Featured Product
This Week in Quality Digest Live
Metrology Features
Quality Digest
PPE counterfeits endanger lives. Testing ensures real PPE does its job.
New method for rapidly providing highly calibrated standards
Gleb Tsipursky
Use this three-step process to prevent disasters in implementing decisions
David H. Parker
Practical implications for electronic distance measurement
Belinda Jones
Users of CMMs, industrial scanners, and portable metrology systems are urged to participate

More Features

Metrology News
Laser scanning also used to help create safety covers for credit card readers
A complimentary webinar for novices to experts on May 27-28, 2020
MetLogix Mx200 DRO is fully featured and easy to use
Automatically controls the die match in form or thread rolling if the dies are not aligned
A versatile FOD solution and effective backup for standard scopes
Good quality is adding an average of 11 percent to organizations’ revenue growth
Digital micrometers offer new capabilities for specialized measuring tasks
Increase production capacity by quickly and accurately inspecting a machine’s state of health
Features a 16-in. diameter screen, 16-in. horizontal travel, 6-in. Y-vertical travel, 2-in. focus travel, and 110-lb load capacity

More News

Jeremy Marvel


How Metrology Influences the User Experience in Human-Robot Interactions

The robotic interfaces connecting people and machines must balance simplicity and functionality

Published: Monday, June 10, 2019 - 12:02

I was told there would be robots.

We are living in a world in which we are surrounded by technology tailored to our needs. Our clothes are treated with nanoparticles to resist wrinkles and stains. We have sent probes beyond the farthest reaches of our solar system, and we have selfie-taking machines exploring celestial neighbors and conducting revolutionary experiments that alter our understanding of the universe.

Artificial intelligence is omnipresent and affects the way we drive, entertain ourselves, read the news, and make dinner. We can even carry out complex social relationships through online video games without ever having to physically meet another human being. The world’s knowledge can be accessed in mere seconds on computers we carry in our pockets. In our pockets! Clearly, we are living in the future so frequently and fancifully predicted in popular culture.

But... where is the plastic pal who’s fun to be with that I was told would be waiting for me?

A test evaluating the performance of an augmented reality interface for interacting with an industrial robot arm for a collaborative assembly task.
Credit: S. Bagchi/NIST

Robots are becoming increasingly prevalent in the manufacturing, medical, and service fields. They are purposefully designed to work around and with people. Robots are even marketed as being “collaborative,” in that they are supposedly safer and easier to use than ever. In every case, robots are custom-tailored for their users’ needs. Such trends imply robotics are becoming consumer products.

In the home, however, robots are largely limited to hobbyist projects, STEM toys, and single-purpose cleaning appliances. Revolutionary and sociable robots are being introduced to an eager market, only to fall short of the capabilities of the simpler, task-built devices that merely sit on a shelf. So why the discrepancy?

In reality, there is no discrepancy. It’s the task and the utility of a given robot that allows it to be custom-designed for the end-user. Specific tasks get specific robots that are built to be user-friendly. General tasks get... something else. When the task is unknown or ill-defined, the manufacturer must anticipate all possible—or at least all supported—applications and design around that.

An industrial robot arm collaborates with a human operator in this test evaluating the performance of vision systems for human-robot interaction. Credit: M. Zimmerman/NIST

All robots are purpose-built, principally because there is a trade-off between simplicity and functionality. To be usable and useful, the interfaces connecting people and machines must carefully traverse the path that is flanked by “too complex” and “too simple.” The real challenge lies in the realization that experts in the field don’t accurately know where that path is, how wide it is, or where it leads. The purpose of the interface is to facilitate communication and drive interaction. It relays important information to the person working with the machine, and it provides a mechanism for expressing the user’s desired actions. The challenge, however, is in balancing usability for a broad spectrum of users while simultaneously providing useful products. To find that Goldilocks “just right” mix of comfort and functionality often requires a lot of trial and error, especially if the ultimate application of the machine is unknown.

And that’s if all is working as it should be.

When things start to go wrong, it can be extremely difficult to diagnose the problem or predict how bad things will get. More intelligence is needed to assess the situation and provide a good prognosis. Assuming that such a prognosis is found and that it’s accurate, how is the robot supposed to share this information so that an untimely fate is avoided? That’s the interface’s job.

A good interface can enhance a user’s experience, while a bad interface can render a machine completely unusable. Thus, the interface drives the experience. Similarly, the means by which we interact with the machines dictates their utility. By changing the interface, one can effectively change how a given robot is used...  or if it’s used.

A test evaluating the performance of an augmented reality interface for interacting with an industrial robot arm for a collaborative assembly task. Credit: S. Bagchi/NIST

As such, an interface that can efficiently adapt to a user or a task is theoretically more useful for that task than an interface that attempts to accommodate all possible tasks or behaviors. To be able to accomplish this, however, the robot needs to be aware of its environment and the user.

While we have some basic tenets to help us differentiate good graphical interfaces from bad ones, there are no metrics by which vendors can measure the effectiveness and efficiency of the interaction between people and robots before the robots are sold and used. Nor are there any standardized means by which we can measure how much a given interface or interaction will be better than another. Currently, the best metrics for measuring the effectiveness of human-machine interactions are through subjective, qualitative, user-volunteered reports. There are few objective, quantitative measures by which a given interaction can be assessed.

If such quantitative metrics existed, however, a robot could adjust its behaviors to match the user and the application, working as a collaborative tool to enable the efficient completion of a task. Similarly, if such adjustments are perceived by the people working with the robot to be both intentional and appropriate, then their confidence in the performance of the robot is strengthened and they can, in turn, respond accordingly. This mutual situational awareness is critical for effective teaming, regardless if it’s on the factory floor or in your kitchen at home. If the interaction breaks down, so too does the team’s performance.

This is the basis for a new research project at NIST, the Performance of Human-Robot Interaction, which seeks to establish test methods and metrics for assessing and assuring the effective teaming of humans and machines. The provision of these metrics and test methods enables the benchmarking and advancement of technology and establishes a baseline of maintaining trust in the capabilities of the robot. Part of this project’s efforts includes reaching out to the world’s experts in human-robot interaction to develop a standardized measurement methodology.

In the recent workshop, Test Methods and Metrics for Effective HRI in Collaborative Human-Robot Teams, NIST researchers and world experts established both the need and means by which human-robot interaction can be objectively measured and replicated. These needs take into account both applications and intercultural issues that drive the user experience and mechanisms for interaction. Ultimately, this workshop kick-started a concerted effort to advance collaborative robot technologies into the future.

So, perhaps someday soon, we’ll get those robots.

First published May 1, 2019, on NIST’s Taking Measure blog.


About The Author

Jeremy Marvel’s picture

Jeremy Marvel

Jeremy Marvel is a research scientist and project leader in NIST’s Intelligent Systems Division. He has over 15 years of research experience in robotics and artificial intelligence, working in academia, industry, and government. His fields of expertise include human-robot and robot-robot collaboration, machine learning for adaptive robot control, and robot safety.