Featured Video
This Week in Quality Digest Live
Quality Insider Features
Dirk Dusharme @ Quality Digest
We can’t ignore the impact that policymaker misinformation has on industry
Quality Transformation With David Schwinn
Popularized in manufacturing, SPC must be practiced throughout an organization to ensure quality
DNV GL
A transition audit to ISO 14001:2015 is an opportunity to start fresh without having to start over
Jim Benson
Systems thinking is almost always exactly the right thing to do
Katherine Watts
Wait and see is not a productive business strategy

More Features

Quality Insider News
By 2025, four levels of self-learning technology will be in play
Automaker’s decision marks reversal on plans to build new plant in Mexico
Mechanical engineer builds animal-like machines for use in disaster response
An effective learning tool in a manufacturing environment
Recognized for continuous efforts to help food and beverage manufacturers achieve excellence in quality
Interlinked system provides seamless link from vendor to supplier on a single platform
New company will focus on technologies for the management and automation of vital clinical processes
Ergonomic, safety, and reliability enhancements comply with the latest European standards

More News

  •  

  •  

  •  

  •  

     

     

  • SUBSCRIBE

NIST

Quality Insider

NIST and Willow Garage Launch Robot Perception Challenge

Competition to measure performance of algorithms that process data gathered with cameras and sensing devices

Published: Thursday, March 3, 2011 - 05:00

(NIST: Gaithersburg, MD) -- The National Institute of Standards and Technology (NIST) is teaming up with Willow Garage, a Silicon Valley robotics research and design firm, to launch an international “perception challenge” to drive improvements in sensing and perception technologies for next-generation robots.

“Perception is the key bottleneck to robotics. This competition will progressively advance solutions to perception problems, enabling ever-wider applications for next-generation adaptive, sensing robots,” says Willow Garage senior scientist Gary Bradski.

The competition will debut at the IEEE International Conference on Robotics and Automation (ICRA) 2011, to be held May 9–13 in Shanghai, China. It will join three other competitions, updated versions of two other robotics competitions previously developed by NIST: the Virtual Manufacturing Challenge and the Micro-Robot Challenge, and the Modular and Reconfigurable Robot Challenge, a collaborative effort by the National Aeronautics and Space Administration (NASA) and the University of Pennsylvania.

The new competition will measure the performance of current algorithms that process and act on data gathered with cameras and other types of sensing devices, explains NIST computer scientist Tsai Hong. “There are hundreds—maybe even thousands—of algorithms that already have been devised to help robots identify objects and determine their location and orientation,” she says. “But we have no means for comparing and evaluating these perceptual tools and determining whether an existing algorithm will be useful for new types of robots.”

Willow Garage is putting up cash awards for excellent performers. The prize money grows exponentially with performance, reflecting the increasing difficulty of each new increment in capability. The top prize is up to $7,000, awarded for successful completion of all tasks within the allotted time.

All contestants will receive a common set of about 35 objects for training and tweaking their algorithms. During the competition, teams will be evaluated on how well their solutions identify and determine the positions of these 35 objects, plus an additional set of 15 objects for validation. NIST also will inform contestants of the metrics and methods they are developing for the competition.

Robust perception is a core enabling technology for next-generation robotics being pursued for a variety of applications. Many of these applications will require operating in unstructured and cluttered environments. For anticipated uses ranging from advanced manufacturing to in-home assistance for the elderly, to search-and-rescue operations at disaster sites, robots must be able to identify objects reliably and determine their position accurately.

The practical goals of this and future perception challenges are to determine what solutions already exist for particular robot-performed jobs, and to push the entire field to develop more dynamic and more powerful perception systems critical for next-generation robotics. NIST, a pioneer in developing metrics for evaluating and comparing robots and other automated technologies, has designed a variety of competitions intended to focus research and stimulate innovation in technology areas critical to improving the capabilities of robots. (See figure 1.)

Techniques and metrics demonstrated in these competitions provide foundations for new standards and test methods for measuring perception system performance. As is true for the other competitions, the perception challenge will grow in difficulty with each passing year.

Willow Garage of Menlo Park, California, will provide a common system for testing competitors’ perception algorithms. Visual information and other environmental data will be gathered and communicated by off-the-shelf sensing technologies, and will be evaluated on Willow Garage’s Personal Robot 2 (PR2) platform.

The deadline for entering is April 15 and final submissions are due May 1.
For more information on the perception challenge and instructions for entering, go to: http://opencv.willowgarage.com/wiki/SolutionsInPerceptionChallenge.


Figure 1:  Robot’s-eye view

Discuss

About The Author

NIST’s picture

NIST

Founded in 1901, The National Institute of Standards and Technology (NIST) is a nonregulatory federal agency within the U.S. Department of Commerce. Headquartered in Gaithersburg, Maryland, NIST’s mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.