Imagine smart tools or robotic delivery systems that instantaneously know where they are on an ever-changing shop floor, a system that could automatically orient a CAD model of an object on the shop floor to the local coordinate system without the use of targets or human intervention. Such tools could deliver parts or subassemblies to where they were needed, even if the major assembly or jig had moved. A smart torque wrench might automatically adjust torque based on knowing where you were and what fastener you were getting ready to tighten.
The challenge of concurrently building a map and estimating motion in an unknown environment without an external reference system is a well-known problem in the robotics community called simultaneous localization and mapping (SLAM). The scenario above is a high-end example of the SLAM problem, and work being done at Boeing's Bellevue, Washington, research center and at Australia's Commonwealth Scientific and Industrial Research Organisation (CSIRO) is seeking to develop real-time SLAM solutions in these very challenging and dynamic environments. In short, the emerging SLAM solution will use hardware and software that enable the real-time creation of maps of those surroundings with no prior information from existing or prior maps. This SLAM solution can also be used to update or improve upon the accuracy of an existing map.
Those of you who work in aerospace or other fields that build large products have seen smart alignment techniques using laser trackers, indoor GPS, or photogrammetry, for instance. As well as those products and techniques work, they all rely on one thing: alignment to targets on the object being measured or to targets on the infrastructure itself (e.g., fixed targets on the building). These targets are needed to orient the object being scanned by the tracker, iGPS, or photogrammetry system to the local work space. And those techniques often require a human expert who understands where on the work floor the piece is located.
The SLAM solution, on the other hand, builds a map in real time using existing features from the manufacturing environment. The accuracy of the SLAM solution is directly related to the quality of the LIDAR (light detection and ranging) scanner. As it is currently implemented using industrial low-cost LIDAR scanners, the accuracy is less than what is attainable by high-end monument- or target-based metrology systems, but that isn't its purpose, say developers.
"The goal is to develop laser-based localization and mapping technologies that would be used for doing autonomous navigation on the factory floor without having to rely on fixed infrastructure that are attached to the factory itself," says Charles Erignac, an associate technical fellow with Boeing Research & Technology. "We wanted something that was completely self-contained, that is able to track its own movement and figure out where it is inside a given space."
In the Boeing video shown below, a cart is mounted with four LIDAR scanners from SICK, a producer of sensor solutions for industrial applications. As the cart is moved around the workspace, the computer collects point-cloud data from all four scanners and stitches them into a map in real time. So far, in limited testing, using only existing features detected in the local map (not targets), the software can compare the map to a CAD model, i.e., compare existing features to CAD features. Because SLAM relies on existing features rather than targets, the object of interest can be anywhere on the floor. Once the object is found it can be registered to the CAD model. Other objects, such as tools and fixtures or other assemblies, can likewise be located and registered to the same coordinate system.
What Erignac describes could be done now using existing technology—laser scanners, portable arms and the like—but those require that a map, or scan, of the work area has previously been done and aligned with the factory floor, which limits their applicability to changing environments. SLAM does it on the fly, with no targets.
The main goal, says Erignac, is to make life simpler on the factory floor, particularly in aerospace, where the factory floor is enormous and the parts and tools are constantly being moved.
"It [SLAM] is an additional capability, which if applied correctly to our tools and processes, is going to make our life much simpler, faster, and less prone to error," says Erignac. "If a mechanic finds a discrepancy in the airplane, it is very difficult to describe where it is. Instead, he could pull a trigger and know where he was in the CAD model. That info could then be communicated to the person responsible for addressing that issue."
Where metrology systems like trackers, arms, and photogrammetry are used to accurately measure and align parts and take submillimeter measurements for quality control or precision alignment purposes, the SLAM solution described here is really intended as a rapid, cost-effective way for identifying where you are within a few centimeters.
Boeing initially experimented with SLAM using the cart-mounted LIDAR shown in the video above (possibly as a precursor to an autonomous robot). Boeing researchers have now turned their attention to a handheld system developed by CSIRO. That agency has pushed the boundary of their software developments, which now use a hardware configuration comprising a lightweight, portable handheld device, named Zebedee, and specialized software.
As you can see in the video below, the user simply walks through the environment waving the device, a Hokuyo 2D LiDAR, like a shaman waving a rattle. And the results are just as magical—a real-time point cloud of the operator's movement through the environment. The CSIRO concept for Zebedee arose from the need to map unstructured, rugged, and confined natural environments (such as the interior of mines), so a small handheld portable device was a must.
"The key is the proprietary software which estimates the sensor trajectory through the environment based on the LIDAR points," says Peter A. Kambouris, a business development manager at CSIRO ICT Centre. "The LIDAR point measurements sample surfaces in the environment. By associating multiple measurements of these surfaces and aligning them together, the position and orientation of the sensor over time can be inferred. The whole idea is to give you a pathway that you have walked, and then group and calibrate everything around you."
In a way, the software works in the same way the brain works, says Kambouris. We remember landmarks, but not objects that we know are moving. It is these landmarks that become the basis for how we navigate our environment. In the same way, with the CSIRO SLAM solution, as you move through the environment, the software knows to ignore stuff that isn't stationary, says Kambouris, such as people or vehicles.
In the future, Kambouris and Erignac expect SLAM technology will be used to streamline a variety of manufacturing functions, from delivering parts to work cells to helping align tools or subassemblies on the manufacturing floor. The possibilities are limitless.