Just What Is Immersive Engineering, Anyway?
Immersive engineering is a digital technology, engineering practice, and methodology, all in one, for solving multifaceted engineering problems in the design, assembly, and maintenance of large, complicated structures (e.g., automobiles, aircraft, and spacecraft). Immersive engineering is equally effective for developing practical workarounds or delving into root causes.
Immersive engineering simulations are fundamental advances over conventional digital modeling and engineering analyses because it:
Few things underscore the evolution of quality control and quality assurance as immersive engineering. The technology is a new way of probing engineering issues like productivity, manufacturability, assembly, maintainability, life-cycle costs, and ergonomics all together and at the same time. Gains in any of them have big and lasting effects on quality.
In the process, immersive engineering is also expanding the definition of quality from better-faster-cheaper products to entire business processes. Two outstanding examples of immersive engineering are shortening new-product development cycles and production-launch learning curves at Ford Motor Co. and Lockheed Martin Co. These industry leaders are using immersive engineering to better the quality of product design and engineering management.
The companies are going far beyond long-established quality control and quality assurance approaches such as geometric dimensioning and tolerancing (GD&T) for inspecting physical products, and end-of-line gauging to track manufacturing variances and process drift. Using virtual reality (VR) as well as motion capture, tracking, and analysis, immersive engineering is even pushing beyond portable coordinate measuring machines and laser and optical scanners (see sidebar).
Fundamentally, immersive engineering is a dramatic advance in the “graphics” and visualization of engineering. It integrates, almost seamlessly, VR, digital video, and related 3-D technologies, computer-aided design (CAD), simulation and analysis, and solid modeling. Potentially troublesome issues are uncovered early in the design phase by immersive engineering, often months before designs are nailed down for final approval or design freeze. Modifications can be made in the digital model well ahead of product launches.
Once production starts, the cost of any modification soars. Typically these engineering challenges cannot be visualized on desktop computer monitors, or even readily understood. To help everyone cope, theater-like immersive engineering systems surround the problem solvers—engineers, technicians, production workers, managers and customers. They get information presented as real-time, life-sized digital displays with motion-tracked, ergonomically accurate avatars—called “digital humans.”
Tying all this together is Cortex image-analysis software, a motion-tracking and motion-analysis technology from Motion Analysis Corp. in Santa Rosa, California. Cortex provides the extensibility to scale to multiple, independent, concurrent, live, and full-body kinematic simulations.
Motion Analysis is the world’s largest manufacturer of high-performance optical instrumentation systems that measure and analyze the movement of physical objects, human or otherwise. Its systems are used in medicine and sports, for industrial measurement and control, and for animation in filmmaking.
Immersive engineering is most effective whenever a knotty design or production problem requires that engineers, technicians, and factory-floor workers come together to find a solution. Whether that solution goes to root causes or is just a practical workaround, immersive engineering is a powerful tool.
The fundamentals of immersive engineering were developed at the Lockheed Martin Aeronautics Co. in Fort Worth, Texas. The newest of Lockheed’s three systems is the Collaborative Human Immersive Lab (CHIL); it opened at the end of 2010 at Lockheed Martin Space Systems Co. near Denver. Lockheed’s third system is at the corporate Center for Innovation in Suffolk, Virginia. All three are linked so analyses done in one can be shared in the others.
In an immersive engineering simulation, technicians and workers perform a task in a fully interactive 3-D environment, wearing two dozen spherical reflectors or “markers” that enable motion tracking—the digital replication of every body movement. Attached to tight-fitting body suits or just mounted on heads and hands, markers locate and orient each joint and its movement in the virtual world of an immersive engineering simulation.
Motion Analysis’ digital video cameras capture and track all that movement. The company’s Cortex software sorts out and analyzes the overlaps in the digital images, then calculates object positions and human motion for use in additional simulations. These uses include:
• Production-engineering decisions on how a task can best be done given workspace clearances, workers’ reaches, sequencing, and pacing. All of these can affect manufacturing variability and quality.
• Ergonomic decisions on minimizing the risk of repetitive-stress injuries due to heavy lifts or awkward work positions. If risks are anything more than minimal, the task will get mechanical assistance, and the tooling or the part may be modified. Among other benefits, these improvements reduce process variance and boost quality.
Some pictures from immersive engineering installations show technicians wearing both reflective markers and helmet-like, head-mounted displays. In these instances, the technicians’ work requires them to alternate between motion tracking and watching their movements displayed on avatars in real-time 3-D video. This back-and-forth, real-time iterative capability is a key benefit in immersive engineering.
For the first time in everyday problem solving, immersive engineering enables engineers and anyone else to collaborate, locally or remotely, in a lifelike, fully realistic virtual environment that is interactive, holistic, and concurrent.
Ford Motor Co. and defense contractor Lockheed Martin are two of immersive engineering’s leading proponents. Their six systems, three apiece, go beyond imaging and 3-D presentations to immerse those responsible for quality in the engineering data on which their decisions are to be based. Both companies’ immersive engineering installations are built around Motion Analysis systems.
Ford pioneered the combined push for immersive engineering in productivity, manufacturability, life-cycle costs, and ergonomics in its Dearborn, Michigan, ergonomics laboratory, part of Ford’s Advanced Engineering & Technology Dept., which is a unit of Ford Vehicle Operations Manufacturing Engineering. Ford’s other two immersive engineering systems are in Merkenish, Germany, near Cologne, and in Dunton, England.
Ford carefully measures and tracks quality. Its latest posted results include a 35.5-percent drop in total things gone wrong (TGW). Specifically TGWs per 1,000 Ford vehicles fell to 1,107 for 2009 from 1,405 in 2007. Ford noted that this gain put it on a par with Honda and surpassed the rest of the world’s automakers, even Toyota.
In the same period, customer satisfaction rose dramatically, by 7 points from 77 percent to 84 percent measured three months after sale. Data are from the Global Quality Research System (GQRS), a Ford-sponsored competitive-research survey done by RDA Group of Bloomfield Hills, Michigan.
On its website, Ford notes that it “pairs advanced motion-capture technology—commonly used in animated movies and digital games—with human modeling software to design jobs that are less physically stressful on workers. The benefits include fewer injuries, lower cost of tooling changes, higher quality, and faster time to market, and Ford is seeing improvement in every one of those metrics.”
Ford adds that “industry-exclusive virtual tools are used by Ford engineers and designers to shave months off the product development process, while improving the quality, comfort, and appeal of Ford vehicles. Ford product development is anywhere from eight to 14 months faster than it was five years ago.”
The Ford immersive engineering installation in Dearborn is run by Allison A. Stephens, Ford global technical leader in assembly ergonomics. In addition to her focus on preventing repetitive stress injuries, she strives to reduce what Ford calls “parts churn,” the number of parts that must be modified, replaced, or retooled in as a vehicle gets into production.
Lockheed Martin Space Systems Co. has its most robust push yet into immersive engineering at the Collaborative Human Immersive Lab (CHIL) in Littleton, Colorado. CHIL was created to find potential design and assembly difficulties and develop solutions while everything is still in digital formats. The lab is located in the company’s cavernous final-assembly building, itself undergoing renovation.
“The purpose of CHIL is to ensure flawless execution, that is, getting it [spacecraft assembly] done right the first time,” says Mark D. Stewart, Lockheed Martin Space Systems’ vice president for assembly, test, and launch operations. “This is especially important for spacecraft because nearly every one is different, and each presents new challenges to the people who build them.”
“CHIL is about the virtual creation of our products and associated processes in digital form before we build the physical products,” says Jeff D. Smith, director of special projects at Lockheed Martin Space Systems. “This becomes especially important for spacecraft. Nearly every spacecraft is unique; each one presents different challenges to the people who build them.
“The capabilities of CHIL provide the means to fundamentally change the way we work. As customer budget pressures continue, and the need for critical assets in space escalates, CHIL will help increase the affordability and value of our programs,” adds Smith. He says similar VR and motion-tracking technology “is being used in the movie industry to create fictitious worlds, but the CHIL is real, and it is driving quality and affordability into our products.”
Lockheed Martin’s pioneering immersive engineering system, the Ship/Air Integration Lab (SAIL) in Fort Worth, Texas, supported and verified key engineering solutions, helping the defense contractor win contract bidding on the Joint Strike Fighter—now known as the F-35 Lightning II. After the F-35’s design was complete, SAIL’s mission was broadened to include safety analyses and facility reviews, and its name was changed to the Human Immersive Lab (HIL).
The CHIL also represents Lockheed Martin’s significant branching out with immersive-engineering technology into final assembly. While HIL focuses on holding down life-cycle costs of aircraft, the CHIL capabilities span the entire life cycle, from concept phase to operations and sustainability. Much of CHIL’s focus is on mistake-proofing the production phase of multibillion-dollar systems.
The differences in the ways HIL and CHIL approach immersive engineering are huge. Aircraft in service get lots of hands-on maintenance, but spacecraft get almost none. Spacecraft canweigh thousands of pounds and have tens of thousands of electronic and electromechanical components. Except for a few military and communications satellites, each spacecraft is one-of-a-kind, and all are assembled by hand. Every connection must withstand the shock and vibration of riding a rocket into space, so every assembly step is verified and documented.
Once launched, spacecraft traverse millions of miles of space and arrive unaffected by the deep cold and low-level radiation of space travel. After all that, they must perform for years in unearthly places like Mars. Electronic tweaking aside, maintenance and repairs are impossible. Right First Time is, in other words, just the beginning of spacecraft quality.
“Using motion tracking and VR, the CHIL creates a unique collaborative virtual environment for exploring and solving problems quickly, and where hardware designs and manufacturing processes can be fine tuned before production or development begins,” notes Lockheed Martin Space Systems in a media release. “This allows engineers to identify risks and increase efficiencies early in program development, when the cost, risk, and time associated with making modifications are still low. With a range of technology applications, the CHIL can improve every stage of a program, from the concept phase to the operations and sustainability phase.”
Lockheed Martin summarized CHIL’s anticipated benefits as “a rich virtual environment” that allows engineers to:
• Establish, optimize, and validate processes before release to manufacturing
• Identify bottlenecks, collision, and worker issues before they happen
• Improve resource utilization and material flow
• Improve productivity
• Reduce rework and scrap
• Mitigate program risk
Lockheed points out that HIL generated a 15-fold internal return on the company’s total investment in applying immersive engineering to detailed aircraft design. The company also credits SAIL with helping it save more than $100 million: The money would have been spent for modifications to the aircraft-carrier version of the F-35. Management at Lockheed Martin, the world’s largest defense contractor, made sure the word got out.
Immersive engineering doesn’t really change the basics of quality. That is still about getting the product, whatever it may be, made exactly to the customer’s specification, on or under budget and delivered on time with no glitches or snags. Immersive engineering allows everyone involved to see and understand what’s going on, spot problems, and recommend solutions across the broad spectrum of manufacturing—productivity, manufacturability, life-cycle costs, and ergonomics.