- Videos / Webinars
- Print Archive
- Events Calendar
Quality improvement has stalled in manufacturing due to an inability to capture, continuously improve, and leverage performance knowledge in design and manufacturing activities. Other enterprise systems, such as project life-cycle management (PLM), fail to improve quality because they treat it as a process management problem. The fundamental challenges to achieving quality are knowledge-management and continuous-improvement issues. Recently, quality life-cycle management has received a boost from enterprise software solutions designed to change how manufacturers go about designing quality into their manufacturing processes and products.
Some statistics reported by manufacturers highlight the current dilemma in quality performance:
• Eighty percent of all quality issues are repeat issues. These are errors that have happened before and were fixed, yet the lesson learned wasn’t recalled by or communicated to another group so that preventive action could be taken.
• Eighty percent of all issues are with conforming materials. This happens when a product is manufactured to specification but problems still arise. Root cause analysis may determine that all parts were made to specification, but the specification itself was wrong. Costly root cause analysis involving simulation, testing, and design of experiments must be performed to properly characterize the product and the process. Engineering changes, new controls, and manufacturing process changes are introduced as a result.
When asked why this is, a manufacturer’s invariable answers are the “inability to manage lessons learned and best practices” and “poor communication between engineering and manufacturing.” Even for companies that are happy with their delivered product quality, it comes at a high cost in terms of prototype, testing, rework, and time-to-market. Innovation is treated as a significant quality and time-to-market risk, so it falters, and consequently products fall behind the competition.
Quality life-cycle management (QLM) fills the quality realization gap in PLM and other enterprise systems by providing a knowledge management platform for core quality definition and process management activities such as failure mode and effects analysis (FMEA), design verification, root cause analysis, corrective actions, and control plans.
Looking at figure 1 below, we can see key quality definition and control activities happening across the product and process life cycle, with little common coverage by enterprise systems.
Figure 1 only shows a partial list. The number of in-use, structured, and continuous-improvement-oriented quality definition and management activities is staggering. They include voice of the customer, house of quality, advanced product quality planning (APQP), FMEA, measurement systems analysis, lean, kaizen , continuous improvement, corrective action, preventive action, mistake proofing, and Six Sigma. These are all important tools to determine systematic improvements, implement them, and attempt to sustain them. Unfortunately, almost all run separately as independent tools and isolated spreadsheets, trapped within the department that created them. In most cases, tremendous knowledge is gained but never applied more than a few times before it’s lost.
Once the low-hanging fruit of continuous improvement is exhausted, the repeat issues remain. Fortunately, most teams now driving quality are comprised of operational excellence managers with years of Six Sigma experience and a strong focus on measurable performance management.
What’s needed is commonly known. Corrective action systems must be driven by root cause, which must be guided by detailed quality plans. Any lesson learned must become part of the current quality plan and all future plans. A lesson learned in manufacturing should be as readily accessible to the design engineer or the tier-three supplier providing a key component as it is to the machine operator who discovered it.
Quality life-cycle management is best understood by first looking at quality definition and quality process management independently. Quality definition is the set of activities that are performed when characterizing a product and its supporting processes, whereas quality process management is the set of activities that shepherd content through creation and communication. Critical quality-definition activities include:
• Voice of the customer through quality function deployment (QFD)
• Design FMEA
• Design verification
• Design rule management
• Process flow diagrams
• Process FMEA
• Control plans
• Work instructions
• Root cause analysis
Each of these activities is characteristic-driven. For any component, quality definition dives into the detailed characteristics: functions, failure modes, causes, effects, controls, gauge numbers, expected test results, performance bands, and measurement boundaries.
Some critical quality process management activities include:
• Corrective action
• Preventive action
• Issue management
• Nonconforming materials
• Recommendation management
• Quality compliance traceability across the product and process life cycle
Traditionally, each of these has been implemented independently of the product and process characterization processes. Issues and corrective actions were based on qualitative summaries of observations, preventive actions were independent of the quality plans, and traceability was attempted at the wrong level--e.g., as requirements rather than characteristics.
The core premise of QLM is that effective quality definition is characteristic-driven, and effective quality process management is dependent on the continuous improvement and knowledge management of the quality characteristics being defined. This isn’t as easy as it sounds because characteristics are influenced by factors across the entire product and process life cycle. A design team working in isolation from manufacturing and support is unlikely to effectively characterize its product. Quality issues will result.
That brings us to QLM principle 1.
A characteristic is a specific measurable and controllable feature described at the behavioral level of a system. To succeed in developing quality products and processes, we must take product features and needs and convert them into characteristics. By doing so, we make quality tangible and manageable; we also provide the foundation of a systems approach to quality. Important characteristics are ones that must be carefully validated and controlled to ensure safety and quality performance.
However, traditional requirements represent only about 20 percent of the product and process definition. As a product moves from concept through design and manufacturing, a “requirements explosion” occurs. It can be either implicit, where the requirements aren’t captured or defined (e.g., the designer, concerned about warping, increases the thickness of a bracket, and no verification of need or subsequent capture of a best practice is done); or explicit (e.g., a prior design FMEA identified a failure mode with steel thickness as the cause, which prompted the development of a verification test and subsequent design rule).
Indeed, the requirements explosion hints at the tyranny of quality, where we find ourselves overdesigning products without understanding exactly what our goal is and what characteristics we must implement to achieve it. If we haven’t been planning quality--including identifying our characteristics and verifying, controlling, and capturing them as lessons are learned--we don’t have the means to apply this knowledge to other programs. Or more important, to communicate them to our supply chain and contract manufacturers, who are directly responsible for the effective delivery of 80 percent of our offering’s significant and critical quality characteristics.
In short, we don’t know what characteristics to identify and how to specify them. This leads us to the second key difference between PLM and QLM when it comes to quality.
Quality realization is the design, verification, and control of products and processes based on cumulative best practices and lessons learned.
This can be seen in the emergence of next-generation quality practices. ISO 9001 documents your processes but not your data. ISO/TS 16949, the APQP automotive standard, essentially adds content to the ISO 9001 standard. Design for Six Sigma (DFSS) adds “why” to the “how” of Six Sigma through design practices similar to APQP.
If you’re manufacturing products, this gets very complex, very quickly. The more innovation and product-centric a company’s business model is, the greater the complexity in quality activities and processes.
Figure 2, above, shows the interaction of a common quality process across the key quality definition and validation activities in the product and process life cycle. This illustrates the critical role of lessons learned and best practices in achieving high quality initially, and how an integrated QLM approach can ensure continuous improvement in the corporate quality-knowledge base.
This leads to the third core principle of QLM.
This principle is similar to the emergence of PLM over stand-alone electronic change- management systems. Without the product data, what exactly is the engineering change process managing? Similarly, if you have a quality issue, what was the quality plan against which the issue occurred?
In most manufacturing enterprises, corrective actions run as much risk of reducing quality as improving it. That’s because without a systemic quality planning process, corrective actions are tantamount to trial and error. Even if quality is improved, that improvement isn’t sustained. The lesson learned is often not communicated broadly, or, more likely, it’s simply forgotten over time.
Quality is a cross-functional discipline, and this is seen in the fabled wall between design and engineering and the large number of efforts to span this divide. Design for assembly, design for manufacturing, geometric dimensioning and tolerancing, value engineering--all of these strive to have design teams understand manufacturing realities. Design teams identify what they deem to be critical and special characteristics, and provide those to manufacturing. Design and process FMEA are done with cross-functional teams.
Design problems are identified in manufacturing and in customer use. Continuous improvement in design requires easy access and understanding to manufacturing as well as customer and warranty data.
Lots of data get generated in different systems, using different descriptions, and for different uses.
To achieve a true life-cycle approach to enterprise quality realization, three things are required:
• A common language. Functional groups across the value chain speak the same language, describing quality attributes and issues with the same terminology and meaning.
• Bill of compliance. Traceability is enabled through a bill of compliance (BOC) at each stage of the quality life cycle. A true BOC is a report that shows where things were done according to plan, and clearly highlights gaps or room for improvement. This ensures that design, engineering, and manufacturing collaborate effectively on design and manufacturing issues, and that critical customer needs aren’t forgotten.
• Seamless access and communication. Knowledge from any part of the life cycle is easily accessed and correlated to activities at any other stage of the quality life cycle.
Figure 3 below shows the typical quality workflow across knowledge management, characterization, and process management. It nicely illustrates the principles of a life-cycle approach and the need for a common language for compliance and communication. It also clearly articulates the necessity of running quality processes at the characteristic level.
The four prior QLM principles are based on decades of industry experience using core quality tools. These principles are largely agreed upon. The final, and possibly most important, requirement for a QLM system is to ensure that all the characteristics captured as lessons learned are actively used. If any lesson has been learned in the many attempts to implement knowledge-capture and preventive-action systems, this is it.
Thankfully, a characteristic-based life-cycle approach to quality makes this a very achievable goal, partly because in a QLM system, a knowledge base makes the user’s job faster and easier. By making quality processes operate based on the primary quality-definition content, continual improvement of the knowledge base is implicit in transactional quality business processes.
Here are some examples: When entering a failure mode, QLM systems actively provide the user with a list of related causes and effects. When considering controls, the systems show what controls are currently considered best practices across the organization for similar processes.
When performing a root cause analysis, users are prompted to select the failure mode and cause that was identified. Don’t duplicate that information; if a new control is needed, add it directly to the control plan for that cause.
Now when you view a control and a control plan, you can see all the historical corrective actions that produced it. This ensures that everyone is speaking the same language across functional groups, because the language is a common resource shared and continuously approved. Deviating from the common language slows down the quality definition process because if you invent a new failure mode, you lose the associations to causes, effects, controls, verification, gauge plans, and historical corrective actions.
However, if you add a new cause in the same language, it’s easily communicated across other teams using the same process with the same failure mode.
A lot of enterprise systems provide useful quality activities and processes. However, they’re usually transactional in nature and don’t focus specifically on the data-centric quality definition and knowledge- management capabilities essential for driving continuous improvement. Quality content is simply documents in a document management system. There’s neither a relationship among the documents across the quality life cycle, nor a knowledge- management core that maintains consistency and associations among the many characteristics that define quality in products and processes.
A wise man once said that a sure sign of a broken enterprise business process is the uncontrolled proliferation of spreadsheets. Even with current enterprise systems in place, this is definitely the reality in quality today.
QLM solves the need to take all those spreadsheets, link them, tie them to business processes, and manage the knowledge capture, continuous improvement, and communication of the data within them.
Looking at some enterprise systems involved in quality, QLM is most similar in concept to PLM. However, PLM manages quality from a document and process level. There’s no concept of characteristics or links across the quality life cycle. Whereas most PLM activities and processes run well at the document/file level, QLM activities don’t.
Quality management systems (QMS) are mostly workflow and document- control systems. They perform similar functions as PLM but add specific domain knowledge and preconfigured workflow. Some PLM vendors are starting to add this domain knowledge along with a QMS type of functionality. However, the weakness of PLM domain expertise in both quality and manufacturing (as opposed to design) processes has limited their presence in quality. They’re also competing on so many fronts with best-of-breed solutions (e.g., project and portfolio management, and manufacturing process management) that their capacity to focus is largely limited.
Quality control modules in enterprise resource planning systems are inspection-based and do little to capture and improve quality knowledge, or share it with other functional groups such as design engineering.
Manufacturing execution systems (MES) lack a model that defines the data they collect, focusing instead on a narrow portion of the product life cycle. However, the data in MES are one of the critical components to product and process characterization.
Emerging categories such as enterprise manufacturing intelligence or enterprise performance management still focus on the manufacturing portion of the quality life cycle and are mostly portals into manufacturing transactional efficiency. As such, they are basically a heterogeneous MES integration layer.
QLM can provide critical infrastructure to improve the return on investment that manufacturers receive from all these tools, and integrations among them will be common. The incursion of vendors from these other areas into QLM will be difficult due to the significant technological and domain model differences between them.
QLM is unique because it’s characteristic- driven, strongly dependent on knowledge management, and emphatically cross-functional. As a simple example of its effect, consider the following: How do you ensure that a requirement of “long lasting” is translated into product and process characteristics? How does a manufacturing engineer operating an injection molding machine at a tier-two supplier two continents away ensure that your product is long lasting?
Fundamentally, the above activities and processes center around one central concept: characterizing products and process so that critical characteristics affecting quality are identified and controlled across the entire product and process life cycle.
Ultimately, QFD, FMEA, design verification, and design of experiments all do one thing: identify characteristics and controls. Lots of them.
To avoid repeat issues and failures, we must first find the right characteristics and controls, and then manage them so we can apply any discovered best practices on future and similar programs.