by Simon Jacobson
Highly publicized recalls, ever-increasing regulatory oversight, and skyrocketing customer demand for quality products have moved quality management from an afterthought to a strategic requirement for life science, health and beauty care, and food and beverage manufacturers, as well as any other firm regulated by the U.S. Food and Drug Administration.
The following industry pressures have moved quality management to the head of the agenda in many boardrooms:
• Downward pressure on health care costs
• Competitive product launches
• Using contract manufacturers to support global markets
• Patent expirations pushing products to generic and over-the-counter status
Companies under pressure to comply with increasingly stringent FDA regulations and maintain profitable growth need to act fast. Quality management must be viewed as a business process as critical as cash flow. Too many FDA-regulated manufacturers bolt on their quality measures after production, which results in:
• Excessive rework and scrap . The industry average for rework plus discarded product is more than 50 percent. A single scrapped batch can represent between $3 million and $4 million to an enterprise, which spans anything from direct profit that would have been gained on the open market or customer shelf, the manufacturing costs, and the supply-chain costs associated with that batch.
• High work-in-process and finished-goods inventories . Many manufacturers report on-hold product inventories at the 40- to 60-day level; on-hold inventories of 100 days are not unheard of.
• Low-capacity utilization . Plant utilization runs at about 50 percent.
• Protracted and unpredictable cycle times . On average, pharmaceutical manufacturing cycle times fall into the 30- to 90-day range. A batch release alone can take upwards of 60 days. Cycle times typically double in nonconformance scenarios because of the time needed to detect, trace, resolve, correct, and document deviations in the manufacturing process. It can take up to six days just to detect a nonconformance and marshal the resources to conduct an investigation. One manufacturer detected a batch nonconformance 25 days after the product had been packaged.
Over time, quality costs have skyrocketed without successfully reducing variability. Also, quality measures mean added cost and a requirement to police plant-level quality groups to enforce those measures, or face the worse situation of the FDA finding the problem and requiring that it be fixed. When that happens, it indicates that quality problems are detected long after they occur, and also points to significant supply-chain costs because the product batch is about to be shipped. Organizations that don’t embed quality management deeply into their manufacturing and supply-chain-focused operations will ultimately weaken their competitive position and expose their brands to great risks.
A systematic approach to reducing variability is needed. Fixing the corrective and preventive action (CAPA) process is a logical first step for many companies because it’s what attracts the most FDA scrutiny. Most companies initiate the CAPA process only after a complaint or failure has been formally registered. Instead of identifying and mitigating risk trends, companies end up either chasing the same complaint at multiple facilities, or failing to correct it everywhere. Some organizations still lack procedures to ensure consistency in handling and timing activities, from complaint receipt, complaint trending, and holds on shipments, to determining the field action and actual recall.
In some cases leveraging information technology (IT) has complicated matters. Defensive application-buying patterns (i.e., purchasing software as a reaction to increasing regulatory insight rather than strategic architectural consideration) have resulted in multiple site-level deployments and fragmented architectures that hinder true visibility into adverse events, as well as complaints across multiple facilities. For companies that have rejected off-the-shelf solutions, custom-developed workflows have been crafted and implemented. This approach further complicates matters when IT upgrades and corporatewide standardization come into effect. Existing workflows and processes are invariably disrupted. Fragmented and disjointed IT solutions prevent manufacturers from being able to see and respond in real time to all adverse events. This lack of transparency creates conflict with the organizational hierarchies defined to support the CAPA process because the lack of a single version of the truth prevents employees from taking the appropriate and timely actions necessary for risk mitigation.
Manufacturing 2.0, web services, and services-oriented architectures have emerged to help close the CAPA gap. By leveraging a healthy prescription of services-oriented architectures, web services for integration, and business process management, companies can create what AMR Research of Boston refers to as operations process management (OPM). Contrary to the conventional thinking surrounding business process management, OPM allows companies to link their disintegrated and fragmented CAPA approaches together. OPM does the following:
• Predictably performs high volumes of complex data aggregations and transformations from shop-floor applications and data models in real time . OPM addresses the higher fidelity of services needed for orchestrating manufacturing processes across multiple data models and governing applications.
• Integrates operational manufacturing processes across disparate time domains to business process management systems . This crosses application and enterprise boundaries to drive transparency of data, eliminate latency of action on adverse events, and enforce process standardization.
• Links events from sensors, devices, and people . That’s because they represent the sensors and actuators in many manufacturing steps, or they intermix with actual manufacturing applications.
Let’s consider how OPM might be applied in a manufacturing setting. A non-conforming event happens at a site that results in the disposition of a full batch or lot. The dispositioned material is manufactured at five other sites, which most likely are unaware of the problem. Rather than rely on a feedback mechanism for an enterprise system (often depicting end-of-shift details and inventory consumption), real-time event detection and alarms can be used to alert other sites to either interlock their processes or perform an audit to prevent a similar event from occurring. Furthermore, a common event-notification mechanism alerts downstream customers that there’s an issue with a product that they’re awaiting, or that the order might be late. This would help them adjust their capacities or production schedules so that they can efficiently respond to their own customer demands.
Applying OPM to quality management doesn’t just reduce the costly cycle time from event detection to cor rection. It also establishes a networked approach to quality management that will support the enterprise and extend the value network of quality management functions.
Tying together the loosely integrated CAPA technologies are one thing, but integrating quality management functions across the enterprise to support and reduce the CAPA process is trickier. Call it Six Sigma or right-the-first-time; manufacturers must build quality into every process that touches the product and shift the enterprise’s focus toward operational excellence. This requires an integrated approach to quality management at the enterprise level, spanning the total manufacturing life cycle of the product and providing the appropriate traceability to manage supplier and customer quality issues. IT applications supporting CAPA must move from bolt-on point products to in-line process control, management, and improvement. Embracing OPM will not only connect disparate event-tracking systems owned by a single company, it will also cut the detection-to-correction time with customers and suppliers.
Don’t be fooled, though: Integrated quality management approaches aren’t complete product offerings. Instead, think in terms of enterprise architecture that must be assembled and integrated from existing products and technologies, including:
• Manufacturing operations . Production’s goal is the timely manufacture of a product that consistently adheres to customer specifications--i.e., throughput without variability. IT should support manufacturing by enabling tightly controlled and integrated manufacturing processes and the ability to infer product outcomes in real time. This requires three sets of technology. First, process analytical technology, including in-line sensors, analyzers, and simulation tools, monitors and predicts product outcomes. Second, process automation and control technologies limit process deviations and out-of-specification conditions in real time. Third, manufacturing execution systems automate the flow of information throughout the manufacturing process and maintain an electronic batch record.
• Laboratory operations . Research and development and laboratory operations need to be synchronized to support manufacturing via raw-materials testing, finished-goods testing (i.e., final assay), scientific data management, stability testing, dissolution testing, and infrequent in-process testing for out-of-spec product.
• Enterprise-level IT and business processes . Without explicitly calling out enterprise resource planning (ERP), the procedural guidelines for quality management need to be defined at the enterprise level. In some cases, the ERP systems may provide the central repository for master recipes and specifications in addition to maintaining production schedules and orders. The guidelines must also be integrated with manufacturing and R&D functions. However, integrating ERP directly with the production process remains a costly and risky proposition. Instead, use ERP to increase visibility into production performance at the enterprise level. ERP systems have been leveraged successfully to retain high-level quality performance metrics from production and laboratory operations. It can then correlate these data with other metrics from finance and the supply chain to obtain complete enterprise performance management.
Integrating quality management processes by using information technology will yield the following results:
• Enterprisewide visibility into quality performance . Visibility into key production life-cycle events will help the enterprise improve its management of schedules, costs, inventories, and resources.
• Coordinated production operations . This allows tracking of and responding to critical production, quality, and conformance events. For example, an aggregated view of several process variables in combination with a process analytical technology reading might trigger an out-of-spec event. This in turn could trigger a CAPA workflow, a process equipment-maintenance workflow, or both, depending on the context in which the event was detected.
• Correlated process and product performance information . This provides a deeper context regarding the production life cycle (e.g., at what step in the process the failure occurred) so that root causes of failures can be identified and responded to quickly.
• Centralized control of process compliance definitions . Centralizing the point where process compliance is defined and administered provides a secure clearinghouse and audit trail for change control. Ultimately, this helps drive the standardization of processes and procedures.
• A single version of the truth . ERP, laboratory information management systems, manufacturing execution systems, data historians, and document management systems will interact through a well-defined transaction touch point. This preserves data integrity in systems of record, minimizes redundant validation efforts, facilitates designs of experiments, and lays the groundwork for controlled, continual improvement efforts.
The phrase “Six Sigma products on the market with three sigma processes” comes from the vice president of global quality for one of the top three pharmaceutical manufacturers. The life sciences industry wouldn’t have grown to where it is today if poor product-supply productivity wasn’t compensated with profit margins of 60 percent to 90 percent. Because of the industry pressures mentioned earlier, companies won’t be able to sustain high margins and profitable growth unless quality management continues to be closely integrated with the enterprise. Companies that have completed this journey can point to tangible results such as:
• 20-percent to 30-percent reduction in investigation times
• 50-percent reduction in batch record review
• 40-percent reduction in overall cycle times
As firms strive to fully track product and process performance across the value chain, the role of quality management can’t be overstated. Understanding true levels of quality can only enhance supplier management, continual improvement (more than half of all quality issues are repeat issues), and risk-awareness competencies. Potentially lower supply-network costs and reduced costs of poor quality are other benefits. To ensure a truly networked approach to quality management, companies need to understand and forestall events that pose risk. This will allow them to enhance existing risk portfolios of individual products and create a tighter feedback mechanism to identify the appropriate risk level and resolution strategy for quality and compliance events.
Simon Jacobson is responsible for AMR Research’s coverage of manufacturing operations and ERP providers serving the mid market. He examines how global manufacturers apply IT to their processes to become demand-driven manufacturers. In addition to ERP, Jacobson also covers manufacturing compliance and quality management (including environmental health and safety), manufacturing services organizations, and the evolving ecosystem of manufacturing execution systems providers across the range of industries that AMR Research studies.