Featured Product
This Week in Quality Digest Live
FDA Compliance Features
Jon Speer
How to do it in real time
Wade Schroeder
A proactive approach to product development
Emily Newton
Technology and processes improve food quality and purchaser trust
Janet Woodcock
A flexible approach for ensuring access to safe, high-quality food and medical products

More Features

FDA Compliance News
Free education source for global medical device community
Good quality is adding an average of 11 percent to organizations’ revenue growth
Further enhances change management capabilities
Creates adaptive system for managing product development and post-market quality for devices with software elements
VQIP allows for expedited review and importation for approved applicants that demonstrate safe supply chains
An invite from Alcon Laboratories
Intended to harmonize domestic and international requirements
Pharma quality teams will have performance-oriented objectives as well as regulatory compliance goals

More News

Stephen McCarthy

FDA Compliance

Transforming Quality Operational Data Into Business Insights

Data-driven decisions are only as good as the data that drive them

Published: Thursday, August 15, 2019 - 12:03

In our constantly evolving, data-rich universe, collecting, interpreting, and understanding process data can be tricky. But it is increasingly important if we want to maintain sustainable quality across product development and manufacturing processes. This challenge is particularly evident in the life sciences arena, where pharmaceutical, biotechnology, and medical device manufacturers constantly strive to build quality processes that deliver “fit for purpose” output.

Process data typically come from a collection of diverse sources in varying formats. These data are dynamic, which often means they have a short shelf life. The longer a piece of data sits, the greater chance that it loses relevancy. Data often are also coming from an extremely complex supply chain that may include development or manufacturing partners’ systems and processes.

Large amounts of data are available from myriad systems, people, and processes, but teams often struggle to make sense of it in the context of quality operations. The sheer volume of data demands some type of prioritization prior to analysis. Many companies are using data lakes to hold the data and keep it safe in a controlled environment until their data scientists can determine how to use it. But here again, the relevancy of the data and timeliness of interpretation is critical.

Because life sciences is a highly regulated industry, the burden of compliance also puts pressures on life sciences teams to accept, analyze, and act on the data as quickly as possible while also verifying the data’s chain of custody and confirming it has not been inadvertently (or purposefully) modified. In Sparta Systems’ 2019 Pharma Quality Outlook, nearly one-third (28%) of those surveyed pointed to new regulations and guidance as a top obstacle for 2019, and 66 percent of respondents named compliance as a top goal for 2019.

Quality operational data are a function of several components with critical inputs needed for effective and efficient use of this intelligence for business insights.

Life sciences teams must also be at-the-ready to host health authority audits of documented quality processes. These audits look at quality processes—anything from raw materials to commercialization—and they document “findings” where a potential weakness in the program may result in a quality issue. As the host of a health authority audit, a quality team must be able to answer inquiries and provide supporting documentation wherever appropriate. This is where efficient and effective use of process data truly makes a difference. Accurate, well-documented responses that are based on real-time process data provide the best insight into the overall quality of a process.


The risks associated with a quality issue can vary significantly. A few years ago, a large contact lens manufacturer had to recall nearly 500,000 boxes of contact lenses after millions of complaints from customers who were experiencing uncomfortable side effects when wearing the company’s product.

The complaints were captured and tracked, but there was no easy way for the team to determine exactly where the problem originated. It took a long time, a significant number of resources, and substantial costs to figure out that the issue was being caused by higher-than-expected levels of a type of acid used in manufacturing the lenses, which had not been fully removed during the lens-rinsing process.

An uncomfortable experience with a product and the costs of a recall could have been much worse for the company. Quality issues in manufacturing can result in very serious or long-term illnesses or even death. This is the main reason quality programs exist in the life sciences; to ensure patient and consumer safety as innovative new medicines and medical devices are commercialized.

A quality issue can also affect the company’s relationship with health authorities. If an issue occurs, it can raise questions in the minds of health authorities as they assess a company’s development or manufacturing processes. Similarly, questions may begin to arise in the marketplace as the company takes corrective actions and attempts to market the affected product or other products in their pipeline. The echo effect of quality issues can impact an entire portfolio of products, resulting in loss of market share and diminishing competitive advantages.

Finally, companies with severe quality issues may end up operating under a consent decree, an agreement with a health authority that is enforced by the U.S. Department of Justice. A consent decree typically lasts three to five years and requires the development and implementation of an extensive mitigation plan. One can cost hundreds of millions of dollars in fines, remediation expenses, and lost sales while also having a lasting negative impact on the company’s reputation and brand.


How can life sciences teams manage, track, and analyze large amounts of process data and maintain quality processes? How can the risks be minimized across such complex processes? Below are four key recommendations that will ensure you are on the right path to transforming operational data into business insights.

Develop a quality system approach. This requires a team to look at the entire, end-to-end process and focus on delivering a consistent and reliable product through documented and controlled processes. The process should consider data sources and formats as well as accessibility and control. This holistic approach enables teams to recognize both risks and opportunities for improvement.

A quality process must also be sustainable. Product development and manufacturing teams cannot assume that this is a “one and done” activity. Continuous improvements to processes are often a natural progression as teams gain experience interpreting process data and acting on emerging insights that are coming from those data. Quality planning, assurance, control, and improvement address current and future processes at a program level and offer sustainability.

Gain team consensus on key performance indicators (KPIs). Teams must agree not only on what will be measured but also on how and when it will be measured. This requires cross-functional discussions and should also include any relevant outsourcing partners. It is important to view each piece of data in the context of the larger process and the value it brings to the quality program. It is also important to understand how the data may impact real-time decision-making processes when an issue occurs. For example, process data about a specific batch of drug product where a quality issue is identified can be incredibly valuable if they are available in real-time. If those same data are only available two weeks later, when the drug has been packaged and potentially dispersed through numerous distribution channels, it puts the team in a completely different and much more reactive position. It also significantly increases the risk of the impacted product getting into the hands of a consumer or patient who could experience a serious adverse event.

Once you have gained consensus, your operational analytics will help to prioritize sorting and processing large volumes of data.

Improve accessibility and transparency of key data. As you develop KPIs, consider where data are stored and their ownership and accessibility. Accessibility of data determines the degree of transparency in the process. If the data aren’t accessible by appropriate roles or require certain skill sets to access, process transparency will be difficult to achieve, and the team’s responsiveness will be negatively impacted. For example, if key reports can only be run by a programmer or IT person, waiting on a specific resource to distribute this information to other team members creates an unnecessary bottleneck that will have consequences for any mitigation plan. The idea of “democratization of data” assumes accessibility to the data by people and systems that will analyze and take actions on the data.

Since so much process data are dynamic and have a short shelf life, a discussion of accessibility should be high on the priority list. Quality processes, by nature, require control but the data should deliver actionable insight to the right people at the right time.

Use enabling technology to manage complex quality programs. The right quality management system can help teams sift through large quantities of data and provide alerts and notifications when a potential issue is looming. Technologies that provide industry-proven workflow capabilities can also escalate critical issues rapidly so that mitigation efforts can be implemented before the issue reaches further into the process. The longer a quality issue remains unaddressed and the deeper into a process that an issue moves, the more expensive it becomes. Technology can be used to create dashboards and reports that help teams to identify, triage, and manage deviations, root cause analyses, and implementing corrective and preventive actions (CAPA).

Technology can also be used to support routine internal audits and respond to successful health authority audits. Internal audit activities can be a great source of feedback about a quality process and the data being used to manage it.


Quality management is important in ensuring that new drug therapies or medical devices deliver consistent value to consumers and patients. The complexity and volume of operational data demand a systematic approach to minimize risk. Enabling technologies can simplify the complex task of managing operational data so that teams can focus on improving processes instead of wrangling process data.

Quality programs rely on quality data. As teams continue to gain experience with new and varying types of data that come from their development and manufacturing processes, they will experience greater flexibility and improved responsiveness to quality issues. These teams will not only be able to understand but also anticipate new and emerging requirements as their processes and resulting deliverables improve. Sparta Systems’ 2019 Pharma Outlook showed that the majority of teams already understand the significance of quality data and are increasingly using it to drive performance. More than three-quarters of those surveyed (76%) said that quality data are very or extremely important to their teams and leadership.

Final note: Quality systems and processes cannot survive or thrive if the company culture does not appreciate the benefits of a quality program. The mindset of the organization must support a team’s efforts to identify, track, and manage operational process data. The data management challenges can be significant, and the efforts may tax resources. However, if leadership teams understand the importance of process analysis and are committed to continuous improvement, the barriers to success will be much smaller.

Competitive life sciences teams not only understand the challenge of mining quality data but can also envision and respect the application of quality data to improve performance and create efficiencies throughout their value chains. Data-driven decisions are only as good as the data that drive them.


About The Author

Stephen McCarthy’s picture

Stephen McCarthy

Stephen McCarthy is vice president of digital innovation at Sparta Systems. McCarthy helps advance the value of quality management systems for Sparta’s customer as well as aid the customers to successfully transition to digital technology when they are ready. McCarthy serves as a key resource sharing his leadership, innovation, and experience while contributing to customer interactions, marketing activities, industry leadership events, and product development. In this role McCarthy supports Sparta’s mission of helping the industries it serves to improve product quality and safety, streamline operations, and lower risk and ensure compliance, using innovative cloud solutions that support an all-digital strategy.