Featured Product
This Week in Quality Digest Live
FDA Compliance Features
Jill Roberts
Another way to know what’s too old to eat
Patricia Santos-Serrao
Four pharma quality trends
Del Williams
Preventing damage caused by large, suspended particles
Kari Miller
An effective strategy requires recruiting qualified personnel familiar with the process and technology

More Features

FDA Compliance News
Now is not the time to skip critical factory audits and supply chain assessments
Google Docs collaboration, more efficient management of quality deviations
Delivers time, cost, and efficiency savings while streamlining compliance activity
First trial module of learning tool focuses on ISO 9001 and is available now
Free education source for global medical device community
Good quality is adding an average of 11 percent to organizations’ revenue growth
Further enhances change management capabilities
Creates adaptive system for managing product development and post-market quality for devices with software elements
VQIP allows for expedited review and importation for approved applicants that demonstrate safe supply chains

More News

FDA Compliance

What We Mean When We Talk About EvGen, Part 1

Laying the foundation for a national system for clinical evidence generation

Published: Thursday, April 28, 2016 - 12:23

Across the clinical research enterprise, there is a growing awareness of serious shortfalls in the current model for generating the scientific evidence that supports medical product evaluation and clinical care decisions. As a result the FDA seeks to modernize methods and satisfy expectations surrounding this evidence base.

We know, for instance, that most clinical-practice guideline recommendations are not based on high-quality evidence, which is typically derived from appropriately designed, randomized controlled trials. We also know that adherence to standards supported by such high-quality evidence results in better outcomes for patients.

There is reason to believe that we’ve arrived at a tipping point where previously separate, “siloed” efforts can be linked to create a national system for evidence generation (EvGen). In this first of a series of articles, we’ll take a look at the elements required to build such a national system, beginning with a pair of foundational concepts—interoperability and connectivity.


Put simply, interoperability is the idea that different systems used by different groups of people can be used for a common purpose because those systems share standards and approaches. To take one example: Modern train tracks employ agreed-upon standards in terms of track gauge and other specifications so that many different kinds of vehicles can safely use the rail system.

In similar fashion, a national system for evidence generation that applies common data standards and definitions could “lay the track” for significant improvements in the exchange of biomedical data. Patients, consumers, professional groups, payers, the medical products industry, and health systems all stand to benefit from potential gains in efficiency and reductions in cost that would accompany standardized approaches to data collection, curation, and sharing, once up-front investments are absorbed. Then, with these standards in place, effort could be devoted to generating actionable knowledge rather than simply managing data.


Establishing interoperable systems is a critical step in building a national system for evidence generation. An equally important step is to enable collaboration among the many groups that generate data; for example, patients, clinicians, hospital systems, and health insurance organizations. Evidence is derived from high-quality data that often originate from many different sources or settings. We can create an interconnected environment that leverages all the available data to provide answers to important public health questions. A defining characteristic of such a network is the ability to leverage all available data for different tasks as needed, allowing the network to integrate complex relationships between data input and output. Coupled with interoperable standards, a national system for evidence generation based on these principles will be capable of generating very large quantities of data and enabling those data to flow among system components.

The result? Researchers will be able to distill the data into actionable evidence that can ultimately guide clinical, regulatory, and personal decision-making about health and healthcare.

These two core constructs represent the essential scaffolding that must be developed and put in place to support a national system for evidence generation. In our next article, we’ll examine ways that we can begin building and continuously improving such a system for the benefit of all stakeholders.


About The Authors

Rachel E. Sherman’s picture

Rachel E. Sherman

Rachel Sherman, M.D., M.P.H., is FDA’s Associate Deputy Commissioner for Medical Products and Tobacco.

Robert M. Califf’s picture

Robert M. Califf

Robert M. Califf, M.D., FDA’s Deputy Commissioner for Medical Products and Tobacco


Automotive "International Material Data System"

I'm in the auto industry, as a quality engineer on electronic components. I appreciate your need for an interconnected system, and I'd like to respond to the theory and the application of such as system.

Auto industry realized in the 90's that it would be better to internationally share hazardous materials testing in raw materials, showing the trace amounts of materials like mercury, internationally. The advantages manifested in reduced paperwork, the many different parts made with one batch of steel did not need that steel batch paperwork with each part, and better traceability, as long as the material makers input the data and lot number, you could trace back the lots all the way to the manufacturing point.

Application of the system is hard, and needs to be auditable. As I said above, the database relationship is a one batch to many products relationship, easy to think of conceptually in a SQL type database, but implementation requires some whizzes of the databasing indutstry. The more data you try to capture with each field increases your data exponetially because you have multiple locations the data is stored and accessed. Analyzing the data once it has been captured should just require the data manipulation skills of any social sciences or statistics person.

Auditing should be the explicit goal of an external body that monitors requirements. For IMDS, I am not a fan of the auditability of the system. An auditor of ISO14001 might be the person to dig up inconsistencies with PPAP (Production Part Approval Process), but I don't know of an organization that strictly audits IMDS requirements. The main thing an auditor should look for is when an Engineer or team approves the use of a certain material or part, that they keep and retain the material certifications for those items, and that they recertify on some regular basis. Similarly, when an organization in your system implements the use of a certain product, they should retain and reevaluate on a regular basis the raw components of that product.

You run into issues with proprietary products (especially in the plastics industry for auto) but it can be circumvented by setting key process indicators for the product (hardness, resistivity, specific mass). Drawing the analogy for your products, they could have key process indicators (percent active ingredient, time until effects felt, side effects).

The payoff of a system like you are suggesting is fantastic, the application will take some hard work from a good databasing team. I suggest you benchmark off of IMDS, but I would suggest taking it to the next level and making it auditable.

Cheers, this sounds like a great project.