NIST’s picture

By: NIST

A new measurement approach proposed by scientists at the National Institute of Standards and Technology (NIST) could lead to a better way to calibrate computed tomography (CT) scanners, potentially streamlining patient treatment by improving communication among doctors. 

The approach, detailed in a research paper in the journal PLOS One, suggests how the X-ray beams generated by CT can be measured in a way that allows scans from different devices to be usefully compared to one another. It also offers a pathway to create the first CT measurement standards connected to the International System of Units (SI) by creating a more precise definition of the units used in CT—something the field has lacked.

“If the technical community could agree on a definition, then the vendors could create measurements that are interchangeable,” says NIST’s Zachary Levine, a physicist and one of the paper’s authors. “Right now, calibration is not as thorough as it could be.” 

Hubert Gatignon’s picture

By: Hubert Gatignon

Health and economics are linked in more ways than just health insurance. When we look past the obvious, research shows us how brain scans, the gig economy, or even hospital queues are all part of the expanding domain of health economics.

Recently, professors and researchers from the Sorbonne University Alliance, including INSEAD, came together to share their work on the relationship between health costs and sustainability. They also discussed a host of other topics related to health economics across disciplines.

Poor health and the working poor

The vagaries of health and employment inequality were central to the researchers’ main conversation.

Mark Stabile, professor of economics and academic director of the Stone Centre for the Study of Wealth Inequality at INSEAD, spoke about his past work on the effect of mental health on educational attainment and his recent work on the gig economy.

Taran March @ Quality Digest’s picture

By: Taran March @ Quality Digest

Life science companies are no strangers to data, so it would be easy to assume they are adept at making innovative use of huge amounts. Not necessarily. A tradition of rigorous scientific method and clinical trial hasn’t prepared them for the shifting inundation of big data or all its baffling potential. If anything, the reliable, “clinically proven” analytical habits of former decades have hampered some manufacturers from leveraging data in new and needed ways.

Here’s a look at three issues facing pharmaceutical and biotech companies as they adapt to the reality of all kinds of data, all the time.

Dirk Dusharme @ Quality Digest’s picture

By: Dirk Dusharme @ Quality Digest

For centuries, medical procedures, prescriptions, and other medical interventions have been based largely on experience—what is known about a set of symptoms. The doctor looks at those symptoms, tests you in various ways (blood tests, X-rays, MRIs), and interprets the results based on experience with past patients or what is widely known in the medical community. Then she prescribes a treatment. There are two problems with this.

First, diagnosis relies on the doctor’s or medical profession’s interpretation of an examination and test results. Second, treatments themselves target populations, not people: This is the treatment that worked for most people in the past, so this treatment should work for you.

This isn’t to bad-mouth the medical or pharma community. But medicine has been, and still is, essentially statistical in nature. It’s based on populations, not individuals. That has been the most cost-effective way to treat the most people in the most efficient way possible. It hasn’t been possible, either technologically or, more important in terms of time, to test every patient for every possible pathogen that he might ever have been exposed to, or personally interview every family member to understand the patient’s family health history.

Mike Richman’s picture

By: Mike Richman

Great quality is pretty much the same everywhere, but the cost of poor quality is not equivalent from industry to industry. For example, it’s conceivable (but I hope not probable) that this article may turn out to be a real bomb, or worse, a complete snoozer. What’s the cost of that poor quality? To you, the reader, it will likely mean little except some lost time. For me, as the writer, the reputational hit could be considerable. To Quality Digest, as the publisher of the piece, the fallout could be even worse—lost readers and advertisers.

Now consider the risks of poor quality for a company operating within the life sciences sector. For these organizations, poor quality can lead to deadly outcomes for customers. That’s why U.S. Food and Drug Administration (FDA) agents won’t hesitate to shut down organizations that have been warned about faulty procedures and bad quality but can’t or won’t correct the situation.

Quality Digest’s picture

By: Quality Digest

Within the life science industry, federal and industry regulations have prompted the need for compliance, and that trend has only increased in magnitude and complexity. Along with that has come technological solutions to enable both compliance and efficiency, without which life science organizations can’t thrive.

The following short timeline offers a broad summary of how these elements have interacted during the last several decades. The industry has come a long way.

1978

Thou shalt obey the FDA

The U.S. Food and Drug Administration (FDA) first prescribes current good manufacturing practices (cGMP) requirements for medical devices, including quality management system (QMS) requirements.

Dirk Dusharme @ Quality Digest’s picture

By: Dirk Dusharme @ Quality Digest

As the United States struggles with rising healthcare costs, reducing the amount of money pharmaceutical companies spend dealing with regulation, while at the same time meeting drug safety requirements, would seem to be competing interests.

The goal of any honest pharmaceutical company is to make money producing a safe product that consumers need and getting it to market as quickly as possible. But the U.S. Food and Drug Administration’s (FDA) job is to make sure drugs are safe, and that means oversight (some would say excessive oversight), and oversight means costs and delays for manufacturers.

The FDA isn’t blind to this issue. In an October 2005 FDA/ISPE workshop, Dr. Janet Woodcock, director for the FDA Center for Drug Evaluation and Research (CDER) stated that a common goal of industry, consumers, and regulators was to have a “a maximally efficient, agile, flexible pharmaceutical manufacturing sector that reliably produces high-quality drugs without extensive regulatory oversight.”1

[Read More]

Laurel Thoennes @ Quality Digest’s picture

By: Laurel Thoennes @ Quality Digest

Compliance to U.S. Food and Drug Administration (FDA) regulations has come a long way in the past 30 years. Here are the main changes. Have they affected your business?

1988: Food and Drug Administration Act
Officially establishes the FDA as an agency of the Department of Health and Human Services and broadly spells out the responsibilities for research, enforcement, education, and information

1988: The Prescription Drug Marketing Act
Requires drug wholesalers to be licensed by the states; restricts reimportation from other countries; and bans sale, trade, or purchase of drug samples, and traffic or counterfeiting of redeemable drug coupons

[Read More]

Taran March @ Quality Digest’s picture

By: Taran March @ Quality Digest

It’s been a year and a month since Stephen McCarthy switched C suites, moving from Johnson & Johnson, where he served as vice president of quality system shared services, to Sparta Systems, where he’s now vice president of digital innovation. His focus has switched as well.

At J&J, he looked inward, responsible for delivering the strategy and plans for simplifying and standardizing quality systems across the company of 126,000 employees. At Sparta, he looks forward to the future, driving growth strategy, supporting R&D in product development, and influencing how the company’s products and solutions can “participate in the larger IT ecosystem.” McCarthy’s outlook on quality is both deep and broad, and with both types of black belts to his name, he understands the value of practical disciplines. So when he speaks with an evangelist’s enthusiasm about quality management’s role in life sciences, he also promotes a sense of what’s actually possible. The result is a fascinating glimpse around the bend to what lies ahead.

[Read More]

Graham Freeman’s picture

By: Graham Freeman

Many industries have no clear boundary between safety and quality culture. In fact, they are often closely integrated. Quality failures and nonconformances that require rework have been correlated with increased accidents and recordable injury rates in manufacturing organizations. These injuries are frequently the result of fatigue, workplace pressure, and the pressure from extra work due to quality failures.

Among the important elements of people, processes, and tools, people are the primary point of failure in increasingly automated systems. Unlike machines, we are subject to fatigue, information overload, and stress that can have a serious impact on our ability to work safely and efficiently. However, people are also where dynamic sense-making, decision-making, and situational awareness reside, which are vital ingredients in complex and high-reliability organizations (HRO).

The aviation industry represents the best example of an HRO, in which automated systems such as navigation and air traffic control are integrated with highly developed human competencies. As a result, the industry sees an extremely small number of safety violations relative to the millions of hours of commercial aviation operation annually.

[Read More]

Syndicate content