Matthew M. Lowe’s picture

By: Matthew M. Lowe

While most business sectors have welcomed the efficiencies and benefits that cloud technologies and software-as-a-service (SaaS) offerings bring, the life sciences industry has been slow to embrace external cloud networks. Merely a decade ago, in fact, an International Data Corp. survey showed that 75 percent of CIOs and IT executives in life sciences and healthcare fields surveyed said that security risks were their primary reason for opposing cloud technologies.

Cloud-averse attitudes are slow to change, and industry research shows that companies that manage health information continue to show major resistance to cloud technology.

Jon Speer’s picture

By: Jon Speer

The European Medical Device Regulation (MDR) is a new set of regulations that governs the production and distribution of medical devices in Europe, and compliance with the regulation is mandatory for medical device companies that want to sell their products in the European marketplace.

If your company was already compliant with the Medical Devices Directive (MDD), don't be fooled into complacency: The MDR represents brand-new regulations with significant changes.

For those seeking to better understand why the regulations have changed, and what some of the major changes are, let’s take a look at some of the most common questions we hear from our users.

1. Why did the MDD need an update?

There were many reasons the MDD needed to be updated. For instance, when the MDD came into law in 1992, software as a medical device (SaaMD) did not yet exist. Software was something that controlled electric machines, and apps that patients could use to monitor their own health were still nearly 20 years away.

AssurX’s picture

By: AssurX

Last month an investigative report revealed that the U.S. Food and Drug Administration (FDA) has millions of “hidden” serious injury and malfunctions reports on medical devices. According to the report from Kaiser Health News, “Since 2016, at least 1.1 million incidents have flowed into the internal “alternative summary reporting” [ASR] repository, instead of being described individually as device-adverse events in the public database known as MAUDE.”

Medical experts trust the Manufacturer and User Facility Device Experience (MAUDE) to identify problems that could put patients in jeopardy—making products that are not in that database essentially concealed.

In 2017 alone, 480,000 injuries or malfunctions were reported through the ASR. The FDA has declined to provide a complete list of the approximately 100 devices that have been granted reporting exemptions. Requests for those data through the Freedom of Information Act could take up to two years.

Critics have pointed out that many of those devices, which include staplers, vaginal mesh devices, robotic surgical devices, breast implants, and heart valves, come from medical device industry leaders.

Matthew M. Lowe’s picture

By: Matthew M. Lowe

Despite the life science industry’s infatuation with modernity and trend chasing, even its most forward-thinking organizations have struggled to fully digitize and integrate their operations.

Yet, while the industry lags behind most other sectors in implementing business-streamlining digital technologies, many shrewd life science companies are working to close the digital gap so they can capitalize on the competitive advantages digitization affords.

As digital initiatives gain more traction, and as advanced technologies increasingly perform more of our mundane tasks, skilled life science professionals’ fears about job displacement are intensifying. Their digital apprehensions are undeniably intertwined with the global workforce’s general anxieties about automation, as highlighted in a 2017 PwC survey that reports 37 percent of the world’s workers are worried about eventually losing their jobs to automation. The unease is worsening, it seems, as only 33 percent of workers reported concerns about job-eradicating automation in the same survey in 2014.

Robyn Metcalf’s picture

By: Robyn Metcalf

In an outbreak that has now run for more than 28 months, at least 279 people across 41 states have fallen ill with multidrug-resistant Salmonella infections linked to raw turkey products. Federal investigators are still trying to determine the cause. In response to food company recalls, more than 150 tons of raw turkey products have flowed back through the supply chain as waste.

In an age when companies envision drone pizza delivery and hamburgers prepared by robots, why is it so hard to locate the source of food-borne diseases like this one?

As I show in my book, Food Routes: Growing Bananas in Iceland and Other Tales from the Logistics of Eating (MIT Press, 2019), the challenge of tracing food-borne illnesses in the United States demonstrates that our high-tech food system is broken in fundamental ways. It also reveals a lag between announcements of new, cool tech advances and applying them to solve real problems. In the meantime people get sick, some die, and food piles up in landfills.

NIST’s picture

By: NIST

A new measurement approach proposed by scientists at the National Institute of Standards and Technology (NIST) could lead to a better way to calibrate computed tomography (CT) scanners, potentially streamlining patient treatment by improving communication among doctors. 

The approach, detailed in a research paper in the journal PLOS One, suggests how the X-ray beams generated by CT can be measured in a way that allows scans from different devices to be usefully compared to one another. It also offers a pathway to create the first CT measurement standards connected to the International System of Units (SI) by creating a more precise definition of the units used in CT—something the field has lacked.

“If the technical community could agree on a definition, then the vendors could create measurements that are interchangeable,” says NIST’s Zachary Levine, a physicist and one of the paper’s authors. “Right now, calibration is not as thorough as it could be.” 

Hubert Gatignon’s picture

By: Hubert Gatignon

Health and economics are linked in more ways than just health insurance. When we look past the obvious, research shows us how brain scans, the gig economy, or even hospital queues are all part of the expanding domain of health economics.

Recently, professors and researchers from the Sorbonne University Alliance, including INSEAD, came together to share their work on the relationship between health costs and sustainability. They also discussed a host of other topics related to health economics across disciplines.

Poor health and the working poor

The vagaries of health and employment inequality were central to the researchers’ main conversation.

Mark Stabile, professor of economics and academic director of the Stone Centre for the Study of Wealth Inequality at INSEAD, spoke about his past work on the effect of mental health on educational attainment and his recent work on the gig economy.

Taran March @ Quality Digest’s picture

By: Taran March @ Quality Digest

Life science companies are no strangers to data, so it would be easy to assume they are adept at making innovative use of huge amounts. Not necessarily. A tradition of rigorous scientific method and clinical trial hasn’t prepared them for the shifting inundation of big data or all its baffling potential. If anything, the reliable, “clinically proven” analytical habits of former decades have hampered some manufacturers from leveraging data in new and needed ways.

Here’s a look at three issues facing pharmaceutical and biotech companies as they adapt to the reality of all kinds of data, all the time.

Dirk Dusharme @ Quality Digest’s picture

By: Dirk Dusharme @ Quality Digest

For centuries, medical procedures, prescriptions, and other medical interventions have been based largely on experience—what is known about a set of symptoms. The doctor looks at those symptoms, tests you in various ways (blood tests, X-rays, MRIs), and interprets the results based on experience with past patients or what is widely known in the medical community. Then she prescribes a treatment. There are two problems with this.

First, diagnosis relies on the doctor’s or medical profession’s interpretation of an examination and test results. Second, treatments themselves target populations, not people: This is the treatment that worked for most people in the past, so this treatment should work for you.

This isn’t to bad-mouth the medical or pharma community. But medicine has been, and still is, essentially statistical in nature. It’s based on populations, not individuals. That has been the most cost-effective way to treat the most people in the most efficient way possible. It hasn’t been possible, either technologically or, more important in terms of time, to test every patient for every possible pathogen that he might ever have been exposed to, or personally interview every family member to understand the patient’s family health history.

Mike Richman’s picture

By: Mike Richman

Great quality is pretty much the same everywhere, but the cost of poor quality is not equivalent from industry to industry. For example, it’s conceivable (but I hope not probable) that this article may turn out to be a real bomb, or worse, a complete snoozer. What’s the cost of that poor quality? To you, the reader, it will likely mean little except some lost time. For me, as the writer, the reputational hit could be considerable. To Quality Digest, as the publisher of the piece, the fallout could be even worse—lost readers and advertisers.

Now consider the risks of poor quality for a company operating within the life sciences sector. For these organizations, poor quality can lead to deadly outcomes for customers. That’s why U.S. Food and Drug Administration (FDA) agents won’t hesitate to shut down organizations that have been warned about faulty procedures and bad quality but can’t or won’t correct the situation.

Syndicate content