NIST’s picture

By: NIST

Artificial intelligence (AI) promises to grow the economy and improve our lives, but along with these benefits, it also brings new risks that society is grappling with. How can we be sure this new technology is not just innovative and helpful, but also trustworthy, unbiased, and resilient in the face of attack? We talked with NIST’s Information Technology Lab director Chuck Romine to learn how measurement science can help provide answers.

How would you define AI? How is it different from regular computing?

One of the challenges with defining AI is that if you put 10 people in a room, you get 11 different definitions. It’s a moving target. We haven’t converged yet on exactly what the definition is, but I think NIST can play an important role here. What we can’t do, and what we never do, is go off in a room and think deep thoughts and say we have the definition. We engage the community.

Peter Dizikes’s picture

By: Peter Dizikes

Given the complexities of healthcare, do basic statistics used to rank hospitals really work well? A study co-authored by MIT economists indicates that some fundamental metrics do, in fact, provide real insight about hospital quality.

“The results suggest a substantial improvement in health if you go to a hospital where the quality scores are higher,” says Joseph Doyle, an MIT economist and co-author of a new paper detailing the study’s results.

The study was designed to work around a difficult problem in evaluating hospital quality: Some high-performing hospitals may receive an above-average number of very sick patients. Accepting those difficult cases could, on the surface, worsen the aggregate outcomes of a given hospital’s patients and make such hospitals seem less effective than they are.

However, the scholars found a way to study equivalent pools of patients, thus allowing them to judge the hospitals in level terms. Overall, the study shows, when patient sickness levels are accounted for, hospitals that score well on quality measures have 30-day readmission rates that are 15 percent lower than a set of lesser-rated hospitals, and 30-day mortality rates that are 17 percent lower.

Paavo Käkelä’s picture

By: Paavo Käkelä

After two decades of offshore productions in low-cost countries, manufacturers are now struggling with the rapidly growing salaries and countereffects of cheap production. The question that industries are asking today is: Do we continue offshoring, or should we consider reshoring?

The right answer, according to Paavo Kakela, the CEO of EID Robotics, who provides modular microfactory systems, is that manufacturers should transform their operations to rightshoring.

During the 1990s, U.S. manufacturers were sold by the lower cost of Asian labor. This is how the global offshoring boom started in Asia. By the 2000 millennium, offshoring began to peak; it maintained this growth trend until 2010—the year when U.S. domestic-manufacturing employment rates reached all-time lows.

Jon Speer’s picture

By: Jon Speer

Believe it or not, paper is very expensive. Although the going rate for a ream of standard copy paper is only about 10 bucks, the expense of relying on paper for your medical device quality management system is downright outrageous.

Some medical device manufacturers have recognized how expensive paper really can be and therefore rely on spreadsheets and documents—so-called “digital paper,”—only to realize this variation of paper is just as expensive, if not more so, due to the false sense of security digital file systems can give.

If you’re still relying on these outdated systems, let’s consider three key ways paper ends up costing significantly more than $10 per ream.

Loss of company valuation

We had an eye-opening conversation with Ronny Bracken, executive and principal at Paladin Biomedical Consultants, an accomplished medical device research and development executive with a career spanning more than two decades. In addition to his role helping his clients in all manner of regulatory and engineering services, Bracken invests in early-stage medical device companies.

Anne Trafton’s picture

By: Anne Trafton

After a patient has a heart attack or stroke, doctors often use risk models to help guide their treatment. These models can calculate a patient’s risk of dying based on factors such as the patient’s age, symptoms, and other characteristics.

While these models are useful in most cases, they do not make accurate predictions for many patients, which can lead doctors to choose ineffective or unnecessarily risky treatments for some patients.

“Every risk model is evaluated on some dataset of patients, and even if it has high accuracy, it is never 100-percent accurate in practice,” says Collin Stultz, a professor of electrical engineering and computer science at MIT and a cardiologist at Massachusetts General Hospital. “There are going to be some patients for which the model will get the wrong answer, and that can be disastrous.”

Stultz and his colleagues from MIT, IBM Research, and the University of Massachusetts Medical School have now developed a method that allows them to determine whether a particular model’s results can be trusted for a given patient. This could help guide doctors to choose better treatments for those patients, the researchers say.

Anthony Veal’s picture

By: Anthony Veal

When Microsoft gave its 2,300 employees in Japan five Fridays off in a row, it found productivity jumped 40 percent.

When financial services company Perpetual Guardian in New Zealand trialed eight Fridays off in a row, its 240 staff reported feeling more committed, stimulated, and empowered.

Around the world there’s renewed interest in reducing the standard working week. But a question arises: Is instituting the four-day week, while retaining the eight-hour workday, the best way to reduce working hours?


Perpetual Guardian trial outcomes, as measured by researchers from the University of Auckland and Auckland University of Technology. 4dayweek.com, CC BY-SA

Tom Taormina’s picture

By: Tom Taormina

In part one of this series, I said that I want to help my colleagues use their ISO 9001 implementation as a profit center and to turn risk-based thinking into risk avoidance. To do this I will share a set of tools that help evolve quality management into business management.

These tools include:
• Evolving the requirements of ISO 9001's Section 4 from merely defining the context of the organization to working with senior management to create, implement, and make shared vision, mission, and values a cultural imperative
• Redefining Section 5 to include roles and responsibilities for everyone in the organization that are measurable and inextricably tied to the key business success goals and metrics
• Including in Section 6 the tools and culture of risk avoidance
• Evolving Section 7 from support to an outcome-based, risk-and-reward culture
• Expanding the scope of Section 8 into a holistic business management system
• Redefining Section 9 from performance evaluation to an enterprisewide culture of individual and team accountability
• Expanding Section 10 from continual improvement to business excellence

Phanish Puranam’s picture

By: Phanish Puranam

Machine learning, the latest incarnation of artificial intelligence (AI), works by detecting complex patterns in past data and using them to predict future data. Since almost all business decisions ultimately rely on predictions (about profits, employee performance, costs, regulation, etc.), it would seem obvious that machine learning (ML) could be useful whenever “big” data are available to support business decisions. But that isn’t quite right.

The reality in most organizations is that data may be captured but they are stored haphazardly. Their quality is uneven, and integrating them is problematic because they sit in disparate locations and jurisdictions. But even when data are cleaned up and stored properly, they’re not always appropriate for the questions or decisions that management has in mind. So, how do you know whether applying predictive analytics through AI techniques to a particular business problem is worthwhile? Although every organization and context is different, here are five general principles that should be useful in answering that question.

Jennifer Lauren Lee’s picture

By: Jennifer Lauren Lee

3D printing of metal objects is a booming industry, with the market for products and services worth more than an estimated $2.3 billion in 2015, a nearly fivefold growth since 2010, according to Wohlers Report 2016. For this type of manufacturing, a metal part is built up successively, layer by layer, over minutes or hours. Sometimes thousands of layers are added together to make a single piece, a reason why this process is conventionally referred to as “additive manufacturing” (AM). By convention, 3D printers that create functional parts, often metal, in a commercial environment are referred to as “additive manufacturing machines.” The term “3D printing” usually refers to the process used to make plastic parts, one-off pieces, art pieces, or prototypes.

Additive manufacturing machines are particularly handy for making objects with complex forms or geometry, or internal features like ducts or channels. They are becoming increasingly popular in the aerospace, automotive, medical, and technology industries, to make complex pieces such as fuel injector nozzles for engines or titanium bone implants for skull, hip, and other repairs.

Kelvin Lee’s picture

By: Kelvin Lee

Biopharmaceutical manufacturing uses living cells to produce therapies that treat diseases like cancer, diabetes, and autoimmune disorders. Manufacturing medicine using biology presents different challenges from the traditional chemical manufacturing processes that stamp out identical pressed pills.

Biomanufacturing processes are hard to control, and the products are difficult to define as “identical” from batch to batch. Despite these challenges, biopharmaceuticals are critical to public health because the advantages are significantly greater. Scientific understanding of diseases and the success of biologically manufactured therapies to treat them has increased dramatically. But it can take a decade from design to full production of a biopharmaceutical—not fast enough to meet the needs of all the patients, or to beat competition from overseas.

Syndicate content