NIST’s picture


Aresearch team has found that a method commonly used to skirt one of metal 3D printing’s biggest problems may be far from a silver bullet.

For manufacturers, 3D printing, or additive manufacturing, provides a means of building complex-shaped parts that are more durable, lighter and more environmentally friendly than those made through traditional methods. The industry is burgeoning, with some predicting it to double in size every three years, but growth often goes hand in hand with growing pains.

Residual stress, a by-product of the repeated heating and cooling inherent to metal printing processes, can introduce defects into parts and, in some cases, damage printers. To better understand how residual stress forms, and how it might be curbed, researchers at the National Institute of Standards and Technology (NIST), Lawrence Livermore National Laboratory, Los Alamos National Laboratory, and other institutions closely examined the effects of different printing patterns in titanium alloy parts made with a common laser-based method.

Lee Seok Hwai’s picture

By: Lee Seok Hwai

As a young man of 20 in his first job at a state-owned enterprise in China, Guoli Chen found senior management fascinating, but not in a good way. His boss’s boss did very little—unless one counts reading newspapers, drinking tea, and gossiping as work. “I wondered whether anyone could replace him without affecting the [organization’s] overall performance,” recalls Chen, now a professor of strategy at INSEAD.

He didn’t stick around to find out. After two years, Chen moved on. His second job couldn’t be more different, in culture as well as the lessons he learned. Working in an investment bank, Chen observed how the company’s venture capital arm picked firms chiefly on the strength of the founding team, especially the chief executive. “Given the uncertainty of the [firms’] business potential… the VC literally bet the success of their investments on the individual,” Chen says in an INSEAD Knowledge podcast.

Adam J. Fleisher’s picture

By: Adam J. Fleisher

In an essay titled “The end of artefacts,” Nobel laureate and National Institute of Standards and Technology (NIST) fellow William D. Phillips details how scientists came to realize the original vision of the metric system, or the International System of Units (SI)—a system of units “for all times, for all people.” With the redefinition of the kilogram in 2019, the new SI was rightly celebrated as a unifying achievement toward the democratization of science, with NIST and its international partners having collectively led the charge.

Ryan McKenna’s picture

By: Ryan McKenna

To date, this series focused on relatively simple data analyses, such as learning one summary statistic about our data at a time. In reality, we’re often interested in a slightly more sophisticated analysis, so we can learn multiple trends and takeaways at once and paint a richer picture of our data.

In this article, we will look at answering a collection of counting queries—which we call a workload—under differential privacy. This has been the subject of considerable research effort because it captures several interesting and important statistical tasks. By analyzing the specific workload queries carefully, we can design very effective mechanisms for this task that achieve low error.

Christopher Allan Smith’s picture

By: Christopher Allan Smith

This series is about planning for the worst that can face us.

It’s jumping-off point is the National Institute of Standards and Technology publication, “A Case Study of the Camp Fire—Fire Progression Timeline,” an epic and thorough study about the wildfire that changed the lives of my family, friends, and some fellow Quality Digest associates in November 2018. That fire razed most of the communities on the Paradise Ridge in Butte County, California, destroyed about 19,000 structures—95-percent of the residences in Paradise—and killed 85 people.

I have come to see my part in my community’s recovery as voicing the lessons we learned—literally taking the awful and searing things we learned that are of some use before, during, and after a disaster—and passing them on to other communities so they may face their trials with some better measure of success and safety.

Jason Spera’s picture

By: Jason Spera

In a customer-centered world, meeting customers’ needs is more demanding and business-critical than ever. Simultaneously, manufacturers struggle to reduce operating costs as margins compress and the competitive landscape intensifies. This dichotomy and a pressure to “choose” between reducing costs and delighting customers is not mutually exclusive.

Best-in-class manufacturers recognize there’s no trade-off; they take a holistic approach to quality management that allows them to excel in both arenas. A quality-driven mindset across every layer of an enterprise strategically enhances process visibility and compliance to enable improvements in both cost and customer satisfaction.

The trade-off mindset

For many discrete manufacturers, delivering more robust customizations at a reduced margin or choosing to settle for less than zero-defect quality in favor of cutting operating costs is part of the old trade-off mindset.

Caroline Zimmerman’s picture

By: Caroline Zimmerman

With big data and artificial intelligence (AI) transforming business, it’s almost certain that every executive will need to leverage these technologies at some point to advance their organization—and their career. However, doing so carries a heavy intimidation factor for most leaders, and this is often exacerbated by skill-heavy job descriptions for leadership roles related to data, analytics, or AI. However many of these descriptions misunderstand what’s required to drive successful business outcomes using data and AI.

Although analytical thinking is certainly important, many traditional leadership skills are equally essential when undertaking AI/big data for the first time. It’s also critical to be comfortable with ambiguity, have the capacity to drive consensus among disparate players, and understand the levers of value to prioritize accordingly. Some of the most effective leaders in data and AI are those who think commercially while applying expertise. Moreover, today’s data-driven business leaders must be politicians and communicators, able to harness the potential of data and AI to drive revenue, efficiency gains, and innovation, while exerting influence and explaining the value they create.

Wade Schroeder’s picture

By: Wade Schroeder

Medical-device usability testing and validation are critical tasks leading up to a medical device’s debut on the market. “Usability” looks at how the user interacts with your device and forms a key component of overall risk management and safety.

If there’s any “spoiler alert” to this article, it’s that human factors, including usability and validation, should not be put off to the last minute. The ultimate goal of early planning for usability testing and validation of your medical device is to ensure you’re building a safe and effective product for the end user.

Early considerations should evolve into formal procedures for usability testing and validation activities that live within your quality system. This will prevent you from needing to scramble last minute or work retroactively once it’s time for your device submission.

This article provides an introductory look at usability testing and validation, and how to build these key components into your quality system.

Introduction to medical device usability

Usability research looks at the interactions between human operators and the medical device. Testing should be conducted on anyone who plays a role in operating the device, from patients to clinicians to people responsible for sterilizing or maintaining the device.

Lawrence Berkeley National Laboratory’s picture

By: Lawrence Berkeley National Laboratory

Plastics are a part of nearly every product we use on a daily basis. The average person in the United States generates about 100 kg of plastic waste per year, most of which goes straight to a landfill. A team led by Corinne Scown, Brett Helms, Jay Keasling, and Kristin Persson at Lawrence Berkeley National Laboratory (Berkeley Lab) set out to change that.

Less than two years ago, Helms announced the invention of a new plastic that could tackle the waste crisis head on. Called poly(diketoenamine), or PDK, the material has all the convenient properties of traditional plastics while avoiding the environmental pitfalls, because unlike traditional plastics, PDKs can be recycled indefinitely with no loss in quality.

Now, the team has released a study that shows what can be accomplished if manufacturers began using PDKs on a large scale. The bottom line? PDK-based plastic could quickly become commercially competitive with conventional plastics, and the products will get less expensive and more sustainable as time goes on.

Alessandro Messina’s picture

By: Alessandro Messina

A challenge that occurs with the latest generation of electric motors is optimization of the component manufacturing in terms of efficiency, quality, and costs.

Electric motors are a critical factor in the unprecedented global growth trend toward e-mobility. This fast diffusion of electric vehicles on a large scale puts an increased expectation of component reliability on the manufacturers. This in turn has implications for the quality-control and process-control requests of the production chain.

To meet the high-quality requirements for an e-motor, both in mobile and stationary use, measurement and testing technology must be applied systematically during the production process: first to meet the demand for safety and performance, and second to shift production toward higher quality.

Syndicate content