Featured Product
This Week in Quality Digest Live
Innovation Features
Ian Wright
MIT and ETH Zurich engineers use computer vision to help adjust material deposition rates in real time
Having more pixels could advance everything from biomedical imaging to astronomical observations
Chris Caldwell
Significant breakthroughs are required, but fully automated facilities are in the future
Leah Chan Grinvald
Independent repair shops are fighting for access to vehicles’ increasingly sophisticated data
Adam Zewe
How do these systems differ from other AI?

More Features

Innovation News
Exploring how a high-altitude electromagnetic pulse works
High-capacity solution using TSMC’s 3DFabric technologies
EcoBell paints plastic parts with minimal material consumption
Study of intelligent noise reduction in pediatric study
Easy to use, automated measurement collection
A tool to help detect sinister email
Funding will scale Aigen’s robotic fleet, launching on farms in spring 2024
High-end microscope camera for life science and industrial applications

More News

James Warren


Revolution by Design: The Materials Genome Initiative

Computation, theory, and experiment flourish in materials research platform

Published: Wednesday, August 31, 2016 - 15:26

Creating a new material has long been either an accident or a matter of trial and error. Steel, for instance, was developed over hundreds of years by people who didn’t know why what they were doing worked (or didn’t work). Generations of blacksmiths observed that iron forged in charcoal was stronger than iron that wasn’t, and iron that was forged in a very high-temperature, charcoal-fired furnace and rapidly cooled was even stronger, and so on.

Although we’re still learning things about steel, we now have all kinds of recipes we can use to make steels with different properties, depending on the application. However, those recipes took a lot of time, sweat, and toil to develop. Wouldn’t it be great if we could skip over all the trials and errors and design new materials from scratch with the exact properties we want?

Think about it. If Scotty had been able to access new materials in Star Trek IV: The Voyage Home, he wouldn’t have had to trade the formula for transparent aluminum to get the acrylic glass he needed to build a tank large enough to store a pair of humpback whales in the cargo hold (and thus save the Earth from being destroyed by whale-loving aliens).

No, he could have just told the ship’s computer that he needed a transparent, watertight material that was stronger than tritanium. The computer, having a detailed database of the properties of various elements and compounds, could predict what properties various combinations of those constituents would have and develop a formula for a new material with the desired properties. (Then it would have just been a matter of gathering the raw materials, so maybe he would have had to trade the formula for transparent aluminum after all.)

Admittedly, all that is probably some ways off. In the shorter term, the Materials Genome Initiative is working to create what President Obama called a “materials innovation infrastructure.” During the past five years since its inception—the Materials Genome Initiative just celebrated its anniversary at the White House a few weeks ago—we’ve built an integrated platform to bring together computation, theory, and experiment, and enable the broad sharing of materials data and software.

Since the launch of Materials Genome Initiative in 2011, the federal government has invested more than $250 million in new R&D and innovation infrastructure to anchor the use of advanced materials in existing and emerging U.S. industrial sectors.

Researchers across the country in government, university, and industry labs can use these tools to design and run smarter simulations, speeding the development and deployment of materials with whatever properties they seek. Some of the applications being pursued include heat-sinking electronics, long-lived batteries, better body armor, and auto bodies strong enough to save your life and light enough to save you gas.

Ethylene (on the left in gray) is usually contaminated with acetylene (blue), which can ruin the process that creates the polyethylene used in most plastic. SIFSIX metal-organic frameworks (center) can capture the acetylene efficiently, leaving pure ethylene (right). Credit: Zhou/NIST

Starting out

When I first came to NIST, I was fortunate to find myself among a number of other National Research Council postdoctoral researchers who shared an interest in computational materials science, a field that was just beginning to take off. Indeed, NIST was a great place to do such work, with NIST senior fellow John W. Cahn (as well as many other outstanding scientists) creating an intellectual environment that valued serious theoretical work that could have real engineering impact.

Some of my early work to predict how microstructures form as complex liquid mixtures cool and solidify. These four sets of simulation images show the striking similarities between crystals “grown” under different conditions. As greater amounts of impurities are added, the crystal grows more and more randomly. Credit: Credit: J. Warren, B. Boettinger/NIST

In 1994, my colleagues and I co-founded the NIST Center for Theoretical and Computational Materials Science. Empowered by the newly created World Wide Web, our mission was to provide the basic data and computational tools that industry needed to “get over the hump” and begin their own materials research. Now in its 22nd year, the center is a source for numerous software tools in wide use by both academic and industrial materials researchers.

While the Center for Theoretical and Computational Materials Science was, and is, something we’re proud of, it was tough starting out. We learned that scientists simply didn’t have much incentive to publish their simulation code; they’d much rather publish the findings they made with the code. What was not widely accepted was that, in general, a well-written simulation code can be at least as valuable, if not more, than a paper in a journal. After all, a paper furthers knowledge in one area, but other researchers can pick up and adapt simulation software to study all kinds of different things.

Although that sounds great in theory, in practice, scientists can be a competitive bunch. Sharing code that you worked so hard to develop risks giving your competitors a leg up, and what’s worse, they often didn’t even give you any credit when they used it. Thankfully, scientists and their patrons have begun asking that data and code be shared, and demanding that data and code be cited when used by others.

Changing the culture of scientific prestige is not easy, but it is now happening, and of course this kind of data/code sharing is an integral part of the Materials Genome Initiative’s DNA. Luckily, NIST management valued and continues to value our output of simulation codes as a class “A” research product.

But we were ahead of the curve back then.

Proving our mettle

If we jump forward 15 years to 2010, we find that code-sharing is now commonplace—e.g., the open-source community—but still not particularly encouraged or rewarded within the wider research community because the risks are still largely viewed as outweighing the rewards. Fortuitously, that year the Office of Science and Technology Policy put out a call to federal researchers to participate in formulating a materials modeling and simulation initiative. Chuck Romine and I were selected to serve on the National Science and Technology Council committee that drafted the white paper that resulted in the creation of the Materials Genome Initiative. Subsequently, I became, and remain, the executive secretary of the Materials Genome Initiative subcommittee.

In the nearly five years since the initiative’s inception, it has grown to more than $25 million in annual funding, including the Chicago-based Center for Hierarchical Materials Design, a NIST Center of Excellence. The Center for Hierarchical Materials Design specifically supports NIST’s Materials Genome Initiative activities to make the exchange of data and models easier, ensure the quality of data and models, and develop a conduit through which new methods and measurements can flow from the infrastructure as it matures.

Simultaneously, NIST has established its Office of Data and Informatics to focus on collecting the highest quality reference data, and developing and disseminating best practices in data science. At this time, we are deploying a materials data repository, a materials data curation system, and to make discovery easier, a materials resource registry.

A new paradigm

One of the ways we know we’re making scientific progress is when we can extend the frontiers of an existing model to describe something new. Even better is when an old model falls in favor of an improved description of reality. The proliferation of data promises to usher in a “fourth paradigm” of data-driven materials science where the discovery of new applications and new materials will become a daily occurrence.

Imagine a roof that could repair its own hail damage. DARPA, a Materials Genome Initiative member agency, is working to develop design tools and methods for creating programmable, self-healing, living, building materials. Credit: DARPA

By integrating experiment and computation more tightly, and organizing and making the results of each more readily available, the Materials Genome Initiative is poised to be an integral part of this new paradigm.

It’s an exciting time to be a materials scientist, and I’m thrilled to be a part of the coming scientific revolution in how we discover new materials. Now, if you’ll excuse me, I need to go tinker with my tritanium recipe.

Published first on NIST's Taking Measure blog.


About The Author

James Warren’s picture

James Warren

James Warren is the technical program director for materials genomics at the National Institute of Standards and Technology (NIST). He works with a governmentwide team to build out the materials innovation infrastructure needed to realize the goals of the Materials Genome Initiative: a multi-agency initiative designed to create a new era of policy, resources, and infrastructure that support U.S. institutions in the effort to discover, design, develop, and deploy advanced materials twice as fast, at a fraction of the cost. Warren is also a co-founder and current director of the NIST Center for Theoretical and Computational Materials Science.