Copyright © 2020 Albuquerque Journal
People have been using metals for thousands of years and, throughout that time, they have been looking for ways to improve them. The earliest metal used by humans was copper, more than 10,000 years ago. A few thousand years later, humans began experimenting with metallurgy and created bronze, a mixture of copper and tin. The Bronze Age gave way to the Iron Age and, eventually, people began using steel, aluminum, titanium and numerous other metals in everyday life.
Today, metals have become reliable enough that they are used in everything from such simple tools as a key to open a door to far more complicated applications, such as the engine in a car. But even with how reliable these materials have become, they can still experience failure, particularly in harsh conditions.
So what about the materials exposed to some of the harshest conditions imaginable, like those inside a nuclear reactor? Reactors work by sustained fission of uranium, splitting atoms at a slow and steady rate. To create steam, temperatures inside a reactor vessel can reach up to 572 degrees Fahrenheit. Inside the nuclear fuel itself, the temperature is even higher, up to 1,832 degrees Fahrenheit. During operation of the reactor, both the fuel and the inner wall are bombarded by large amounts of radiation resulting from nuclear reactions, which makes the environment even harsher for the materials.
So it is easy to understand why the materials inside reactors need to be especially durable. They can make reactors safer and more efficient. But how can these materials be improved?
Traditionally, this research is done through physical experiments, meaning scientists go into a lab, create a material, put it into a reactor and wait – up to a year in some cases. When the material comes out, it is radioactive and can be difficult to handle. This process can be complicated, time-consuming and expensive.
This is where supercomputers can help, allowing researchers to simulate molecular dynamics – basically, the movement of every single atom in a material. This provides fundamental insight into materials down to the atomic scale and shows where each atom is at every point in time. Ultimately, the goal of computational materials science is to use such simulations to shorten the time it takes to develop new materials by developing them from scratch on the computer. That way, they can get a sense of how a material would perform just from computer simulations without having to go to the lab and try different possibilities.
However, molecular dynamic simulations are challenging to create. They can take weeks to generate and are expensive to produce, because a simulation tracking the movement of individual atoms requires significant computing power and time is money on a supercomputer.
For example, on the largest computer systems, researchers can simulate around one trillion atoms. That may sound like a lot, but that actually corresponds to a cube of material roughly one millionth of a meter on each side. And perhaps the most stringent constraint is time scale. Most simulations that are created last less than one millionth of a second. That’s typically far too short to see the structure of the material evolve in reaction to such external factors as stress, pressure or temperature, as these changes tend to occur on the seconds to minutes timescale.
So, even as researchers prepare for the arrival of the next generation of high-performance computers that will make one quintillion calculations per second, these limitations remain. Using these machines, dubbed exascale computers, efficiently will be no simple feat as they will carry out billions of operations simultaneously. Writing codes that can harness so much computing power requires fundamentally new ways to arrange the calculations, which often requires rethinking algorithms from the ground up.
The Exascale Atomistic capability for Accuracy, Length, and Time project, or EXAALT – led by Los Alamos National Laboratory in collaboration with Sandia National Laboratories and the University of Tennessee at Knoxville – aims to address these problems. The goal of the project is to develop a new generation of algorithms that would let researchers use very large computers in new and more flexible ways. It will also enable researchers to choose the kind of length, time and accuracy that they need for a simulation, and it can be optimized to work on supercomputers of any size.
Coupled with more powerful computers, EXAALT will enable a dramatic increase in the power of molecular dynamics simulations for materials, including potentially reaching an entire second of simulated time. This important milestone would propel molecular dynamics from basic research into engineering that affects day-to-day life.
The capabilities of EXAALT are meant to be general and could be applied to any hard material, not just the inside of a reactor. A broad range of applications could include developing new alloys for high-temperature turbines or jet engines, or micro-electronic components. Or it could be used for lightweight materials in cars. Consider that almost a ton of steel is used to build a car. Because humanity has literally thousands of years of experience with steel, researchers know how it is going to behave during a crash, for instance. The same cannot be said of new lightweight alternatives that could improve the gas mileage of future cars. EXAALT could help provide the answers.
Improving materials through computer simulations has been a goal in the material science community for a long time and EXAALT is helping researchers achieve that goal.
Danny Perez is a staff scientist in the Theoretical Division of Los Alamos National Laboratory and part of the Department of Energy’s Exascale Computing Project. The EXAALT project is funded by the Department of Energy’s Exascale Computing Project.