It is estimated that around 70 percent of the energy generated worldwide is discarded as waste heat.
If scientists could more accurately predict the transfer of heat through semiconductors and insulators, they could design more efficient power generation systems, but modeling the thermal properties of materials can be extremely difficult.
The problem comes from phonons, the subatomic particles that carry heat: Part of a material’s thermal properties depend on a measurement called the phonon dispersion relation, but even this is very difficult to obtain, and therefore difficult to use in designing systems.
A team of researchers at MIT and elsewhere has tackled this challenge by rethinking the problem from the ground up. The result is a new machine learning framework that can predict phonon dispersion relations up to 1,000 times faster than other AI-based techniques, and with similar or better accuracy—potentially as much as a million times faster than traditional non-AI-based approaches.
The method could help engineers design energy generation systems that produce more power more efficiently, and it could also be used to develop more efficient microelectronics, as thermal management is a major bottleneck in making electronic devices faster.
“Phonons are responsible for thermal loss, but their characterization is notoriously difficult both computationally and experimentally,” said Minda Li, associate professor of nuclear science and engineering and lead author of a paper on the technology.
In addition to Lee, the paper’s authors include Ryotaro Okabe, a graduate student in chemistry, Abijatmedi Chotrattanapituk, a graduate student in electrical engineering and computer science, Tommy Yackola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT, and co-authors from MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California, Santa Barbara, and Oak Ridge National Laboratory. Natural Computational Science.
Phonon prediction
Heat-carrying phonons are difficult to predict because they occur over such a wide frequency range and the particles interact and move at different speeds.
The phonon dispersion relation of a material is the relationship between the energy and momentum of phonons in its crystal structure. Researchers have tried to predict phonon dispersion relations using machine learning for years, but the models have run into difficulties because they require so many high-precision calculations.
“With 100 CPUs and a few weeks, you could probably calculate the phonon dispersion relations for one material, and the whole community really wants a way to do this more efficiently,” Okabe says.
A machine learning model that scientists often use for these calculations is known as a graph neural network (GNN). A GNN converts the atomic structure of a material into a crystalline graph, consisting of multiple nodes representing atoms, connected by edges that represent interatomic bonds between the atoms.
Although GNNs are well suited to computing many quantities such as magnetization and electric polarization, they are not flexible enough to efficiently predict extremely high-dimensional quantities such as phonon dispersion relations: phonons can move around atoms on the X, Y, and Z axes, making the phonon momentum space difficult to model with a fixed graph structure.
To gain the necessary flexibility, Li and his collaborators came up with virtual nodes.
They created what they call a Virtual Node Graph Neural Network (VGNN) by adding a set of flexible virtual nodes to a fixed crystal structure to represent phonons. The virtual nodes allow the output of the neural network to be resized so that it is no longer limited by a fixed crystal structure.
Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. During a computation, when the model updates real nodes, the virtual nodes are also updated, but this does not affect the accuracy of the model.
“This method is very efficient in coding; we just generate a few more nodes in the GNN. Their physical location doesn’t matter, and the real nodes don’t even know that the virtual nodes exist,” Chotrattanapituk said.
Eliminate Complexity
Because VGNN has virtual nodes that represent phonons, it can omit many complex calculations when estimating phonon dispersion relations, making it a more efficient method than standard GNNs.
The researchers proposed three different versions of VGNN of increasing complexity, each of which can be used to predict phonons directly from a material’s atomic coordinates.
The approach has the flexibility to rapidly model high-dimensional properties and can therefore be used to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are particularly difficult to model using traditional approaches.
The researchers also found that VGNN offered slightly higher accuracy in predicting the heat capacity of materials: in some cases, their technique reduced prediction errors by two orders of magnitude.
Li says that with VGNN, a personal computer can calculate the phonon dispersion relations for thousands of materials in just a few seconds.
This efficiency could allow scientists to explore a larger space when searching for materials with specific thermal properties, such as superior heat storage, energy conversion or superconductivity.
Moreover, the virtual node technique is not limited to phonons but can also be used to predict challenging optical and magnetic properties.
In the future, the researchers hope to improve the technique and make it more sensitive to picking up small changes that may affect the phonon structure of the virtual nodes.
“Researchers are so used to representing atoms using graph nodes, but we can rethink that. A graph node can be anything, and virtual nodes are a very general approach that can be used to predict many high-dimensional quantities,” Li says.
“The authors’ innovative approach significantly enhances graph neural network descriptions of solids by incorporating key physics elements through virtual nodes, for example informing wavevector-dependent band structures and dynamical matrices,” said Olivier Delaire, an associate professor in Duke’s Thomas Lord School of Mechanical Engineering and Materials Science, who was not involved in the work. “The level of acceleration in predicting complex phonon properties is remarkable, orders of magnitude faster than state-of-the-art universal machine learning interatomic potentials. Remarkably, the advanced neural net captures fine features and obeys the laws of physics. There is great potential to extend the model to describe other important materials properties, such as electronic, optical, and magnetic spectra and band structures.”
This research is supported by the U.S. Department of Energy, the National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and Oak Ridge National Laboratory.