It is estimated that approximately 70 percent of the energy generated worldwide ends up as waste heat.
If scientists could better predict how heat travels through semiconductors and insulators, they could design more efficient power-generating systems. However, the thermal properties of materials can be extremely difficult to model.
The problem comes from phonons, which are subatomic particles that carry heat. Some of a material's thermal properties depend on a measurement called the phonon dispersion relation, which can be incredibly difficult to obtain, let alone use in system design.
A team of researchers from MIT and elsewhere tackled this challenge by rethinking the problem from scratch. The result of their work is a new machine learning framework that can predict phonon dispersion relations up to 1,000 times faster than other ai-based techniques, with comparable or even better accuracy. Compared to more traditional non-ai-based approaches, it could be a million times faster.
This method could help engineers design power generation systems that produce more energy more efficiently. It could also be used to develop more efficient microelectronics, as heat management remains a major hurdle to speeding up electronics.
“Phonons are the culprits of thermal loss, but obtaining their properties is notoriously difficult, either computationally or experimentally,” says Mingda Li, associate professor of nuclear science and engineering and senior author of a paper on the technique.
Li is joined on the paper by co-lead authors Ryotaro Okabe, a graduate student in chemistry; and Abhijatmedhi Chotrattanapituk, a graduate student in electrical engineering and computer science; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; as well as others at MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. The research it appears in Nature Computational Science.
Phonon prediction
Heat-carrying phonons are difficult to predict because they have an extremely wide frequency range and the particles interact and travel at different speeds.
The phonon dispersion relation of a material is the ratio of the energy to the momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but there are so many high-precision calculations involved that the models get bogged down.
“If you have 100 CPUs and a few weeks, you can probably calculate the phonon dispersion relation for a material. The whole community wants a more efficient way to do that,” Okabe says.
The machine learning models that scientists typically use for these calculations are known as graph neural networks (GNNs). A GNN converts the atomic structure of a material into a crystalline graph comprising multiple nodes, representing atoms, connected by edges, which represent the interatomic bonding between atoms.
While GNNs work well for calculating many quantities, such as magnetization or electric polarization, they are not flexible enough to efficiently predict an extremely high-dimensional quantity such as the phonon dispersion relation. Because phonons can travel around atoms in the x, Y, and Z axes, their momentum space is difficult to model with a fixed graph structure.
To gain the flexibility they needed, Li and his collaborators came up with virtual nodes.
They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes allow the output of the neural network to vary in size, so it is not restricted by the fixed crystal structure.
Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. While virtual nodes will be updated as the model updates real nodes during computation, they do not affect the accuracy of the model.
“The way we do it is very efficient in coding. You just generate a few more nodes in the overall neural network. The physical location doesn’t matter and the real nodes don’t even know that the virtual nodes are there,” Chotrattanapituk says.
Removing the complexity
Since it has virtual nodes to represent phonons, the VGNN can omit many complex calculations when estimating phonon dispersion relations, making the method more efficient than a standard GNN.
The researchers proposed three different versions of VGNN with increasing complexity. Each of them can be used to predict phonons directly from the atomic coordinates of a material.
Because their method has the flexibility to rapidly model high-dimensional properties, they can use it to estimate phonon dispersion relations in alloy systems. These complex combinations of metals and nonmetals are especially challenging for traditional modeling methods.
The researchers also found that VGNNs offered slightly higher accuracy when predicting a material's thermal capability. In some cases, prediction errors were two orders of magnitude lower using this technique.
Li says a VGNN could be used to calculate phonon dispersion relations for a few thousand materials in just a few seconds with a personal computer.
This efficiency could allow scientists to search a larger space when looking for materials with certain thermal properties, such as superior thermal storage, energy conversion, or superconductivity.
Furthermore, the virtual node technique is not unique to phonons and could also be used to predict challenging optical and magnetic properties.
In the future, the researchers want to refine the technique so that the virtual nodes have greater sensitivity to capture small changes that can affect the structure of phonons.
“Researchers have become too accustomed to using graph nodes to represent atoms, but we can rethink that idea. Graph nodes can be anything, and virtual nodes are a very generic approach that can be used to predict many high-dimensional quantities,” Li says.
“The authors’ innovative approach significantly augments the description of solids by graph neural networks by incorporating key physics-based elements via virtual nodes, for example by reporting wavevector-dependent band structures and dynamic matrices,” says Olivier Delaire, an associate professor in the Thomas Lord Department of Mechanical Engineering and Materials Science at Duke University, who was not involved in this work. “I think the level of speedup in predicting complex phonon properties is astonishing — several orders of magnitude faster than a state-of-the-art universal interatomic machine learning potential. It’s impressive that the advanced neural network captures fine features and obeys physical rules. There is great potential to expand the model to describe other important materials properties — electronic, optical, and magnetic spectra and band structures come to mind.”
This work is supported by the U.S. Department of Energy, the National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and Oak Ridge National Laboratory.