Modeling at multi-scales, to better understand strongly correlated materials in massively parallel ways
Project lead: Mark Jarrell, Ph.D., University of Cincinnati
Research title: Next generation multiscale quantum simulations software for strongly correlated materials
Funding source: U.S. Department of Energy, Scientific Discovery through Advanced Computing
“This project will advance researchers’ understanding, simulation and design of magnetic materials and superconductors for energy and national security applications, as well as basic research applications.” – Dr. Mark Jarrell
Materials are fundamental to all forms of technology; the examples are all around us. Magnetic Resonance Imaging machines, supercolliders and novel electronic computer circuits illustrate just a few ways that scientists and industries have leveraged the unique properties of strongly correlated materials, a wide class of materials with unusual electronic and magnetic properties.
However, while these materials hold great technological promise, they are poorly understood.
That should change, with the application of current research by an interdisciplinary team of investigators from the University of Cincinnati, Oak Ridge National Laboratory and the University of California-Davis. Led by Mark Jarrell, Ph.D., a physicist at the University of Cincinnati, the team of computational physicists, applied mathematicians and computer scientists is developing a massively parallel, multi-scale method to study strongly correlated materials.
“We are working towards developing better materials with greater functionality,” said Dr. Jarrell, “whether it’s for power transmission through superconductors or faster computer chips. We want this work to lead to materials that companies such as Goodyear, Battelle or Procter & Gamble can use to develop better technology for areas such as electronic devices, medical science — even magnetic levitation trains.”
Strongly correlated materials include lanthanide and actinide heavy Fermion materials, transition metal oxides, high-temperature superconductors and highdensity ferromagnets. These highly intricate materials display different characteristics at different levels, or length scales.
“Our goal is to develop computational methods that separate a material’s correlations by length scale,” Dr. Jarrell said. “These algorithms will enable physicists, for the first time, to accurately study the complex interactions of strongly correlated materials.”
At the smallest length scale, they are looking at atomic particles, such as electrons, where a change in the electrical or magnetic properties of one single electron, at one point in time, will affect a neighboring electron’s electrical or magnetic behavior. These short length scales are considered “strongly correlated,” and are calculated by numerically exact Quantum Monte Carlo simulations.
The second segment evaluates the material at the other end of the spectrum, where the interaction between particles is weak. Physicists have long used a mathematical approach called dynamic mean field approximation to evaluate these “long length” scales.
It’s the space in the middle, the intermediate length scales, that could break open investigations of strongly correlated materials. Before, it was mathematically overwhelming to evaluate the collection of 10 or so atoms. Because of a variety of factors, the equations that looked at the interactions of this missing middle quickly scaled off the charts — it would have been impossible for even the most advanced supercomputers to crunch the numbers in a realistic timeframe.
However, today’s trend toward petascale computing, combined with new mathematical equations and codes, holds the promise of fundamental discovery in the study of complex correlated materials. To evaluate the intermediate length scales, Dr. Jarrell and his team are tapping the power of massively parallel supercomputers by using a Feynman diagrammatic approach called the “parquet equations.”
“Our approach, from the treatment of the correlations at intermediate length scales to the integration of the three length scales, is completely new,” Dr. Jarrell said. “Resources at the Ohio Supercomputer Center have helped us both qualify and quantify the method. The codes we are developing will allow us to take advantage of the next generation of supercomputers, where the use of ten thousand processors — or more — will be common.