PPPL By Raphael Rosen | October 27, 2016
A proposal from scientists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has been chosen as part of a national initiative to develop the next generation of supercomputers. Known as the Exascale Computing Project (ECP), the initiative will include a focus on exascale-related software, applications, and workforce training.
Once developed, exascale computers will perform a billion billion operations per second, a rate 50 to 100 times faster than the most powerful U.S. computers now in use. The fastest computers today operate at the petascale and can perform a million billion operations per second. Exascale machines in the United States are expected to be ready in 2023.
The PPPL-led multi-institutional project, titled “High-Fidelity Whole Device Modeling of Magnetically Confined Fusion Plasmas,” was selected during the ECP’s first round of application development funding, which distributed $39.8 million. The overall project will receive $2.5 million a year for four years to be distributed among all the partner institutions, including Argonne, Lawrence Livermore, and Oak Ridge national laboratories, together with Rutgers University, the University of California, Los Angeles, and the University of Colorado, Boulder. PPPL itself will receive $800,000 per year; the project it leads was one of 15 selected for full funding, and the only one dedicated to fusion energy. Seven additional projects received seed funding.
The application efforts will help guide DOE’s development of a U.S. exascale ecosystem as part of President Obama’s National Strategic Computing Initiative (NSCI). DOE, the Department of Defense and the National Science Foundation have been designated as NSCI lead agencies, and ECP is the primary DOE contribution to the initiative.
The ECP’s multi-year mission is to maximize the benefits of high performance computing (HPC) for U.S. economic competitiveness, national security and scientific discovery. In addition to applications, the DOE project addresses hardware, software, platforms and workforce development needs critical to the effective development and deployment of future exascale systems. The ECP is supported jointly by DOE’s Office of Science and the National Nuclear Security Administration within DOE.
PPPL has been involved with high-performance computing for years. PPPL scientists created the XGC code, which models the behavior of plasma in the boundary region where the plasma’s ions and electrons interact with each other and with neutral particles produced by the tokamak’s inner wall. The high-performance code is maintained and updated by PPPL scientist C.S. Chang and his team.
XGC runs on Titan, the fastest computer in the United States, at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility at Oak Ridge National Laboratory. The calculations needed to model the behavior of the plasma edge are so complex that the code uses 90 percent of the computer’s processing capabilities. Titan performs at the petascale, completing a million billion calculations each second, and the DOE was primarily interested in proposals by institutions that possess petascale-ready codes that can be upgraded for exascale computers.
The PPPL proposal lays out a four-year plan to combine XGC with GENE, a computer code that simulates the behavior of the plasma core. GENE is maintained by Frank Jenko, a professor at the University of California, Los Angeles. Combining the codes would give physicists a far better sense of how the core plasma interacts with the edge plasma at a fundamental kinetic level, giving a comprehensive view of the entire plasma volume.
Leading the overall PPPL proposal is Amitava Bhattacharjee, head of the Theory Department at PPPL. Co-principal investigators are PPPL’s Chang and Andrew Siegel, a computational scientist at the University of Chicago.
The multi-institutional effort will develop a full-scale computer simulation of fusion plasma. Unlike current simulations, which model only part of the hot, charged gas, the proposed simulations will display the physics of an entire plasma all at once. The completed model will integrate the XGC and GENE codes and will be designed to run on exascale computers.
The modeling will enable physicists to understand plasmas more fully, allowing them to predict its behavior within doughnut-shaped fusion facilities known as tokamaks. The exascale computing fusion proposal focuses primarily on ITER, the international tokamak being built in France to demonstrate the feasibility of fusion power. But the proposal will be developed with other applications in mind, including stellarators, another variety of fusion facility. Better predictions can lead to better engineered facilities and more efficient fusion reactors. Currently, support for this work comes from the DOE’s Advanced Science Computing Research program.
“This will be a team effort involving multiple institutions,” said Bhattacharjee. He noted that PPPL will be involved in every aspect of the project, including working with applied mathematicians and computer scientists on the team to develop the simulation framework that will couple GENE with XGC on exascale computers.
“You need a very-large-scale computer to calculate the multiscale interactions in fusion plasmas,” said Chang. “Whole-device modeling is about simulating the whole thing: all the systems together.”
Because plasma behavior is immensely complicated, developing an exascale computer is crucial for future research. “Taking into account all the physics in a fusion plasma requires enormous computational resources,” said Bhattacharjee. “With the computer codes we have now, we are already pushing on the edge of the petascale. The exascale is very much needed in order for us to have greater realism and truly predictive capability.”
PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.