Princeton Research Computing Jun 15, 2018
In scientific research, results don’t just have to be good—they have to be reproducible. Members of the PICSciE-affiliated research team are achieving both, as they apply a deep-learning AI approach to the problem of disruption forecasting in tokamak fusion plasmas.
In the race to make human-engineered fusion a feasible source of unlimited clean energy, one major challenge is how to predict and mitigate the very damaging disruption events in fusion-grade plasmas. Without enough predictive lead time (with “enough” measured here in milliseconds), hot charged plasma escaping magnetic confinement can result in rapid and costly cooling and termination of energy production in multibillion-dollar thermonuclear reactors.
Using vast amounts of data on disruptions provided by the UK-based Joint European Torus (JET), Princeton physicist Bill Tang and his colleagues—working on Princeton University’s TIGER cluster of modern GPUs—conducted the first “deep learning” tests with software they developed to markedly improve the speed and accuracy with which scientists can predict and respond to the first signs of disruptions. (Deep learning represents an exciting new avenue toward the prediction of disruptions, explains Tang, allowing the computer to analyze immense quantities of multi-dimensional data, searching for meaningful patterns.)
Not only did it work, but the success of the software, called the Fusion Recurrent Neural Network, or FRNN code, justified its further testing: first on 6,000 Oak Ridge National Laboratory’s Titan supercomputer (Tesla K20 GPUs) and, more recently, on Japan’s new TSUBAME-3 supercomputer at the Tokyo Institute of Technology, where the code demonstrated the ability to scale to over 1,000 NVIDIA Tesla P-100 GPUs. The exciting thing, notes Tang, a principal research physicist at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory and Lecturer with Rank and Title of Professor in the Department of Astrophysicsal Sciences at Princeton University, is that results with FRNN have continued to show excellent scaling with the number of GPUs engaged.
These promising results also led to Tang winning the 2018 Global Impact Award from NVIDIA Corp., the leading producer of graphics processing units (GPUs), for the work he and his team carried out involving artificial intelligence (AI) computing. This highly competitive $100,000 award was one of two presented at the NVIDIA national GPU technology conference in San Jose, California in March, 2018.
The win provides not only material support for ongoing research by Tang and his team, but also recognition of the role played by Princeton Institute for Computational Science and Engineering (PICSciE) in facilitating leading-edge interdisciplinary collaborations that leverage the expertise of faculty and researchers from diverse backgrounds to address new and relevant computational problems.
In this way, says Tang, PICSciE has contributed to this advancement in increasing the speed and accuracy of disruption predictions. The next goal? The researchers envision improving their code to the point where it would contribute to software enabling control systems to avoid plasma disruptions in current experimental facilities and in the near future at ITER, the $25 billion international burning plasma fusion facility under construction in France.
The targeted goal in ITER of over ten times greater than “break-even-like (energy out = energy in”) conditions currently achieved at JET, explains Tang, could be compared to the Wright brothers getting their plane airborne at Kitty Hawk. The challenge, he adds, is to scale up the process to commercial proportions, in an efficient, economical, and environmentally benign way. “The subsequent impact of aviation on transportation was, of course, monumental, and the delivery of clean fusion energy could well have even greater beneficial consequences for the world.”