Three Fermilab scientists awarded $17.5 million in SciDAC funding

Three Fermilab-led collaborations have been awarded a combined $17.5 million by the U.S. Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program. Researchers James Amundson and James Kowalkowski, whose awards will be disbursed over five years, and researcher Giuseppe Cerati, who received a three-year award, will use the funds to support collaborations with external partners in computer science and applied mathematics to address problems in high-energy physics with advanced computing solutions.

SciDAC is a partnership between the DOE Office of Advanced Scientific Computing Research and the Office of High-Energy Physics.

The awards mark the fourth consecutive cycle of successful, SciDAC-funded proposals from Fermilab scientists. The series of computational collaborations has enabled Fermilab to propose progressively more sophisticated projects. One, an accelerator simulation project, builds directly on previous SciDAC-funded projects. The other two projects, which are new to the SciDAC program, build on research supported by the DOE Computational High-Energy Physics program, as well as on Fermilab’s previous involvement with SciDAC: One focuses on speeding up event reconstruction, and the other focuses on design new data analysis workflows.

“Not only have we had successful projects for the last decade, but we acquired enough expertise that we’re now daring to do things that we wouldn’t have dared before,” said Panagiotis Spentzouris, head of Fermilab’s Scientific Computing Division who was also a lead principal investigator of a successful SciDAC project.

 

This round’s winners

Jim Amundson

James Amundson

SciDAC is enabling Amundson and his team to enhance both the depth and accuracy of simulation software to meet the challenges of emerging accelerator technology.

His project ComPASS4 will do this by first developing integrated simulations of whole accelerator complexes. For example, by simulating the effects of unwanted, emitted radiation, scientists can then ensure the success of PIP-II upgrades by mitigating radiation effects. PIP-II is the lab’s plan for providing powerful, high-intensity proton beams for the international Long-Baseline Neutrino Facility and Deep Underground Neutrino Experiment. The work also supports long-term goals for accelerators now in various stages of development.

“We will be able to study plasma acceleration in much greater detail than currently possible, then combine those simulations with simulations of the produced beam in order to create a virtual prototype next-generation accelerator,” Amundson said. “None of these simulations would have been tractable with current software and high-performance computing hardware.”

Giuseppe Cerati

Giuseppe Cerati

The next generation of high-energy physics experiments, including the Deep Underground Neutrino Experiment, will produce an unprecedented amount of data, which needs to be reconstructed into useful information, including a particle’s energy and trajectory. Reconstruction takes an enormous amount of computing time and resources.

“Processing this data in real time, and even offline, will become unsustainable with the current computing model,” Cerati said. He therefore has proposed to lead an exploration into modern computing architectures to speed up reconstruction.

“Without a fundamental transition to faster processing, we would face significant reductions in efficiency and accuracy, which would have a big impact on an experiment’s discovery potential,” he added.

James Kowalkowski

James Kowalkowski

James Kowalkowski’s group will aim to redefine data management, enhancing optimization procedures to use computing resources in ways that have been unavailable in the past. This means fundamental changes in computational techniques and software infrastructure.

In this new way of working, rather than treating data sets as collections of files, used to transfer chunks of information from one processing or analysis stage to the next, researchers can view data as immediately available and moveable around a unified, large-scale distributed application. This will permit scientists within collaborations to process large portions of collected experimental data in short order — nearly on demand.

“Without the special funding from SciDAC to pull people from diverse backgrounds together, it would be nearly impossible to carry out this work,” Kowalkowski said.