Panagiotis Spentzouris, head of the Scientific Computing Division, wrote this column.
Massive particle detectors are the hallmark of neutrino physics research. A less visible but equally essential element in unraveling the mysteries of these elusive particles, and ultimately of the universe itself, is the set of intricate software systems that collect and analyze the data from those detectors. Through the Scientific Computing Division, Fermilab supports a wide range of such software tools to assist the neutrino program in this quest.
Our software portfolio includes various tools that SCD reuses and integrates into flexible end-to-end solutions for the Fermilab neutrino experiments. This approach allows scientists, who often participate in more than one experiment, to operate within similar and familiar software environments.
artdaq-based data acquisition (DAQ) systems have been developed for the DUNE 35-ton prototype and the LArIAT test beam experiment. These systems are currently being used in detector and electronics testing and are well-positioned for production data taking later this year. By similarly extending capabilities and customizing configurations as needed, scientists can integrate artdaq to serve as the core of many different DAQ systems. In addition to artdaq, SCD members, working in close collaboration with the experimenters, have developed custom software components for the MicroBooNE and NOvA DAQ systems.
The art framework, used by the DUNE prototype, LArIAT, MicroBooNE and NOvA experiments, supports offline event reconstruction and simulation. art is also used for online event filtering in several of these experiments and for online, or “nearline,” monitoring in all of them.
The software that makes up the LArSoft suite exploits the common properties of liquid-argon time projection chambers to provide shared data structures, algorithms and tools for the simulation, reconstruction and analysis of data from these detectors. SCD partners with the experiments to produce, test and distribute the software and to ensure that the code works for all of the detectors. By providing a means to leverage effort and expertise across participating experiments, LArSoft dramatically reduces the cost of code development. This, in turn, enables scientists to focus on other areas. DUNE, LArIAT, MicroBooNE and SBND all make use of liquid-argon time projection chambers, and all use and participate in the development of LArSoft.
Fabric for Frontier Experiments (FIFE) provides a collaborative platform where experiment offline computing groups and the core SCD support teams deploy distributed computing solutions for simulation and event processing using grid, cloud and, soon, high-performance parallel computing. FIFE software provides the glue: integration, deployment and monitoring of the various software components developed at Fermilab. This includes art, artdaq and packages from other sources such as the LHC experiments and Open Science Grid. The software covers workload scheduling and data management, collaborative tools and the associated database applications.
Computing and software are an increasingly complex and resource-intensive part of building and operating the current and future generation of neutrino experiments. artdaq, art, LArSoft, FIFE and other reusable software tools are important components of the lab’s Neutrino Platform. They provide cost-effective solutions that experiments can adopt, adapt and rely on to provide the capabilities they need in a familiar environment while ensuring sustainable support across their lifetime.
Editor’s note: The art and LArSoft development teams will present a one-week course on the use of art and LArSoft from Aug. 3-7 at Fermilab. The goal of the course is to help newcomers to art and LArSoft along the path to becoming able to contribute to their experiment’s code or to the shared LArSoft software. For more information, visit the course wiki.