The science of data collection

Kurt Biery

Kurt Biery, head of the Real-Time Software Infrastructure Department, wrote this column.

Whether an experiment at Fermilab is carried out through a short exposure of a prototype detector to a particle beam or a multiyear run of a large detector with hundreds of thousands of data taking channels, they all need data acquisition systems to collect their data.

The primary purpose of such a system is to convert the signals from the detector components into digital data, organize the data into well-defined blocks, select subsets of the data that pass specific acceptance criteria and store the data on disk for later storage and analysis.

In some experiments, the data acquisition needs are modest, but many experiments have technically challenging requirements, such as transferring tens of gigabytes of data per second and selecting the 0.1 percent of the data that contains the interesting physics.

Engineers and software developers in the Scientific Computing Division work with experiment collaborators, as well as engineers, developers and scientists in the Particle Physics Division, to develop the DAQ systems needed for today’s high-energy physics and cosmology experiments. SCD engineers develop hardware and firmware to read out the data and provide the precise timing information needed to correlate data from different parts of the detector. SCD software developers write applications and tools to transfer and filter the data and to control, configure and monitor the performance of the detector and the DAQ system.

As a result of technology improvements, today’s DAQ systems can be designed almost entirely from commercial off-the-shelf hardware. Multiple levels of custom electronics can be consolidated into a single level of processors. This fits well with the needs of experiments that require a DAQ system in which all of the data is streamed into a cluster of computers. Software filters are run on those computers to select the events of interest.

Software frameworks provide reusable core functionality and allow experimenters to create components that perform experiment-specific functions, significantly reducing the effort needed to build software for an experiment’s DAQ system. Two such frameworks that have been developed within SCD are the artdaq data acquisition toolkit and the art event analysis framework. artdaq provides necessary DAQ functionality and allows experimenters to develop the software that configures and reads out their particular detector components. art is used for the offline analysis on many experiments at Fermilab, and it is included in artdaq to provide data analysis and filtering in the online environment.

Members of SCD contribute to data acquisition development on the DarkSide-50, LArIAT, LBNF, MicroBooNE, Mu2e and NOvA experiments. NOvA, which started taking data this year, uses custom hardware and software DAQ components. DarkSide-50 and LArIAT use commercial DAQ hardware and the artdaq toolkit. Mu2e, which will be constructed over the next five to seven years, will use commercial DAQ hardware and the artdaq toolkit. The Mu2e DAQ will take advantage of improvements in commercial hardware while still providing the high data rates that are needed.

The work on these experiments shows the evolution of our approach to DAQ systems, providing state-of-the art systems to current and future experiments at Fermilab.