Hot on the trail of charged particles in the CMS detector

The CMS detector at CERN’s Large Hadron Collider can be thought of as an enormous, high-resolution digital camera that takes snapshots of energetic proton-proton collisions. By analyzing these snapshots, scientists are able to study the laws that govern the interactions of fundamental particles. A key step in deciphering what happened inside a particle collision (known as an “event”) is referred to as reconstruction. Reconstruction algorithms use the patterns of electronic signals generated within the CMS subdetectors to identify the particles produced in the collisions and to measure their properties.

Almost 60% of the reconstruction time is spent on tracking, the process of identifying the trails left by charged particles passing through the highly granular silicon detectors closest to the beam pipe. Tracking plays a crucial role in CMS reconstruction; almost all other reconstruction algorithms rely on tracking as an input. Without tracking, scientists would be unable to accurately reconstruct the momenta of particles such as electrons, muons or particle jets (collimated sprays of particles initiated by quarks and gluons), mitigate the effect of many overlapping proton-proton collisions (“pileup”), or correctly measure missing momentum.

This event display shows two “bunches” of protons colliding in the CMS detector at a center-of-mass energy of 13 TeV. The yellow lines represent the reconstructed particle tracks. This event has approximately 100 overlapping proton-proton interactions (pileup) and was recorded during a dedicated high-pileup data-taking run in 2016. At the HL-LHC, the average number of pileup interactions is expected to be about 200, corresponding to 6,000 charged particles with transverse momentum above 300 MeV per bunch crossing. Image: CERN

The LHC accelerator complex (and the detectors) will soon undergo major upgrades to increase the luminosity by a factor of 10, relative to the LHC design. The High-Luminosity LHC (HL-LHC) will provide exciting physics opportunities but also daunting computing challenges for CMS. The amount of data will increase by a factor of 60, and the average number of pileup interactions per event is expected to increase by a factor of 5. This poses a particular challenge for tracking, because the tracking time increases exponentially with pileup. New tracking solutions are needed to deal with the reconstruction demands of the HL-LHC without sacrificing physics potential.

Researchers at Fermilab, along with collaborators from Cornell University, Princeton University, UC San Diego, and the University of Oregon, are working to speed up the CMS tracking algorithm by taking advantage of modern, highly parallel CPU architectures. The new implementation, referred to as the “mkFit” algorithm and presented in an article published by JINST, is designed to efficiently use multiple CPU cores as well as each core’s vector processing unit, adding the ability to perform SIMD (single instruction, multiple data) operations. When running on a single thread, mkFit is a factor of 6 faster than the nominal CMS algorithm, with comparable physics performance, and significant additional speedups are achieved using multiple threads.

The core of the mkFit algorithm is the Matriplex library, a custom library that efficiently vectorizes the multiplication of small matrices (6×6 elements or fewer). Streamlining of the CMS geometry information and optimized bookkeeping of track candidates provide additional gains. The mkFit collaboration is also exploring code portability solutions such as Kokkos, Alpaka and OpenMP to run reconstruction algorithms efficiently on both CPUs and GPUs.

Neutrino experiments that use liquid-argon time projection chamber, or LArTPC, detectors such as MicroBooNE, ICARUS and DUNE have to cope with similar reconstruction challenges as the size of the detectors increase. Scientists from the Fermilab Scientific Computing Division are applying many of the same tools and techniques learned from mkFit in order to speed up LArTPC reconstruction. Adopting the mkFit approach, the LArTPC hit-finding algorithm has been sped up by a factor of 10 in standalone tests. The new algorithm has been incorporated into LArSoft (the common software framework for LArTPC experiments) and is currently being used by the ICARUS and ProtoDUNE experiments.

Future high-energy physics experiments face unprecedented computing challenges. The mkFit algorithm and LArTPC reconstruction efforts are exciting solutions that promise to help maximize the physics potential of the HL-LHC and DUNE.

Allison Hall is a Fermilab research associate and works on the CMS experiment.

CMS Department communications are coordinated by Fermilab scientist Pushpa Bhat.