Large Hadron Collider

The year 2018 will be remembered as a very eventful year for CMS as a whole and especially for the Fermilab group. Thanks to excellent accelerator performance, the LHC delivered much more proton-proton collision data than anticipated, making the LHC Run 2 a very successful data-taking period. Being at the very core of the detector operations and computing, the Fermilab group was key in ensuring that a large and high quality data set was collected for searches and precision measurements.

A proton describes its final moments in the Large Hadron Collider. During its second run, between 2015 and 2018, the Large Hadron Collider at CERN collided about 16 million billion particle pairs. This 3-minute animation is the story of one of them.

During the last four years, LHC scientists have filled in gaps in our knowledge and tested the boundaries of the Standard Model. Since the start of Run II in March 2015, they’ve recorded an incredible amount of data —five times more than the LHC produced in Run I. The accelerator produced approximately 16 million billion proton-proton collisions — about one collision for every ant currently living on Earth.

During the short heavy-ion run at the Large Hadron Collider at CERN, every moment counts. As one scientist puts it, experimenters have “four weeks to collect all the data we will use for the next three years.” The data arising from LHC’s collisions of heavy nuclei, such as lead, will be used to study the properties of a very hot and dense subatomic material called the quark-gluon plasma.

From the Pittsburgh Computing Center, Oct. 10, 2018: Fermilab’s Dirk Hufnagel is quoted in this piece on the Pittsburgh Supercomputing Center now supplying computation for the LHC. Fermilab scientists working on the CMS experiment, in collaboration with the Open Science Grid, have begun analyzing data LHC data using PSC’s Bridges supercomputer.

A specialized measuring machine at SLAC is helping scientists build precise detectors for the ATLAS experiment.