CMS

For the last six years, the mission of U.S. CMS scientists has been, in a phrase, to complete the LHC Phase 1 Upgrades. On May 1, with the successful outcome of the DOE Critical Decision 4 review, the U.S. CMS group fulfilled that mission. We’re proud of all the work we’ve done to upgrade the CMS detector so it can handle the increased luminosity of the Large Hadron Collider.

The CMS collaboration reached a major milestone last week by submitting for publication its 900th paper. Since 2010, CMS has been publishing about 100 papers every year on physics analyses using LHC collision data.

From CERN, May 24, 2019: The CMS collaboration used a large proton–proton collision dataset, collected during the Large Hadron Collider’s second run, to search for instances in which the Higgs boson might transform, or “decay”, into a photon and a massless dark photon.

This March, scientists from around the world gathered in LaThuile, Italy, for the 53rd annual Recontres de Moriond conference, one of the longest running and most prestigious conferences in particle physics. This conference is broken into two distinct weeks, with the first week usually covering electroweak physics and the second covering processes involving quantum chromodynamics. Fermilab and the LHC Physics Center were well represented at the conference.

Fermilab operates the world's largest CMS Tier-1 facility. It provides 115 petabytes of data storage, grid-enabled CPU resources and high-capacity network to other centers. Photo: Reidar Hahn

Data science is one of the world’s fastest growing industries, and as a consequence, a large ecosystem of software tools to enable data mining at ever increasing scales has emerged. Data processing campaigns have distilled the more than 100 petabytes of raw data produced by the CMS experiment to around 10 terabytes. Even this reduced data is still unwieldy for HEP researchers to analyze. Fermilab researchers is currently leading an effort using novel approaches to complete two full CMS analyses.

With the warmth of holiday cheer in the air, some physicists decided to hit the pub after a conference in December 2014 and do what many physicists tend to do after work: keep talking about physics. That evening’s topic of conversation: dark energy particles. The chat would lead to a new line of investigation at the Large Hadron Collider at CERN. Every second, the universe grows a little bigger. Scientists are using the LHC to try to find out why.

Fermilab has enormous, decades-long experience in building silicon detectors. Thanks to an exceptional, cooperative team with levels of experience and capabilities that lead the world, we were able to quickly put together a design for a tracking system that could be used for muon tomography — using muons to see inside solid objects, similar to how we use X-rays. The system won an R&D 100 Award.

Science fiction sometimes borrows from science fact. In the movie “Spider-Man: Into the Spider-Verse,” the writers blended multiverses and alternate realities with the real-world Large Hadron Collider and the Compact Muon Solenoid. In this 6-minute video, Fermilab’s Don Lincoln gives you the low-down on what is real and what is made up.

Machine learning is revolutionizing data analysis. Recent leaps in driverless car navigation and the voice recognition features of personal assistants are possible because of this form of artificial intelligence. As data sets in the Information Age continue to grow, companies are building tools that make machine learning faster and more efficient. Fermilab is taking cues from industry to improve their own “big data” processing challenges.