CMS

A pioneer in particle physics and high-performance computing, Fermilab has launched HEPCloud, a cloud computing service that will enable the lab’s demanding experiments to make the best, most efficient use of computing resources. This flagship project lets experiments rent computing resources from external sources during peak demand, reducing the costs of providing for local resources while also providing failsafe redundancy.

From CERN, May 24, 2019: The CMS collaboration used a large proton–proton collision dataset, collected during the Large Hadron Collider’s second run, to search for instances in which the Higgs boson might transform, or “decay”, into a photon and a massless dark photon.

With the warmth of holiday cheer in the air, some physicists decided to hit the pub after a conference in December 2014 and do what many physicists tend to do after work: keep talking about physics. That evening’s topic of conversation: dark energy particles. The chat would lead to a new line of investigation at the Large Hadron Collider at CERN. Every second, the universe grows a little bigger. Scientists are using the LHC to try to find out why.

Top quark couture

The mentorship of a scientist on the CMS experiment meant everything to Evan Coleman, a former physics undergraduate at Brown University. What do you give a physicist who helped discover a fundamental particle and jump-started your science career? Something individual, artistic and science-themed.

View of Large Hadron Collider

During the last four years, LHC scientists have filled in gaps in our knowledge and tested the boundaries of the Standard Model. Since the start of Run II in March 2015, they’ve recorded an incredible amount of data —five times more than the LHC produced in Run I. The accelerator produced approximately 16 million billion proton-proton collisions — about one collision for every ant currently living on Earth.

From the Pittsburgh Computing Center, Oct. 10, 2018: Fermilab’s Dirk Hufnagel is quoted in this piece on the Pittsburgh Supercomputing Center now supplying computation for the LHC. Fermilab scientists working on the CMS experiment, in collaboration with the Open Science Grid, have begun analyzing data LHC data using PSC’s Bridges supercomputer.