With the warmth of holiday cheer in the air, some physicists decided to hit the pub after a conference in December 2014 and do what many physicists tend to do after work: keep talking about physics. That evening’s topic of conversation: dark energy particles. The chat would lead to a new line of investigation at the Large Hadron Collider at CERN. Every second, the universe grows a little bigger. Scientists are using the LHC to try to find out why.
LHC
If you want to visit the Pasner family farm, you’ll need a truck with four-wheel drive. You’ll need to traverse 4 miles of bumpy dirt road deep into the countryside of Penn Valley, California. But once you arrive, you’ll be greeted by fields of organic onions and garlic, nestled between rolling grassy hills speckled with oak trees. For physicist Jake Pasner, this will always be home.
From The New York Times, Dec. 21, 2018: The largest machine ever built is shutting down for two years of upgrades. Take an immersive tour of the collider and study the remnants of a Higgs particle in augmented reality.
From Live Science, Dec. 23, 2018: Fermilab scientist Don Lincoln summarizes the past, present and future of research at the Large Hadron Collider.
During the last four years, LHC scientists have filled in gaps in our knowledge and tested the boundaries of the Standard Model. Since the start of Run II in March 2015, they’ve recorded an incredible amount of data —five times more than the LHC produced in Run I. The accelerator produced approximately 16 million billion proton-proton collisions — about one collision for every ant currently living on Earth.
During the short heavy-ion run at the Large Hadron Collider at CERN, every moment counts. As one scientist puts it, experimenters have “four weeks to collect all the data we will use for the next three years.” The data arising from LHC’s collisions of heavy nuclei, such as lead, will be used to study the properties of a very hot and dense subatomic material called the quark-gluon plasma.
From 9 to 5 Google, Nov. 15, 2018: The LHC’s massive physics experiments will require computing capacity that is an estimated 50-100 times higher than today. Google finds the challenge exciting and has already been working with Fermilab and Brookhaven National Laboratory to store and analyze data from the LHC using the Google Computer Engine.
From the Pittsburgh Computing Center, Oct. 10, 2018: Fermilab’s Dirk Hufnagel is quoted in this piece on the Pittsburgh Supercomputing Center now supplying computation for the LHC. Fermilab scientists working on the CMS experiment, in collaboration with the Open Science Grid, have begun analyzing data LHC data using PSC’s Bridges supercomputer.