The U.S. Department of Energy recently announced $13.7 million in funding for nine research projects that will advance the state of the art in computer science and applied mathematics. One of the recipients of this funding, Fermilab scientist Nhan Tran will lead a project to explore methods for programming custom hardware accelerators for streaming compression.
big data
From UKRI, Feb. 22, 2021: UKRI scientists are developing vital software to exploit the large data sets collected by the next-generation experiments in high-energy physics. The new software will have the capability to crunch the masses of data that the LHC at CERN and next-generation neutrino experiments, such as the Fermilab-hosted Deep Underground Neutrino Experiment, will produce this decade.
The prodigious amount of data produced at the Large Hadron Collider presents a major challenge for data analysis. Coffea, a Python package developed by Fermilab researchers, speeds up computation and helps scientists work more efficiently. Around a dozen international LHC research groups now use Coffea, which draws on big data techniques used outside physics.
From MIT News, Aug. 19, 2019: A new prototype machine-learning technology co-developed by Fermilab and MIT scientists speeds Large Hadron Collider data processing by up to 175 times over traditional methods.
A new machine learning technology tested by Fermilab scientists and collaborators can spot specific particle signatures among an ocean of LHC data in the blink of an eye, much faster than standard methods. Sophisticated and swift, its performance gives a glimpse into the game-changing role machine learning will play in making future discoveries in particle physics as data sets get bigger and more complex.