At CERN’s Large Hadron Collider, as many as 40 million particle collisions occur within the span of a single second inside the CMS particle detector’s more than 80 million detection channels. These collisions create an enormous digital footprint, even after computers winnow it to the most meaningful data. The simple act of retrieving information can mean battling bottlenecks.
CMS physicists at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, which stores a large portion of LHC data, are now experimenting with the use of NVMe, or nonvolatile memory express, solid-state technology to determine the best way to access stored files when scientists need to retrieve them for analysis.
The trouble with terabytes
The results of the CMS experiment at CERN have the potential to help answer some of the biggest open-ended questions in physics, such as why there is more matter than antimatter in the universe and whether there are more than three physical dimensions.
Before scientists can answer such questions, however, they need to access the collision data recorded by the CMS detector, much of which was built at Fermilab. Data access is by no means a trivial task. Without online data pruning, the LHC would generate 40 terabytes of data per second, enough to fill the hard drives of 80 regular laptop computers. An automated selection process keeps only the important, interesting collisions, trimming the number of saved events from 40 million per second to just 1,000.

When scientists need to access the stored files to perform analyses, a long robotic arm descends from the ceiling, selects a tape, and transfers the data it stores to a hard drive. Photo: Reidar Hahn, Fermilab
“We care about only a fraction of those collisions, so we have a sequence of selection criteria that decide which ones to keep and which ones to throw away in real time,” said Fermilab scientist Bo Jayatilaka, who is leading the NVMe project.
Still, even with selective pruning, tens of thousands of terabytes of data from the CMS detector alone have to be stored each year. Not only that, but to ensure that none of the information ever gets lost or destroyed, two copies of each file have to be saved. One copy is stored in its entirety at CERN, while the other copy is split between partnering institutions around the world. Fermilab is the main designated storage facility in the U.S. for the CMS experiment, with roughly 40% of the experiment’s data files stored on tape.
A solid-state solution
The Feynman Computing Center at Fermilab houses three large data libraries filled with rows upon rows of magnetic tapes that store data from Fermilab’s own experiments, as well as from CMS. If you were to combine all of Fermilab’s tape storage capacity, you’d have roughly the capability to store the equivalent of 13,000 years’ worth of HD TV footage.
“We have racks full of servers that have hard drives on them, and they are the primary storage medium that scientists are actually reading and writing data to and from,” Jayatilaka said.
But hard drives — which have been used as storage devices in computers for the last 60 years — are limited in the amount of data they can load into applications in a given time. This is because they load data by retrieving it from spinning disks, which is the only point of access for that information. Scientists are investigating ways to implement new types of technology to help speed up the process.
To that end, Fermilab recently installed a single rack of servers full of solid-state NVMe drives at its Feynman Computing Center to speed up particle physics analyses.
Generally, solid state drives use compact electrical circuits to quickly transfer data. NVMe is an advanced type of solid-state drive that can handle up to 4,000 megabytes per second. To put that into perspective, the average hard drive caps at around 150 megabytes per second, making solid-state the obvious choice if speed is your main goal.
But hard drives haven’t been relegated to antiquity just yet. What they lack in speed, they make up for in storage capacity. At present, the average storage limit in solid-state drives is 500 gigabytes, which is the minimum amount of storage you’d commonly find available on modern hard drives. Determining whether or not Fermilab should replace more of their hard drive memory storage with solid-state drives will thus require a careful analysis of cost and benefits.
Undertaking an analysis
When researchers analyze their data using large computer servers or supercomputers, they typically do so by sequentially retrieving portions of that data from storage, a task well-suited for hard drives.
“Up until now, we’ve been able to get away with using hard drives in high-energy physics because we tend to handle millions of events by analyzing each event one at a time,” Jayatilaka said. “So at any given time, you’re asking for only a few pieces of data from each individual hard drive.”

In an attempt to speed up analyses in high-energy physics research, Fermilab recently installed a single rack of servers full of solid state drives called NVMe. Photo: Bo Jayatilaka, Fermilab
But newer techniques are changing the way scientists analyze their data. Machine learning, for example, is becoming increasingly common in particle physics, especially for the CMS experiment, where this technology is responsible for the automated selection process that keeps only the small fraction of data scientists are interested in studying.
But instead of accessing small portions of data, machine learning algorithms need to access the same piece of data repeatedly — whether it’s stored on a hard drive or solid-state drive. This wouldn’t be much of a problem if there were only a few processors trying to access that data point, but in high-energy physics calculations, there are thousands of processors that are vying to access that data point simultaneously.
This can quickly cause bottlenecking and slow speeds when using traditional hard drives. The end result is slower computing times.
Fermilab researchers are currently testing NVMe technology for its ability to reduce the number of these data bottlenecks.
The future of computing at Fermilab
Fermilab’s storage and computing power are much more than just a powerhouse for the CMS experiment. The CMS computing R&D effort is also setting the foundations for the success of the upcoming High-Luminosity LHC program and enabling the international, Fermilab-hosted Deep Underground Neutrino Experiment, both of which will start taking data in the late 2020s.
Jayatilaka and his team’s work will also allow physicists to prioritize where NVMe drives should be primarily located, whether at Fermilab or at other LHC partner institutions’ storage facilities.
With the new servers in hand, the team is exploring how to deploy the new solid-state technology in the existing computing infrastructure at Fermilab.
The CMS experiment and scientific computing at Fermilab are supported by the DOE Office of Science.
Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.
All summer long, progress on preparing the Fermilab site for the construction of the Long-Baseline Neutrino Facility has been proceeding at a healthy clip. Now, as summer winds down, that site prep is nearing completion.
LBNF provides the infrastructure that houses and supports the international Deep Underground Neutrino Experiment, hosted by Fermilab. DUNE will comprise a giant neutrino detector a mile underground at the Sanford Underground Research Facility in South Dakota and a large one located on the Fermilab site in Illinois. The development of the latter, known as the “near site,” involves clearing an area currently dense with accelerator-related utilities and other infrastructure.
The site prep work included clearing the site of conflicting infrastructure, including relocating utilities such as power and communications; realigning a small stream called Indian Creek; and replacing an accelerator cooling pond with a cooling tower. The work started last winter, and most of it was completed during the annual summer maintenance shutdown of the Main Injector. The remainder of the prep work will be complete before the lab’s accelerator complex turns on again, and the entire prep work is on schedule to be finished before the end of the year.
View the gallery below for a sample of the busy summer the crews have had preparing Fermilab for LBNF.
The Long-Baseline Neutrino Facility is supported by the Department of Energy Office of Science.
Kate Sienkiewicz is the LBNF Near Site Conventional Facilities project manager at Fermilab.
From light bulbs to cell phones, all electronic devices in everyday life rely on the flow of electrons to function. Just as scientists use meters to describe the length of an object or seconds to measure the passage of time, they use amperes, or amps, to quantify electric current — the rate at which electric charge moves through a circuit.
In everyday life, you can safely use a hair dryer or toaster without knowing exactly how many electrons are flowing through it every second. But researchers on the frontiers of physics must have a precise definition of the ampere to detect when experiments unexpectedly deviate from theoretical predictions.
“As technology progresses, lots of measurements that we could not do before become available, and then you can have extremely high-precision measurements,” said Fermilab scientist Javier Tiffenberg. “So you want to have a definition of the unit that is much more precise than whatever you are trying to measure.”
For decades, scientists have struggled to achieve the necessary precision for the ampere. But now, a device called the skipper CCD, developed by Tiffenberg and his collaborators at Fermilab and the Lawrence Berkeley National Laboratory Microsystems Lab, could spark an advance in measurement science.

Fermilab scientist Javier Tiffenberg examines a skipper CCD in a cleanroom at Fermilab’s Silicon Detector Facility. Photo: Reidar Hahn, Fermilab
Counting electrons, one by one
Two current-carrying wires exert a force on each other that depends on the distance between the wires as well as the value of the current. Until recently, 1 amp was defined to be the current that would cause two infinitely long wires placed parallel to each other a meter apart to experience a force of exactly 0.2 millionths of a newton per meter of length.
But that definition troubled the scientific community — an experiment requiring infinitely long wires is impossible to perform. Other base units also had unsatisfactory definitions: For example, the kilogram was defined to be the mass of a particular metal cylinder in a vault near Paris. So in 2019, the General Conference on Weights and Measures adopted new definitions for four of the seven base units of the International System of Units, or SI, including the kilogram and the ampere.
“Now the idea is to link all the units to fundamental constants of the universe,” Tiffenberg said. “In the case of the ampere, the link is done through the charge of the electron.”
Yet one problem remains: The charge of a single electron is minuscule. Under the new definition, the current generated by a single electron passing a given point each second is exactly 1.602176634×10-19 amps, or less than 2 tenths of a billionth of a billionth of an amp. Many experts say that an instrument to calibrate the definition of the ampere must generate a current of at least 1 microamp, or 1 millionth of an amp, while counting individual electrons — trillions of them every second. No such device exists yet.
Enter Fermilab’s skipper charge-coupled device, which builds on improvements made in the 1990s to standard CCDs. Pixels connected in a grid store the electrons produced when light hits them. Then the electrons are shuttled to a detector that measures the charge contained in each pixel.
Widely used in digital cameras and scientific instruments, standard CCDs can measure the charge in each pixel only once before losing the information. Skipper CCDs, on the other hand, can measure each pixel repeatedly at a rate of 100 times per millisecond. This allows skipper CCDs, unlike standard ones, to count individual electrons.
“Because these measurements are independent, just by taking many, many samples and averaging them, you are able to reduce the uncertainty on how much charge was sitting in the pixel,” explained Tiffenberg, who won the 2021 New Horizons in Physics Prize and the 2020 URA Early Career Award for his work on skipper CCDs. “In principle, you can reduce this to a number that is arbitrarily small. We have done this to uncertainty levels of 0.06 electrons.”
Tiffenberg and his collaborators began the skipper CCD project with the goal of detecting dark matter, the mysterious substance that makes up about 85 percent of the matter in the universe. Some theories predict that collisions with lightweight dark matter particles would cause individual electrons to recoil, which a skipper CCD could detect with extreme precision.
Now that the ampere is defined in terms of single electrons, researchers at Fermilab are working to scale up skipper CCD technology to reach the current needed for a successful calibration of the definition.
“I’m not saying this is going to be easy, but there’s no theoretical limitation,” said Guillermo Fernandez Moroni, a postdoc at Fermilab working on skipper CCDs.

Unlike standard charge-coupled devices, the skipper CCD pictured here can detect individual electrons. As a result, skipper CCD technology could enable precise calibrations for measurements of electric current. Photo: Miguel Sofo Haro, Bariloche Atomic Center
Building a larger current source
In the 2019 redefinition of the SI units, the General Conference on Weights and Measures provided three candidate methods to calibrate the ampere. The most promising hinges on single-electron transistors, which, like skipper CCDs, can count individual electrons. But the current produced by today’s SETs falls far short of the threshold for a precise calibration.
The first generation of skipper CCDs can already produce a larger current than SETs. Tiffenberg and Moroni expect that future refinements will allow them to build skipper CCDs that generate a current as large as 1 billionth of an amp while still counting individual electrons.
To reach the 1 microamp threshold from there, researchers would need to link together a thousand skipper CCDs. This, too, seems feasible to Tiffenberg. His team’s prototype dark matter detector contains around a hundred skipper CCDs. While SETs must be chilled to a few thousandths of a degree above absolute zero, skipper CCDs can operate at minus 133 degrees Celsius — a balmy temperature by comparison. As a result, scaling up the latter is more practical.
In the meantime, Fermilab researchers are exploring a host of other uses for skipper CCDs.
“We have been adding a lot of people to this effort, and now our days are full of meetings. Every day is a different subject around the skipper,” said Moroni, who received the 2019 URA Tollestrup Award for his skipper CCD research. “Monday and Wednesday are dark matter, Wednesday and Friday are neutrinos, Tuesday is quantum, Thursday is astronomy and satellites. It is very exciting.”
Tiffenberg agrees that skipper CCDs hold great promise for measurement science and physics research more broadly.
“The applications seem to just pop out everywhere, so it’s a lot of fun,” he said.
Fermilab research on particle detector technology is supported by the DOE Office of Science.
Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.