In February and March, three batches of copper plates arrived at Fermilab and were rushed into storage 100 meters underground. The copper had been mined in Finland, rolled into plates in Germany and shipped across land and sea to the lab — all within 120 days. In the quest to detect dark matter, the mysterious substance making up 85% of the matter in the universe, every day that the copper spent above ground mattered.
“At the surface of the Earth, we’re in a shower of cosmic rays,” said Fermilab scientist Dan Bauer.
When these high-energy particles originating from space strike a copper atom, they can knock out protons and neutrons to produce another atom called cobalt-60. Cobalt-60 is radioactive, meaning that it is unstable and spontaneously decays into other particles. The minuscule number of copper atoms converted into cobalt has no impact on everyday uses for copper. But Bauer and others working on the Super Cryogenic Dark Matter Search must take drastic steps to ensure the copper they use is as pure as possible.

The SNOBOX — the device designed to detect dark matter particles for the SuperCDMS experiment — will use nesting copper cans similar to this one, which was used in the progenitor CDMS experiment at Soudan. Photo: Dan Bauer, Fermilab
The latest in a lineage of similar experiments, SuperCDMS will search for dark matter at SNOLAB, an underground laboratory near Sudbury, Ontario, Canada. The copper plates will eventually take the shape of six oversized soda cans arranged like nesting dolls. The innermost can will house germanium and silicon devices designed to detect hypothesized weakly interacting massive particles, or WIMPs, especially those with less than 10 times the mass of a proton. The vacuum-sealed outermost can will measure a little over a meter in diameter. The whole contraption, dubbed the SNOBOX, will be linked via a set of copper stems to a special refrigerator that will cool the detectors to a tiny fraction of a degree above absolute zero.
At such frigid temperatures, thermal vibrations are so small that a WIMP could leave a detectable signal upon colliding with an atom.
But “you’re looking for a needle in a haystack with dark matter,” Bauer said. “The best you’re going to get is maybe a few events per year.”
Meanwhile, ordinary matter particles flying through the SuperCDMS detectors could produce extraneous signatures, known as background, that would drown the signals from dark matter interactions.

The ultrapure copper plates will be shaped into nested cans, as shown in this cutaway of the SNOBOX design. The hexagonal holes at the center will hold the dark matter detectors. Image: SuperCDMS collaboration
Burying SuperCDMS two kilometers underground and encasing the SNOBOX in layers of lead, plastic and water will screen out almost all the unwanted particles in the environment. But nothing stands between the copper cans and the detectors. And while copper’s superior ability to transport heat makes it ideal for cooling the detectors, any radioactive impurities in the metal would emit background particles.
That brings us back to cobalt-60.
“The bottom line is that the longer the copper sits around on the surface being exposed to cosmic rays, the more cobalt-60 is created,” explained Fermilab’s Matthew Hollister, the manager of the SuperCDMS cryogenics system. “So part of the background budget for the experiment includes a time limit for surface exposure.”
Cobalt-60 is not the only impurity to worry about. Radioactive isotopes of uranium, thorium and potassium occur naturally in Earth’s crust, so the SuperCDMS team had to buy copper sourced from a mine with as little of these metals as possible. Nonradioactive impurities matter, too — they can decrease the copper’s ability to conduct heat, thus making it harder to keep the detectors cold. In total, the copper for SuperCDMS must be over 99.99% pure with fewer than 0.1 parts per billion of radioactive impurities.
Between intrinsic impurities and those introduced through cutting, rolling and transporting the copper, the plates now sitting underground at Fermilab are not quite pristine.
“A lot of the process is not something that we have direct control over,” Hollister said. “Some of it really is a shot in the dark as to what we’re going to end up with at the end of the day.”

After traversing the Atlantic Ocean, the copper plates for SuperCDMS were delivered to a plant in South Bend, Indiana, before being brought to Fermilab for storage underground. Photo: Luke Martin, Fermilab
After receiving the plates, the researchers sent samples to the U.S. Department of Energy’s Pacific Northwest National Laboratory for detailed testing to quantify the remaining impurities. Soon, the plates will leave Fermilab for fabrication, and the cobalt clock will be ticking once again until the cans reach their home at SNOLAB.
“The last step before we take them underground will be to spray them with an acid etch that will take off some tens of microns of the surface,” Bauer said.
A solution of hydrogen peroxide and diluted hydrochloric acid will remove any surface impurities that have accumulated in the manufacturing process. And a weak citric acid solution will preserve copper’s high thermal conductivity by protecting it from oxidizing over the course of the experiment.
The SuperCDMS collaboration plans to begin collecting data in 2022. All in all, this iteration of the experiment is aiming for background levels 100 times lower than its predecessor, thanks in large part to the purity of the copper. With the increased sensitivity, researchers hope to spot any low-mass WIMPs that might be in the neighborhood.
“This program’s been quite a long time in development, so it’s good to see it starting to come together,” Hollister said. “The SNOBOX is really the last major piece, so we’re looking forward to getting this thing installed and getting it operational as soon as we can.”
SuperCDMS research on dark matter is supported by DOE’s Office of Science and the National Science Foundation, as well as the Canada Foundation for Innovation and SNOLAB.
Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.
This is a version of an article originally published by Argonne National Laboratory.
A limiting factor in modern physics experiments is the precision at which scientists can measure important values, such as the magnetic field within a detector. Scientists at the U.S. Department of Energy’s Argonne National Laboratory and their collaborators have developed a unique facility to calibrate field measurement devices and test their limits inside powerful magnetic fields.
The facility features a solenoid magnet from a former magnetic resonance imaging scanner originally housed at a San Francisco hospital. The magnet produces a maximum field of four teslas — over 400 times the strength of a refrigerator magnet. Its large opening, originally intended to hold a patient during an MRI scan, gives scientists ample space to position devices and machinery inside the magnetic field. The field produced by the magnet is also exceptionally uniform and stable, a requirement for calibrating measurement devices to the ultrahigh precision necessary for many particle and nuclear physics experiments.
“We have worked with several researchers, at Argonne and from other institutions, that need a strong magnetic field and large bore to test their research,” said Peter Winter, physicist and group leader in Argonne’s High Energy Physics Division. “Scientists bring their devices and electronics, and we provide our magnet, expertise and infrastructure to help automate the processes and ensure the success of the tests.”
The team is seeking new users to continue to broaden the facility’s application portfolio.

This panorama view of the 4-tesla solenoid facility shows Argonne’s Midhat Farooq and Joe Grange aligning an NMR calibration setup (left of the magnet), Ran Hong and students improving the calibration motion control system (right of the magnet) and David Flay analyzing current NMR calibration data. Photo: Argonne National Laboratory
Calibration station
A primary application of Argonne’s solenoid test facility is calibrating and cross-calibrating measurement probes to achieve high precision and to add layers of consistency between similar experiments across the world.
Originally, Argonne scientists acquired the magnet to test and calibrate several probes developed by the University of Massachusetts for measuring the magnetic field in the Muon g-2 experiment currently taking place at DOE’s Fermi National Accelerator Laboratory. The test facility allowed the scientists to achieve precise field measurements down to several parts per billion – like measuring the circumference of the Earth down to about two inches.
Precise measurement of the field in the experiment is crucial because the magnetic field strength is a major player in the ultimate determination of “g,” a property of the muon whose ultimate determination will either confirm present theories of particle physics or point to the existence of undiscovered particles.
“This facility has enabled the magnetic field team on Muon g-2 to meet strict goals on the experiment by reducing uncertainties and improving the robustness of our measurements,” said David Kawall, a physicist and professor from the University of Massachusetts. “To the best of my knowledge, there are no peer facilities in the world, and having access to these tools at Argonne has been essential to the success of the magnetic field effort on Muon g-2.”
Future g-2 experiments will be conducted in Japan at the Japan Proton Accelerator Complex of the High Energy Accelerator Research Organization, or KEK. The Japanese collaborators, led by Ken-ichi Sasaki, are using the facility to cross-calibrate their magnetic field probes with the ones used at Fermilab.
“By ensuring our probes all read the same values in the same magnetic field, we are adding certainty to the measurements coming from both g-2 experiments,” said Sasaki, who is a professor at KEK and subsection leader of the cryogenic section in J-PARC.
Another muon experiment, the Muonium Spectroscopy Experiment Using Microwaves, or MuSEUM, will contribute to the Japanese g-2 experiment by precisely measuring the mass ratio of the muon to the electron, a value also included in the g-2 determination.
The experiment at KEK in Japan uses very similar nuclear magnetic resonance calibration probes as the g-2 experiment. The development of the probe for MuSEUM has been led by Toya Tanaka, a graduate student at the University of Tokyo who uses the solenoid facility to calibrate the experiment’s probes. The collaboration between Japan and U.S. scientists will ensure that both g-2 experiments and the MuSEUM experiment have a consistent field measurement.
Helium and Hall probe development
Through a partnership with Fermilab’s Thomas Strauss, another Japanese group, led by Norihito Ohuchi and Yasushi Arimoto from KEK, is using the facility to calibrate their own probe — called a Hall probe — for the upcoming SuperKEKB experiment.
Although less precise than the NMR probes used in the current g-2 experiments, Hall probes can measure not only the magnitude of a magnetic field with the field gradient, but also its direction.
SuperKEKB, a recently upgraded, three-kilometer electron-positron collider, accelerates particles called electrons and positrons very close to the speed of light. The scientists will use the measurements from particles created in collisions to investigate a potential explanation of the matter-antimatter asymmetry in the universe.
The SuperKEKB experiment involves five superconducting solenoid magnets in the beam colliding region. The solenoid fields have a heavy influence on the efficiency of the collisions. To elevate the beam colliding efficiency, the team will use the calibrated data of Hall probes to make more precise solenoid field profiles.
“Using Argonne’s test facility, we believe we can improve the accuracy of the Hall probes by one order of magnitude,” said Ohuchi, who is a professor at KEK and leader of the superconducting magnet group in the Accelerator Laboratory. “This will enable us to map the complex magnetic fields produced by the SuperKEKB magnets and improve the quality of the beams.”
Another upcoming experiment at Fermilab, called Mu2e, will also employ Hall probes for field mapping. The experiment uses a solenoid magnet like Argonne’s, but bigger, to measure muon interactions. The reigning Standard Model of particle physics allows muons to decay in a specific way, but for this experiment, scientists will search for a “forbidden” interaction whose occurrence would violate the Standard Model and point to new physics.
The ability of Hall probes to measure the direction of a field makes it the preferred probe for the Mu2e experiment, but the added capability necessitates even more quality control. Argonne scientists have taken the responsibility for field mapping in the Mu2e experiment, and they are using the test facility to calibrate the probes.
“If you have a slight misalignment between the direction from which the probe reads its measurement and where the field is actually pointing, the measurement can veer away from the true value very quickly,” said Bob Wagner, leader of the field mapping team at Argonne. “Our magnet allows us to align the axes of the probes with the field and with each other.”
As Hall probes become more accurate and precise with the help of Argonne’s test facility, a new probe — one that uses helium — is making its debut. A group of researchers from the University of Michigan, led by Professor Tim Chupp and Midhat Farooq, developed the new calibration probe to act as an additional check for measuring fields
The helium isotope in the probe, helium-3, is an inert gas that behaves differently from the water used in traditional probes and has the potential for greater accuracy.
“We used the Argonne test magnet to cross-calibrate our probe with two water probes, including one with the same design as the UMass probe, and found agreement with high precision, confirming that any effects we had not considered are pretty small,” Chupp said. “Our next step is cross calibration of the UMass probe with an improved helium-3 probe that will be even more precise.”
Farooq and team published a paper in Physical Review Letters in June 2020 on the success of their helium probe.
A growing list of applications
Since accepting its first group of external users — scientists from Stony Brook University that tested a magnetic cloak to shield electronics in experiments — the facility’s applications and user base have grown significantly.
In addition to probe calibration, the magnet has also aided in testing and development of a variety of experimental equipment. Argonne’s Junqi Xie, a scientist in the lab’s Physics division, uses the magnet to develop detectors that operate in high magnetic fields for photosensing applications. The detectors will have future applications in the Electron-Ion Collider to be built at DOE’s Brookhaven National Laboratory.
Fermilab recently used the magnet to test their laser metrology systems that they use to measure distances and align equipment in experiments. They tested the ability of several laser trackers, which can measure distances at the submillimeter level, to remain accurate in the presence of high magnetic fields.
“The facility has also been helpful for training the next generation of scientists,” Kawall said, “and the international collaborations formed will be of enduring benefit.”
For inquiries about using the magnet for research and development, contact Peter Winter at winterp@anl.gov.
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.
The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
At CERN’s Large Hadron Collider, as many as 40 million particle collisions occur within the span of a single second inside the CMS particle detector’s more than 80 million detection channels. These collisions create an enormous digital footprint, even after computers winnow it to the most meaningful data. The simple act of retrieving information can mean battling bottlenecks.
CMS physicists at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, which stores a large portion of LHC data, are now experimenting with the use of NVMe, or nonvolatile memory express, solid-state technology to determine the best way to access stored files when scientists need to retrieve them for analysis.
The trouble with terabytes
The results of the CMS experiment at CERN have the potential to help answer some of the biggest open-ended questions in physics, such as why there is more matter than antimatter in the universe and whether there are more than three physical dimensions.
Before scientists can answer such questions, however, they need to access the collision data recorded by the CMS detector, much of which was built at Fermilab. Data access is by no means a trivial task. Without online data pruning, the LHC would generate 40 terabytes of data per second, enough to fill the hard drives of 80 regular laptop computers. An automated selection process keeps only the important, interesting collisions, trimming the number of saved events from 40 million per second to just 1,000.

When scientists need to access the stored files to perform analyses, a long robotic arm descends from the ceiling, selects a tape, and transfers the data it stores to a hard drive. Photo: Reidar Hahn, Fermilab
“We care about only a fraction of those collisions, so we have a sequence of selection criteria that decide which ones to keep and which ones to throw away in real time,” said Fermilab scientist Bo Jayatilaka, who is leading the NVMe project.
Still, even with selective pruning, tens of thousands of terabytes of data from the CMS detector alone have to be stored each year. Not only that, but to ensure that none of the information ever gets lost or destroyed, two copies of each file have to be saved. One copy is stored in its entirety at CERN, while the other copy is split between partnering institutions around the world. Fermilab is the main designated storage facility in the U.S. for the CMS experiment, with roughly 40% of the experiment’s data files stored on tape.
A solid-state solution
The Feynman Computing Center at Fermilab houses three large data libraries filled with rows upon rows of magnetic tapes that store data from Fermilab’s own experiments, as well as from CMS. If you were to combine all of Fermilab’s tape storage capacity, you’d have roughly the capability to store the equivalent of 13,000 years’ worth of HD TV footage.
“We have racks full of servers that have hard drives on them, and they are the primary storage medium that scientists are actually reading and writing data to and from,” Jayatilaka said.
But hard drives — which have been used as storage devices in computers for the last 60 years — are limited in the amount of data they can load into applications in a given time. This is because they load data by retrieving it from spinning disks, which is the only point of access for that information. Scientists are investigating ways to implement new types of technology to help speed up the process.
To that end, Fermilab recently installed a single rack of servers full of solid-state NVMe drives at its Feynman Computing Center to speed up particle physics analyses.
Generally, solid state drives use compact electrical circuits to quickly transfer data. NVMe is an advanced type of solid-state drive that can handle up to 4,000 megabytes per second. To put that into perspective, the average hard drive caps at around 150 megabytes per second, making solid-state the obvious choice if speed is your main goal.
But hard drives haven’t been relegated to antiquity just yet. What they lack in speed, they make up for in storage capacity. At present, the average storage limit in solid-state drives is 500 gigabytes, which is the minimum amount of storage you’d commonly find available on modern hard drives. Determining whether or not Fermilab should replace more of their hard drive memory storage with solid-state drives will thus require a careful analysis of cost and benefits.
Undertaking an analysis
When researchers analyze their data using large computer servers or supercomputers, they typically do so by sequentially retrieving portions of that data from storage, a task well-suited for hard drives.
“Up until now, we’ve been able to get away with using hard drives in high-energy physics because we tend to handle millions of events by analyzing each event one at a time,” Jayatilaka said. “So at any given time, you’re asking for only a few pieces of data from each individual hard drive.”

In an attempt to speed up analyses in high-energy physics research, Fermilab recently installed a single rack of servers full of solid state drives called NVMe. Photo: Bo Jayatilaka, Fermilab
But newer techniques are changing the way scientists analyze their data. Machine learning, for example, is becoming increasingly common in particle physics, especially for the CMS experiment, where this technology is responsible for the automated selection process that keeps only the small fraction of data scientists are interested in studying.
But instead of accessing small portions of data, machine learning algorithms need to access the same piece of data repeatedly — whether it’s stored on a hard drive or solid-state drive. This wouldn’t be much of a problem if there were only a few processors trying to access that data point, but in high-energy physics calculations, there are thousands of processors that are vying to access that data point simultaneously.
This can quickly cause bottlenecking and slow speeds when using traditional hard drives. The end result is slower computing times.
Fermilab researchers are currently testing NVMe technology for its ability to reduce the number of these data bottlenecks.
The future of computing at Fermilab
Fermilab’s storage and computing power are much more than just a powerhouse for the CMS experiment. The CMS computing R&D effort is also setting the foundations for the success of the upcoming High-Luminosity LHC program and enabling the international, Fermilab-hosted Deep Underground Neutrino Experiment, both of which will start taking data in the late 2020s.
Jayatilaka and his team’s work will also allow physicists to prioritize where NVMe drives should be primarily located, whether at Fermilab or at other LHC partner institutions’ storage facilities.
With the new servers in hand, the team is exploring how to deploy the new solid-state technology in the existing computing infrastructure at Fermilab.
The CMS experiment and scientific computing at Fermilab are supported by the DOE Office of Science.
Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.