Secrets of the universe may lie in an old gold mine in South Dakota

If bison could talk …

I used to feed the bison back in the day. We used to have two herds. One was where the current herd lives now, and the other was across the street. For that second herd, there were no trees or anything, so when the bison would shed their winter fur, they would rub against the fence posts to get the old fur off. They could actually destroy a fence. So a co-worker and I had to put up a scratch post, a big post with a big steel cable.

When we started setting up this post, the bison were way in the corner of the field. We’re sitting there working, and all of the sudden we heard this grunt. We look and the bull’s right there. Other bison are surrounding us. And I said to the other guy, “Let’s go.” He said, “Oh, they’re not going to mess with you.” I said, “I know, because I’m leaving. If you want to go, we can go.”

He didn’t want to go, so I got in the truck and left. They didn’t mess with him.

I’d seen how young bulls behaved before. Once a guy had left a tractor idling as we were going to lunch. The young bull was disturbed by it, and he kept charging it, and he finally hit it. He popped that tractor tire like it was a balloon. He almost knocked it over.

At Fermilab, tape libraries house data from particle physics and astrophysics experiments. Photo: Reidar Hahn

VHS may have vanished, and cassettes are no longer cool, but tape is still on top when it comes to particle physics data: Fermilab stores over 100 petabytes of data, equivalent to 1,300 years of HD TV, on tape cartridges.

But why is tape, which is generally considered obsolete for music or movies, the go-to for storing all this data?

“Tape is the safest medium you can have,” said Stu Fuess, a senior scientist working in computing at Fermilab. “With tape, a machine can’t crash and cause you to lose your data.”

Fermilab’s seven tape libraries — three at the laboratory’s Feynman Computing Center and four at its Grid Computing Center — have the capacity to hold 10,000 tapes each, adding up to 600 petabytes of data storage. That leaves a lot of storage capacity that isn’t in use — yet.

“One challenge of data storage is that we don’t ever really want to throw data away — sometimes experimenters will reanalyze data years later to find something they never thought to look for,” Fuess said.

And now, increasingly large amounts of data are accumulated from more complex particle detectors, causing a growth in data Fuess called “almost exponential.”

Fermilab’s expedition into the intensity frontier — the realm of physics that requires highly intense particle beams to search for new physics — requires detailed detectors and a lot of data, enough to say whether an observation is a fundamental particle or a fluke.

“Particle physics is a statistical science, especially the intensity frontier,” Fuess said. “The more data you can accumulate, the more statistical power you can add to your measurements.”

In addition to 40 petabytes of data from intensity frontier experiments such as Fermilab’s neutrino experiments MicroBooNE and NOvA, the lab stores 40 petabytes of data from the CMS detector at the Large Hadron Collider in Switzerland. The legacy Tevatron experiments, CDF and DZero, each contribute another 10 petabytes. That brings the total to 100 PB of active data storage on tapes. A few extra petabytes of data from the Dark Energy Survey is housed in the Fermilab tape repositories, too, along with an unlikely data-neighbor: genomic research from the Simons Foundation Autism Research Initiative.

One of tape’s biggest drawbacks is speed, even though tape libraries are fully automated and manned by robotic retrieval arms.

“When you want to access a file, you have to find a free tape drive — the robot’s got to find the tape and put it in the tape drive,” said Gene Oleynik, a computing services manager in charge of data movement and storage at Fermilab. “All this communication has to happen to access data, and this happens in an order of minutes,” which, when compared to the fractions of a second it takes to retrieve digital data, is pretty slow going.

To speed up this process for high-demand files, about 35 petabytes of data from tapes has a copy that lives on disks, which offer much faster, although not instantaneous, file access. Disks don’t have to be physically retrieved and put in a drive by robots, and they can be read nonsequentially, with more efficiency than a tape.

Eventually disk systems may overtake tape if they become a cheaper data storage option, but disks can be unreliable.

“Tapes usually don’t fail. You should be able to keep a tape for 30 years and it’ll retain the data. But the problem with disks is, they do fail,” Oleynik said.

A disk system would have to store redundant pieces of files across many different disks to prevent data loss, but duplicating data means more data to store, also making the system less cost effective.

In the case of storing particle physics data, disks probably can’t make tapes obsolete; even in a disk system, there will probably be tapes as backup, just in case. Until a cost-effective data storage medium arrives that can match the reliability of tapes, it looks like they’re sticking around.

New result rivals precision of cosmic microwave background measurements, supports view that dark matter and dark energy make up most of the cosmos

 

Map of dark matter made from gravitational lensing measurements of 26 million galaxies in the Dark Energy Survey. The map covers about 1/30th of the entire sky and spans several billion light-years in extent. Red regions have more dark matter than average, blue regions less dark matter. Image: Chihway Chang of the Kavli Institute for Cosmological Physics at the University of Chicago and the DES collaboration

Imagine planting a single seed and, with great precision, being able to predict the exact height of the tree that grows from it. Now imagine traveling to the future and snapping photographic proof that you were right.

If you think of the seed as the early universe, and the tree as the universe the way it looks now, you have an idea of what the Dark Energy Survey (DES) collaboration has just done. In a presentation today at the American Physical Society Division of Particles and Fields meeting at the U.S. Department of Energy’s (DOE) Fermi National Accelerator Laboratory, DES scientists will unveil the most accurate measurement ever made of the present large-scale structure of the universe.

These measurements of the amount and “clumpiness” (or distribution) of dark matter in the present-day cosmos were made with a precision that, for the first time, rivals that of inferences from the early universe by the European Space Agency’s orbiting Planck observatory. The new DES result (the tree, in the above metaphor) is close to “forecasts” made from the Planck measurements of the distant past (the seed), allowing scientists to understand more about the ways the universe has evolved over 14 billion years.

“This result is beyond exciting,” said Scott Dodelson of Fermilab, one of the lead scientists on this result. “For the first time, we’re able to see the current structure of the universe with the same clarity that we can see its infancy, and we can follow the threads from one to the other, confirming many predictions along the way.”

Composite picture of stars over the Cerro Tololo Inter-American Observatory in Chile. Photo: Reidar Hahn

Most notably, this result supports the theory that 26 percent of the universe is in the form of mysterious dark matter and that space is filled with an also-unseen dark energy, which is causing the accelerating expansion of the universe and makes up 70 percent.

Paradoxically, it is easier to measure the large-scale clumpiness of the universe in the distant past than it is to measure it today. In the first 400,000 years following the Big Bang, the universe was filled with a glowing gas, the light from which survives to this day. Planck’s map of this cosmic microwave background radiation gives us a snapshot of the universe at that very early time. Since then, the gravity of dark matter has pulled mass together and made the universe clumpier over time. But dark energy has been fighting back, pushing matter apart. Using the Planck map as a start, cosmologists can calculate precisely how this battle plays out over 14 billion years.

“The DES measurements, when compared with the Planck map, support the simplest version of the dark matter/dark energy theory,” said Joe Zuntz, of the University of Edinburgh, who worked on the analysis. “The moment we realized that our measurement matched the Planck result within 7  percent was thrilling for the entire collaboration.”

The primary instrument for DES is the 570-megapixel Dark Energy Camera, one of the most powerful in existence, able to capture digital images of light from galaxies eight billion light-years from Earth. The camera was built and tested at Fermilab, the lead laboratory on the Dark Energy Survey, and is mounted on the National Science Foundation’s 4-meter Blanco telescope, part of the Cerro Tololo Inter-American Observatory in Chile, a division of the National Optical Astronomy Observatory. The DES data are processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Scientists on DES are using the camera to map an eighth of the sky in unprecedented detail over five years. The fifth year of observation will begin in August. The new results released today draw from data collected only during the survey’s first year, which covers 1/30th of the sky.

“It is amazing that the team has managed to achieve such precision from only the first year of their survey,” said National Science Foundation Program Director Nigel Sharp. “Now that their analysis techniques are developed and tested, we look forward with eager anticipation to breakthrough results as the survey continues.”

DES scientists used two methods to measure dark matter. First, they created maps of galaxy positions as tracers, and second, they precisely measured the shapes of 26 million galaxies to directly map the patterns of dark matter over billions of light-years, using a technique called gravitational lensing.

This image of the NGC 1398 galaxy was taken with the Dark Energy Camera. This galaxy lives in the Fornax cluster, roughly 65 million light-years from Earth. It is 135,000 light-years in diameter, just slightly larger than our own Milky Way galaxy, and contains more than a billion stars. Image: Dark Energy Survey

To make these ultraprecise measurements, the DES team developed new ways to detect the tiny lensing distortions of galaxy images, an effect not even visible to the eye, enabling revolutionary advances in understanding these cosmic signals. In the process, they created the largest guide to spotting dark matter in the cosmos ever drawn (see image). The new dark matter map is 10 times the size of the one DES released in 2015 and will eventually be three times larger than it is now.

“It’s an enormous team effort and the culmination of years of focused work,” said Erin Sheldon, a physicist at the DOE’s Brookhaven National Laboratory, who co-developed the new method for detecting lensing distortions.

These results and others from the first year of the Dark Energy Survey will be released today online and announced during a talk by Daniel Gruen, NASA Einstein fellow at the Kavli Institute for Particle Astrophysics and Cosmology at DOE’s SLAC National Accelerator Laboratory, at 5 p.m. Central time. The talk is part of the APS Division of Particles and Fields meeting at Fermilab and will be streamed live.

The results will also be presented by Kavli fellow Elisabeth Krause of the Kavli Insitute for Particle Astrophysics and Cosmology at SLAC at the TeV Particle Astrophysics Conference in Columbus, Ohio, on Aug. 9; and by Michael Troxel, postdoctoral fellow at the Center for Cosmology and AstroParticle Physics at Ohio State University, at the International Symposium on Lepton Photon Interactions at High Energies in Guanzhou, China, on Aug. 10. All three of these speakers are coordinators of DES science working groups and made key contributions to the analysis.

“The Dark Energy Survey has already delivered some remarkable discoveries and measurements, and they have barely scratched the surface of their data,” said Fermilab Director Nigel Lockyer. “Today’s world-leading results point forward to the great strides DES will make toward understanding dark energy in the coming years.”

The Dark Energy Survey is a collaboration of more than 400 scientists from 26 institutions in seven countries. Funding for the DES Projects has been provided by the U.S. Department of Energy Office of Science, U.S. National Science Foundation, Ministry of Science and Education of Spain, Science and Technology Facilities Council of the United Kingdom, Higher Education Funding Council for England, ETH Zurich for Switzerland, National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, Kavli Institute of Cosmological Physics at the University of Chicago, Center for Cosmology and AstroParticle Physics at Ohio State University, Mitchell Institute for Fundamental Physics and Astronomy at Texas A&M University, Financiadora de Estudos e Projetos, Fundação Carlos Chagas Filho de Amparo à Pesquisa do Estado do Rio de Janeiro, Conselho Nacional de Desenvolvimento Científico e Tecnológico and Ministério da Ciência e Tecnologia, Deutsche Forschungsgemeinschaft, and the collaborating institutions in the Dark Energy Survey, the list of which can be found at www.darkenergysurvey.org/collaboration.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

It was in August 1972 that Fermilab published its first experimental results. What else happened in the waning summer days?

The three buildings in the foreground are part of the Antiproton Source.

Aug. 16, 1983: Antiproton Source groundbreaking
In 1981, the lab began work on the design for the Antiproton Source, a key part of the planned proton-antiproton collider aspect of the Tevatron. The lab broke ground on the Antiproton Source on Aug. 16, 1983. It would be completed in 1985, and the first antiprotons would circulate in the Tevatron on Oct. 12, 1985.

This is one of the earliest photos taken with the 30-inch bubble chamber. This image was captured on June 12, 1972.

Aug. 21, 1972: First published experimental results
E-141, the Study of pp Interactions in the 30-inch Hydrogen Bubble Chamber, was the first experiment at the lab to have its results published. The experimenters submitted their paper, titled “Charged Particle Multiplicity Distribution from 200 GeV pp Interactions,” to Physical Review Letters on July 18, 1972, and it appeared in print on August 21.

This event display depicts a CDF top quark event from Tevatron Run I.

Aug. 31, 1992: Tevatron Collider Run 1 begins
Tevatron Collider Run I, which began on Aug. 31, 1992, was the first Tevatron run with two collider detectors. It ended on Feb. 20, 1996.