3 new studies indicate a conflict at the heart of cosmology

Physicists used MINERvA, a Fermilab neutrino experiment, to measure the proton’s size and structure using a neutrino-scattering technique.

For the first time, particle physicists have been able to precisely measure the proton’s size and structure using neutrinos. With data gathered from thousands of neutrino-hydrogen scattering events collected by MINERvA, a particle physics experiment at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, physicists have found a new lens for exploring protons. The results were published today in the scientific journal Nature.

This measurement is also important for analyzing data from experiments that aim to measure the properties of neutrinos with great precision, including the future Deep Underground Neutrino Experiment, hosted by Fermilab.

MINERvA

One of two magnetic focusing horns used in the beamline at Fermilab that produces intense neutrino beams for MINERvA and other neutrino experiments. Photo: Reidar Hahn, Fermilab

“The MINERvA experiment has found a novel way for us to see and understand proton structure, critical both for our understanding of the building blocks of matter and for our ability to interpret results from the flagship DUNE experiment on the horizon,” said Bonnie Fleming, Fermilab deputy director for science and technology.

Protons and neutrons are the particles that make up the nucleus, or core, of an atom. Understanding their size and structure is essential to understand particle interactions. But it is very difficult to measure things at subatomic scales. Protons — about a femtometer, or 10-15 meters, in diameter — are too small to examine with visible light. Instead, scientists use particles accelerated to high energies. Their wavelengths are capable of probing miniscule scales.

Starting in the 1950s, particle physicists used electrons to measure the size and structure of the proton. Electrons are electrically charged, which means they interact with the electromagnetic force distribution in the proton. By shooting a beam of accelerated electrons at a target containing lots of atoms, physicists can observe how the electrons interact with the protons and thus how the electromagnetic force is distributed in a proton. Performing increasingly more precise experiments, physicists now have measured the proton’s electric charge radius to be 0.877 femtometers.

The MINERvA collaboration achieved its groundbreaking result by using particles called neutrinos in lieu of electrons. Specifically, they used antineutrinos, the antimatter partners of neutrinos. Unlike electrons, neutrinos and antineutrinos have no electric charge; they only interact with other particles via the weak nuclear force. This makes them sensitive to the “weak charge” distribution inside a proton.

However, neutrinos and antineutrinos rarely interact with protons — hence the name weak force. To collect enough scattering events to make a statistically meaningful measurement, MINERvA scientists needed to smash a lot of antineutrinos into a lot of protons.

Fortunately, Fermilab is home to the world’s most powerful high-energy neutrino and antineutrino beams. And MINERvA contains a lot of protons. Located 100 meters underground at Fermilab’s campus in Batavia, Illinois, MINERvA was designed to perform high-precision measurements of neutrino interactions on a wide variety of materials, including carbon, lead and plastic.

The MINERvA experiment has found a novel way for us to see and understand proton structure.

– Bonnie Fleming

To measure the proton structure with high precision, scientists ideally would send neutrinos or antineutrinos into a very dense target made only of hydrogen, which contains protons but no neutrons. That is experimentally challenging, if not impossible to achieve. Instead, the MINERvA detector contains hydrogen that is closely bonded to carbon in the form of a plastic called polystyrene. But no one had ever tried to separate hydrogen data from carbon data.

“If we were not optimists, we would say it’s impossible,” said Tejin Cai, a postdoctoral researcher at York University and lead author on the Nature paper. Cai performed this research for his doctorate at the University of Rochester. “The hydrogen and carbon are chemically bonded together, so the detector sees interactions on both at once. But then, I realized that the very nuclear effects that made scattering on carbon complicated also allowed us to select hydrogen and would allow us to subtract off the carbon interactions.”

Cai and Arie Bodek, a professor at the University of Rochester, proposed using MINERvA’s polystyrene target to measure antineutrinos scattering off protons in hydrogen and carbon nuclei to Cai’s Ph.D. advisor, Kevin McFarland. Together, they developed algorithms to subtract the large carbon background by identifying neutrons produced from antineutrinos scattering off carbon atoms.

“When Tejin and Arie first suggested trying this analysis, I thought it would be too difficult, and I wasn’t encouraging. Tejin persevered and proved it could be done,” said McFarland, a professor at the University of Rochester. “One of the best parts of being a teacher is having a student who learns enough to prove you wrong.”

Cai and his collaborators used MINERvA to record more than a million antineutrino interactions over the course of three years. They determined that about 5,000 of these were neutrino-hydrogen scattering events.

detector

MINERvA used a collider detector design and a high-intensity beam to study neutrino reactions with five different nuclei, creating the first self-contained comparison of interactions in different elements. Credit: Fermilab

With these data, they inferred the size of the proton’s weak charge radius to be 0.73 ± 0.17 femtometers. It is the first statistically significant measurement of the proton’s radius using neutrinos. Within its uncertainties, the result aligns with the electric charge radius measured with electron scattering.

The result shows that physicists can use this neutrino-scattering technique to see the proton through a new lens. It is like how looking at the universe in different wavelengths allows astronomers to study the universe in new ways, as seen with the infrared-sensitive James Webb Space Telescope.

The result also provides a better understanding of the proton’s structure. This can be used to predict the behavior of groups of protons in an atom’s nucleus. If physicists start with a better measurement of neutrino-proton interactions, they can make better models of neutrino-nucleus interactions. This will improve the performance of other neutrino experiments, such as NOvA at Fermilab and T2K in Japan.

For Cai, one of the most exciting things about the result is showing that, “Even with a general particle detector, we can do things we didn’t imagine we could do.”

Co-spokesperson for the MINERvA collaboration Deborah Harris, scientist at Fermilab and professor of physics at York University, said, “When we proposed MINERvA, we never thought we’d be able to extract measurements from the hydrogen in the detector. Making this work required great performance from the detector, creative analysis ideas and years of running in the most intense high-energy neutrino beam on the planet.”

This work is supported by the DOE Office of Science, National Science Foundation, Coordination for the Improvement of Higher Education Personnel in Brazil, Brazilian National Council for Scientific and Technological Development, Mexican National Council of Science and Technology, Basal Project in Chile, Chilean National Commission for Scientific and Technological Research, Chilean National Fund for Scientific and Technological Development, Peruvian National Council for Science, Technology and Technological Innovation, Research Management Directorate at the Pontifical Catholic University of Peru, National University of Engineering in Peru, Polish National Science Center and UK Science and Technology Facilities Council.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Editor’s note: This article was originally written by the University of Chicago and was adapted with Fermilab information.

Sometimes to know what the matter is, you have to find it first.

When the universe began, matter was flung outward and gradually formed planets, stars and galaxies. By carefully assembling a map of that matter today, scientists can try to understand the forces that shaped the evolution of the universe.

A group of scientists, including several with the U.S. Department of Energy’s Fermi National Accelerator Laboratory, have released one of the most precise measurements ever made of how matter is distributed across the universe today.

Combining data from two major telescope surveys of the universe, the Dark Energy Survey and the South Pole Telescope, the analysis involved more than 150 researchers and is published as a set of three articles Jan. 31 as the Editors’ Suggestion in Physical Review D.

Scientists have released a new survey of all the matter in the universe, using data taken by the Dark Energy Survey in Chile (above) and the South Pole Telescope. Photo: Andreas Papadopoulos, University of Portsmouth

“One of the original motivations of building DES and SPT was to combine both sets of measurements to more powerfully constrain cosmology,” said Bradford Benson, a scientist at Fermilab and associate director of operations for SPT. “This is an exciting result because for the first time it demonstrates combining data from both experiments which has interestingly found some tension between early and late time large-scale structure growth.”

Among other findings, the analysis indicates that matter is not as “clumpy” as we would expect, based on our current best model of the universe, which adds to a body of evidence that there may be something missing from the existing standard model of the universe.

Cooling and clumps

After the Big Bang created all the matter in the universe in a very hot, intense few moments about 13 billion years ago, this matter has been spreading outward, cooling and clumping as it goes. Scientists are very interested in tracing the path of this matter; by seeing where all the matter ended up, they can try to recreate what happened and what forces would have had to have been in play.

The first step is collecting enormous amounts of data with telescopes.

In this study, scientists combined data from two very different telescope surveys: The Dark Energy Survey, which surveyed the sky over six years from a mountaintop in Chile; and the South Pole Telescope, which looks for the faint traces of radiation that are still traveling across the sky from the first few moments of the universe.

Combining two different methods of looking at the sky reduces the chance that the results are thrown off by an error in one of the forms of measurement. “It functions like a cross-check, so it becomes a much more robust measurement than if you just used one or the other,” said University of Chicago astrophysicist Chihway Chang, one of the lead authors of the studies.

In both cases, the analysis looked at a phenomenon called gravitational lensing. As light travels across the universe, it can be slightly bent as it passes objects with lots of gravity, such as galaxies.

This method catches both regular matter and dark matter — the mysterious form of matter that has only been detected due to its effects on regular matter — because both regular and dark matter exert gravity.

By rigorously analyzing these two sets of data, the scientists could infer where all the matter ended up in the universe. It is more precise than previous measurements — that is, it narrows down the possibilities for where this matter wound up — compared to previous analyses, the authors said.

By comparing maps of the sky from the Dark Energy Survey telescope (left) with data from the South Pole Telescope and the Planck satellite (right), the team could infer how the matter is distributed. Image: Yuuki Omori, University of Chicago

The majority of the results fit perfectly with the currently accepted best theory of the universe.

But there are also signs of a crack — one that has been suggested in the past by other analyses, too.

“It seems like there are slightly less fluctuations in the current universe, than we would predict assuming our standard cosmological model anchored to the early universe,” said analysis coauthor and University of Hawaii astrophysicist Eric Baxter.

That is, if you made a model incorporating all the currently accepted physical laws, then took the readings from the beginning of the universe and extrapolated it forward through time, the results would look slightly different from what we actually measure around us today.

Specifically, today’s readings find the universe is less “clumpy” — clustering in certain areas rather than evenly spread out — than the model would predict.

If other studies continue to find the same results, scientists say, it may mean there is something missing from our existing model of the universe, but the results are not yet to the statistical level that scientists consider to be ironclad. That will take further study.

“Perhaps most exciting, this is just the tip of the iceberg using the data that we have. This analysis used only half of the DES survey data, and the SPT-3G camera has already made measurements roughly an order of magnitude more sensitive. In the next few years, we expect to learn much more from this method using these much more sensitive data,” said Benson.

The analysis is a landmark, as it yielded useful information from two very different telescope surveys. This is a much-anticipated strategy for the future of astrophysics, as more large telescopes come online in the next decades, but few had actually been carried out yet.

University of Chicago Kavli Associate Fellow Yuuki Omori was a lead co-author for the papers: Paper I, Paper II, Paper III.

The South Pole Telescope is primarily funded by the National Science Foundation and the U.S. Department of Energy and is operated by a collaboration led by the University of Chicago. The Dark Energy Survey was an international collaboration coordinated through Fermi National Accelerator Laboratory and funded by the U.S. Department of Energy, the National Science Foundation and many institutions around the world.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Just as the sound of a guitar depends on its strings and the materials used for its body, the performance of a quantum computer depends on the composition of its building blocks. Arguably the most critical components are the devices that encode information in quantum computers.

One such device is the transmon qubit — a patterned chip made of metallic niobium layers on top of a substrate, such as silicon. Between the two materials resides an ultrathin layer that contains both niobium and silicon. The compounds of this layer are known as silicides (NbxSiy). Their impact on the performance of transmon qubits has not been well understood — until now.

The silicide research team. In the front from left to right: Mark Hersam, Michael Bedzyk, James Ronidnelli and Xiezeng Lu. Back: Carlos Torres and Dominic Goronzy. Photo: SQMS Center

Silicides form when elemental niobium is deposited onto silicon during the fabrication process of a transmon qubit. They need to be well understood to make devices that reliably and efficiently store quantum information for as long as possible.

Researchers at the Superconducting Quantum Materials and Systems Center, hosted by the U.S. Department of Energy’s Fermi National Accelerator Laboratory, have discovered how silicides impact the performance of transmon qubits. Their research has been published in APS Physical Review Materials.

An unexpected signal

Carlos Torres-Castanedo was analyzing the materials of a transmon qubit using x-rays, when he came across a peculiar signal.

“I thought the signal came from a surface oxide, because that’s just what usually happens,” said Torres-Castanedo, a doctoral candidate in materials science at Northwestern University. “After spending a day trying to fit the data to match an oxide, the only possibility was to introduce a niobium silicide layer. When the data beautifully fit the model, I showed the results to my co-workers, and we all became excited about what this could mean for transmon qubit performance.”

The SQMS Center researchers dug deeper. They identified the types of silicides present, the thickness of the layer — typically only a few nanometers thick — and its physical and chemical structure. After completing these measurements, they focused on figuring out how these compounds affect the performance of qubits.

The researchers simulated different types of silicides. Not only did they find that silicides are detrimental to the performance of transmon qubits, but they also found that some are more detrimental than others.

Impact on coherence time

Qubits are the basic and fragile units of information that a quantum computer uses to perform calculations. They are physically encoded through transmon qubits.

Similar to a street performer plucking an A note on a guitar string and allowing the tone to ring out before it becomes obscured by street noise, quantum information in a transmon qubit exists for a limited time before it dissipates or is obscured by environmental noise. This time span is known as the coherence time. The longer the coherence time, the better the performance of the transmon qubit.

“This interface will never be like silicon stop, niobium start,” said SQMS Center researcher James Rondinelli, Walter Dill Scott Professor of Materials Science and Engineering at Northwestern University. “The first observation was that there is not an atomically sharp interface, but rather a compositional gradient between the silicon substrate —which is the platform for the system — and the niobium.”

With that observation, Rondinelli and his group began a detailed computational study as part of a greater SQMS Center effort to improve qubit coherence times.

Simulations with a supercomputer

With a newfound curiosity about what the presence of silicides could mean for transmon qubits, the researchers used a supercomputer at the National Energy Research Scientific Computing Center, located at the DOE’s Lawrence Berkley National Laboratory.

Think of silicides as a thin material inside the street performer’s guitar that affects the sound of the guitar string. Researchers studying transmon qubits are essentially trying to isolate an A note and seeing to what extent the hidden material interferes.

Some silicides, for example, have magnetic properties that can interfere with the quantum information that rings out from the transmon qubit. The stronger the magnetism, the more the quantum information is obscured.

Through simulations, researchers found that the silicide compound Nb6Si5 does not have any magnetic properties, while Nb5Si3 introduces magnetic noise. If silicides will always be present in transmon qubits, whether researchers like it or not, Nb6Si5 is less detrimental, and scientists will have to make do.

“To really push the field forward, you have to embrace a little bit of an outsider perspective to make an advancement, and we’re optimistic our multidisciplinary approach will solve this challenge.” – James Rondinelli, SQMS Center researcher

“I find it interesting how the research on the properties of these silicides have been studied since the ’80s, but never have been understood in a nanometer-sized film,” said Torres-Castanedo. “I feel proud that I was able to work alongside my fellow researchers to conduct this important study.”

These findings by themselves are significant. In the greater context of the SQMS Center’s aim to develop a state-of-the-art quantum computer, however, the results have much further implications than just understanding the properties of materials.

“The community who’s worked on superconducting qubits has traditionally been quantum physicists and engineers. The reason the SQMS Center has been so successful is they’ve embraced material scientists,” said Rondinelli. “To really push the field forward, you have to embrace a little bit of an outsider perspective to make an advancement, and we’re optimistic our multidisciplinary approach will solve this challenge.”

The Superconducting Quantum Materials and Systems Center at Fermilab is supported by the DOE Office of Science.

The Superconducting Quantum Materials and Systems Center is one of the five U.S. Department of Energy National Quantum Information Science Research Centers. Led by Fermi National Accelerator Laboratory, SQMS is a collaboration of 24 partner institutions — national labs, academia and industry —working together to bring transformational advances in the field of quantum information science. The center leverages Fermilab’s expertise in building complex particle accelerators to engineer multiqubit quantum processor platforms based on state-of-the-art qubits and superconducting technologies. Working hand in hand with embedded industry partners, SQMS will build a quantum computer and new quantum sensors at Fermilab, which will open unprecedented computational opportunities. For more information, please visit sqmscenter.fnal.gov.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.