Fermilab feature

Nobel-winning experiment enables Fermilab-led quantum network

On Dec. 10, 2022, the 2022 Nobel Prize in Physics was bestowed upon Alain Aspect, John F. Clauser and Anton Zeilinger at a ceremony in Stockholm, Sweden. The prestigious award recognized scientific achievements in quantum physics “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.”

The Nobel-winning research directly applies to quantum efforts at the U.S. Department of Energy’s Fermi National Accelerator Laboratory. Specifically, experiments done by Zeilinger have contributed to the development of technology behind the Illinois‐Express Quantum Network, or IEQNET, led by Fermilab.

A joint research project among Fermilab, Argonne National Laboratory, Northwestern University and Caltech, IEQNET is a metropolitan-scale quantum network testbed that uses deployed optical fiber and other currently available technology. It is one step toward developing a quantum internet — a network in which information is delivered over long distances with qubits, the units of information for quantum computers and networks.

IEQNET uses a combination of cutting-edge quantum and classical technologies to transmit quantum information. Researchers also designed it to coexist with classical networks. The design and implementation of IEQNET are outlined in a scientific paper recently published in IEEE Transactions on Quantum Engineering.

IEQNET’s quantum networking architecture relies on three planes: an infrastructure plane (connectivity of physical devices), a control plane (control of individual devices and the network as a whole), and an application plane (end-to-end networking services). Image: IEQNET

“With the design, we are describing the control functions that will automate the operation of IEQNET,” said Joaquin Chung, a research scientist in the Data Science and Learning Division at Argonne National Laboratory.

“At IEQNET, we want to develop an automated system that can make those connections without a human making the changes,” said Chung. “So things work like a production system and not like a tabletop experiment.”

The paper also explains some of the experiments that IEQNET collaborators have conducted. The results inform what needs to be built into the architecture of the network.

Chung said the paper is important because it aims to bridge the gap between physicists focused on demonstrating nascent technologies and theorists who are thinking 20 years into the future.

“With the resources we have now, can we build a system that works autonomously — and thus improves upon physics experiment and approaches what is envisioned in theory papers? That’s the importance of this engineering paper that lays out the design of IEQNET,” said Chung.

Quantum teleportation and entanglement swapping

For more than five years, IEQNET researchers and collaborators have developed and deployed quantum systems that generate and distribute entangled photons over already-existing telecommunications optical fiber.

To protect and facilitate the transfer of quantum information over a quantum network, IEQNET will use technology enabled by Nobel laureate Anton Zeilinger’s research. “We are standing on the shoulders of these Nobel winners,” said Chung.

In quantum teleportation, a quantumly entangled photon pair shared between two locations is used to transfer a qubit quantum state between them. Demonstrated by Zeilinger in a 1997 experiment, it is a way to transfer quantum information from one system to another without physically transferring the original qubit that encodes the information. This provides a means to protect the quantum information from loss during the transfer process. It is especially important because arbitrary unknown quantum states cannot be copied, due to the no-cloning theorem in quantum mechanics. So, copying the state is not available to protect from loss. In addition, it allows for extremely secure communication; if the information were intercepted or manipulated by an attacker, the entanglement would break, and the sender would know immediately.

“We’re establishing a high-tech attractor in the Chicagoland area. We want to help create the Silicon Prairie for quantum technologies and microelectronics in the Midwest.” – Panagiotis Spentzouris, principal investigator of IEQNET

Importantly, said Christoph Simon, who did his doctoral work with Anton Zeilinger at the University of Vienna, Zeilinger’s experiments opened the door to multi-photon entanglement — as opposed to two-photon entanglement — which is crucial for quantum networks.

“You had to somehow overlap photons from different sources, from different pairs. That was the key conceptual and technological step they had to take,” said Simon, now a professor at the University of Calgary. “Once you can do entanglement of three or four photons, you can scale it up, and now people can do dozens.”

Zeilinger also demonstrated entanglement swapping, the entanglement of two separate photons that never came in contact with each other. This type of entanglement will be essential for developing quantum repeaters, technology key to enabling long-distance quantum communication that is not yet feasible. IEQNET is currently a repeaterless network, which works at the metropolitan scale; demonstrating entanglement swapping is the collaboration’s next goal.

“The Silicon Prairie for quantum”

IEQNET consists of multiple sites geographically dispersed across the Chicagoland area, and each site has one or more quantum nodes. These “Q-nodes” generate or measure quantum signals, such as entangled photons, and communicate the measurement results via standard and classical signals and conventional networking processes.

The architecture of IEQNET, as described in the IEEE paper, uses software-defined networking technology to perform traditional wavelength routing and assignment between the Q-nodes. Its structure facilitates a network, rather than a series of links. It’s analogous to having a multi-way conference call in which any two parties can talk at any given time and still be understood as opposed to a simple two-person phone call.

With Q-nodes at Fermilab, Argonne and Northwestern University’s Evanston campus and the proposed node at Northwestern’s downtown-Chicago campus, the metropolitan Chicago area is well poised to become a quantum hub.

IEQNET employs classical networking technology and transparent optical switches to implement network functions, such as establishing lightpaths — a lightpath is a path between two nodes in the network in which light passes through unmodified — between quantum nodes via routing and approaches used in transparent optical networks. IEQNET deploys transparent optical switches at Fermilab, Argonne and StarLight, an internet exchange point on Northwestern’s Chicago campus. Quantum nodes are deployed at all three locations; Fermilab has multiple nodes at two locations separated by ~2.5 km. A fiber optical connection about 150 kilometers long runs from Fermilab to Argonne to StarLight and to Northwestern’s Evanston campus. An additional connection has been requested between the Fermilab and StarLight locations. Image: IEQNET

“We’re establishing a high-tech attractor in the Chicagoland area,” said Panagiotis Spentzouris, head of the quantum science program at the Fermilab Quantum Institute and principal investigator of IEQNET. “We want to help create the Silicon Prairie for quantum technologies and microelectronics in the Midwest.”

IEQNET collaborators have already demonstrated record teleportation fidelity at Caltech and at the Fermilab IEQNET quantum nodes. In June 2022, IEQNET achieved record synchronization between nodes at Fermilab and Argonne, with quantum and classical signals co-propagating. This “co-existence” of quantum and classical signals — also demonstrated in 2021 for polarization entangled photons, the kind used in Zeilinger’s groundbreaking work, between Northwestern’s Evanston and Chicago campuses — is essential for the deployment of a functional quantum network.

A fiber optical connection about 150 kilometers long already runs from Fermilab to Argonne to an internet exchange point called Starlight on Northwestern’s Chicago campus to Northwestern’s Evanston campus.

“The fiber links interconnecting the sites of IEQNET are rightsized for the currently available quantum hardware. As quantum technology matures, and new hardware becomes available, the IEQNET architecture and control functions are designed to accommodate the new hardware,” said Prem Kumar, professor of information technology at the McCormick School of Engineering at Northwestern. “Therefore, we anticipate a rapid progression of advances in quantum networking.”

IEQNET’s ultimate goal is to implement the architecture design and test it using these metro-scale links.

“We’re using well-established and developed Fermilab competencies that we have acquired because of our excellence in our high-energy physics program — that is, the controls engineering, system integration, architecture — things that we know how to do because of who we are,” said Spentzouris. “We are applying that know-how to new emerging fields, such as quantum networks, that will have tremendous implications on the well-being of people and the advancement of other fields outside of high-energy physics.”

The Illinois‐Express Quantum Network is supported by Advanced Scientific Computing Researchin the DOE Office of Science.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Physicists used MINERvA, a Fermilab neutrino experiment, to measure the proton’s size and structure using a neutrino-scattering technique.

For the first time, particle physicists have been able to precisely measure the proton’s size and structure using neutrinos. With data gathered from thousands of neutrino-hydrogen scattering events collected by MINERvA, a particle physics experiment at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, physicists have found a new lens for exploring protons. The results were published today in the scientific journal Nature.

This measurement is also important for analyzing data from experiments that aim to measure the properties of neutrinos with great precision, including the future Deep Underground Neutrino Experiment, hosted by Fermilab.

MINERvA

One of two magnetic focusing horns used in the beamline at Fermilab that produces intense neutrino beams for MINERvA and other neutrino experiments. Photo: Reidar Hahn, Fermilab

“The MINERvA experiment has found a novel way for us to see and understand proton structure, critical both for our understanding of the building blocks of matter and for our ability to interpret results from the flagship DUNE experiment on the horizon,” said Bonnie Fleming, Fermilab deputy director for science and technology.

Protons and neutrons are the particles that make up the nucleus, or core, of an atom. Understanding their size and structure is essential to understand particle interactions. But it is very difficult to measure things at subatomic scales. Protons — about a femtometer, or 10-15 meters, in diameter — are too small to examine with visible light. Instead, scientists use particles accelerated to high energies. Their wavelengths are capable of probing miniscule scales.

Starting in the 1950s, particle physicists used electrons to measure the size and structure of the proton. Electrons are electrically charged, which means they interact with the electromagnetic force distribution in the proton. By shooting a beam of accelerated electrons at a target containing lots of atoms, physicists can observe how the electrons interact with the protons and thus how the electromagnetic force is distributed in a proton. Performing increasingly more precise experiments, physicists now have measured the proton’s electric charge radius to be 0.877 femtometers.

The MINERvA collaboration achieved its groundbreaking result by using particles called neutrinos in lieu of electrons. Specifically, they used antineutrinos, the antimatter partners of neutrinos. Unlike electrons, neutrinos and antineutrinos have no electric charge; they only interact with other particles via the weak nuclear force. This makes them sensitive to the “weak charge” distribution inside a proton.

However, neutrinos and antineutrinos rarely interact with protons — hence the name weak force. To collect enough scattering events to make a statistically meaningful measurement, MINERvA scientists needed to smash a lot of antineutrinos into a lot of protons.

Fortunately, Fermilab is home to the world’s most powerful high-energy neutrino and antineutrino beams. And MINERvA contains a lot of protons. Located 100 meters underground at Fermilab’s campus in Batavia, Illinois, MINERvA was designed to perform high-precision measurements of neutrino interactions on a wide variety of materials, including carbon, lead and plastic.

The MINERvA experiment has found a novel way for us to see and understand proton structure.

– Bonnie Fleming

To measure the proton structure with high precision, scientists ideally would send neutrinos or antineutrinos into a very dense target made only of hydrogen, which contains protons but no neutrons. That is experimentally challenging, if not impossible to achieve. Instead, the MINERvA detector contains hydrogen that is closely bonded to carbon in the form of a plastic called polystyrene. But no one had ever tried to separate hydrogen data from carbon data.

“If we were not optimists, we would say it’s impossible,” said Tejin Cai, a postdoctoral researcher at York University and lead author on the Nature paper. Cai performed this research for his doctorate at the University of Rochester. “The hydrogen and carbon are chemically bonded together, so the detector sees interactions on both at once. But then, I realized that the very nuclear effects that made scattering on carbon complicated also allowed us to select hydrogen and would allow us to subtract off the carbon interactions.”

Cai and Arie Bodek, a professor at the University of Rochester, proposed using MINERvA’s polystyrene target to measure antineutrinos scattering off protons in hydrogen and carbon nuclei to Cai’s Ph.D. advisor, Kevin McFarland. Together, they developed algorithms to subtract the large carbon background by identifying neutrons produced from antineutrinos scattering off carbon atoms.

“When Tejin and Arie first suggested trying this analysis, I thought it would be too difficult, and I wasn’t encouraging. Tejin persevered and proved it could be done,” said McFarland, a professor at the University of Rochester. “One of the best parts of being a teacher is having a student who learns enough to prove you wrong.”

Cai and his collaborators used MINERvA to record more than a million antineutrino interactions over the course of three years. They determined that about 5,000 of these were neutrino-hydrogen scattering events.

detector

MINERvA used a collider detector design and a high-intensity beam to study neutrino reactions with five different nuclei, creating the first self-contained comparison of interactions in different elements. Credit: Fermilab

With these data, they inferred the size of the proton’s weak charge radius to be 0.73 ± 0.17 femtometers. It is the first statistically significant measurement of the proton’s radius using neutrinos. Within its uncertainties, the result aligns with the electric charge radius measured with electron scattering.

The result shows that physicists can use this neutrino-scattering technique to see the proton through a new lens. It is like how looking at the universe in different wavelengths allows astronomers to study the universe in new ways, as seen with the infrared-sensitive James Webb Space Telescope.

The result also provides a better understanding of the proton’s structure. This can be used to predict the behavior of groups of protons in an atom’s nucleus. If physicists start with a better measurement of neutrino-proton interactions, they can make better models of neutrino-nucleus interactions. This will improve the performance of other neutrino experiments, such as NOvA at Fermilab and T2K in Japan.

For Cai, one of the most exciting things about the result is showing that, “Even with a general particle detector, we can do things we didn’t imagine we could do.”

Co-spokesperson for the MINERvA collaboration Deborah Harris, scientist at Fermilab and professor of physics at York University, said, “When we proposed MINERvA, we never thought we’d be able to extract measurements from the hydrogen in the detector. Making this work required great performance from the detector, creative analysis ideas and years of running in the most intense high-energy neutrino beam on the planet.”

This work is supported by the DOE Office of Science, National Science Foundation, Coordination for the Improvement of Higher Education Personnel in Brazil, Brazilian National Council for Scientific and Technological Development, Mexican National Council of Science and Technology, Basal Project in Chile, Chilean National Commission for Scientific and Technological Research, Chilean National Fund for Scientific and Technological Development, Peruvian National Council for Science, Technology and Technological Innovation, Research Management Directorate at the Pontifical Catholic University of Peru, National University of Engineering in Peru, Polish National Science Center and UK Science and Technology Facilities Council.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Editor’s note: This article was originally written by the University of Chicago and was adapted with Fermilab information.

Sometimes to know what the matter is, you have to find it first.

When the universe began, matter was flung outward and gradually formed planets, stars and galaxies. By carefully assembling a map of that matter today, scientists can try to understand the forces that shaped the evolution of the universe.

A group of scientists, including several with the U.S. Department of Energy’s Fermi National Accelerator Laboratory, have released one of the most precise measurements ever made of how matter is distributed across the universe today.

Combining data from two major telescope surveys of the universe, the Dark Energy Survey and the South Pole Telescope, the analysis involved more than 150 researchers and is published as a set of three articles Jan. 31 as the Editors’ Suggestion in Physical Review D.

Scientists have released a new survey of all the matter in the universe, using data taken by the Dark Energy Survey in Chile (above) and the South Pole Telescope. Photo: Andreas Papadopoulos, University of Portsmouth

“One of the original motivations of building DES and SPT was to combine both sets of measurements to more powerfully constrain cosmology,” said Bradford Benson, a scientist at Fermilab and associate director of operations for SPT. “This is an exciting result because for the first time it demonstrates combining data from both experiments which has interestingly found some tension between early and late time large-scale structure growth.”

Among other findings, the analysis indicates that matter is not as “clumpy” as we would expect, based on our current best model of the universe, which adds to a body of evidence that there may be something missing from the existing standard model of the universe.

Cooling and clumps

After the Big Bang created all the matter in the universe in a very hot, intense few moments about 13 billion years ago, this matter has been spreading outward, cooling and clumping as it goes. Scientists are very interested in tracing the path of this matter; by seeing where all the matter ended up, they can try to recreate what happened and what forces would have had to have been in play.

The first step is collecting enormous amounts of data with telescopes.

In this study, scientists combined data from two very different telescope surveys: The Dark Energy Survey, which surveyed the sky over six years from a mountaintop in Chile; and the South Pole Telescope, which looks for the faint traces of radiation that are still traveling across the sky from the first few moments of the universe.

Combining two different methods of looking at the sky reduces the chance that the results are thrown off by an error in one of the forms of measurement. “It functions like a cross-check, so it becomes a much more robust measurement than if you just used one or the other,” said University of Chicago astrophysicist Chihway Chang, one of the lead authors of the studies.

In both cases, the analysis looked at a phenomenon called gravitational lensing. As light travels across the universe, it can be slightly bent as it passes objects with lots of gravity, such as galaxies.

This method catches both regular matter and dark matter — the mysterious form of matter that has only been detected due to its effects on regular matter — because both regular and dark matter exert gravity.

By rigorously analyzing these two sets of data, the scientists could infer where all the matter ended up in the universe. It is more precise than previous measurements — that is, it narrows down the possibilities for where this matter wound up — compared to previous analyses, the authors said.

By comparing maps of the sky from the Dark Energy Survey telescope (left) with data from the South Pole Telescope and the Planck satellite (right), the team could infer how the matter is distributed. Image: Yuuki Omori, University of Chicago

The majority of the results fit perfectly with the currently accepted best theory of the universe.

But there are also signs of a crack — one that has been suggested in the past by other analyses, too.

“It seems like there are slightly less fluctuations in the current universe, than we would predict assuming our standard cosmological model anchored to the early universe,” said analysis coauthor and University of Hawaii astrophysicist Eric Baxter.

That is, if you made a model incorporating all the currently accepted physical laws, then took the readings from the beginning of the universe and extrapolated it forward through time, the results would look slightly different from what we actually measure around us today.

Specifically, today’s readings find the universe is less “clumpy” — clustering in certain areas rather than evenly spread out — than the model would predict.

If other studies continue to find the same results, scientists say, it may mean there is something missing from our existing model of the universe, but the results are not yet to the statistical level that scientists consider to be ironclad. That will take further study.

“Perhaps most exciting, this is just the tip of the iceberg using the data that we have. This analysis used only half of the DES survey data, and the SPT-3G camera has already made measurements roughly an order of magnitude more sensitive. In the next few years, we expect to learn much more from this method using these much more sensitive data,” said Benson.

The analysis is a landmark, as it yielded useful information from two very different telescope surveys. This is a much-anticipated strategy for the future of astrophysics, as more large telescopes come online in the next decades, but few had actually been carried out yet.

University of Chicago Kavli Associate Fellow Yuuki Omori was a lead co-author for the papers: Paper I, Paper II, Paper III.

The South Pole Telescope is primarily funded by the National Science Foundation and the U.S. Department of Energy and is operated by a collaboration led by the University of Chicago. The Dark Energy Survey was an international collaboration coordinated through Fermi National Accelerator Laboratory and funded by the U.S. Department of Energy, the National Science Foundation and many institutions around the world.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.