Fermilab feature

They are there and they are gone: ICARUS chases a fourth neutrino

Argon. It’s all around us. It’s in the air we breathe, incandescent lights we read by and plasma globes many of us played with as children.

In liquid form, argon is also an inexpensive and effective target for neutrino physics experiments. On Feb. 21, scientists at Fermilab began cooling down ICARUS – the largest particle detector in the lab’s Short-Baseline Neutrino Program – and filling it with 760 tons of liquid argon, moving ICARUS closer to operation and the search for a fourth type of neutrino.

“The Short-Baseline Neutrino Program is amazing because it will finally resolve longstanding anomalous results in neutrino measurements,” said Robert Wilson, deputy spokesperson of ICARUS and professor of physics at Colorado State University.

“Neutrinos are a fundamental and abundant component of our universe: We still know too little about them, and this keeps me very interested to continue searching their properties,” added Carlo Rubbia, Nobel laureate and ICARUS spokesperson.

Over 20 years ago, scientists at Los Alamos National Laboratory found more electron antineutrinos than they expected in their results from the Liquid Scintillator Neutrino Detector. In a follow-up experiment more than 10 years later, scientists on the MiniBooNE experiment at Fermilab observed a similar inconsistency and uncovered a new anomaly in their neutrino data.

Scientists wonder whether this was more than coincidence.

ICARUS – the largest particle detector in the lab’s Short-Baseline Neutrino Program – and filling it with 760 tons of liquid argon, moving ICARUS closer to operation and the search for a fourth type of neutrino. Photo: Al Johnson, Fermilab

A fourth type of neutrino

It is well-established that the three known neutrino types – electron, muon and tau – oscillate, or change, into one another. To study these oscillations and how they happen, scientists need neutrinos to interact with something. For ICARUS, that substance is liquid argon.

In the ICARUS experiment, a muon-type neutrino beam will interact with liquid argon and should, in theory, produce mostly charged particles called muons. (An electron-type neutrino beam should produce mostly electrons.) But given results from the Liquid Scintillator Neutrino Detector and MiniBooNE, this is only part of the story, and ICARUS intends to fill the gaps.

“What if the neutrinos are oscillating into a neutrino that does not interact at all, not even a little bit like other neutrinos do?” Wilson said. “This is not a natural extension of neutrino theory, but it could explain the LSND and MiniBooNE results.”

Such a fourth type of neutrino, unlike the others, would not change into a complementary charged particle upon interaction in a detector. In fact, it wouldn’t interact at all. By quantum mechanics, however, this so-called sterile neutrino could still oscillate between neutrino types and alter the oscillation pattern that ICARUS will observe.

Discovery of a sterile neutrino would upend the Standard Model of subatomic particles and affect our understanding of how the universe has evolved.

From filling to beam

ICARUS’s optimal location, size and detector material make it uniquely sensitive to detecting neutrinos that would display this oscillation effect. If ICARUS scientists find more electron neutrinos in their muon-type neutrino beam than expected, they would at long last have concrete evidence of sterile neutrinos.

ICARUS’s measurements will also inform how long-baseline neutrino experiments collect and analyze data. For example, scientists’ experience on ICARUS will inform the much larger, international Deep Underground Neutrino Experiment, hosted by Fermilab. ICARUS’s liquid-argon detection technology will be adapted for DUNE, which will use 70,000 tons of liquid argon to study the three known neutrino types and how they change from one to another.

“ICARUS has come a long way from its conception and data taking activity at the Gran Sasso Laboratory in Italy and is now approaching an new phase of data acquisition here at Fermilab. I am thrilled to see the enthusiasm of a younger generation of scientists now at work on this experiment,” Rubbia said.

It will take approximately eight weeks to fill ICARUS with liquid argon. Once the detector is filled, scientists will check its stability and the argon’s purity. Then, they will turn on power for the first time since ICARUS made its way to Fermilab across the Atlantic Ocean. They expect to see first particle tracks later this year.

This work is supported by the U.S. Department of Energy Office of Science.

On March 2, 1995, laboratory staff, users and media gathered at the U.S. Department of Energy’s Fermi National Accelerator Laboratory in the lab’s Ramsey Auditorium to hear about one of the most significant discoveries in science.

After physicists at Fermilab discovered the bottom quark in 1977, the first in a new generation of elementary particles, scientists around the world began searching for evidence of its expected partner, the top quark. The top quark was the last fundamental particle of matter within the six-member family of quarks predicted by the Standard Model, one that physicists had not yet tracked down. Its discovery would provide strong evidence for the accuracy of the Standard Model. The task, however, would not be easy: It took almost two decades of effort by hundreds of scientists from around the world who pushed the boundaries of technology in their search for this elusive particle.

 

People pack Fermilab’s Ramsey Auditorium on March 2, 1995, during the announcement of the discovery of the top quark. Photo: Reidar Hahn, Fermilab

The top quark, as it turned out, is by far the most massive of the elementary particles of the Standard Model — it is 40 times more massive than the bottom quark, the second heaviest of the quarks. It is as massive as an entire gold atom and 50% heavier than a Higgs boson. Fermilab’s original accelerator complex, which culminated in the Main Ring, could not provide sufficient energy to produce the top quark. Top quarks are thought to have existed only for a few brief moments right after the Big Bang: bringing them back into existence would require the lab’s scientists and users to replicate the intense energies of the early universe.

Before the hunt for the top quark could begin, Fermilab had to build the Tevatron, the world’s first superconducting accelerator and for more than 20 years the world’s most powerful particle collider. The hunt for the top quark also required the construction of two large sophisticated particle detectors, known as CDF and DZero, and the combined efforts of two scientific collaborations. The CDF and DZero collaborations would each comprise about 450 people in 1995. Unlike the construction of the accelerator, these detectors were international projects; partners in Italy and Japan played major roles in CDF, while Russia, France, India and Brazil provided key support to DZero. The scientific collaborations included institutions from Brazil, Canada, Colombia, France, India, Italy, Japan, Korea, Mexico, Poland, Russia, Taiwan and the United States.

The Fermilab director joins CDF and DZero spokespeople at the announcement of the top quark discovery. From left: CDF spokesperson Giorgio Bellettini, DZero spokesperson Paul Grannis, Fermilab Director John Peoples, DZero spokesperson Hugh Montgomery, CDF spokesperson Bill Carithers Photo: Reidar Hahn, Fermilab

An article in a July 1992 issue of FermiNews outlines the goals of the experimenters: “This spring, physicists at CDF and D0 began their search for the most elusive of all subatomic particles, the top quark. The top quark is the last, and most important quark that researchers have yet to discover in the Standard Model. Its discovery will not only validate the Standard Model, but also give physicists at Fermilab a closer look at the conditions that existed at the beginning of the universe.”

Fermilab Director John Peoples addresses the press during a March 2, 1995, press meeting on the top quark discovery. Photo: Reidar Hahn, Fermilab

During the next few years, CDF and DZero experimenters recorded and analyzed the products of collisions between protons and antiprotons at the Tevatron. Since top quarks were too short-lived for the experimenters to observe directly, they used their sensitive detectors to look for decay products from the top quark. At the Tevatron, collisions produced a pair of top and antitop quarks. Each decayed immediately into a b quark and a W boson. In the various decay channels of these final-state particles, the top quark signatures appeared as combinations of electrons, muons and jets of particles. The discovery of the top quark would not be a single “Eureka!” moment. Instead, scientists would arrive at their discovery after thorough study and accurate analyses of the particle combinations, separating them from the huge backgrounds that arose from millions of collisions.

CDF experimenters submit their top quark discovery paper to Physical Review Letters on Feb. 24, 1995. Photo: Lynn Johnson, Fermilab

DZero experimenters submit their top quark discovery paper to Physical Review Letters on Feb. 24, 1995. Photo: Fred Ullrich, Fermilab

On Feb. 24, 1995, CDF and DZero simultaneously submitted papers to Physical Review Letters, describing their definitive observations of the top quark: “Observation of Top Quark Production in pp Collisions with the Collider Detector at Fermilab” and “Observation of the Top Quark,” respectively.

The experimenters made the public announcement of their discovery at a press conference on March 2, 1995. Under the leadership of Fermilab director John Peoples, CDF co-spokespersons Giorgio Bellettini and William Carithers Jr. and DZero co-spokespersons Paul Grannis and Hugh Montgomery represented their collaborations. The papers were published in the April 3, 1995, issue of PRL.

Recently in July 2019, the European Physical Society formally presented the CDF and DZero collaborations with the 2019 High Energy and Particle Physics Prize “for the discovery of the top quark and the detailed measurement of its properties.”

The discovery brought Fermilab international attention; newspapers from all over the world, from New York to California, from Brazil to Germany to Japan, covered this historic achievement. Within the particle physics community, the discovery would lay the groundwork for new explorations into the nature of matter and the universe for decades to come.

Major support for construction of the Tevatron and the CDF and DZero experiments was provided by DOE’s Office of Science.

Valerie Higgins is the Fermilab archivist.

Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Hundreds gather in Fermilab’s Ramsey Auditorium to hear the announcement of the top quark discovery. Photo: Reidar Hahn, Fermilab

Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter.

For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where they’ll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits.

“Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space,” said Lyon, one of the lead scientists on the project. “It turns out this is directly applicable to a qubit.”

Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, it’s been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition.

Superconducting radio-frequency cavities, such as the one seen here, are used in particle accelerators. They can also solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. Photo: Reidar Hahn, Fermilab

Superposition

Classical computers use a binary system of 0s and 1s – called bits – to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers aren’t constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesn’t have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both 0 and 1

Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe.

Qubits can be in a superposition of 0 and 1, while classical bits can be only one or the other. Image: Jerald Pinson

Parallel positions

Qubits don’t represent data in the same way as bits. Because qubits in superposition are both 0 and 1 at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and it’s one of the properties that makes quantum computers so much faster than classical systems.

The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color.

“A classical computer would go through every page,” Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. “A quantum computer, instead of going through the pages sequentially, would go through them all at once.”

Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, that’s the answer you’d get.

“But a quantum computer is inherently probabilistic,” Kowalkowski said.

This means the data you get back isn’t definite. In a book with 100 pages, the data from a quantum computer wouldn’t be just three. It also could give you, for example, a 1 percent chance of having three blue pages or a 1 percent chance of 50 blue pages.

An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isn’t very helpful – unless, that is, the right answer could somehow be given a higher probability.

Interference

Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that there’s no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem they’re given.

“If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers,” Lyon said. “You’re trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere.”

When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response.

When waves meet, they may constructively interfere, producing one wave with a higher crest. Image: Jerald Pinson

Waves may also destructively interfere, canceling each other so that there’s no longer any wave to speak of. Image: Jerald Pinson

Listening for signals above the noise

In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the world’s largest supercomputer to complete the same task.

Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 1940s.

“The machines we have now don’t scale up much at all,” Lyon said.

There’s still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations.

If a stray photon — a particle of light — from outside the system were to interact with a qubit, its wave would interfere with the qubit’s superposition, essentially turning the calculations into a jumbled mess – a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second.

“Quantum systems like to be isolated,” Lyon said, “and there’s just no easy way to do that.”

When a quantum computer is operating, it needs to be placed in a large refrigerator, like the one pictured here, to cool the device to less than a degree above absolute zero. This is done to keep energy from the surrounding environment from entering the machine. Photo: Reidar Hahn, Fermilab

Which is where Lyon and Kowalkowski’s simulation work comes in. If the qubits can’t be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise.

It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations.

Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons.

Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions.

Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs.

“Going from one desktop to an HPC, you might be 10,000 times faster,” said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project.

Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices.

“If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it,” said Eric Holland, the deputy head of quantum technology at Fermilab. “We can use that to guide what we develop for quantum technologies.”

This work is supported by the Department of Energy Office of Science.