A whale of a tale

It is hard these days not to encounter examples of machine learning out in the world. Chances are, if your phone unlocks using facial recognition or if you’re using voice commands to control your phone, you are likely using machine learning algorithms — in particular deep neural networks.

What makes these algorithms so powerful is that they learn relationships between high-level concepts we wish to find in an image (faces) or sound wave (words) with sets of low-level patterns (lines, shapes, colors, textures, individual sounds), which represent them in the data. Furthermore, these low-level patterns and relationships do not have to be conceived of or hand-designed by humans, but instead are learned directly from examples of the data. Not having to come up with new patterns to find for each new problem is the reason deep neural networks have been able to advance the state of the art for so many different types of problems: from analyzing video for self-driving cars to assisting robots in learning how to manipulate objects.

Here at Fermilab, there has been a lot of effort in having these deep neural networks help us analyze the data from our particle detectors so that we can more quickly and effectively use it to look for new physics. These applications are a continuation of the high-energy physics community’s long history in adopting and furthering the use of machine learning algorithms.

Recently, the MicroBooNE neutrino experiment published a paper describing how they used convolutional neural networks — a particular type of deep neural network — to sort individual pixels coming from images made by a particular type of detector known as a liquid-argon time projection (LArTPC) chamber. The experiment designed a convolutional neural network called U-ResNet to distinguish between two types of pixels: those that were a part of a track-like particle trajectory from those that were a part of a shower-like particle trajectory.

This plot shows a comparison of U-ResNet performance on data and simulation, where the true pixel labels are provided by a physicist. The sample used is 100 events that contain a charged-current neutrino interaction candidate with neutral pions produced at the event vertex. The horizontal axis shows the fraction of pixels where the prediction by U-ResNet differed from the labels for each event. The error bars indicate only a statistical uncertainty.

Track-like trajectories, made by particles such as a muon or proton, consist of a line with small curvature. Shower-like trajectories, produced by particles such as an electron or photon, are more complex topological features with many branching trajectories. This distinction is important because separating these types of topologies can be difficult for traditional algorithms. Not only that, shower-like shapes are produced when electrons and photons interact in the detector, and these two particles are often an important signal or background in physics analyses.

MicroBooNE researchers demonstrated that these networks not only performed well but also worked in a similar fashion when presented with simulated data and real data. The latter is the first time this has been demonstrated for data from LArTPCs.

Showing that networks behave the same on simulated and real data is critical, because these networks are typically trained on simulated data. Recall that these networks learn by looking at many examples. In industry, gathering large “training” data sets is an arduous and expensive task. However, particle physicists have a secret weapon — they can create as much simulated data as they want, since all experiments produce a highly detailed model of their detectors and data acquisition systems in order to produce as faithful a representation of the data as possible.

However, these models are never perfect. And so a big question was, “Is the simulated data close enough to the real data to properly train these neural networks?” The way MicroBooNE answered this question is by performing a Turing test that compares the performance of the network to that of a physicist. They demonstrated that the accuracy of the human was similar to the machine when labeling simulated data, for which an absolute accuracy can be defined. They then compared the labels for real data. Here the disagreement between labels was low, and similar between machine and human. (See the top figure. See the figure below for an example of how a human and computer labeled the same data event.) In addition, a number of qualitative studies looked at the correlation between manipulations of the image and the label provided by the network. They showed that the correlations follow human-like intuitions. For example, as a line segment gets shorter, the network becomes less confident if the segment is due to a track or a shower. This suggests that the low-level correlations being used are the same physically motivated correlations a physicist would use if engineering an algorithm by hand.

This example image shows a charged-current neutrino interaction with decay gamma rays from a neutral pion (left). The label image (middle) is shown with the output of U-ResNet (right) where track and shower pixels are shown in yellow and cyan color respectively.

Demonstrating this simulated-versus-real data milestone is important because convolutional neural networks are valuable to current and future neutrino experiments that will use LArTPCs. This track-shower labeling is currently being employed in upcoming MicroBooNE analyses. Furthermore, for the upcoming Deep Underground Neutrino Experiment (DUNE), convolutional neural networks are showing much promise toward having the performance necessary to achieve DUNE’s physics goals, such as the measurement of CP violation, a possible explanation of the asymmetry in the presence of matter and antimatter in the current universe. The more demonstrations there are that these algorithms work on real LArTPC data, the more confidence the community can have that convolutional neural networks will help us learn about the properties of the neutrino and the fundamental laws of nature once DUNE begins to take data.

Learn more

Victor Genty, Kazuhiro Terao and Taritree Wongjirad are three of the scientists who analyzed this result. Victor Genty is a graduate student at Columbia University. Kazuhiro Terao is a physicist at SLAC National Accelerator Laboratory. Taritree Wongjirad is an assistant professor at Tufts University.

At the upcoming workshop, scientists will explore ways that the fields of high-energy physics and quantum information science can advance each other. It will also feature Google's first public hands-on tutorial on their quantum software. Photo: Reidar Hahn

At the upcoming workshop, scientists will explore ways that the fields of high-energy physics and quantum information science can advance each other. It will also feature Google’s first public hands-on tutorial on their quantum software. Photo: Reidar Hahn

Solving the longstanding, seemingly intractable problems of particle physics is about to get one quantum step closer. (And ideas for quantum teleportation experiments might surface at the same time.)

From Sept. 12-14, scientists, engineers and members of industry from around the globe will converge at the Department of Energy’s Fermilab to explore quantum computing technologies for high-energy physics.

The hands-on workshop, “Next Steps in Quantum Science for High-Energy Physics,” will feature speakers from Google, IBM, and several universities and national laboratories. Renowned physicist John Preskill of Caltech will anchor the first day with an address on quantum information science. And for the first time in a public setting, Google will conduct a hands-on tutorial on their quantum computing software, giving attendees an opportunity to use it directly and meet face-to-face with company representatives.

“This is the world’s first offering for people in high-energy physics to use a quantum software package — and with the people who wrote it,” said Fermilab Chief Research Officer and Deputy Director Joe Lykken. “And the topics go beyond high-energy physics. The problems our field is trying to solve through quantum computing share extensive overlap with other fields, such as nuclear physics and scientific computing.”

Fermilab is a natural host for this first-of-its-kind workshop. As home to groundbreaking discoveries in particle physics and a pioneer in high-performance and supercomputing, Fermilab’s expertise at the intersection of these areas is world-class.

And now it is directing some of its considerable expertise to the area of quantum computing, which holds great promise for solving some of nature’s toughest problems. Quantum computers may be able to tackle in minutes problems that would take classical computers years to solve.

Google’s hands-on tutorials will provide a first taste of this approach. A Google team led by Product Manager Alan Ho will demonstrate the use of two different software packages: Cirq, a quantum computing programming framework that exposes the hardware detail to the programmer, and OpenFermion, a platform for translating problems in quantum chemistry and material science to quantum computers.

“It’s great that Fermi National Accelerator Laboratory has convened some of the best scientists to coming up with ideas for quantum computing,” Ho said. “I’m looking forward to hearing the different ideas and experiments that can be run on a quantum computer. We believe that Fermilab is a great partner for this.”

The feedback from participants, he added, will be valuable.

“We want to make the computers useful, so we need to hear the best ideas,” Ho said.

Topics include quantum simulation of quantum field theories, algorithms for traditional high-energy physics computational problems, quantum teleportation experiments and qubit technologies for quantum sensors.

“This is the meeting where scientists can say, ‘I have this problem, I want to solve it. How do I do that with quantum?’ and get an answer,” Fermilab’s Lykken said. “It’s going to be an exciting glimpse into ways we can advance the field.”

Chicago-area scientists lead effort to probe the cosmic microwave background

The South Pole Telescope measures the cosmic microwave background from the earliest days of the universe. Photo courtesy of Brad Benson

The South Pole Telescope measures the cosmic microwave background from the earliest days of the universe. Photo: Jason Gallicchio

Deep in Antarctica, at the southernmost point on our planet, sits a 33-foot telescope designed for a single purpose: to make images of the oldest light in the universe.

This light, known as the cosmic microwave background, or CMB, has journeyed across the cosmos for 14 billion years — from the moments immediately after the Big Bang until now. Because it is brightest in the microwave part of the spectrum, the CMB is impossible to see with our eyes and requires specialized telescopes.

The South Pole Telescope, specially designed to measure the CMB, has recently opened its third-generation camera for a multiyear survey to observe the earliest instants of the universe. Since 2007, the SPT has shed light on the physics of black holes, discovered a galaxy cluster that is making stars at the highest rate ever seen, redefined our picture of when the first stars formed In the universe, provided new insights into dark energy and homed in on the masses of neutrinos. This latest upgrade improves its sensitivity by nearly an order of magnitude — making it among the most sensitive CMB instruments ever built.

“Being able to detect and analyze the CMB, especially with this level of detail, is like having a time machine to go back to the first moments of our universe,” said University of Chicago Professor John Carlstrom, the principal investigator for the South Pole Telescope project.

“Encoded in images of the CMB light that we capture is the history of what that light has encountered in its 14 billion-year journey across the cosmos,” he added. “From these images, we can tell what the universe is made up of, how the universe looked when it was extremely young and how the universe has evolved.”

The South Pole Telescope team, led by the University of Chicago, Fermilab and Argonne National Laboratory. Photo courtesy of Brad Benson

The South Pole Telescope team, led by the University of Chicago, Fermilab and Argonne National Laboratory. Photo: Brad Benson

Located at the National Science Foundation’s Amundsen-Scott South Pole Station, the South Pole Telescope is funded and maintained by the National Science Foundation in its role as manager of the U.S. Antarctic Program, the national program of research on the southernmost continent.

“The ability to operate a 10-meter telescope, literally at the end of the Earth, is a testament to the scientific capabilities of the researchers that NSF supports and the sophisticated logistical support that NSF and its partners are able to provide in one of the harshest environments on Earth,” said Vladimir Papitashvili, Antarctic astrophysics and geospace sciences program director in NSF’s Office of Polar Programs. “This new camera will extend the abilities of an already impressive instrument.”

The telescope is operated by a collaboration of more than 80 scientists and engineers from a group of universities and U.S. Department of Energy national laboratories, including three institutions in the Chicago area. These research organizations — the University of Chicago, Argonne National Laboratory and Fermi National Accelerator Laboratory — have worked together to build a new, ultrasensitive camera for the telescope, containing 16,000 specially manufactured detectors.

“Built with cutting-edge detector technology, this new camera will significantly advance the search for the signature of early cosmic inflation in the cosmic microwave background and allow us to make inroads into other fundamental mysteries of the universe, including the masses of neutrinos and the nature of dark energy,” said Kathy Turner of DOE’s Office of Science.

“Baby pictures” of the cosmos

The Aurora Australis above the South Pole Telescope. Photo courtesy of Brad Benson

The Aurora Australis above the South Pole Telescope. Photo: Robert Schwarz

The CMB is the oldest light in our universe, produced in the intensely hot aftermath of the Big Bang before even the formation of atoms. These primordial particles of light, which have remained nearly untouched for nearly 14 billion years, provide unique clues about how the universe looked at the beginning of time and how it has changed since.

“This relic light is still incredibly bright — literally outshining all the stars that have ever existed in the history of the universe by over an order of magnitude in energy,” said University of Chicago professor and Fermilab scientist Bradford Benson, who headed the effort to build this new camera.

However, because most of the energy is in the microwave part of the spectrum, to observe it we need to use special detectors at observatories in high and dry locations. The South Pole Station is better than anyplace else on Earth for this: It is located atop a two-mile-thick ice sheet, and the extremely low temperatures in Antarctica mean there is almost no atmospheric water vapor.

Scientists are hoping to plumb this data for information on a number of physical processes and even new particles.

“The cosmic microwave background is a remarkably rich source for science,” Benson said. “The third-generation camera survey can give us clues on everything from dark energy to the physics of the Big Bang to locating the most massive clusters of galaxies in the universe.”

The details of this “baby picture” of the cosmos will allow scientists to better understand the different kinds of matter and energy that make up our universe, such as neutrinos and dark energy. They may even find evidence of the gravitational waves from the beginning of the universe, regarded by many as the “smoking gun” for the theory of inflation. It also serves as a rich astronomical survey; one of the things they’ll be looking for are some of the first massive galaxies in the universe. These massive galaxies are increasingly of interest to astronomers as “star farms,” forming the first stars in the universe, and since they are nearly invisible to typical optical telescopes, the South Pole Telescope is perhaps the most efficient way to find them.

“Nothing that comes out of a box”

The South Pole Telescope collaboration has operated the telescope since its construction in 2007. Grants from multiple sources — the National Science Foundation, the U.S. Department of Energy, and the Kavli and Moore foundations — supported a second-generation polarization-sensitive camera. The latest third-generation focal plane contains 10 times as many detectors as the previous experiment, requiring new ideas and solutions in materials and nanoscience.

“From a technology perspective, there is virtually nothing that comes ‘out of a box,’” said Clarence Chang, an assistant professor at the University of Chicago and physicist at Argonne involved with the experiment.

For the South Pole Telescope, scientists needed equipment far more sensitive than anything made commercially. They had to develop their own detectors, which use special materials for sensing tiny changes in temperature when they absorb light. These custom detectors were developed and manufactured from scratch in ultraclean rooms at Argonne.

The detectors went to Fermilab to be assembled into modules, which included small lenses for each pixel made at the University of Illinois at Urbana-Champaign. After being tested at multiple collaborating universities around the country, the detectors made their way back to Fermilab to be integrated into the South Pole Telescope camera cryostat, designed by Benson. The camera looks like an 8-foot-tall, 2,500-pound optical camera with a telephoto lens on the front, but with the added complication that the lenses need to be cooled to just a few degrees above absolute zero. (Even the Antarctic isn’t that cold, so it needs this special cryostat to cool it down further.)

Finally, the new camera was ready for its 10,000-mile journey to Antarctica by way of land, air and sea. On the final leg, from NSF’s McMurdo Station to the South Pole, it flew aboard a specialized LC130 cargo plane outfitted with skis so that it could land on snow near the telescope site, since the station sits atop an ice sheet.  The components were carefully unloaded, and a team of more than 30 scientists raced to reassemble the camera during the brief three-month Antarctic summer — since the South Pole is not accessible by plane for most of the year due to temperatures that can drop to minus 100 degrees Fahrenheit.

The South Pole Telescope’s multiyear observing campaign brings together researchers from across North America, Europe and Australia. With the upgraded telescope taking data, the exploration of the cosmic microwave background radiation enters a new era with a powerful collaboration and an extremely sensitive instrument.

“The study of the CMB involves so many different kinds of scientific journeys,” Chang said. “It’s exciting to watch efforts from all over come together to push the frontiers of what we know.”


The South Pole Telescope collaboration is led by the University of Chicago, and includes research groups at Argonne National Laboratory, Case Western Reserve University, Fermi National Accelerator Laboratory, Harvard-Smithsonian Astrophysical Observatory, Ludwig Maximilian University of Munich, McGill University, SLAC National Accelerator Laboratory, University of California at Berkeley, University of California at Davis, University of California at Los Angeles, University of Colorado at Boulder, University of Illinois at Urbana-Champaign, University of Melbourne, University of Toronto, as well as individual scientists at several other institutions.

The South Pole Telescope is funded primarily by the National Science Foundation’s Office of Polar Programs and the U.S. Department of Energy Office of Science. Partial support also is provided by the NSF-funded Physics Frontier Center at the KICP, the Kavli Foundation, and the Gordon and Betty Moore Foundation.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC, a joint partnership between the University of Chicago and the Universities Research Association Inc. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2018, its budget is $7.8 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives more than 50,000 competitive proposals for funding and makes about 12,000 new funding awards.