UPDATE: July 9, 2018, 6:05 p.m.
Acid leak contained, clean-up begins
Fire department hazardous-material personnel successfully contained a pinhole-sized leak of sulfuric acid from a 400-gallon tank used for water treatment in a building on the Fermilab site and stopped the leak. A hazardous-material contractor is now on site to remove the acid remaining in the tank and clean up the area where the leak occurred.
There was no environmental impact.
POSTED July 9, 4:02 p.m.
What happened?
A 400-gallon tank with sulfuric acid for water treatment at Fermilab sprang a leak Monday afternoon, July 9, 2018, around 2:30 p.m., and is slowly leaking acid.
Who was involved?
Employee responsible for the system noticed the leak and informed Fermilab Fire Department. No people injured.
Where?
Tank located in Fermilab Central Utility Building.
What is the response?
Fermilab has requested support from local fire departments to contain leak and monitor situation. Fermilab fire department and local fire departments are at site of incident.
Background:
Sulfuric acid is a common chemical used in many processes. It is corrosive.
How will we provide further updates?
Fermilab will provide more updates when available. Reporters should check the Fermilab website at www.fnal.gov or call 630-840-3351.
How do you arrive at the physical laws of the universe when you’re given experimental data on a renegade particle that interacts so rarely with matter, it can cruise through light-years of lead? You call on the power of advanced computing.
The NOvA neutrino experiment, in collaboration with the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC-4) program and the HEPCloud program at DOE’s Fermi National Accelerator Laboratory, was able to perform the largest-scale analysis ever to support the recent evidence of antineutrino oscillation, a phenomenon that may hold clues to how our universe evolved.
Using Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), located at Lawrence Berkeley National Laboratory, NOvA used over 1 million computing cores, or CPUs, between May 14 and 15 and over a short timeframe one week later. This is the largest number of CPUs ever used concurrently over this duration — about 54 hours — for a single high-energy physics experiment. This unprecedented amount of computing enabled scientists to carry out some of the most complicated techniques used in neutrino physics, allowing them to dig deeper into the seldom seen interactions of neutrinos. This Cori allocation was more than 400 times the amount of Fermilab computing allocated to the NOvA experiment and 50 times the total computing capacity at Fermilab allocated for all of its rare-physics experiments. A continuation of the analysis was performed on NERSC’s Cori and Edison supercomputers one week later. In total, nearly 35 million core-hours were consumed by NOvA in the 54-hour period. Executing the same analysis on a single desktop computer would take 4,000 years.

The Cori supercomputer at NERSC was used to perform a complex computational analysis for NOvA. NOvA used over 1 million computing cores, the largest amount ever used concurrently in a 54-hour period. Photo: Roy Kaltschmidt, Lawrence Berkeley National Laboratory
“The special thing about NERSC is that it enabled NOvA to do the science at a new level of precision, a much finer resolution with greater statistical accuracy within a finite amount of time,” said Andrew Norman, NOvA physicist at Fermilab. “It facilitated doing analysis of real data coming off the detector at a rate 50 times faster than that achieved in the past. The first round of analysis was done within 16 hours. Experimenters were able to see what was coming out of the data, and in less than six hours everyone was looking at it. Without these types of resources, we, as a collaboration, could not have turned around results as quickly and understood what we were seeing.”
The experiment presented the latest finding from the recently collected data at the Neutrino 2018 conference in Germany on June 4.
“The speed with which NERSC allowed our analysis team to run sophisticated and intense calculations needed to produce our final results has been a game-changer,” said Fermilab scientist Peter Shanahan, NOvA co-spokesperson. “It accelerated our time-to-results on the last step in our analysis from weeks to days, and that has already had a huge impact on what we were able to show at Neutrino 2018.”
In addition to the state-of-the-art NERSC facility, NOvA relied on work done within the SciDAC HEP Data Analytics on HPC (high-performance computers) project and the Fermilab HEPCloud facility. Both efforts are led by Fermilab scientific computing staff, and both worked together with researchers at NERSC to be able to support NOvA’s antineutrino oscillation evidence.
The current standard practice for Fermilab experimenters is to perform similar analyses using less complex calculations through a combination of both traditional high-throughput computing and the distributed computing provided by Open Science Grid, a national partnership between laboratories and universities for data-intensive research. These are substantial resources, but they use a different model: Both use a large amount of computing resources over a long period of time. For example, some resources are offered only at a low priority, so their use may be preempted by higher-priority demands. But for complex, time-sensitive analyses such as NOvA’s, researchers need the faster processing enabled by modern, high-performance computing techniques.
SciDAC-4 is a DOE Office of Science program that funds collaboration between experts in mathematics, physics and computer science to solve difficult problems. The HEP on HPC project was funded specifically to explore computational analysis techniques for doing large-scale data analysis on DOE-owned supercomputers. Running the NOvA analysis at NERSC, the mission supercomputing facility for the DOE Office of Science, was a task perfectly suited for this project. Fermilab’s Jim Kowalkowski is the principal investigator for HEP on HPC, which also has collaborators from DOE’s Argonne National Laboratory, Berkeley Lab, University of Cincinnati and Colorado State University.
“This analysis forms a kind of baseline. We’re just ramping up, just starting to exploit the other capabilities of NERSC at an unprecedented scale,” Kowalkowski said.
The project’s goal for its first year is to take compute-heavy analysis jobs like NOvA’s and enable it on supercomputers. That means not just running the analysis, but also changing how calculations are done and learning how to revamp the tools that manipulate the data, all in an effort to improve techniques used for doing these analyses and to leverage the full computational power and unique capabilities of modern high-performance computing facilities. In addition, the project seeks to consume all computing cores at once to shorten that timeline.
The Fermilab HEPCloud facility provides cost-effective access to compute resources by optimizing usage across all available types and elastically expanding the resource pool on short notice by, for example, renting temporary resources on commercial clouds or using high-performance computers. HEPCloud enables NOvA and physicists from other experiments to use these compute resources in a transparent way.
For this analysis, “NOvA experimenters didn’t have to change much in terms of business as usual,” said Burt Holzman, HEPCloud principal investigator. “With HEPCloud, we simply expanded our local on-site-at-Fermilab facilities to include Cori and Edison at NERSC.”

At the Neutrino 2018 conference, Fermilab’s NOvA neutrino experiment announced that it had seen strong evidence of muon antineutrinos oscillating into electron antineutrinos over long distances. NOvA collaborated with the Department of Energy’s Scientific Discovery through Advanced Computing program and Fermilab’s HEPCloud program to perform the largest-scale analysis ever to support the recent evidence. Photo: Reidar Hahn
Building on work the Fermilab HEPCloud team has been doing with researchers at NERSC to optimize high-throughput computing in general, the HEPCloud team was able to leverage the facility to achieve the million-core milestone. Thus, it holds the record for the most resources ever provisioned concurrently at a single facility to run experimental HEP workflows.
“This is the culmination of more than a decade of R&D we have done at Fermilab under SciDAC and the first taste of things to come, using these capabilities and HEPCloud,” said Panagiotis Spentzouris, head of the Fermilab Scientific Computing Division and HEPCloud sponsor.
“NOvA is an experimental facility located more than 2,000 miles away from Berkeley Lab, where NERSC is located. The fact that we can make our resources available to the experimental researchers near real-time to enable their time-sensitive science that could not be completed otherwise is very exciting,” said Wahid Bhimji, a NERSC data architect at Berkeley Lab who worked with the NOvA team. “Led by colleague Lisa Gerhardt, we’ve been working closely with the HEPCloud team over the last couple of years, also to support physics experiments at the Large Hadron Collider. The recent NOvA results are a great example of how the infrastructure and capabilities that we’ve built can benefit a wide range of high energy experiments.”
Going forward, Kowalkowski, Holzman and their associated teams will continue building on this achievement.
“We’re going to keep iterating,” Kowalkowski said. “The new facilities and procedures were enthusiastically received by the NOvA collaboration. We will accelerate other key analyses.”
NERSC is a DOE Office of Science user facility.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC, a joint partnership between the University of Chicago and the Universities Research Association Inc. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
At present scientists think that at the highest energies and earliest moments in time, all the fundamental forces may have existed as a single unified force. As the universe cooled just one microsecond after the Big Bang, it underwent a “phase transition” that transformed or “broke” the unified electromagnetic and weak forces into the distinct forces observed today.
One can compare the phase transition to the transformation of water into ice. In this familiar case, we call the transition a change in a state of matter. In the early-universe case, we call the transition electroweak symmetry breaking.
In the same way that we characterize the water-to-ice phase transition as occurring when the temperature drops below 32 degrees, we characterize the amount of symmetry breaking with a parameter called the weak mixing angle, whose value has been measured over the years.
By recreating the early-universe conditions in accelerator experiments, we have observed this transition and can measure the weak mixing angle that controls it. Our best understanding of the electroweak symmetry breaking involves the so-called Higgs mechanism, and the Nobel Prize-winning Higgs boson discovery in 2012 was a milestone in our understanding.

The CDF and DZero experiments at the Tevatron have published their latest measurement of the electroweak mixing angle. Photo: Reidar Hahn
Measuring the weak mixing angle: a global pursuit
Previous determinations of the weak mixing angle from around the world disagreed, allowing for the possibility that maybe there are new fundamental particles to be discovered, or maybe even pointing to a misunderstanding in how we think about the fundamental forces. A new result from Fermilab helps to resolve the discrepancy and reinforces our standard theory of the fundamental forces.
Our combined understanding of particle physics and cosmology hinges on our understanding of the particles and their interactions: how the Higgs boson breaks the symmetry, how the different forces mesh with each other, whether there are new and unknown particles like dark matter or other unseen forces that could have a dramatic effect on the evolution of the universe.
The details of this symmetry breaking affect why matter is stable at all and how stars and galaxies form. The much improved agreement on the measurement of the weak mixing angle helps cement our understanding of the past, the character of what we observe today, and what we believe is in store for our future.

This is a compilation of measurements of the weak mixing angle. The new Tevatron combined result of sin2θefflept=0.23148 ± 0.00033 rivals the precision of the two previous measurements by the LEP-1 experiments at CERN and SLD at SLAC, and it lies midway between them. The Tevatron combination agrees very well with the world average of all results shown by the shaded band.
For two decades, the most precise measurements of the weak mixing angle came from experiments that collided electrons and positrons at the European laboratory CERN and SLAC National Accelerator Laboratory in California, each of which gave different answers. Their results have been puzzling because the probability of the two measurements to agree was less than one part in a thousand, suggesting the possibility of new phenomena. More input was needed.
Although the environment in Fermilab’s proton-antiproton Tevatron Collider was much harsher than either CERN’s or SLAC’s collider, with many more background particles, the large and well-understood data sets of the Tevatron’s CDF and DZero experiments allowed a new combined measurement that gives almost the same precision as the electron-positron measurements.
The new measurement of the mixing angle sin2θefflept is 0.23148 ± 0.00033, and it lies about midway between the CERN and SLAC measurements and thus is in good agreement with both of them, as well as with the average of all previous direct and indirect measurements of weak mixing. Thus, Occam’s razor suggests that those new particles and forces are not yet necessary and that our present particle physics and cosmology models remain good descriptors of the observed universe.
Read more in Physical Review D and on the Tevatron Run II webpage.
Breese Quinn is a member of the DZero collaboration and a physicist at the University of Mississippi. Willis Sakumoto is a member of the CDF collaboration and a physicist at the University of Rochester.