
This photo of the RFQ for the Fermilab PIP-II accelerator was taken during the assembly phase at Lawrence Berkeley National Laboratory. Photo courtesy of Andrew Lambert, Berkeley Lab
In March, the Fermilab Accelerator Division successfully sent beam through a newly commissioned linear accelerator. The brand new radio-frequency quadrupole (RFQ) linac, designed and built by a team of engineers and physicists at Lawrence Berkeley National Laboratory, will be the start for a proposed upgrade to Fermilab’s 800-MeV superconducting linear accelerator.
“The RFQ is one of the biggest challenges faced by our group,” said Derun Li, lead scientist on the RFQ development team and deputy head of the Center for Beam Physics at Berkeley Lab. “And seeing it take nearly 100 percent of the source beam on its first try is great!”
The new, front-end accelerator is one of several upgrade projects conducted under PIP-II, a plan to overhaul the Fermilab accelerator complex to produce high-intensity proton beams for the lab’s multiple experiments. PIP-II is supported by the DOE Office of Science.
Currently located at the Cryomodule Test Facility, approximately 1.5 miles northeast of Wilson Hall, the RFQ took first beam – 100-microsecond pulses at 10 hertz – during its first testing phase. Since its first run in March, the team has been working on various commissioning activities, including running the pulsed beam through the RFQ and its transport lines. These activities are expected to continue until June.
The goal of these tests is to provide intense, focused beams to the entire accelerator complex. The lab’s current RFQ, which sits at the beginning of the laboratory’s accelerator chain, accelerates a negative hydrogen ion beam to 0.75 million electronvolts, or MeV. The new RFQ, which is longer, accelerates a beam to 2.1 MeV, nearly three times the energy. Transported beam current, and therefore power, is the key improvement with the new RFQ. The current RFQ delivers 54-watt beam power; the new RFQ delivers beam at 21 kilowatts – an increase by a factor of nearly 400.
RFQs are widely used for accelerating low-energy ion beams, and the energy of the beams they produce typically caps off at about 5 MeV, said Paul Derwent, PIP-II Department head. These low energy protons will then undergo further acceleration by other components of Fermi’s accelerator complex, some to 8 GeV and others to 120 GeV.
The new RFQ is 4.5 meters long and made of four parallel copper vanes, as opposed to four rods used on the current RFQ. As viewed from one end, the vanes form a symmetrical cross. At the center of the cross is a tiny aperture, or tunnel through which the beam travels.
If you were to remove one vane and peer inside the RFQ from the side, you would see an intricate pattern of peaks and valleys, similar to a waveform, along the inner edge of each vane. Like puzzle pieces, the vanes fit together to form the small tunnel with the rippling walls of that inner waveform shape. The farther down the tunnel you go, and therefore the higher the beam energy, the longer the spacing of the peaks and valleys. This means that the time the beam needs to go from peak to valley and back remains constant, necessary for proper acceleration.
Jim Steimel, the electrical engineering coordinator for PIP-II and a Fermilab liaison for Berkeley’s RFQ development team, said this shape is a special trait in RFQs; one that creates an electromagnetic quadrupole field that focuses low-velocity particles.
“As the beam travels through the RFQ tunnel, longitudinal electrical fields generated by the vane peaks and valleys accelerate particle energy,” Steimel said. “This helps focus the beam and keeps the particles accelerating.”
The Berkeley team successfully designed the accelerator to bring beams to a higher intensity than Fermilab’s previous RFQ technology could achieve – energy that matches PIP-II’s front-end requirements.
“Our challenge was to come up with a design that uses minimum radio-frequency power and delivers the required beam quality and intensity, and to engineer a mechanical design that can withstand continuous operation at high average power,” Li said.
Li’s team took into account potential problems that may occur at a power of 100 kilowatts or more, which was needed to maintain the electromagnetic quadrupole field inside the RFQ.
For example, at higher powers temperatures can rapidly increase, causing thermal stress on the RFQ components. Large water flow rates and durable materials are needed to withstand heat and prevent deformations, which is a significant mechanical engineering feat.
“The Berkeley team is proud to have been a key contributor to the first phase of the PIP-II upgrade,” said Wim Leemans, director of Berkeley Lab’s Accelerator Technology and Applied Physics Division. “Berkeley physicists and engineers have been building RFQs for a number of users and purposes for 30 years, and this is a great example of getting the most leverage out of the agency investment.”
Now that the Berkeley and Fermilab teams demonstrated that the RFQ can generate intense beams in pulses, the next step will be to create a continuous high-intensity beam for PIP-II. The team expects to achieve a continuous beam in the summer.
“Fermilab and Berkeley have a long history of collaboration,” Derwent said. “This was just another one where it has worked very well, and their expertise helped us achieve one of our goals.”

Consul General of Brazil Paulo Camargo (eighth from left) recently visited Fermilab, as did Janaina Solomos (fourth from left), also from the Brazilian consulate. Two directors of Brazilian national labs also visited Fermilab: Brazilian Center for Physics Research Director Ronald Shellard (11th from left) and National Observatory Director Joao dos Anjos (13th from left). Fermilab Director Nigel Lockyer is 6th from the left. Brazilian scientists at Fermilab include Marcelle Soares-Santos (second from left), Fernanda Garcia (12th from left) and Carlos Escobar (sixth from right). Scientists at Brazilian institutions include Mateus Carneiro of the Brazilian Center for Physics Research (3rd from left) and Ernesto Kemp of the State University of Campinas (fourth from right). Photo: Reidar Hahn
“Wish you were here” wasn’t written on the postcard Carlos Escobar received in 1981, but it was the sentiment. A friend from Mexico, writing from Brookhaven National Laboratory, had switched from theoretical to experimental physics and was encouraging his Brazilian colleague to do the same. Within a few years Escobar would follow his friend’s advice and be among the first Brazilian physicists to conduct research at Fermilab.
Other Brazilian researchers followed his footsteps along the same inroads he helped lay. The Brazilian user community at Fermilab now consists of nearly 80 researchers from 15 institutions working across 13 different projects and experiments.
There was a vision, recalls Escobar, pervasive throughout Latin America when he got that note that their accomplished, theoretical high-energy physics community needed to begin running experiments of their own. That feeling led Escobar to help organize the Pan American Symposium on Elementary Particles and Technology in 1983, dedicated to showcasing HEP experiments around the globe. Directors of Fermilab, CERN, DESY and other influential labs were in attendance, and at the close of the conference, Fermilab Director Emeritus Leon Lederman bluntly said that since Brazilian institutions were not ready to pay for their researchers to come to Fermilab, he would.
Thus, Escobar, then based at the University of São Paulo, along with Alberto Santoro, João dos Anjos and Moacyr Souza, all from the Brazilian Center for Research in Physics in Rio de Janeiro, formed the first collaboration between Fermilab and South American institutions. The team worked on a fixed-target experiment, known as E691, for two years before bringing their learnings back to their home institutions.
“Of course, that was the whole purpose of us coming here,” said Escobar, now a guest scientist in the Fermilab Neutrino Division. “To return to Brazil and form our groups (Rio and São Paulo) working on experimental high-energy physics.”
So, when Marcelle Soares-Santos came to Fermilab as a graduate student in 2008, she inherited some dividends of Escobar’s and others’ work, notably an opportunity to work on a precursor of the Dark Energy Survey. She also benefited from some perks of a well-established scientific relationship, including a Brazil House in the Fermilab Village for students and a Brazilian community to help her transition to lab life.
Soares-Santos notes that as the scientific community in Brazil has blossomed, their contingent at Fermilab has concordantly grown and diversified into new projects. Currently, the largest contingent of Brazilian researchers collaborating with Fermilab work on DES, though most do so remotely. The largest groups of on-site Brazilians conduct research in the Neutrino Division or CMS at Fermilab.
Soares-Santos is now a permanent employee of the lab and says most of the researchers she meets follow the path the four Brazilians laid out: They stay for a relatively short term and leave to build on the science back home. She believes this benefits the researchers and the scientific community.
“It’s noticeable how much people grow,” Soares-Santos said. “And that, I think, has an impact back home on the level of science we’re doing today and bodes well for the level of science we hope to achieve.”

Pilar Coloma (left) and Seyda Ipek write calculations from floor to ceiling as they try to find solutions to lingering questions about our current models of the universe. Photo: Rashmi Shivni, OC
Some of the ideas you’ve probably had about theoretical physicists are true.
They toil away at complicated equations. The amount of time they spend on their computers rivals that of millennials on their hand-held devices. And almost nothing of what they turn up will ever be understood by most of us.
The statements are true, but as you might expect, the resulting portrait of ivory tower isolation misses the mark.
The theorist’s task is to explain why we see what we see and predict what we might expect to see, and such pronouncements can’t be made from the proverbial armchair. Theorists work with experimentalists, their counterparts in the proverbial field, as a vital part of the feedback loop of scientific investigation.
“Sometimes I bounce ideas off experimentalists and learn from what they have seen in their results,” said Fermilab theorist Pilar Coloma, who studies neutrino physics. “Or they may find something profound in theory models that they want to test. My job is all about pushing the knowledge forward so other people can use it.”
Predictive power
Theorists in particle physics — the Higgses and Hawkings of the world — push knowledge by making predictions about particle interactions. Starting from the framework known as the Standard Model, they calculate, say, the likelihood of numerous outcomes from the interaction of two electrons, like a blackjack player scanning through the possibilities for the dealer’s next draw.
Experimentalists can then seek out the predicted phenomena, rooting around in the data for a never-before-seen phenomenon.
Theorists’ predictions keep experimentalists from having to shoot in the dark. Like an experienced paleontologist, the theorist can tell the experimentalist where to dig to find something new.
“We simulate many fake events,” Coloma said. “The simulated data determines the prospects for an experiment or puts a bound on a new physics model.”
The Higgs boson provides one example. By 2011, a year before CERN’s ATLAS and CMS experiments announced they’d discovered the Higgs boson, theorists had put forth nearly 100 different proposals by as many different methods for the particle’s mass. Many of the predictions were indeed in the neighborhood of the mass as measured by the two experiments.
And like the paleontologist presented with a new artifact, the theorist also offers explanations for unexplained sightings in experimentalists’ data. She might compare the particle signatures in the detector against her many fake events. Or given an intriguing measurement, she might fold it into the next iteration of calculations. If experimentalists see a particle made of a quark combination not yet on the books, theorists would respond by explaining the underlying mechanism or, if there isn’t one yet, work it out.
“Experimentalists give you information. ‘We think this particle is of this type. Do you know of any Standard Model particle that fits?’” said Seyda Ipek, a theorist studying the matter-antimatter imbalance in the universe. “At first it might not be obvious, because when you add something new, you change the other observations you know are in the Standard Model, and that puts a constraint on your models.”
And since the grand aim of particle physics theory is to be able to explain all of nature, the calculation developed to explain a new phenomenon must be extendible to a general principle.
“Unless you have a very good prediction from theory, you can’t convert that experimental measurement into a parameter that appears in the underlying theory of the Standard Model,” said Fermilab theorist John Campbell, who works on precision theoretical predictions for the ATLAS and CMS experiments at the Large Hadron Collider.
Calculating moves
The theorist’s calculation starts with the prospect of a new measurement or a hole in a theory.
“You look at the interesting things that an experiment is going to measure or that you have a chance of measuring,” Campbell said. “If the data agrees with theory everywhere, there’s not much room for new physics. So you look for small deviations that might be a sign of something. You’re really trying to dream up a new set of interactions that might explain why the data doesn’t agree somewhere.”
In its raw form, particle physics data is the amount and location of the energy a particle deposits in a particle detector. The more sensitive the detector, the more accurate the experimentalists’ measurement, and the more precise the corresponding calculation needs to be.

Fermilab theorists John Campbell (left) and Ye Li work on a calculation that describes the interactions you might expect to see in the complicated environment of the LHC. Photo: Rashmi Shivni
The CMS detector at the Large Hadron Collider, for example, allows scientists to measure some probabilities of particle interactions to within a few percent. And that’s after taking into account that it takes one million or even one billion proton-proton collisions to produce just one interesting interaction that CMS would like to measure.
“When you’re making the measurement that accurately, it demands a prediction at a very high level,” Campbell said. “If you’re looking for something unexpected, then you need to know the expected part in quite a lot of detail.”
A paleontologist recognizes the vertebra of a brachiosaurus, and the theoretical particle physicist knows what the production of a pair of top quarks looks like in the detector. A departure from the known picture triggers him to take action.
“So then you embark on this calculation,” Campbell said.
Embark, indeed. These calculations are not pencil-and-paper assignments. A single calculation predicting the details of a particle interaction, for example, can be a prodigious effort that takes months or years.
So-called loop corrections are one example: Theorists home in on what happens during a particle event by adding detail — a correction — to an approximate picture.
Consider two electrons that approach each other, exchange a photon and diverge. Zooming in further, you predict that the photon emits and reabsorbs yet another pair of particles before it itself is reabsorbed by the electron pair. And perhaps you predict that, at the same time, one of the electrons emits and reabsorbs another photon all on its own.
Each additional quantum-scale effect, or loop, in the big-picture interaction is like pennies on the dollar, changing the accounting of the total transaction — the precision of a particle mass calculation or of the interaction strength between two particles.
With each additional loop, the task of performing the calculation becomes that much more formidable. (“Loop” reflects how the effects are represented pictorially in Feynman diagrams — details in the approximate picture of the interaction.) Theorists were computing one-loop corrections for the production of a Higgs boson arising from two protons until 1991. It took another 10 years to complete the two-loop corrections for the process. And it wasn’t until this year, 2016, that they finished computing the three-loop corrections. Precise measurements at the Large Hadron Collider would (and do) require precise predictions to determine the kind of Higgs boson that scientists would see, demanding the decades-long investment.
“Doing these calculations is not straightforward, or we would have done them a long time ago,” Campbell said.
Once the theorist completes a calculation, they might publish a paper or otherwise make their code broadly available. From there, experimentalists can use the code to simulate how it will look in the detector. Farms of computers map out millions of fake events that take into account the new predictions provided courtesy of the theorist.
“Without a network of computers available, our studies can’t be done in a reasonable time,” Coloma said. “A single computer can not analyze millions of data points, just as a human being could never take on such a task.”
If the simulation shows that, for example, a particle might decay in more ways than what the experiment has seen, the theorist could suggest that experimentalists expand their search.
“We’ve pushed experiments to look in different channels,” Ipek said. “They could look into decays of particles into two-body states, but why not also 10-body states?”
Theorists also work with an experiment, or multiple experiments, to put their calculations to best use. Armed with code, experimentalists can change a parameter or two to guide them in their search for new physics. What happens, for example, if the Higgs boson interacts a little more strongly with the top quark than we expect? How would that change what we see in our detectors?
“That’s a question they can ask and then answer,” Campbell said. “Anyone can come up with a new theory. It is best to try to provide a concrete plan that they can follow.”
Outlandish theories and concrete plans
Concrete plans ensure a fruitful relationship between experiment and theory. The wilder, unconventional theories scientists dream up take the field into exciting, uncharted territory, but that isn’t to say that they don’t also have their utility.
Theorists who specialize in physics beyond the Standard Model, for example, generate thousands of theories worldwide for new physics – new phenomena seen as new energy deposits in the detector where you don’t expect to see them.
“Even if things don’t end up existing, it encourages the experiment to look at its data in different ways,” Campbell said. An experiment could take so much data that you might worry that some fun effect is hiding, never to be seen. Having truckloads of theories helps mitigate against that. “You’re trying to come up with as many outlandish ideas as you can in the hope that you cover as many of those possibilities as you can.”
Theorists bridge the gap between the pure mathematics that describes nature and the data through which nature manifests.
“The field itself is challenging, but theory takes us to new places and helps us imagine new phenomena,” Ipek said.” We collectively work toward understanding every detail of our universe and that’s what ultimately matters most.”