The Planck scale

Tags:

Consul General of Brazil Paulo Camargo (eighth from left) recently visited Fermilab, as did Janaina Solomos (fourth from left), also from the Brazilian consulate. Two directors of Brazilian national labs also visited Fermilab: Brazilian Center for Research in Physics Director Ronald Shellard (11th from left) and National Observatory Director Joao dos Anjos (13th from left). Fermilab Director Nigel Lockyer is 6th from the left. Brazilian scientists at Fermilab include Marcelle Soares-Santos (second from left), Fernanda Garcia (12th from left) and Carlos Escobar (sixth from right). Scientists at Brazilian institutions include Mateus Carneiro of the Brazilian Center for Physics Research (3rd from left) and Ernesto Kemp of the State University of Campinas (fourth from right) Photo: Reidar Hahn

Consul General of Brazil Paulo Camargo (eighth from left) recently visited Fermilab, as did Janaina Solomos (fourth from left), also from the Brazilian consulate. Two directors of Brazilian national labs also visited Fermilab: Brazilian Center for Physics Research Director Ronald Shellard (11th from left) and National Observatory Director Joao dos Anjos (13th from left). Fermilab Director Nigel Lockyer is 6th from the left. Brazilian scientists at Fermilab include Marcelle Soares-Santos (second from left), Fernanda Garcia (12th from left) and Carlos Escobar (sixth from right). Scientists at Brazilian institutions include Mateus Carneiro of the Brazilian Center for Physics Research (3rd from left) and Ernesto Kemp of the State University of Campinas (fourth from right). Photo: Reidar Hahn

“Wish you were here” wasn’t written on the postcard Carlos Escobar received in 1981, but it was the sentiment. A friend from Mexico, writing from Brookhaven National Laboratory, had switched from theoretical to experimental physics and was encouraging his Brazilian colleague to do the same. Within a few years Escobar would follow his friend’s advice and be among the first Brazilian physicists to conduct research at Fermilab.

Other Brazilian researchers followed his footsteps along the same inroads he helped lay. The Brazilian user community at Fermilab now consists of nearly 80 researchers from 15 institutions working across 13 different projects and experiments.

There was a vision, recalls Escobar, pervasive throughout Latin America when he got that note that their accomplished, theoretical high-energy physics community needed to begin running experiments of their own. That feeling led Escobar to help organize the Pan American Symposium on Elementary Particles and Technology in 1983, dedicated to showcasing HEP experiments around the globe. Directors of Fermilab, CERN, DESY and other influential labs were in attendance, and at the close of the conference, Fermilab Director Emeritus Leon Lederman bluntly said that since Brazilian institutions were not ready to pay for their researchers to come to Fermilab, he would.

Brazilian scientists were working on Fermilab experiment E769 in 1989. Photo: Fred Ullrich

Brazilian scientists were working on Fermilab experiment E769 in 1989. Photo: Fred Ullrich

Thus, Escobar, then based at the University of São Paulo, along with Alberto Santoro, João dos Anjos and Moacyr Souza, all from the Brazilian Center for Research in Physics in Rio de Janeiro, formed the first collaboration between Fermilab and South American institutions. The team worked on a fixed-target experiment, known as E691, for two years before bringing their learnings back to their home institutions.

“Of course, that was the whole purpose of us coming here,” said Escobar, now a guest scientist in the Fermilab Neutrino Division. “To return to Brazil and form our groups (Rio and São Paulo) working on experimental high-energy physics.”

So, when Marcelle Soares-Santos came to Fermilab as a graduate student in 2008, she inherited some dividends of Escobar’s and others’ work, notably an opportunity to work on a precursor of the Dark Energy Survey. She also benefited from some perks of a well-established scientific relationship, including a Brazil House in the Fermilab Village for students and a Brazilian community to help her transition to lab life.

Soares-Santos notes that as the scientific community in Brazil has blossomed, their contingent at Fermilab has concordantly grown and diversified into new projects. Currently, the largest contingent of Brazilian researchers collaborating with Fermilab work on DES, though most do so remotely. The largest groups of on-site Brazilians conduct research in the Neutrino Division or CMS at Fermilab.

Soares-Santos is now a permanent employee of the lab and says most of the researchers she meets follow the path the four Brazilians laid out: They stay for a relatively short term and leave to build on the science back home. She believes this benefits the researchers and the scientific community.

“It’s noticeable how much people grow,” Soares-Santos said. “And that, I think, has an impact back home on the level of science we’re doing today and bodes well for the level of science we hope to achieve.”

Pilar Coloma (left) and Seyda Ipek write calculations from floor to ceiling as they try to find solutions to lingering questions in our current models of the universe. Photo: Rashmi Shivni, OC

Pilar Coloma (left) and Seyda Ipek write calculations from floor to ceiling as they try to find solutions to lingering questions about our current models of the universe. Photo: Rashmi Shivni, OC

Some of the ideas you’ve probably had about theoretical physicists are true.

They toil away at complicated equations. The amount of time they spend on their computers rivals that of millennials on their hand-held devices. And almost nothing of what they turn up will ever be understood by most of us.

The statements are true, but as you might expect, the resulting portrait of ivory tower isolation misses the mark.

The theorist’s task is to explain why we see what we see and predict what we might expect to see, and such pronouncements can’t be made from the proverbial armchair. Theorists work with experimentalists, their counterparts in the proverbial field, as a vital part of the feedback loop of scientific investigation.

“Sometimes I bounce ideas off experimentalists and learn from what they have seen in their results,” said Fermilab theorist Pilar Coloma, who studies neutrino physics. “Or they may find something profound in theory models that they want to test. My job is all about pushing the knowledge forward so other people can use it.”

 

Predictive power

Theorists in particle physics — the Higgses and Hawkings of the world — push knowledge by making predictions about particle interactions. Starting from the framework known as the Standard Model, they calculate, say, the likelihood of numerous outcomes from the interaction of two electrons, like a blackjack player scanning through the possibilities for the dealer’s next draw.

Experimentalists can then seek out the predicted phenomena, rooting around in the data for a never-before-seen phenomenon.

Theorists’ predictions keep experimentalists from having to shoot in the dark. Like an experienced paleontologist, the theorist can tell the experimentalist where to dig to find something new.

“We simulate many fake events,” Coloma said. “The simulated data determines the prospects for an experiment or puts a bound on a new physics model.”

The Higgs boson provides one example. By 2011, a year before CERN’s ATLAS and CMS experiments announced they’d discovered the Higgs boson, theorists had put forth nearly 100 different proposals by as many different methods for the particle’s mass. Many of the predictions were indeed in the neighborhood of the mass as measured by the two experiments.

And like the paleontologist presented with a new artifact, the theorist also offers explanations for unexplained sightings in experimentalists’ data. She might compare the particle signatures in the detector against her many fake events. Or given an intriguing measurement, she might fold it into the next iteration of calculations. If experimentalists see a particle made of a quark combination not yet on the books, theorists would respond by explaining the underlying mechanism or, if there isn’t one yet, work it out.

“Experimentalists give you information. ‘We think this particle is of this type. Do you know of any Standard Model particle that fits?’” said Seyda Ipek, a theorist studying the matter-antimatter imbalance in the universe. “At first it might not be obvious, because when you add something new, you change the other observations you know are in the Standard Model, and that puts a constraint on your models.”

And since the grand aim of particle physics theory is to be able to explain all of nature, the calculation developed to explain a new phenomenon must be extendible to a general principle.

“Unless you have a very good prediction from theory, you can’t convert that experimental measurement into a parameter that appears in the underlying theory of the Standard Model,” said Fermilab theorist John Campbell, who works on precision theoretical predictions for the ATLAS and CMS experiments at the Large Hadron Collider.

 

Calculating moves

The theorist’s calculation starts with the prospect of a new measurement or a hole in a theory.

“You look at the interesting things that an experiment is going to measure or that you have a chance of measuring,” Campbell said. “If the data agrees with theory everywhere, there’s not much room for new physics. So you look for small deviations that might be a sign of something. You’re really trying to dream up a new set of interactions that might explain why the data doesn’t agree somewhere.”

In its raw form, particle physics data is the amount and location of the energy a particle deposits in a particle detector. The more sensitive the detector, the more accurate the experimentalists’ measurement, and the more precise the corresponding calculation needs to be.

Fermilab theorists John Campbell (left) and Ye Li work on a calculation that describes the interactions you might expect to see in the complicated environment of the LHC. Photo: Rashmi Shivni

Fermilab theorists John Campbell (left) and Ye Li work on a calculation that describes the interactions you might expect to see in the complicated environment of the LHC. Photo: Rashmi Shivni

The CMS detector at the Large Hadron Collider, for example, allows scientists to measure some probabilities of particle interactions to within a few percent. And that’s after taking into account that it takes one million or even one billion proton-proton collisions to produce just one interesting interaction that CMS would like to measure.

“When you’re making the measurement that accurately, it demands a prediction at a very high level,” Campbell said. “If you’re looking for something unexpected, then you need to know the expected part in quite a lot of detail.”

A paleontologist recognizes the vertebra of a brachiosaurus, and the theoretical particle physicist knows what the production of a pair of top quarks looks like in the detector. A departure from the known picture triggers him to take action.

“So then you embark on this calculation,” Campbell said.

Embark, indeed. These calculations are not pencil-and-paper assignments. A single calculation predicting the details of a particle interaction, for example, can be a prodigious effort that takes months or years.

So-called loop corrections are one example: Theorists home in on what happens during a particle event by adding detail — a correction — to an approximate picture.

Consider two electrons that approach each other, exchange a photon and diverge. Zooming in further, you predict that the photon emits and reabsorbs yet another pair of particles before it itself is reabsorbed by the electron pair. And perhaps you predict that, at the same time, one of the electrons emits and reabsorbs another photon all on its own.

Each additional quantum-scale effect, or loop, in the big-picture interaction is like pennies on the dollar, changing the accounting of the total transaction — the precision of a particle mass calculation or of the interaction strength between two particles.

With each additional loop, the task of performing the calculation becomes that much more formidable. (“Loop” reflects how the effects are represented pictorially in Feynman diagrams — details in the approximate picture of the interaction.) Theorists were computing one-loop corrections for the production of a Higgs boson arising from two protons until 1991. It took another 10 years to complete the two-loop corrections for the process. And it wasn’t until this year, 2016, that they finished computing the three-loop corrections. Precise measurements at the Large Hadron Collider would (and do) require precise predictions to determine the kind of Higgs boson that scientists would see, demanding the decades-long investment.

“Doing these calculations is not straightforward, or we would have done them a long time ago,” Campbell said.

Once the theorist completes a calculation, they might publish a paper or otherwise make their code broadly available. From there, experimentalists can use the code to simulate how it will look in the detector. Farms of computers map out millions of fake events that take into account the new predictions provided courtesy of the theorist.

“Without a network of computers available, our studies can’t be done in a reasonable time,” Coloma said. “A single computer can not analyze millions of data points, just as a human being could never take on such a task.”

If the simulation shows that, for example, a particle might decay in more ways than what the experiment has seen, the theorist could suggest that experimentalists expand their search.

“We’ve pushed experiments to look in different channels,” Ipek said. “They could look into decays of particles into two-body states, but why not also 10-body states?”

Theorists also work with an experiment, or multiple experiments, to put their calculations to best use. Armed with code, experimentalists can change a parameter or two to guide them in their search for new physics. What happens, for example, if the Higgs boson interacts a little more strongly with the top quark than we expect?  How would that change what we see in our detectors?

“That’s a question they can ask and then answer,” Campbell said. “Anyone can come up with a new theory. It is best to try to provide a concrete plan that they can follow.”

 

Outlandish theories and concrete plans

Concrete plans ensure a fruitful relationship between experiment and theory. The wilder, unconventional theories scientists dream up take the field into exciting, uncharted territory, but that isn’t to say that they don’t also have their utility.

Theorists who specialize in physics beyond the Standard Model, for example, generate thousands of theories worldwide for new physics – new phenomena seen as new energy deposits in the detector where you don’t expect to see them.

“Even if things don’t end up existing, it encourages the experiment to look at its data in different ways,” Campbell said. An experiment could take so much data that you might worry that some fun effect is hiding, never to be seen. Having truckloads of theories helps mitigate against that. “You’re trying to come up with as many outlandish ideas as you can in the hope that you cover as many of those possibilities as you can.”

Theorists bridge the gap between the pure mathematics that describes nature and the data through which nature manifests.

“The field itself is challenging, but theory takes us to new places and helps us imagine new phenomena,” Ipek said.” We collectively work toward understanding every detail of our universe and that’s what ultimately matters most.”

Experiments at the LHC are once again recording collisions at extraordinary energies

Collisions recorded on May 7, 2016, by the CMS detector on the Large Hadron Collider. After a winter break, the LHC is now taking data again at extraordinary energies. Image: CERN

Collisions recorded on May 7, 2016, by the CMS detector on the Large Hadron Collider. After a winter break, the LHC is now taking data again at extraordinary energies. Image: CERN

Editor’s note: The following news release about the restart of the Large Hadron Collider is being issued by the U.S. Department of Energy’s Fermi National Accelerator Laboratory on behalf of the U.S. scientists working on the LHC. Fermilab serves as the U.S. hub for the CMS experiment at the LHC and the roughly 1,000 U.S. scientists who work on that experiment, including about 100 Fermilab employees. Fermilab is a Tier 1 computing center for LHC data and hosts a Remote Operations Center to process and analyze that data. Read more information about Fermilab’s role in the CMS experiment and the LHC. Fermilab scientists are available for interviews upon request, including Joel Butler, recently elected next spokesperson of the CMS experiment.  

After months of winter hibernation, the Large Hadron Collider is once again smashing protons and taking data. The LHC will run around the clock for the next six months and produce roughly 2 quadrillion high-quality proton collisions, six times more than in 2015 and just shy of the total number of collisions recorded during the nearly three years of the collider’s first run.

“2015 was a recommissioning year. 2016 will be a year of full data production during which we will focus on delivering the maximum number of data to the experiments,” said Fabiola Gianotti, CERN director general.

The LHC is the world’s most powerful particle accelerator. Its collisions produce subatomic fireballs of energy, which morph into the fundamental building blocks of matter. The four particle detectors located on the LHC’s ring allow scientists to record and study the properties of these building blocks and look for new fundamental particles and forces.

“We’re proud to support more than a thousand U.S. scientists and engineers who play integral parts in operating the detectors, analyzing the data and developing tools and technologies to upgrade the LHC’s performance in this international endeavor,” said Jim Siegrist, associate director of science for high-energy physics in the U.S. Department of Energy’s Office of Science. “The LHC is the only place in the world where this kind of research can be performed, and we are a fully committed partner on the LHC experiments and the future development of the collider itself.”

Between 2010 and 2013 the LHC produced proton-proton collisions with 8 teraelectronvolts of energy. In the spring of 2015, after a two-year shutdown, LHC operators ramped up the collision energy to 13 TeV. This increase in energy enables scientists to explore a new realm of physics that was previously inaccessible. Run II collisions also produce Higgs bosons — the groundbreaking particle discovered in LHC Run I — 25 percent faster than Run I collisions and increase the chances of finding new massive particles by more than 40 percent.

Almost everything we know about matter is summed up in the Standard Model of particle physics, an elegant map of the subatomic world. During the first run of the LHC, scientists on the ATLAS and CMS experiments discovered the Higgs boson, the cornerstone of the Standard Model that helps explain the origins of mass. The LHCb experiment also discovered never-before-seen five-quark particles, and the ALICE experiment studied the near-perfect liquid that existed immediately after the Big Bang. All these observations are in line with the predictions of the Standard Model.

“So far the Standard Model seems to explain matter, but we know there has to be something beyond the Standard Model,” said Denise Caldwell, director of the Physics Division of the National Science Foundation. “This potential new physics can only be uncovered with more data that will come with the next LHC run.”

For example, the Standard Model contains no explanation of gravity, which is one of the four fundamental forces in the universe. It also does not explain astronomical observations of dark matter, a type of matter that interacts with our visible universe only through gravity, nor does it explain why matter prevailed over antimatter during the formation of the early universe. The small mass of the Higgs boson also suggests that matter is fundamentally unstable.

The new LHC data will help scientists verify the Standard Model’s predictions and push beyond its boundaries. Many predicted and theoretical subatomic processes are so rare that scientists need billions of collisions to find just a small handful of events that are clean and scientifically interesting. Scientists also need an enormous amount of data to precisely measure well-known Standard Model processes. Any significant deviations from the Standard Model’s predictions could be the first step towards new physics.

The United States is the largest national contributor to both the ATLAS and CMS experiments, with 45 U.S. universities and laboratories working on ATLAS and 49 working on CMS.

CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a candidate for accession. Cyprus and Serbia are associate members in the pre-stage to membership. Turkey and Pakistan are associate members. India, Japan, the Russian Federation, the United States of America, Turkey, the European Union, JINR and UNESCO have observer status.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2015, its budget is $7.3 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 48,000 competitive proposals for funding and makes about 11,000 new funding awards. NSF also awards about $626 million in professional and service contracts yearly.