Fermilab feature

Timing neutrinos with White Rabbit

From left: Donatella Torretta, William Badgett and Angela Fava fine tune the White Rabbit synchronization system for the Fermilab Short-Baseline Neutrino Program. Photo: Reidar Hahn

From left: Donatella Torretta, William Badgett and Angela Fava fine tune the White Rabbit synchronization system for the Fermilab Short-Baseline Neutrino Program. Photo: Reidar Hahn

Being on time is important – just ask Lewis Carroll’s leporine friend – and one group who knows this more than most are particle physicists, whose work revolves around keeping track of near-light speed blips of matter.

As particle accelerators and experiments have become increasingly complex and choreographed over the decades, technology behind the scenes has had to innovate to keep up. One such example is White Rabbit, a clever timing and data transfer system that is playing a key role in modern particle physics.

“We are always pushing our experiments to higher and higher precisions,” said Angela Fava, scientist on Fermilab’s ICARUS neutrino detector and part of the team exploring White Rabbit at Fermilab. “White Rabbit is really useful because it can reach time precisions down to less than a billionth of a second.”

Keeping time

In modern particle accelerators, many separate components have to be activated in sequence in a timely manner to identify and track particles passing by at the speed of light. This requires very precise synchronization and timing systems to determine when these events should occur – an egg timer won’t cut it here.

Until recently, this timing has usually been achieved with devices that are hard-wired into experimental equipment, such as the General Machine Timing (GMT) system at CERN. But GMT has limitations, including a low data bandwidth, the capacity to only send signals one way through the network, and an inability to self-calibrate — to internally calculate how long a signal has taken to travel — which results in timing errors.

As experiments grow in complexity and require nanosecond coordination, physicists have been left with a need for a one-size-fits-all system that can provide the required time synchronization and still be compatible with systems from multiple sources and vendors that are already in place.

The solution is White Rabbit, an open-source system that builds on common and accessible Ethernet technology – the same technology behind wired internet access. The system works kind of like an everyday computer network, too, with circuit boards called “nodes,” controlled by a specially written program.

Up to around 1,000 nodes can be linked in one White Rabbit network, all connected together with a web of optical fibers – up to 10 kilometers long – to exchange information. As the technology develops, the system will likely be able to support even more nodes separated by greater distances.

Since precise timing is so important in modern experiments, White Rabbit’s power comes in its ability to keep itself synchronized, no matter the cable length between nodes or other external factors. Even relatively small changes in cable temperature can affect travel time on the scale of nanoseconds, for example.

A White Rabbit system works kind of like a hierarchy, where one of the nodes in a network is designated a “master” and is responsible for keeping all the other nodes in check. The external time is fed into the master from high-precision atomic oscillators via orbiting GPS satellites, the same technology on which Google Maps navigation is based.

This exact time is digitally attached to blips of data – which, for example, include control instructions for accelerators – that constantly fly around the network. By sending the time tags back and forth between nodes, which GMT isn’t able to do, the system can calculate the time delays it takes for data to travel through cables and correct for them, keeping all the nodes in synchronization with the correct time and ensuring experimental events are kept coordinated.

Fava and scientist Donatella Torretta, together with William Badgett at Fermilab, are currently working on installing White Rabbit into some of Fermilab’s experiments, including the Short-Baseline Neutrino (SBN) Program, which will study neutrinos – tiny, elusive particles. The first use of White Rabbit in North America, the system can be used to time-tag the neutrinos from their production at the beam source through to the detector at the end of the experiment.

On the SBN ICARUS detector, White Rabbit can also be used to get an extremely accurate tagging of unwanted cosmic particles that come from space and get in the way of the experiment, potentially hiding the neutrino signatures.

“It would be possible to run ICARUS without White Rabbit, but it’s lot easier if we use it,” said Fava. “And it’s all in real-time too, so it saves on our computing power and storage.”

Pulses on the oscilloscope indicate when the particles arrive in the detector. Photo: Reidar Hahn

Open science

White Rabbit was first conceived in around 2008 as an international collaboration between CERN, the GSI Helmholtz Centre for Heavy Ion Research in Germany, and other partners, and was introduced to boost the abilities of the Large Hadron Collider.

From the beginning, the collaboration has made both the hardware and software for the timing system openly available to anyone around the world. The physical equipment can be purchased from commercial vendors, while the software is completely free and easily accessible online.

“Everybody benefits when science is open,” said Torretta, who learned about White Rabbit at a demonstration workshop at CERN. “As the technology develops, it’s becoming more and more popular.”

Torretta has since attended further workshops to learn more, including one recently in Barcelona, which was organized by White Rabbit experts from CERN.

The CERN development team also took care to ensure the design was as general as possible, so as to allow a large range of practical applications for the technology, including outside of science. A group in the Netherlands has even used White Rabbit to transmit official time between Dutch cities with nanosecond accuracies.

 

Burt Holzman is one of 14 Fermilab employees who visited local schools for Hour of Code in December. Holzman visited second- through fifth-grade classrooms at Wolf’s Crossing Elementary School in Aurora. Photo: Stephanie Comes

Burt Holzman is one of 14 Fermilab employees who visited local schools for Hour of Code in December. Holzman visited second- through fifth-grade classrooms at Wolf’s Crossing Elementary School in Aurora. Photo: Stephanie Comes

With information technology integrated into so many aspects of our lives, having some computer science skills will be essential for the future workforce. However, less than half of all schools teach computer science. Furthermore, while 71 percent of STEM jobs are currently in computer science, only 8 percent of STEM graduates study computer science. These gaps are concerning, but there are efforts under way to close them.

Hour of Code is a worldwide movement designed to demystify code, to show that anybody can learn the basics and to broaden participation in computer science and other technical fields. Over 400 partners and 200,000 educators globally currently support the movement.

As part of Computer Science Education Week on Dec. 4-10, Fermilab partnered with Argonne National Laboratory on an initiative to bring Hour of Code activities and coding role models to local schools. Fourteen employees from Fermilab, along with several from various Argonne organizations, visited area elementary, middle and high schools and spoke about their labs, their careers and coding in general. They also assisted students with coding exercises.

This is the second year Penelope Constanta, application developer and system analyst, participated in Hour of Code. Over the course of one day, Constanta visited 13 classrooms at Oswego’s Fox Chase Elementary School, from first grade to fifth grade, with varying levels of coding experience. She explained to the children who had never coded what a computing program is, and for those who had, she helped with the coding tasks that they were asked to do.

“I did love the kids’ reactions when I asked them to ‘program me’ to move to a location in the classroom or do some simple task,” Constanta said. “At all levels, they were able to grasp the simple concepts that I introduced.” For all but two of the classes she attended, this was the first time these kids had heard about programming.

Scientist Adam Lyon spent a morning at Thomas Jefferson Junior High School in Woodridge.

“It was a lot of fun, and the teachers and kids were great,” he said. “The Minecraft and Flappy Bird projects were by far the most popular, though several kids told me that Minecraft is so 2015.”

Application developer and system analyst Kris Brandt attended both introductory and AP computer science classes at St. Charles East High School, where the students are already coding, so she didn’t need to introduce the concept of coding.

“Instead, my talk focused on scientific versus core computing at Fermilab and computing careers in general,” Brandt said. “They were also curious and impressed by some of the stats from ‘Computing by the Numbers,’ especially the amount of data we manage and the cybersecurity stats. Quantum computing was also a hot topic. They were a very curious and smart group of kids, which made the experience fun.”

Fermilab Chief Information Officer Rob Roser highlighted Fermilab Computing’s positive impact in the community.

“I am very proud of the enthusiastic participation from all branches of Computing in this important outreach event,” Roser said.  “Reaching out through the schools and showing these kids that coding is both fun and accessible to them and that the end result can change the world is very powerful.”

The Fermilab participants and the schools they visited were:

Kris Brandt: St. Charles East High School
Penelope Constanta: Fox Chase Elementary School, Oswego
Lynn Garren: Churchill Elementary School, Oswego
Ken Herner: Lemont High School
Burt Holzman: Wolf’s Crossing Elementary, Aurora
Tanya Levshina: Morrill Math & Science Elementary School, Chicago
Adam Lyon: Thomas Jefferson Junior High School
Marco Mambelli: UIC College Prep, Chicago
Craig Mohler: Timber Ridge Middle School, Plainfield
Keenan Newton: Thornton Fractional South High School, Lansing
Irene Shiu: Boulder Hill Elementary, Oswego
Margaret Votava: Bower Elementary, Warrenville
Tammy Whited: Grace McWayne Elementary, Batavia
Michael Zalokar: Rotolo Middle School, Batavia

Marcia Teckenbrock is the communications manager in the Office of the Chief Information Officer.

Also announces discovery of eleven stellar streams, evidence of small galaxies being eaten by the Milky Way

This image shows the full area of sky mapped by the Dark Energy Survey and the 11 newly discovered stellar streams. Four of the streams in this diagram — ATLAS, Molonglo, Phoenix and Tucana III – were previously known. The others were discovered using the Dark Energy Camera, one of the most powerful astronomical cameras on Earth. Image: Dark Energy Survey

This image shows the full area of sky mapped by the Dark Energy Survey and the 11 newly discovered stellar streams. Four of the streams in this diagram — ATLAS, Molonglo, Phoenix and Tucana III – were previously known. The others were discovered using the Dark Energy Camera, one of the most powerful astronomical cameras on Earth. Image: Dark Energy Survey

At a special session held during the American Astronomical Society meeting in Washington, D.C., scientists on the Dark Energy Survey (DES) announced today the public release of their first three years of data. This first major release of data from the survey includes information on about 400 million astronomical objects, including distant galaxies billions of light-years away as well as stars in our own galaxy.

DES scientists are using this data to learn more about dark energy, the mysterious force believed to be accelerating the expansion of the universe, and presented some of their preliminary cosmological findings in the special session. As part of that session, DES scientists also announced today the discovery of 11 new stellar streams, remnants of smaller galaxies torn apart and devoured by our Milky Way.

The public release of the first three years of DES data fulfills a commitment scientists on the survey made to share their findings with the astronomy community and the public. The data cover the full DES footprint – about 5,000 square degrees, or one eighth of the entire sky — and include roughly 40,000 exposures taken with the Dark Energy Camera. The images correspond to hundreds of terabytes of data and are being released along with catalogs of hundreds of millions of galaxies and stars.

The Dark Energy Camera is mounted on the Blanco telescope in Chile. Photo: Fermilab

The Dark Energy Camera is mounted on the Blanco telescope in Chile. Photo: Fermilab

“There are all kinds of discoveries waiting to be found in the data. While DES scientists are focused on using it to learn about dark energy, we wanted to enable astronomers to explore these images in new ways, to improve our understanding of the universe,” said Dark Energy Survey Data Management Project Scientist Brian Yanny of the U.S. Department of Energy’s Fermi National Accelerator Laboratory.

“The great thing about a big astronomical survey like this is that it also opens a door to many other studies, like the new stellar streams,” added Adam Bolton, associate director for the Community Science and Data Center at the National Optical Astronomy Observatory (NOAO). “With the DES data now available as a ‘digital sky,’ accessible to all, my hope is that these data will lead to the crowdsourcing of new and unexpected discoveries.”

The DES data can be accessed online.

The Cerro Tololo Inter-American Observatory in Chile houses the Dark Energy Camera. Photo: Fermilab

The Dark Energy Camera, the primary observation tool of the Dark Energy Survey, is one of the most powerful digital imaging devices in existence. It was built and tested at Fermilab, the lead laboratory on the Dark Energy Survey, and is mounted on the National Science Foundation’s 4-meter Blanco telescope, part of the Cerro Tololo Inter-American Observatory in Chile, a division of NOAO. The DES images are processed by a team at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign.

“We’re excited that this release of high-quality imaging data is now accessible to researchers around the world,” said Matias Carrasco Kind, DES release scientist at NCSA. “While DES was designed with the goal of understanding dark energy and dark matter, the huge amount of data in these images and catalogs will bring new scientific applications, challenges, and opportunities for discovery to astronomers and data scientists. In collaboration, NCSA, NOAO and the LIneA group in Brazil are providing the tools and resources to access and analyze this rich and robust data set.”

One new discovery enabled by the data set is the detection of 11 new streams of stars around our Milky Way. Our home galaxy is surrounded by a massive halo of dark matter, which exerts a powerful gravitational pull on smaller, nearby galaxies. The Milky Way grows by pulling in, ripping apart and absorbing these smaller systems. As stars are torn away, they form streams across the sky that can be detected using the Dark Energy Camera. Even so, stellar streams are extremely difficult to find since they are composed of relatively few stars spread out over a large area of sky.

“It’s exciting that we found so many stellar streams,” said astrophysicist Alex Drlica-Wagner of Fermilab. “We can use these streams to measure the amount, distribution and clumpiness of dark matter in the Milky Way. Studies of stellar streams will help constrain the fundamental properties of dark matter.”

Prior to the new discoveries by DES, only about two dozen stellar streams had been discovered. Many of them were found by the Sloan Digital Sky Survey, a precursor to the Dark Energy Survey. The effort to detect new stellar streams in the Dark Energy Survey was led by University of Chicago graduate student Nora Shipp.

“We’re interested in these streams because they teach us about the formation and structure of the Milky Way and its dark matter halo. Stellar streams give us a snapshot of a larger galaxy being built out of smaller ones,” Shipp said. “These discoveries are possible because DES is the widest, deepest and best-calibrated survey out there.”

This image shows a portion of the sky mapped by the Dark Energy Survey. Stellar streams (including ones previously found) can be seen as yellow, blue and red streaks. Image: Dark Energy Survey

Since there is no universally accepted naming convention for stellar streams, the Dark Energy Survey has reached out to schools in Chile and Australia, asking young students to select names. Students and their teachers have worked together to name the streams after aquatic words in native languages from northern Chile and aboriginal Australia. Read more about the names in Symmetry magazine.

Read the papers drawn from the first years of DES data online. An animation of several of the newly discovered streams can be seen on Fermilab’s website.

DES plans one more major public data release, after the survey is completed, which will include nearly twice as many exposures as in this release.

“This result is an excellent example of how ‘data mining’ — the exploration of large data sets — leads to new discoveries,” said Richard Green, director of the National Science Foundation’s (NSF) Division of Astronomical Sciences. “NSF is investing in this approach through our foundationwide ‘Harnessing the Data Revolution’ initiative, which is encouraging fundamental research in data science. We’re expecting a drumbeat of exciting discoveries, particularly when the Large Synoptic Survey Telescope data floodgates are opened!”

This work is supported in part by the U.S. Department of Energy Office of Science.

The Dark Energy Survey is a collaboration of more than 400 scientists from 26 institutions in seven countries. Funding for the DES Projects has been provided by the U.S. Department of Energy Office of Science, U.S. National Science Foundation, Ministry of Science and Education of Spain, Science and Technology Facilities Council of the United Kingdom, Higher Education Funding Council for England, ETH Zurich for Switzerland, National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, Kavli Institute of Cosmological Physics at the University of Chicago, Center for Cosmology and AstroParticle Physics at Ohio State University, Mitchell Institute for Fundamental Physics and Astronomy at Texas A&M University, Financiadora de Estudos e Projetos, Fundação Carlos Chagas Filho de Amparo à Pesquisa do Estado do Rio de Janeiro, Conselho Nacional de Desenvolvimento Científico e Tecnológico and Ministério da Ciência e Tecnologia, Deutsche Forschungsgemeinschaft, and the collaborating institutions in the Dark Energy Survey, the list of which can be found at www.darkenergysurvey.org/collaboration

Cerro Tololo Inter-American Observatory, National Optical Astronomy Observatory, is operated by the Association of Universities for Research in Astronomy (AURA) under a cooperative agreement with the National Science Foundation. NSF is an independent federal agency created by Congress in 1950 to promote the progress of science. NSF supports basic research and people to create knowledge that transforms the future.

NCSA at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50® for more than 30 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale. For more information, please visit www.ncsa.illinois.edu.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance LLC, a joint partnership between the University of Chicago and the Universities Research Association Inc. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.