From Inside HPC, Sept. 15, 2019: Argonne and the National Center for Supercomputing Applications use deep learning to analyze Dark Energy Survey data.
From MIT News, Aug. 19, 2019: A new prototype machine-learning technology co-developed by Fermilab and MIT scientists speeds Large Hadron Collider data processing by up to 175 times over traditional methods.
A new machine learning technology tested by Fermilab scientists and collaborators can spot specific particle signatures among an ocean of LHC data in the blink of an eye, much faster than standard methods. Sophisticated and swift, its performance gives a glimpse into the game-changing role machine learning will play in making future discoveries in particle physics as data sets get bigger and more complex.
From Inside HPC, July 3, 2019: Particle physics researchers are using custom integrated circuits called FPGAs in combination with other computing resources to process massive quantities of data at extremely fast rates to find clues to the origins of the universe. This requires filtering sensor data in real time to identify novel particle substructures that could contain evidence of the existence of dark matter and other physical phenomena. A growing team of physicists and engineers from Fermilab, CERN and other institutions, co-led by Fermilab scientist Nhan Tran, wanted to have a flexible way to optimize custom-event filters in the CMS detector they are working on at CERN.
Particle accelerators are some of the most complicated machines in science. In today’s more autonomous era of self-driving cars and vacuuming robots, efforts are going strong to automate different aspects of the operation of accelerators, and the next generation of particle accelerators promises to be more automated than ever. Scientists are working on ways to run them with a diminishing amount of direction from humans.
Machine learning is revolutionizing data analysis. Recent leaps in driverless car navigation and the voice recognition features of personal assistants are possible because of this form of artificial intelligence. As data sets in the Information Age continue to grow, companies are building tools that make machine learning faster and more efficient. Fermilab is taking cues from industry to improve their own “big data” processing challenges.
A new telescope will take a sequence of snapshots with the world’s largest digital camera, covering the entire visible night sky every few days — and repeating the process for an entire decade. What’s the best way to rapidly and automatically identify and categorize all of the stars, galaxies and other objects captured in these images? Data scientists trained have computers to pick out useful information from these hi-res snapshots of the universe.
Fermilab’s quantum program includes a number of leading-edge research initiatives that build on the lab’s unique capabilities as the U.S. center for high-energy physics and a leader in quantum physics research. On the tour, researchers discussed quantum technologies for communication, high-energy physics experiments, algorithms and theory, and superconducting qubits hosted in superconducting radio-frequency cavities.