From Gizmodo, May 5, 2020: Fermilab scientist Brian Nord weighs in on the question of how automated devices, such as an autonomously operating telescope, free from human biases and complications, could find the solutions to questions about dark matter and dark energy.
Join us for the inaugural Computational Science Seminar sponsored jointly by the University of Chicago, Argonne National Laboratory and Fermilab on Tuesday, Nov. 12, from 10:30 to 11:30 a.m. at the University of Chicago (Watch remotely at https://fnal.zoom.us/j/720265198, or watch the simulcast at the Racetrack-WH7X). SCD’s Nhan Tran is the speaker. https://indico.fnal.gov/event/22307/ Abstract: In the first of the joint seminar series to build connections between the University of Chicago, Argonne, and Fermilab, we will highlight current activities in…
Speaker: Nhan Tran (FNAL) Tuesday, November 12, 2019 from 10:30 to 11:30 (US/Central) Attend in person at University of Chicago, John Crerar Library, Kathleen A Zar Room (first floor) Watch simulcast at Fermilab, Racetrack (WH7X) Watch remotely with Zoom (link to be provided soon – See https://indico.fnal.gov/event/22307/) Abstract: In the first of the joint seminar series to build connections between the University of Chicago, Argonne, and Fermilab, we will highlight current activities in Artificial Intelligence (AI) at Fermilab. Machine…
From Inside HPC, Sept. 15, 2019: Argonne and the National Center for Supercomputing Applications use deep learning to analyze Dark Energy Survey data.
From MIT News, Aug. 19, 2019: A new prototype machine-learning technology co-developed by Fermilab and MIT scientists speeds Large Hadron Collider data processing by up to 175 times over traditional methods.
A new machine learning technology tested by Fermilab scientists and collaborators can spot specific particle signatures among an ocean of LHC data in the blink of an eye, much faster than standard methods. Sophisticated and swift, its performance gives a glimpse into the game-changing role machine learning will play in making future discoveries in particle physics as data sets get bigger and more complex.
From Inside HPC, July 3, 2019: Particle physics researchers are using custom integrated circuits called FPGAs in combination with other computing resources to process massive quantities of data at extremely fast rates to find clues to the origins of the universe. This requires filtering sensor data in real time to identify novel particle substructures that could contain evidence of the existence of dark matter and other physical phenomena. A growing team of physicists and engineers from Fermilab, CERN and other institutions, co-led by Fermilab scientist Nhan Tran, wanted to have a flexible way to optimize custom-event filters in the CMS detector they are working on at CERN.
Particle accelerators are some of the most complicated machines in science. In today’s more autonomous era of self-driving cars and vacuuming robots, efforts are going strong to automate different aspects of the operation of accelerators, and the next generation of particle accelerators promises to be more automated than ever. Scientists are working on ways to run them with a diminishing amount of direction from humans.
Machine learning is revolutionizing data analysis. Recent leaps in driverless car navigation and the voice recognition features of personal assistants are possible because of this form of artificial intelligence. As data sets in the Information Age continue to grow, companies are building tools that make machine learning faster and more efficient. Fermilab is taking cues from industry to improve their own “big data” processing challenges.