A new telescope will take a sequence of snapshots with the world’s largest digital camera, covering the entire visible night sky every few days — and repeating the process for an entire decade. What’s the best way to rapidly and automatically identify and categorize all of the stars, galaxies and other objects captured in these images? Data scientists trained have computers to pick out useful information from these hi-res snapshots of the universe.
computing
Fermilab’s quantum program includes a number of leading-edge research initiatives that build on the lab’s unique capabilities as the U.S. center for high-energy physics and a leader in quantum physics research. On the tour, researchers discussed quantum technologies for communication, high-energy physics experiments, algorithms and theory, and superconducting qubits hosted in superconducting radio-frequency cavities.
From insideHPC, Jan. 9, 2019: After scanning in depth about a quarter of the southern skies for six years and cataloging hundreds of millions of distant galaxies, the Dark Energy Survey finishes taking data on Jan. 9. The National Center for Supercomputing Applications at the University of Illinois will continue refining and serving this data for use by scientists into 2021.
From HostingAdvice.com, Dec. 14, 2018: Fermilab scientist Marc Paterno is quoted in this article on how Fermilab raising the bar on innovative and cost-effective computing solutions that help researchers explore high-energy physics. As a repository for massive sets of scientific data, Fermilab is at the forefront of new computing approaches, including HEPCloud, a paradigm for provisioning computing resources.
From 9 to 5 Google, Nov. 15, 2018: The LHC’s massive physics experiments will require computing capacity that is an estimated 50-100 times higher than today. Google finds the challenge exciting and has already been working with Fermilab and Brookhaven National Laboratory to store and analyze data from the LHC using the Google Computer Engine.
From the Pittsburgh Computing Center, Oct. 10, 2018: Fermilab’s Dirk Hufnagel is quoted in this piece on the Pittsburgh Supercomputing Center now supplying computation for the LHC. Fermilab scientists working on the CMS experiment, in collaboration with the Open Science Grid, have begun analyzing data LHC data using PSC’s Bridges supercomputer.