Rise of the machines
Machine learning will become an even more important tool when scientists upgrade to the High-Luminosity Large Hadron Collider.
41 - 50 of 76 results
Machine learning will become an even more important tool when scientists upgrade to the High-Luminosity Large Hadron Collider.
In a collaborative, cross-laboratory effort, Fermilab’s HEPCloud program enabled NOvA to perform the largest antineutrino data analysis ever in record time.
Oak Ridge and Fermilab developers are changing the way researchers can transport and analyze data, using a new methodology that allows for compressing and streaming of data coming out of simulations in real time.
Particle physicists are studying ways to harness the power of the quantum realm to further their research.
Scientists are using cutting-edge machine-learning techniques to analyze physics data.
Particle detectors won’t be late, be late for a very important date.
As part of Computer Science Education Week on Dec. 4-10, Fermilab partnered with Argonne National Laboratory on an initiative to bring Hour of Code activities and coding role models to local schools.
From CERN Openlab, Nov. 22, 2017: Physics data reduction helps ensure researchers gain valuable insights from the vast amounts of particle collision data produced by CMS. Fermilab scientist Oliver Gutsche and colleagues will look at investigate techniques based on Apache Spark, a popular open-source software platform.
From ASCR Discovery, October 2017: The cosmological search in the dark is no walk in the park. With help from Berkeley Lab, Fermilab aims open-source software at data from high-energy physics. Fermilab’s Oliver Gutsche, Jim Kowalkowski and Saba Sehrish talk about Spark