The Grid Computing Center houses small files for a number of experiments at Fermilab. Photo: Reidar Hahn Experts in the Scientific Computing Division have developed a more efficient method to store data to better cater to the needs of Intensity Frontier experiments. Last year, SCD developers created a method to efficiently store and access small-sized data files. The method is useful for several neutrino experiments, which normally store many small, sparse files that range from hundreds of kilobytes to hundreds…
Every day researchers add another sea of data to an ocean of knowledge on the world around us — billions on top of billions of measurements, images and observations of the tiniest subatomic particles up to the movement of planets and stars.
The world’s largest computing grid has passed its most comprehensive tests to date in anticipation of the restart of the world’s most powerful particle accelerator, the Large Hadron Collider.
The world’s largest computing grid is ready to tackle mankind’s biggest data challenge from the earth’s most powerful accelerator.
The six experiments at the Large Hadron Collider will produce 15 million gigabytes of data every year, enough information to create a 13-mile-high stack of CDs.
High-Performance Computing and Communications Organizations Pool Capabilities to Support Vast Bandwidth Needs for Particle Physics and Other Applications