Rob Roser, head of the Scientific Computing Division, wrote this column.
We are constantly reminded what a precious commodity data is. Particle physicists have an insatiable thirst for it. It’s the currency in which discoveries are made. When we are looking for rare sightings, more data buys probability. And in our current era of highly-constrained budgets, we need to make the most out of the data that we have.
Computing has always been important in particle physics, but now there’s a premium to get the most out of every collision. Scientists have developed many advanced analysis techniques such as neural networks, boosted decision trees and the like as a way to maximize the usage of all the available information in every collision.
It is with that background that the newly formed Scientific Computing Division must perform. Our mission is two fold: to be stewards of the state of the art techniques for the field and to utilize computing tools and techniques to maximize our scientific output. Scientists, engineers and computing professionals are working on a variety of issues including improved physics simulations, accelerator modeling, lattice QCD calculations, design and construction of DAQ systems and operating experiments.
The members of the Scientific Computing Division are committed to properly completing the Tevatron program and supporting the on-going programs in the three frontiers. As Fermilab transitions from an era dominated by its collider experiments to a period with many types of experiments and projects, both on and off site, we need to create tools sufficiently flexible to meet the diverse needs of these different efforts.
I invite people to communicate with me regarding what aspects of scientific computing they feel should be emphasized in the coming years. As our experimental program diversifies, the emphasis on scientific computing becomes more acute.