Building on Fermilab Today‘s University Profiles, the Computing Sector followed up with several 2012 participants to inquire about the roles their computing departments play in particle physics research programs. We questioned randomly selected universities for two articles, and this feature focuses on the seven responses to our second set of questions.
The first article concentrated on the value of collaboration, demonstrated by the contribution of software development or local computing resources from different universities’ computing departments to various high-energy physics research groups. Unsurprisingly, this remains a theme here: Six of the universities are currently Open Science Grid members; six are CMS or ATLAS Tier-3 or Tier-2 centers, with the seventh almost finished installing a Tier-3 center; and all have contributed software to experiments. Effective collaboration also encompasses research to improve resources. This can be incremental, enhancing precision or efficiency, or revolutionary, with novel approaches stemming from R&D efforts that create new experiment and analysis opportunities. The following are selected examples of R&D work from the universities’ responses.
Several are investigating future processing, infrastructure and storage requirements. Professor Markus Wobisch referred to Louisiana Tech University’s interest in “multicore particle physics computing and high-availability applications.” Research scientist Shawn McKee said that the University of Michigan is researching “next-generation infrastructures, including software-defined networking, new file systems, and tools and techniques for agilely provisioning, configuring and maintaining their infrastructure and virtualization capabilities.” The University of Notre Dame’s Professor Michael Hildreth is lead principal investigator on a project looking into “data preservation issues for the future.” He is also working with others, including Professor Kevin Lannon, to “develop techniques for opportunistic computing.”
Others focused even further on grid infrastructure. Professor Brad Abbott emphasized the University of Oklahoma’s early involvement in grid computing R&D, having had “one of the first US ATLAS grid computing test-bed setups” and being the first site to adopt an existing high-performance computing cluster as part of ATLAS. Professors Ian Shipsey and Norbert Neumeister described Purdue University’s membership in the ExTENCI project to provide an interface between the Open Science Grid and XSEDE “to bridge the efforts of these two cyberinfrastructure projects.” Professor George Alverson says that Northeastern University personnel are currently involved as grid users and testers, and the institution hopes soon to begin grid integration work.
Finally, Professor Sung-Won Lee and postdoctoral research fellow Chris Cowden said a group at Texas Tech University is studying further applications of their FFTJet algorithm, “which applies image processing techniques to jet finding in high-energy physics experiments,” as well as “developing an application of the Geant4 simulation toolkit to study the CMS Phase II detector upgrade designs.”
Whether fine tuning or paradigm shifting, these projects represent advances in computing capabilities and applications, the benefits of which are felt across the field of high-energy physics.
—Clementine Jones, Computing Sector Communications