computing

Illustration of four scientists in white lab coats, two of whom are typing, two of whom are looking at and drawing on a screen with equations and 3D images.

Over time, particle physics and astrophysics and computing have built upon one another’s successes. That coevolution continues today. New physics experiments require computing innovation, including cluster computing for the Tevatron, and more recently machine learning and quantum problem-solving.

The U.S. Department of Energy recently announced $13.7 million in funding for nine research projects that will advance the state of the art in computer science and applied mathematics. One of the recipients of this funding, Fermilab scientist Nhan Tran will lead a project to explore methods for programming custom hardware accelerators for streaming compression.

From the DOE Office of Science, Aug. 25: The DOE Office of Science announced 10 DOE national laboratory projects, including one led by Fermilab, have been selected to receive funding as part of research related to microelectronics co-design. Senior engineer Davide Braga’s work on hybrid cryogenic detector architectures for sensing and edge computing enabled by new fabrication processes was chosen as one of the 10 awards.

From Silicon Republic (Ireland), July 7, 2021: Sinéad Ryan, a professor of theoretical high-energy physics at Trinity College Dublin, describes her postdoctoral research work on lattice QCD at Fermilab, the next-generation of exascale computing and the structural barriers and imbalance of diversity in the physics community.

From UKRI, Feb. 22, 2021: UKRI scientists are developing vital software to exploit the large data sets collected by the next-generation experiments in high-energy physics. The new software will have the capability to crunch the masses of data that the LHC at CERN and next-generation neutrino experiments, such as the Fermilab-hosted Deep Underground Neutrino Experiment, will produce this decade.

The prodigious amount of data produced at the Large Hadron Collider presents a major challenge for data analysis. Coffea, a Python package developed by Fermilab researchers, speeds up computation and helps scientists work more efficiently. Around a dozen international LHC research groups now use Coffea, which draws on big data techniques used outside physics.

From Data Center Knowledge, Feb. 3, 2021: That Fermilab and partners achieved sustained, high-fidelity quantum teleportation has big implications in many fields. Fermilab scientist, Panagiotis Spentzouris talks about what the results could mean for the future of data centers.

From DOE, Dec. 9, 2020: Computer Science Education Week is aimed at inspiring students to discover computer science activities and careers. The national laboratories, including Fermilab, are scheduled to host a number of activities to highlight The Department of Energy’s efforts, including increasing access to computer science education, building computational literacy, and growing the cyber workforce of the future.