Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Last month, a team of researchers put the then-world’s fastest supercomputer to work on a pretty big problem: the nature of atoms and dark matter in the universe.
The supercomputer is called Frontier; Recently, a team of researchers used it for the largest astrophysical simulation of the universe. The scale of the supercomputer simulation has not been possible until now. The calculations behind the simulations provide a new basis for cosmological simulations of the universe’s matter content, from from what we see to invisible matter that only gravitationally interacts with ordinary matter.
Frontier is an exascale supercomputer capable of one quintillion (one billion billion) calculations per second.In other words, a juicy machine worthy of a massive undertaking that simulates the physics and evolution of both the known and unknown universe.
“If we want to know what’s going on in the universe, we need to simulate these two things: gravity, as well as all the other physics, including hot gas, and the formation of stars, black holes and galaxies,” said Salman Habib of Argonne National director of the Laboratory’s Computational Sciences Division at Oak Ridge National Laboratory release. “An astrophysical ‘kitchen sink’ so to speak.”
The matter we know—the stuff we can see, from black holes to molecular clouds to planets and moons—makes up just 5% of the universe’s contents. according to CERN. A larger part of the universe is thought to have only gravitational effects on visible (or atomic) matter.That invisible part is called dark matter, which may be responsible for a number of particles and objects 27% of the universe. The remaining 68% of the universe’s composition is attributed to dark energy, which is responsible for the accelerating rate of expansion of the universe.
“If we were to simulate a large part of the universe surveyed by one of the big telescopes, like the Rubin Observatory in Chile, you’re talking about huge stretches of time, billions of years,” Habib said simulation except for the gravitational approximation only.”
In the graph above, the left image shows the evolution of the expanding universe over billions of years in a region containing a cluster of galaxies, and the right image shows the formation and motion of galaxies over time in one region of that image.
“It’s not just the sheer size of the physical domain that requires direct comparison with state-of-the-art research observations made possible by exascale computing,” said Bronson Messer, director of science at the Oak Ridge Leadership Computing Facility at the lab. release. “It’s also the added physical realism of including baryons and all the other dynamical physics that makes this simulation a real tour de force for Frontier.”
Frontier is one of several exascale supercomputers used by the Department of Energy, and includes: more than 9,400 CPUs and more than 37,000 GPUs. It is located at Oak Ridge National Laboratory, although the latest simulations are carried out by researchers at Argonne.
Frontier’s results were made possible by the supercomputer’s code, the Hardware/Hybrid Accelerated Cosmology Code (or HACC), a fifteen-year-old code that was updated as part of a $1.8 billion DOE eight-year effort. Exascale computing projectwhich ended in
The simulation results were announced last month, when Frontier was still the fastest supercomputer in the world. However, Frontier soon eclipsed El Capitan as the world’s fastest. According to Lawrence Livermore National Laboratory, El Capitan is clocked at 1,742 quintillion calculations per second. the total performance is 2.79 quintillion calculations per second. release.