Researchers at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan have achieved a significant milestone in astrophysics. Collaborating with colleagues from the University of Tokyo and the Universitat de Barcelona, the team has successfully conducted the world’s first simulations of the Milky Way, accurately representing over 100 billion stars over a timeline of 10,000 years. This groundbreaking simulation not only models 100 times more stars than previous efforts but also operates at a speed that is 100 times faster.
The simulation was realized through the innovative combination of 7 million CPU cores, advanced machine learning algorithms, and numerical simulations. This remarkable achievement provides astronomers with a powerful tool for studying stellar and galactic evolution on an unprecedented scale. The team’s findings were detailed in a paper titled “The First Star-by-star N-body/Hydrodynamics Simulation of Our Galaxy Coupling with a Surrogate Model,” published on March 15, 2025, in the *Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis* (SC ’25).
Advancing Galactic Simulations
Simulations that capture the dynamics of individual stars are vital for testing theories about galactic formation, structure, and evolution. Researchers have faced significant challenges in accurately representing the complex forces at play, including gravity, fluid dynamics, supernovae, element synthesis, and the influence of supermassive black holes (SMBHs). Historically, the limitations of available computing power restricted scientists to simulating galaxies with a mass limit of about one billion solar masses, which accounts for less than 1% of the stars in the Milky Way.
State-of-the-art supercomputers require over 315 hours to simulate just 1 million years of galactic evolution. Given that the Milky Way is estimated to be over 13.61 billion years old, this translates to a mere 0.00007% of its total age. Extending these simulations to cover 1 billion years would take more than 36 years, which limits researchers to modeling only large-scale events. Additionally, simply adding more supercomputer cores does not address efficiency challenges; in fact, performance can decline as more cores are deployed.
To overcome these hurdles, the team led by Hirashima implemented an AI-driven solution in the form of a machine learning surrogate model. This model, trained on high-resolution simulations of supernovae, allows researchers to predict the impact of these explosions on surrounding gas and dust up to 100,000 years post-explosion. By integrating this AI model with physical simulations, the team could simultaneously model the dynamics of a Milky Way-sized galaxy and small-scale stellar phenomena.
Performance and Implications
The performance of this new simulation method was validated through extensive tests on the Fugaku and Miyabi Supercomputer Systems at RIKEN and the University of Tokyo. Results indicated that the new approach could simulate star resolution in galaxies of more than 100 billion stars and complete 1 million years of evolution in just 2.78 hours. At this unprecedented speed, simulating 1 billion years of galactic history could be accomplished in a mere 115 days.
These advancements not only offer astronomers a crucial resource for testing theories of galactic evolution but also showcase the potential of integrating surrogate AI models into advanced simulations. This methodology can significantly reduce both the time required and the energy consumption for complex simulations. Beyond the realm of astrophysics, such AI shortcuts could facilitate other intricate simulations that involve both large and small-scale factors, including meteorology, ocean dynamics, and climate science.
The findings from Hirashima and his colleagues mark a transformative step in our understanding of the universe, promising to enhance the study of galactic phenomena and potentially revolutionizing the field of computational astrophysics.
