ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .

Unravel the biggest ideas in science today. Become a more curious you.

Unravel the biggest ideas in science today. Become a more curious you.

The full Nautilus archive eBooks & Special Editions Ad-free reading

  • The full Nautilus archive
  • eBooks & Special Editions
  • Ad-free reading
Join
Explore

Home to over 100 billion stars, including our sun, the vast Milky Way is understandably difficult to simulate with computer models. The most sophisticated models available can only simulate the mass of about 1 billion suns, which means the smallest unit of resolution in these images still represents an average of around 100 stars—making for a relatively blurry picture. Now, in a major milestone for AI-assisted computer modeling, researchers have created a 100-billion-star simulation of the Milky Way that’s capable of tracking individual stars.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

The breakthrough came with an assist from an AI deep-learning surrogate that helped researchers overcome a particularly thorny sticking point—supernova behavior. Simulating the fine particulate matter these exploding stars spew into space creates a heavy computational load and headaches for human researchers. By feeding the deep-learning su​​rrogate high-resolution supernova simulations, a team led by Keiya Hirashima from the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences was able to teach the program to predict how gas from a supernova would spread 100,000 years in the future. With AI sweating the small stuff, the rest of the simulation was freed up to focus on the big picture.

The result is a new model that not only includes 100 times more stars than the best simulations to date, but one that was generated over 100 times faster. The model was recently presented at SC ’25, an international supercomputing conference.

Read more: “The Galaxy That Got Too Big

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Going forward, Hirashima and his team have high hopes for their new hybrid modeling technique. “I believe that integrating AI with high-performance computing marks a fundamental shift in how we tackle multi-scale, multi-physics problems across the computational sciences,” Hirashima said in a statement. “This achievement also shows that AI-accelerated simulations can move beyond pattern recognition to become a genuine tool for scientific discovery—helping us trace how the elements that formed life itself emerged within our galaxy.”

In addition to modeling galaxies, their new approach also promises to elucidate more earthly phenomena, including oceanography, meteorology, and climate change.

Enjoying  Nautilus? Subscribe to our free newsletter.

Lead image: NASA/JPL-Caltech/ESA/CXC/STScI

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Fuel your wonder. Feed your curiosity. Expand your mind.

Access the entire Nautilus archive,
ad-free on any device.

! There is not an active subscription associated with that email address.

Subscribe to continue reading, and get 25% off.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article. Get 25% off now.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.

1/2
FREE ARTICLES THIS MONTH
Become a Nautilus member at our lowest price of the year.
Subscribe @ 25% off
2/2
FREE ARTICLES THIS MONTH
This is your last free article. Get 25% off for a limited time.
Subscribe @ 25% off