Over a century ago, in 1917, Albert Einstein proposed the first cosmological model based on the theory of general relativity. In the 1920s and 1930s, Georges Lemaître pushed the boundaries of this model, anticipating many concepts that are now part of today’s standard cosmological model, including dark matter and dark energy, and even hinting at ideas explored by scientists like Hawking and John Wheeler in quantum cosmology.
Today, thanks to observational programs enabled by instruments like the Planck satellite and, more recently, Euclid, we have an enormous volume of observations related to galaxies, galaxy clusters, and the large structures that bind them. To extract insights into the physics underlying these structures and their formation, powerful simulations are necessary, utilizing supercomputers and, more recently, artificial intelligence tools.
The aim is to see whether numerical simulations, based on equations and certain observations, predict the features of the Universe as we observe it today and how those features have evolved since the Big Bang. This allows for testing equations and advanced theories to understand the observable cosmos, including the nature of dark matter and dark energy. Computers can perform calculations beyond human capability, particularly when dealing with nonlinear equations.
In recent decades, increasingly sophisticated simulations have been conducted, leveraging advances in computational power, like the famous Millennium Simulation. The 1994-founded Virgo Consortium for cosmological simulations on supercomputers made many of these simulations possible. It rapidly evolved into an international group of scientists from the UK, Germany, the Netherlands, Canada, the United States, and Japan.
Simulations Posed Problems When Considering Only Dark Matter
Virgo, for instance, had access to world-class computational resources in the UK and at Durham University, which carried out several simulations, including one called Eagle (Evolution and Assembly of Galaxies and Their Environments). Today, researchers from this university, along with their colleagues elsewhere, have taken a step further as an international team of astronomers by creating Flamingo. It’s considered the largest cosmological computer simulation ever conducted, incorporating both dark matter and ordinary matter.
Flamingo is a new iteration of the Virgo consortium, with an acronym standing for “Full-hydro Large-scale Structure Simulations with All-sky Mapping for the Interpretation of Next Generation Observations.” It employs 300 billion “particles,” each with the mass of a small galaxy, within a cubic volume spanning ten billion light-years.
Here’s the context: According to the standard cosmological model, after the Big Bang, concentrations of dark matter collapsed rapidly, leading to the collapse of ordinary baryonic matter, which forms stars, planets, and, of course, ourselves. Information about ordinary matter concentrations can be obtained from the time of the emission of the cosmic microwave background radiation, providing initial conditions for calculating the evolution of matter into galaxies and galaxy clusters over 13.8 billion years.
In the early 1980s, computational power in cosmology was rudimentary, and since dark matter was believed to dominate mass and gravitational force over ordinary matter, it was assumed that cosmic structures were formed solely from dark matter. This was much easier and less computationally intensive to simulate than accounting for ordinary matter.
Despite some significant successes, these simulations faced issues, particularly with predictions based on the most credible dark matter models. One major failure was predicting numerous dwarf galaxies around large galaxies like the Milky Way or Andromeda. In reality, such dwarf galaxies are rare.
Realistic Simulations Involving Supernovae and Black Holes
The most conservative solution to this problem, without challenging the standard model, involves conducting more complex simulations that consider supernova explosions and the accretion of ordinary matter by massive black holes while also accounting for the gravitational pull of ordinary matter distributions. The force of supernova explosions, which eject ordinary matter from galaxies, or the winds produced by black holes devouring matter, can alter the growth of galaxies, including dwarf galaxies, by ejecting gas into the intergalactic medium. In general, the pressure force of ejected gas can oppose gravitational contraction, affecting galaxy formation.
Over the past decade, with increasing computing power, it has become possible to conduct more realistic simulations of galaxy formation and evolution, taking ordinary matter into account. Flamingo’s simulation now stands as the pinnacle of these efforts to replicate the evolution of the observable universe from the Big Bang to the present. This work has resulted in three articles published in the Monthly Notices of the Royal Astronomical Society: one describing the methods, another presenting the simulations, and the third examining to what extent the simulations reproduce the large-scale structure of the universe and the populations of galaxies, like those explored by the James Webb Space Telescope.
By varying the fundamental parameters of the standard cosmological model with cold dark matter in a simulation, researchers can attempt to find the parameters that best match observations. Cosmologists have particularly varied factors such as the strength of galactic winds, the mass of neutrinos, and key cosmological parameters.
The masses of ordinary neutrinos are not well known yet, but it’s understood that neutrino masses influence the sizes of dwarf galaxies, which could explain their scarcity. Since neutrinos have been rapidly moving since the Big Bang without interacting with other ordinary matter particles, they are considered a component of hot dark matter (the faster, on average, the particles in a gas move, the hotter the gas).
Following the Hubble Tension, There’s Now the S8 Tension With Galaxy Clusters
For some time, there has been a discrepancy in cosmology between measurements of the rate of space expansion obtained from cosmic microwave background data, approximately 380,000 years after the Big Bang, and measurements from supernovae observations covering billions of years. This inconsistency could indicate either a concealed error or the need to introduce new physics.
Specifically, this involves a disagreement in the estimates of what’s known as the Hubble-Lemaître constant, and it’s increasingly referred to as the Hubble tension. It’s also referred to as the S8 tension.
What does this mean?
Once again, the study of cosmic microwave background radiation provides a value for a measure of the significance of galaxy clusters or, more precisely, the density fluctuations of matter inferred from temperature and polarization fluctuations of the cosmic microwave background tell us that the observable universe should have evolved to contain a certain number of galaxy clusters per unit volume after more than 10 billion years.
However, as with dwarf galaxies, the numbers don’t add up. There have been fewer galaxy clusters formed in the past few billion years than expected. The density of galaxy clusters can be studied over the past few billion years by observing the gravitational lensing effects they produce.
Simulations conducted with Flamingo do not match the lower current value of the S8 parameter compared to that deduced from the standard cosmological model. This could also indicate the need for new physics, particularly regarding dark matter particles that might interact with unknown forces. Theoretical models have already been proposed in certain particle physics theories beyond the standard model of high-energy physics.