In 2011, the Nobel Prize in Physics was awarded to Saul Perlmutter, Brian Schmidt, and Adam Riess for discovering that the expansion of the universe was accelerating. They came to this conclusion by observing faraway exploding stars. These distant supernovae showed that the cosmos was getting bigger faster because the farther away the supernovae, the faster it appeared to be moving away from us.
In a recent paper published in Monthly Notices of the Royal Astronomical Society: Letters, a group of astrophysicists from the University of Canterbury in New Zealand questioned that Nobel Prize-winning finding.
The claim that our universe is not only expanding, but that the expansion is speeding up, has no simple explanation. There is no normal type of energy or matter that can make this happen; it requires a substance with negative pressure, a property that we have never otherwise observed. Indeed, we have difficulties even interpreting what it would mean. Physicists have just called whatever is causing this accelerated expansion “dark energy.”
This is not a bombastic “Einstein was wrong” claim.
The simplest type of dark energy is one that is constant both in space and in time. This “cosmological constant” is now one of the key parameters in the standard model of cosmology—the Lambda cold dark matter model—alongside other parameters such as the amount of dark matter, a type of invisible matter that supposedly accumulates around galaxies.
The cosmological constant, if it exists, is thought to make up more than two thirds of the matter-energy budget of the universe. We can already see its effects: The extra expansion makes it harder for galaxies to clump and reduces the number of new stars that form. The cosmological constant would also determine the ultimate fate of our universe, dooming it to an ever-faster acceleration that will cool the universe and everything in it to absolute zero—eternal darkness, and very, very cold.
But the authors of the new paper boldly assert that the standard model with the cosmological constant is just the wrong model for the universe, and that if one uses what they assert is the correct model, we do not need dark energy at all, not even in the form of a cosmological constant.
This is not a bombastic “Einstein was wrong” claim: The authors use the same mathematical framework as astrophysicists commonly do, which is Einstein’s theory of general relativity. This theory has it that all types of energy—including matter, radiation, and pressure—curve space, and the curvature in return influences how the energy-types move. The authors of the new paper, led by Antonia Seifert, don’t question this. They question instead how we use Einstein’s math.
Because there is a lot of matter in the universe, in practice we can’t do calculations with the exact distribution of stars and galaxies that we observe. There are just too many! Therefore, we make a big simplification. We say that if we average over sufficiently large distances, then stars and galaxies have the same distribution everywhere. It does not matter which part of the universe you find yourself in, it always looks pretty much the same. This idea is called the “cosmological principle,” and if we make use of it, we move from the overarching theory of general relativity to a specific model. And it is this model, the Lambda cold dark matter model, that requires dark energy to accelerate the expansion of the universe.
It is too early to declare the end of dark energy.
But strictly speaking, we already know that this model is, of course, wrong. Galaxies and galaxy clusters are not uniformly distributed. Instead, they form a spongy structure, if you can imagine a sponge some dozen billion light-years wide, made of galaxies, and expanding. It has patches with many galaxies in them—such as the one that we find ourselves in—but then there are big voids in between.
This is what the authors of the new paper looked at, a universe that is a patchwork of matter-filled regions like our own and voids. And all these regions interact, pushing and pulling on each other. Now not only do we have an expanding sponge that’s some dozens billion light-years wide, we have a sponge that doesn’t expand at the same rate everywhere.
The idea itself isn’t new—it’s been around for as long as general relativity itself. The problem is that it’s mathematically difficult to deal with. This is because in general relativity, the patches of different densities and the voids need to be suitably matched together, so that space itself remains smooth. And there is no agreement among physicists for how to do this correctly.
Nevertheless, about 15 years ago, David Wiltshire—one of the co-authors of the new paper—put forward a model which does exactly this. He called it the “timescape.” This is because, in Einstein’s theory, time runs at different speeds depending on the amount of matter that a region contains. In this timescape model, what we observe in our vicinity, in our own patch, is governed by different laws than what happens on average at larger distances. It is much like how what you observe in your home city may be a poor description for what happens in the world on average. So we might feel that time runs at a normal cadence here in our neighborhood of the universe, but a few galaxy clusters north time might run at a slower pace.
Wiltshire and his co-authors say that the idea that the universe undergoes an accelerated expansion as a whole is a misinterpretation of what we observe in our vicinity. In the standard cosmological model, these two things—what we observe nearby and what happens in the entire universe—must be identical, because this is how the model works. In Wiltshire’s timescape, this is no longer so. We can then reconcile the supernovae observations with a universe that expands but whose expansion doesn’t accelerate, all without the need for dark energy.
To reach this conclusion, the group of researchers compared how well the Lambda cold dark matter model and the timescape model fit to a catalogue of supernovae observations (which has grown a lot since the Nobel Prize-winning discoveries). In the timescape model, one does not need the cosmological constant, but one introduces a new quantity: the ratio of matter-filled patches of the universe to voids. The authors use a Bayesian analysis, which quantifies the probability of a model to be correct, and find that the timescape model actually fits the data better.
Imagine a sponge some dozen billion light-years wide, made of galaxies, and expanding.
This agrees with earlier findings which have built up for some time, that call the cosmological principle which underlies the Lambda cold dark matter model—and with that, the discovery of dark energy—into question. For several decades now, astrophysicists have found structures in the universe that are too large to be compatible with the cosmological principle, such as: the “Great Wall,” a collection of galaxies about 1 billion light-years away from us that extends over 1.5 billion light-years; the “Huge quasar group“ which spans 4 billion light-years; and the recently discovered Big Ring that spans 1 billion light-years. According to the Lambda cold dark matter model, these large structures should not exist. Yet they do.
While I am sympathetic to the idea of the timescape universe and think it has a lot of potential, I also think it is too early to declare the end of dark energy. Analyses like the one in the new paper, depend a lot on their assumptions (priors) and the data used, and I would not be surprised if another group soon claims that Lambda cold dark matter is superior after all. Questions like this one take time to settle.
Think, for example, of the long-standing argument about whether dark matter or modified gravity is a better fit to astronomical observations. Depending on which assumptions you make (the priors you choose), the same data is either evidence for modified gravity, or against it, or requires Newton’s constant to vary.
I see another problem with the timescape model. Even though the equations that Seifert, Wiltshire, and their colleagues used in the new paper have the same number of parameters as the Lambda cold dark matter model—meaning they have equally many numbers that need to be specified for the model to be predictive—their full model is significantly more complicated than the standard one. And, at least to me, it has remained unclear how to do calculations in their model. My own cognitive shortcomings say nothing about the validity of the model, of course, but the fact that using the model would require a steep learning curve makes me expect that its adoption in the astro-community will be slow.
As they say, all models are wrong, but some are useful, and whatever your misgivings about dark energy, it certainly has proved to be useful for explaining many observations, such as the features of the cosmic microwave background and the growth of galactic structures. It will take a lot more than one paper to convince astrophysicists that dark energy should be declared dead.
Lead image: Vadim Sadovski / Shutterstock