It follows the publication this month of a new look at supernovae in our Universe, which the researchers say give only a "marginal detection" of the acceleration of the Universe.
This seems to be a big deal, because the 2011 Nobel Prize was awarded to the leaders of two teams that used supernovae to discover that the expansion of the Universe is speeding up.
But never have I seen such a storm in a teacup. The new analysis, published in Scientific Reports, barely changes the original result, but puts a different (and in my opinion, misleading) spin on it.
So why does this new paper claim that the detection of acceleration is "marginal"? Well, it is marginal if you only use a single data set. After all, most big discoveries are initially marginal. If they were more obvious, they would have been discovered sooner.
The evidence, so far
The supernova data alone could, at only a slight stretch, be consistent with a Universe that neither accelerates nor decelerates. This has been known since the original discovery, and is not under dispute.
But if you also add one more piece of information - for example, that matter exists - then there’s nothing marginal about it. New physics is clearly required.
In fact, if the Universe didn’t accelerate or decelerate at all, which is an old proposal revisited in this new paper, new physics would still be required.
These days the important point is that if you take all of the supernova data and throw it in the bin, we still have ample evidence that the Universe’s expansion accelerates.
The pattern of galaxies isn’t actually random, so we used this pattern to effectively lay grid paper over the Universe and measure how its size changes with time.
Using this data alone shows the expanding Universe is accelerating, and it is independent of any supernova information. The Nobel Prize was awarded only after this and many other observational techniques confirmed the supernova findings.
Something missing in the Universe
Another example is the Cosmic Microwave Background (CMB), which is the leftover afterglow from the big bang and is one of the most precise observational measurements of the Universe ever made. It shows that space is very close to flat.
Meanwhile observations of galaxies show that there simply isn’t enough matter or dark matter in the Universe to make space flat. About 70 percent of the Universe is missing.
So when observations of supernovae found that 70 percent of the Universe is made up of dark energy, that solved the discrepancy.
The supernovae were actually measured before the CMB, so essentially predicted that the CMB would measure a flat Universe, a prediction that was confirmed beautifully.
So the evidence for some interesting new physics is now overwhelming.
I could go on, but everything we know so far supports the model in which the Universe accelerates. For more detail see this review I wrote about the evidence for dark energy.
What is this 'dark energy'?
One of the criticisms the new paper levels at standard cosmology is that the conclusion that the Universe is accelerating is model dependent. That’s fair enough.
Usually cosmologists are careful to say that we are studying 'dark energy', which is the name we give to whatever is causing the apparent acceleration of the expansion of the Universe. (Often we drop the "apparent" in that sentence, but it is there by implication.)
'Dark energy' is a blanket term we use to cover many possibilities, including that vacuum energy causes acceleration, or that we need a new theory of gravity, or even that we’ve misinterpreted general relativity and need a more sophisticated model.
The key feature that is not in dispute is that there is some significant new physics apparent in this data. There is something that goes beyond what we know about how the Universe works - something that needs to be explained.
So let’s look at what the new paper actually did. To do so, let’s use an analogy.
Margins of measurement
Imagine you’re driving a car down a 60km/h limit road. You measure your speed to be 55 km/h, but your odometer has some uncertainty in it. You take this into account, and are 99 percent sure that you are travelling between 51 km/h and 59 km/h.
Now your friend comes along and analyses your data slightly differently. She measures your speed to be 57 km/h. Yes, it is slightly different from your measurement, but still consistent because your odometer is not that accurate.
But now your friend says: "Ha! You were only marginally below the speed limit. There’s every possibility that you were speeding!"
In other words, the answer didn’t change significantly, but the interpretation given in the paper takes the extreme of the allowed region and says "maybe the extreme is true".
For those who like detail, the three standard deviation limit of the supernova data is big enough (just) to include a non-accelerating Universe. But that is only if there is essentially no matter in the Universe and you ignore all other measurements (see figure, below).
Above is a reproduction of Figure 2 from the new research paper with annotations added. The contours encircle the values of the matter density and dark energy (in the form of a cosmological constant) that best fit the supernova data (in units of the critical density of the Universe).
The contours show one, two, and three standard deviations. The best fit is marked by a cross. The amount of matter measured by other observations lies approximately around the orange line.
The contours lie almost entirely in the accelerating region, and the tiny patch that is not yet accelerating will nevertheless accelerate in the future.
Improving the analysis
This new paper is trying to do something laudable. It is trying to improve the statistical analysis of the data (for comments on their analysis see).
As we get more and more data and the uncertainty on our measurement shrinks, it becomes more and more important to take into account every last detail.
In fact, with the Dark Energy Survey we have three people working full-time on testing and improving the statistical analysis we use to compare supernova data to theory.
We recognise the importance of improved statistical analysis because we’re soon going to have about 3,000 supernovae with which to measure the acceleration far more precisely than the original discoveries, which only had 52 supernovae between them.
The sample that this new paper re-analyses contains 740 supernovae.
One final note about the conclusions in the paper. The authors suggest that a non-accelerating Universe is worth considering. That’s fine. But you and I, the Earth, the Milky Way and all the other galaxies should gravitationally attract each other.
So a Universe that just expands at a constant rate is actually just as strange as one that accelerates. You still have to explain why the expansion doesn’t slow down due to the gravity of everything it contains.
So even if the non-acceleration claim made in this paper is true, the explanation still requires new physics, and the search for the "dark energy" that explains it is just as important.
Healthy scepticism is vital in research. There is still much debate over what is causing the acceleration, and whether it is just an apparent acceleration that arises because our understanding of gravity is not yet complete.
Indeed that is what we as professional cosmologists spend our entire careers investigating. What this new paper and all the earlier papers agree on is that there is something that needs to be explained.
The supernova data show something genuinely weird is going on. The solution might be acceleration, or a new theory of gravity. Whatever it is, we will continue to search for it.