A new value for the Hubble constant – the expansion rate of the universe — has been calculated by an international group of astrophysicists. The team used primordial distance scales to study more than 200 supernovae observed by telescopes in Chile and Australia. The new result agrees well with previous values of the constant obtained using a specific model of cosmic expansion, while disagreeing with more direct observations from the nearby universe – so exacerbating a long-running disagreement between cosmologists and astronomers.

The Hubble constant is calculated by looking at distant celestial objects and determining how fast they are moving away from Earth. A plot of the speeds of the objects versus their distance from Earth falls on a straight line, the slope of which is the Hubble constant.

Obtaining an object’s speed is straightforward and involves measuring the redshift of the light it emits, but quantifying its distance is much more complicated. Historically, this has been done using a “distance-ladder”, whereby progressively greater length scales are measured by using one type of “standard candle” to calibrate the output of another standard candle. The distance to stars known as Cepheid variables (one type of standard candle) is first established via parallax, and that information is used to calibrate the output of type Ia supernovae (another type of standard candle) located in galaxies containing Cepheids. The apparent brightness of other supernovae can then be used to work out distances to galaxies further away.

Large discrepancy

This approach has been refined over the years and has most recently yielded a Hubble constant of 73.5 ± 1.7 kilometres per second per magaparsec (one megaparsec being 3.25 million light-years). That number, however – obtained by starting close to Earth and moving outwards – is at odds with calculations of the Hubble constant that take the opposite approach — moving inwards from the dawn of time. The baseline in that latter case comes from length scales of temperature fluctuations in the radiation dating back to just after the Big Bang, known as the cosmic microwave background. The cosmic expansion rate at that time is extrapolated to the present day by assuming that the universe’s growth has accelerated under the influence of a particular kind of dark energy. Using the final results from the European Space Agency’s Planck satellite, a very different Hubble constant of 67.4 ± 0.5 is obtained.

To try to resolve the problem by using an alternative approach, scientists have in recent years created what is known as an “inverse distance ladder”. This also uses the cosmic microwave background as a starting point, but it calculates the expansion rate at a later time – about 10 billion years after the Big Bang – when the density fluctuations imprinted on the background radiation had grown to create clusters of galaxies distributed within “baryon acoustic oscillations”. The oscillations are used to calibrate the distance to supernovae – present in the galaxies – thanks to the fact that the oscillations lead to a characteristic separation between galaxies of 147 megaparsecs.

In the latest work, the Dark Energy Survey collaboration draws on galaxy data from the Sloan Digital Sky Survey as well as 207 newly-studied supernovae captured by the Dark Energy Camera mounted on the 4-metre Víctor M Blanco telescope in Chile. Using spectra obtained mainly at the similarly-sized Anglo-Australian Telescope in New South Wales, the collaboration calculates a value for the Hubble constant of 67.8 ± 1.3 – so agreeing with the Planck value while completely at odds with the conventional distance ladder.

Fewer assumptions

“The key thing with these results,“ says team member Ed Macaulay of the University of Portsmouth in the UK, “is that the only physics you need to assume is plasma physics in the early universe. You don’t need to assume anything about dark energy.”

Adam Riess, an astrophysicist at the Space Telescope Science Institute in Baltimore, US who studies the distance-ladder, says that the new work “adds more weight” to the disparity in values of the Hubble constant obtained from the present and early universe. (Indeed, the distance-ladder itself has gained independent support from expansion rates calculated using gravitational lensing.) He reckons that the similarity between the Planck and Dark Energy Survey results means that redshifts out to z=1 (going back about 8 billion years) are “probably not where the tension develops” and that the physics of the early universe might be responsible instead.

Chuck Bennett of Johns Hopkins University, who led the team on Planck’s predecessor WMAP, agrees. He points to a new model put forward by his Johns Hopkins colleagues Marc Kamionkowski, Vivian Poulin and others that adds extra dark energy to the universe very early on (before rapidly decaying). This model, says Bennett, “proves that it is theoretically possible to find cosmological solutions to the Hubble constant tension”.

Macaulay is more cautious. He acknowledges the difficulty of trying to find an error, reckoning that potential systematic effects in any of the measurements “are about ten times smaller” than the disparity. But he argues that more data are needed before any serious theoretical explanations can be put forward. To that end, he and his colleagues are attempting to analyse a further 2000 supernovae observed by the Dark Energy Camera, although they are doing so without the aid of (costly) spectroscopic analysis. Picking out the right kind of supernovae and then working out their redshift “will be very difficult,” he says, “and not something that has been done with this many supernovae before”.

A preprint describing the research is available on arXiv.