
Dark Energy Is Driving the Universe Apart. We May Finally Know Why.
Season 11 Episode 22 | 19m 2sVideo has Closed Captions
The universe expands faster. “Dark energy” may not be constant after all.
We’ve known since 1929 that the universe is expanding, and since 1998 that it’s speeding up. The unknown force behind this acceleration is called “dark energy,” assumed to stay constant in density. But new evidence hints it may change over time, possibly explaining why major measurements of the universe’s expansion rate don’t agree.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

Dark Energy Is Driving the Universe Apart. We May Finally Know Why.
Season 11 Episode 22 | 19m 2sVideo has Closed Captions
We’ve known since 1929 that the universe is expanding, and since 1998 that it’s speeding up. The unknown force behind this acceleration is called “dark energy,” assumed to stay constant in density. But new evidence hints it may change over time, possibly explaining why major measurements of the universe’s expansion rate don’t agree.
Problems playing video? | Closed Captioning Feedback
How to Watch PBS Space Time
PBS Space Time is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipIn 2014, the pale electromagnetic ghosts of an ancient exploded star barely grazed the camera of the Hubble Space Telescope.
Four points of dim light revealing the same supernova, framing the galaxy whose gravitational field had bent this light to us.
After a 9.3 billion year journey, the images arrived in rapid succession.
But other, longer paths wove the spacetime contortions of the surrounding galaxy cluster.
Astronomers spun up their models and predicted a single explosive encore the following year.
They waited, they prepared Hubble, and 12 months later their it was -- the laggard afterimage of the supernova just as predicted.
Ridiculous that we were even able to predict this.
But more incredible is its scientific value.
In a 1964 paper, Sjur Refsdal pointed out that the time-delay between gravitationally lensed supernova images could give a way to measure the expansion rate of the whole universe.
We've now done that with the eponymous Supernova Refsdal, but we've now also built this up into what may become the most powerful method for unlocking the nature of dark energy.
We've known that the universe is expanding since 1929, and that the expansion is accelerating since 1998.
The culprit behind the acceleration is unknown, so we live with a stand-in term "dark energy".
Our modern cosmological model assumes that dark energy has a constant density--always the same amount of the outward-shoving stuff per volume of space.
But there's recent evidence to the contrary--which may be why our primary efforts to measure the expansion rate of the universe disagree with each other.
Let's review those methods.
On the one hand, studying the oldest light in the universe-the cosmic microwave background, the CMB-gives us exquisite information about a time soon after the Big Bang.
Combined with our cosmological model we can predict how fast the universe should be expanding now, and that modern expansion is characterized as a measure of the Hubble constant.
On the other hand, observations of supernova explosions allow us to track that expansion history in the relatively modern universe.
Exploding white dwarf stars-type 1a supernovae-have very predictable intrinsic brightnesses, allowing us to determine distances based on how faint they appear to us.
And those distances give us another measure of the Hubble constant, in a sense more directly measuring that modern expansion rate.
If our measurements are good and our cosmological model is good then the CMB and supernova results should match.
But they don't.
The supernova program finds an expansion rate a few percent higher than the extrapolation of the CMB measurements.
So, either the CMB or the supernova result is flawed, or the cosmological model we use to extrapolate between the two is wrong.
This so-called hubble tension has been lurking around for some years, and stubbornly refuses to go away even as we improve our measurements.
That hints that we may really need new physics in our cosmological model.
For example, that model assumes constant dark energy, but what if dark energy really changes over time.
But before we jump to conclusions, it's absolutely critical that we make independent measurements of the expansion history.
The supernova results are particularly vulnerable to systematic errors because they need to be bootstrapped off a series of distance calibration steps.
More room for weak links in the calculation chain.
Now, some would say, me included, that one of the most promising independent methods is something called time-delay cosmography.
This is the method outlined by Sjur Refsdal and was applied to the eponymous supernova.
But it turns out we can also use a different cosmic cataclysm that's far brighter and more common that the supernova-the quasar-and the best time-delay cosmography has been done with quasars.
So I will start by talking about quasars, but supernovae do have a bright future in this field as we'll see.
So a quasar is what you get when the supermassive black hole in the center of every galaxy enters a feeding frenzy, as happens from time to time.
Active galactic nucleus is more general, quasars are the bright ones that we care about.
So, matter driven into the galactic core is caught in the colossal gravitational field and spirals in to feed the black hole.
The gas also generates such prodigious heat on its way that these "accretion disks" glow bright enough to be seen across the universe.
Quasars are messy eaters.
The infalling material comes in fits and bursts and the bright central region fluctuates wildly.
This causes the entire beast to flicker on timescales of hours to months.
By itself a quasar isn't great for measuring distances.
But if we introduce a new level of chaos then the magic happens.
In Einstein's universe, light does not travel to us in straight lines.
Mass curves spacetime, and light is bound to follow those curves.
We see the effects of gravitational lensing everywhere in the universe.
Most of the lensing is subtle-manifesting as changing brightness and warping of shapes.
But in rare cases, a "gravitational lens" is so perfectly situated between us and a distant source that the latter's light follows multiple paths to reach us.
We literally see the same object through different paths through space, and that looks like duplications of the source around the lens.
This is "strong lensing" and strongly lensed quasars are going to become our new cosmic expansion tracker.
The core concept is simple enough.
The 2 or 4 different paths taken by the quasar's light are different lengths, and so the light we see in each of the images has taken a different amount of time to reach us.
So, although we're looking at the same quasar in each, we're looking at the quasar as it was at slightly different times.
And this is where the quasar variability comes in.
Its brightness fluctuations are pretty random, but they still form distinct patterns in the lightcurve-the plot of brightness over time.
If we can record the lightcurves of different images in one lensed quasar, then we can find where the lightcurve patterns match up and that gives us the time difference of the paths of different images.
We call this the time delay.
That in turn gives us a way to measure the actual distances to the lens and the quasar.
This is the first level description.
Let's dig deeper because it's not as easy as it sounds.
First up-the actual math.
It's based on Fermats principle of least time-light will always take the path between two points that takes the least time.
This can be used to derive the bending of light by regular lenses, but it also applies to gravitational lenses.
The paths taken in a gravitational lens represent local minima in the light travel time-paths for which, the photon, if deviated slightly, would take longer to reach us.
The path lengths give us one aspect of the time delay.
The other is the Shapiro or gravitational time delay- just passing through different parts of the galaxy's gravitational field leads to an additional time difference.
The resulting time delay formula is surprisingly simple for such a complex system.
And the most important thing about this expression is that the front term scales with the Hubble constant-with the expansion speed of the universe.
That means the time delay goes straight to a Hubble constant measurement.
And none of this depends on a chain of other distance measurements.
Sounds too easy-so why hasn't this become the standard for measuring cosmic expansion over the supernovae method?
Well several reasons actually.
One is the rarity of good gravitationally lensed quasars.
These objects are rare already due to the precise alignment needed between observer, lens and quasar.
And most lensed quasars aren't great for this purpose due to the next issue: modeling the lensing galaxy.
This is by far the biggest challenge.
Galaxies make pretty terrible lenses compared to precision-ground glass lenses.They are lumpy, irregular objects made of stars and dark matter.
In order to properly calculate time delays we need to know the gravitational field of that galaxy.
It's kind of amazing that lens modeling is even possible at all.
One particular trick is especially important: we can pretend that all of the galaxy's mass is collapsed into a flat plane due to the extreme distances compared to the size of the galaxy.
This is the thin-lens approximation, and it turns gravitational lensing into a classic geometric optics problem.
So we build a lens model based on the light of the lensing galaxy's stars and the positions of the lensed images themselves.
That works pretty well when the lensing galaxy is simple-and that requirement for an easily modelable lens reduces the number of useful objects dramatically.
But even in the case of a simple lens, there are confounding factors that we can't completely rule out.
Even if we can see the light from the lens's stars, we can't see its dark matter so have to make some assumptions about how it's spread through the galaxy.
We also have to try to model the gravitational effects of nearby objects like other galaxies.
One of the trickiest issues is called the mass-sheet degeneracy.
We tend to assume that the lensing galaxy is the only significant source of bending for a given quasar's light.
But that light is actually traveling billions of light years, and may well pass through other gravitational fields-other "mass sheets"--on the way.
It's very difficult to rule that out.
These issues are best solved by making a different type of measurement of the lens's gravitational field.
The stars of a galaxy move in the mutual gravity of each other.
And it's possible to get a bead on that motion by looking for average Doppler shifts in their starlight.
This lets us build a much less ambiguous map of the galaxy's gravitational field.
But these sorts of "kinematic" measurements are very difficult to make, and typically only possible for relatively nearby lensing galaxies.
Okay, so let's say we have a perfect lens model.
How do we get time delays?
Well by monitoring the system for years with a decent-sized professional telescope and building a lightcurve for each image in the lens.
But even building the lightcurve isn't easy.
The separation between lens images is tiny and they're often blended together when observed from the ground.
They need to be pried apart with clever algorithms.
That plus the faintness of these objects means every measurement has an uncertainty attached to it.
On top of that we can't monitor continuously.
There are gaps in the lightcurve during the day and a long "season gap" every year when the Sun gets in the way of our lens.
All of this adds to the uncertainty in the time delay measurement when we try to line up the light curves from different quasar images.
Combine the observing constraints with the lens model uncertainties and we end up with some big error bars in our measurement of the time delay, and in the resulting Hubble constant for any one lens.
OK, so where are we with this whole program?
In 2019, the Holicow collaboration published their latest measurement of H0 based on the 6 best lensed quasars, getting a value of 73.3 plus or minus.
And just last year Holicow's successor TDCOSMO improved the analysis and added two more lenses to get a slightly smaller 71.8.
These numbers are more consistent with the late universe supernova value than the CMB value, but the error bars are larger that the supernova uncertainties.
Taken on their own the time delay cosmography results don't confirm the tension between the late and early expansion measurements.
But when these new results are actually combined with the supernova results they actually do strengthen that tension, and suggest a less than one in 10 million chance that these values are so different due to random noise.
But we're not quite there in confirming the Hubble tension.
There could be some unknown systematic error that is pushing the supernova result to be higher than the CMB result.
In that case the lensing value on its own isn't tight enough to confirm the tension.
Now I'll come back to how that's soon going to change.
First though, quasars are awesome for this work because they are, perhaps horrifically, extremely abundant in the universe.
But time-delay cosmography was first conceived for use with supernovae, so how supernovae do?
Well, precise measurement of time delay for the late 5th image of supernova Refsdal gave us a Hubble constant measurement of around 70, but with pretty large error bars.
There's also the lensed supernova SN H0pe, which is extra special because it's the same sort of type-1a supernova that are used to do our other late universe Hubble constant measurements.
So in a sense, this one gave us two independent measures of the Hubble constant in one object.
In practice these were combined-the lensing side gave us a fairly direct H0 measure, while the type 1a supernova analysis allowed us to break some of the degeneracies in the lensing analysis.
H0pe yielded a Hubble constant that's also consistent with other modern-universe expansion measurements, but again with huge error bars.
For both of these supernovae, the error bars are large, but no larger than we get for any single quasar.
The advantage of the quasars is we have more of them.
Statistics lets us whittle down uncertainty.
In general, increasing the number of both lensed quasars and lensed supernovae is going to be the key to moving forward.
And that is about to happen.
In early 2026 the Vera Rubin Observatory will start its LSST survey, in which it'll image the entire southern sky every few days for a full 10 years.
Combined with other new facilities like ESO's Euclid satellite, we expect to discover thousands new lensed quasars and hundreds of lensed supernovae.
It's been estimated that the H0 uncertainty will be pushed down to a percent or less, which, if it happens, will render the hubble tension incontrovertible.
But Rubin may be able to do a lot more.
With enough lensed quasars and supernovae, we may be able to do more than measure the modern expansion rate.
We may be able to trace the evolution of the expansion history over cosmic time.
In the current standard cosmological model-Lambda CDM-dark energy is assumed to have a constant density.
But there's evidence that this isn't the case I mentioned earlier.
In 2025 the Dark Energy Spectroscopic Instrument team reported the results of its expansion history study using baryon acoustic oscillations, and obviously we covered that.
When combined with the CMB and supernova results DESI reports strong indication that dark energy has weakened over cosmic time.
This was the dark energy news of the decade.
If we can measure that change somewhat accurately, we can actually start to nail down the physics or dark energy-and figure out what the stuff actually is.
Time-delay cosmography with gravitational lensing may be the key to doing that.
In fact, this could be the most important result that will come out of Rubin-LSST, and many people are working on it.
No big deal really.
We just clock the flickering echoes of a thousand ancient cataclysms and discover the culprit behind our accelerating spacetime.


- Science and Nature

A documentary series capturing the resilient work of female land stewards across the United States.












Support for PBS provided by:

