Skip to content(if available)orjump to list(if available)

Supernovae evidence for foundational change to cosmological models

PaulHoule

When I worked at arXiv one of my coworkers was a fresh astrophysics PhD who was cynical about the state of the field. He thought that we didn't know what the hell was going on with accretion disks but that a few powerful people in the field created the impression that we did and that there was no dissent because it was so difficult to get established in the field.

When I first saw the ΛCDM model my first impression was that I'd didn't believe it, it seemed bad enough to have dark matter that we didn't understand (though WIMPs and axions are plausible) but adding equally mysterious and physically unmotivated dark energy made it seem just an exercise in curve fitting.

There have been a longstanding problem that the history of the universe and cosmological distance scale haven't made sense.

https://medium.com/starts-with-a-bang/the-hubble-tension-sti...

When I was getting my PhD in condensed matter physics I was going to the department colloquium all the time and seeing astrophysics talks about how some people thought the hubble constant was 40 km/s/Mpc and others thought it was 80 km/s/Mpc. With timescape cosmology maybe they were both right.

Another longstanding problem in astronomy is that since the 1970s it's been clear we have no idea of how supermassive black holes could have formed in the time we think the universe has existed. With the JWST there are a flood of results that show the first 500 million years of the universe probably lasted a lot more than 500 million years

https://iopscience.iop.org/article/10.3847/2041-8213/ac9b22

Maro

I was doing an astrophsyics Phd 15 years ago, and one of the many reasons I abandoned it is exactly this. To get published, I would have had to start all my papers introducing and assuming the ΛCDM model, even though it just didn't seem right to me (too many "dark" components, too many assumptions, inflation).

To be fair, people a lot smarter than me think it's good, or good enough.

Keysh

> There have been a longstanding problem that the history of the universe and cosmological distance scale haven't made sense. https://medium.com/starts-with-a-bang/the-hubble-tension-sti...

> When I was getting my PhD in condensed matter physics I was going to the department colloquium all the time and seeing astrophysics talks about how some people thought the hubble constant was 40 km/s/Mpc and others thought it was 80 km/s/Mpc. With timescape cosmology maybe they were both right.

You're (mis)remembering a different (old) problem and confusing it with a new one. The problem in the 1970s and 1980s was: what is the local expansion rate of the universe? Where "local" mean "within a few hundred megaparsecs". There were two main groups working on the problem: one group tended to find values of around 50 km/s/Mpc and other values of around 100. Gradually they began to converge (in the early 1990s, the low-H0 group getting values of around 60, the high-H0 group values of around 80), until a consensus emerged that it was in the low 70s, which is where we are now.

The "Hubble tension" is a disagreement between what we measure locally (i.e., a value in the low 70s) and what theory (e.g., LCDM) says we should measure locally, if you extrapolate the best-fitting cosmological models -- based on cosmological observations of the CMB, etc. -- down to now (a value in the upper 60s). This has only become a problem very recently, because the error bars on the local measurement and the cosmological predictions are now small enough to suggest (maybe/probably) meaningful disagreement.

> Another longstanding problem in astronomy is that since the 1970s it's been clear we have no idea of how supermassive black holes could have formed in the time we think the universe has existed. With the JWST there are a flood of results that show the first 500 million years of the universe probably lasted a lot more than 500 million years https://iopscience.iop.org/article/10.3847/2041-8213/ac9b22

That's not a "longstanding" problem, it's a problem from the last 25 years or so. In order for there to be a problem, you have to have what you think are reliable estimates for the age of the universe and evidence for large supermassive black holes very early in the universe. This is something that has emerged only relatively recently.

(Your link, by the way, is to a paper that has nothing to do with black holes.)

somenameforme

Slightly tangential but how does the ad hoc nature of things like cosmic inflation seemingly not bother more people? Quite the opposite, it's a rather lauded discovery. This is more cosmology than astronomy, but at least reasonably related. That topic alone destroyed my interest in an academic pursuit of astronomy.

'Here's an idea that makes no logical sense, has no physical argument whatsoever (let alone evidence) in support of its existence, and just generally seems completely absurd - but if we ignore all of that, it solves a lot of other pesky problems with reality, as observed, practically falsifying other lauded theories.'

Just add more epicycles?

lutorm

Well, it's not like people pulled it out of thin air. Both inflation and the lambda-CDM models are solutions to the GR equations, so in that sense it's perfectly justifiable to see if general relativity can explain the data. I don't think it's fair to say that it "makes no logical sense".

misja111

I assume that OP was talking about the cosmic inflation theory that claims there was a rapid expansion immediately after the big bang. I don't see how that's a solution to the GR equation, could you maybe explain/give a link?

stouset

Nobody’s happy with dark energy, it’s just the only framework we have that fits the data. All the other ideas might be brilliant and inspired but are measurably worse at describing the real world.

Even the name “dark energy” is a tacit acknowledgment that—along with dark matter—we have no clue what the underlying physics actually is.

cryptonector

How is timescapes measurably worse? It's not a new theory, so perhaps it's been tested, and if it's failed why is it back in the news? (Sometimes failures get back in the news. It's a fair question.)

gosub100

What do you mean by "bother"? The universe appears to be expanding, so I assume you don't deny that evidence, correct?

colechristensen

On the topic of early black hole growth I saw this released a couple of months ago, an early black hole apparently growing at 40x the Eddington limit 1.5 billion years after the big bang.

https://chandra.si.edu/press/24_releases/press_110424.html

> A super-Eddington-accreting black hole ~1.5 Gyr after the Big Bang observed with JWST

https://www.nature.com/articles/s41550-024-02402-9

dotancohen

Correct me if I'm wrong, but the term Eddington limit is a bit misleading as it does not describe some physical rate that cannot be exceeded. Lots of super Eddington objects are known.

PaulHoule

It's the point where light pressure can blow off the outer layers of a star

https://en.wikipedia.org/wiki/Eddington_luminosity

Objects that pulse like

https://en.wikipedia.org/wiki/Eta_Carinae

can evade it and there are other ways too.

When it comes to super-massive black holes there is the question of how quickly stuff can even get close enough to the black hole to get into the accretion disk.

500M years is a long time for the kind of large star that becomes a black hole (blows up in 10M years or so), but if one black hole is going to merge with another black hole and that is going to merge with another black hole and so on there is no Eddington limit (no EM radiation!) but rather the even slower process of shedding angular momentum via gravitational radiation. (One highlight of grad school was the colloquium talk where we got to hear the signal from two black holes colliding almost 20 years before it was detected for real)

I hope JWST sees

https://en.wikipedia.org/wiki/Stellar_population#Population_...

Note those Pop 3 stars have a higher Eddington limit because they've got hardly any "metal" in them which means light interacts with them differently, although astronomers have the strange (to me) conventional that anything heavier than Helium is a metal which includes, say, oxygen. (As a cond-mat PhD I think a metal is something that has free electrons, which could be one of those elements towards the left side of the periodic table or could be a doped semiconductor or polymer like polyaniline)

null

[deleted]

Xlr8head

[dead]

gammarator

Here’s an extended comment by another astrophysicst: https://telescoper.blog/2025/01/02/timescape-versus-dark-ene...

The most important bit:

> The new papers under discussion focus entirely on supernovae measurements. It must be recognized that these provide just one of the pillars supporting the standard cosmology. Over the years, many alternative models have been suggested that claim to “fix” some alleged problem with cosmology only to find that it makes other issues worse. That’s not a reason to ignore departures from the standard framework, but it is an indication that we have a huge amount of data and we’re not allowed to cherry-pick what we want.

throwawaymaths

the thing is, this is not really an alternative model. it's rather actually bothering to do the hard math based on existing principles (GR) and existing observations, dropping the fairly convincingly invalidated assumption of large scale uniformity in the mass distribution of the universe.

if anything the standard model of cosmology should at this point be considered alternative as it introduces extra parameters that might be unnecessary.

so yeah it's one calculation. but give it time. the math is harder.

sandgiant

This has the same number of free parameters as LambdaCDM. Also this result only looks supernovae, i.e. low redshift sources. LambdaCDM is tested on cosmological scales.

Very interesting, but “more work is needed”.

throwawaymaths

thats not the case, if, as is increasingly speculated, the lambda is not constant over time. you figure two parameters for linear and three for a quadratic experience

bsder

> dropping the fairly convincingly invalidated assumption of large scale uniformity in the mass distribution of the universe.

The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

Otherwise, you are simply invoking the anthropic principle: "The universe is the way it is because we are here."

throwawaymaths

> The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

you need no such thing. thats like saying "i refuse to acknowledge the pacific ocean to be so damn large without a mechanism". you dont need that. it just is. this doesnt preclude the existence of such a mechanism. but for any (legit) science, mechanistic consideration should be strictly downstream of observation.

marcyb5st

I think that can be mitigated in three ways: our understanding of inflation is flawed, there were more "nucleation" sites where our universe came to be, and there are the already theorized baryonic acoustic oscillations that could introduce heterogeneity in the universe.

Maybe is a combination of these, maybe something else. If nothing else, the uniformity is less probable than a mass distribution with variance (unless there is a phenomenon like inflation that smoothen things out, but also that was introduced to explain the assumption of a homogeneous universe). I concede that explaining the little variance in the CMB with our current understanding is hard when dropping homogeneity assumption however.

jcarreiro

> The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

The mechanism is gravity; and we have good observational evidence that the mass distribution of the universe is not uniform, at least at the scales we can observe (we can see galaxy clusters and voids).

zmgsabst

You don’t need a mechanism to point out a fact contradicts an assumption, eg, our measurements show non-uniform mass at virtually all scales (including billions of light years). There simply is no observable scale with uniform mass.

Obviously there’s some mechanism which causes that, but the mere existence of multi-billion light year structures invalidates the modeling assumption — that assumption doesn’t correspond to reality.

User23

Calculation is harder in a world of functionally limitless compute is sort of interesting. Where do we go from here?

austin-cheney

That sounds like regression.

If this problem of regression occurs as regularly as your quote implies then the fault is not in these proposed alternatives, or even in the likely faulty existing model, but in the gaping wide holes for testing these things quickly and objectively. That is why us dumb software guys have test automation.

abdullahkhalids

You are oversimplifying science, especially theoretic physics. At the point where we are, there are neither any quick/cheap tests, and there is no objectivity. The space of possible correct theories is infinite, and humans are simply not smart enough to come up frameworks to objectively truncate the space. If we were, we would have made progress already.

There is a lot of subjectivity and art to designing good experiments, not to mention a lot of philosophical insight. I know a lot of scientists deny the role of philosophy in science, but I see all the top physicists in my fields liberally use philosophy - not philosopher type philosophy but physicist type philosophy - to guide their scientific exploration.

naasking

> and humans are simply not smart enough to come up frameworks to objectively truncate the space.

We are, but some people stubbornly resist such things. For instance, MOND reproducing the Tully-Fisher relation and being unexpectedly successful at making many other predictions suggests that any theories purporting to explain dark matter/gravitational anomalies should probably have MOND-like qualities in some limit. That would effectively prune the space of possible theories.

Instead, they've gone in the complete opposite direction, basically ignoring MOND and positing different matter distributions just to fit observations, while MOND, against all odds since it's not ultimately correct, continues to make successful predictions we're now seeing in JWST data.

austin-cheney

I am not. You are using bias as an excuse to qualify poor objectivity. I am fully aware that astrophysics contains a scale and diversity of data beyond my imagination, but that volume of data does not excuse an absence of common business practices.

> The space of possible correct theories is infinite

That is not unique to any form of science, engineering, or even software products.

> and humans are simply not smart enough to...

That is why test automation is a thing.

bubblyworld

I think automated hypothesis testing against new data in science is itself an incredibly difficult problem. Every experiment has its own methodology and particular interpretation, often you need to custom build models for your experimental setup to test a given hypothesis, there are lots of data cleanup and aggregation steps that don't generalise, etc. My partner is in neuroscience, for instance, and merging another lab's data into their own workflows is a whole project unto itself.

Test automation in the software context is comparatively trivial. Formal systems make much better guarantees than the universe.

(not to say I think it's a bad idea - it would be incredible! - but perhaps the juice isn't worth the squeeze?)

austin-cheney

> Every experiment has its own methodology

That is bias. Bias is always an implicit default in any initiative and requires a deliberate concerted effort to identify.

None of what you said is unique to any form of science or engineering. Perhaps the only thing about this unique to this field of science, as well as microbiology, is the shear size and diversity of the data.

From an objective perspective test automation is not more or less trivial to any given subject. The triviality of testing is directly determined by the tests written and their quality (speed and reproducibility).

The juice is always worth the squeeze. Its a business problem that can be answered with math in consideration of risk, velocity, and confidence.

JumpCrisscross

Reading this as a layman, it looks like releasing ΛCDM's cosmological principle [1] reveals the nontrivial temporal effects mass clusters have via general relativity. As a result, there could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy. This not only changes how we interpret supernova data (the acceleration isn't generally happening, but an artefact of looking through space which is older than our own), but may also negate the need for dark matter (EDIT: dark energy) and the meaning of a single age of our univese.

(I'm also vaguely remembering a multi-universe model in which empty space inflates quicker than massed space.)

[1] https://en.wikipedia.org/wiki/Cosmological_principle

throwawaymaths

> here could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy.

important to note that the motivation for releasing the cosmological principle, is that we know that there are "small" voids and that there is strong evidence of much larger voids and structure on the scale of tens of billions of light years that is incompatible with the cosmological principle, so it's not just a thing to do on a whim, it's supported by observation.

JumpCrisscross

> we know that there are "small" voids and that there is strong evidence of much larger voids and structure on the scale of tens of billions of light years that is incompatible with the cosmological principle

Two cosmologists debate which of their town’s bars is better, the small one or the large one. The town has one bar.

throwawaymaths

fair, I should have also put voids in quotes. is the black sea part of the med and is the med part of the atlantic?

aeve890

>Reading this as a layman, it looks like releasing ΛCDM's cosmological principle

You mean relaxing. Also... "as a layman"? Lol what kind of layman are you. Respect.

JumpCrisscross

> You mean relaxing

Fair enough, at high redshift the cosmological principle could still hold under timescape. (It doesn't require, it however.)

All that said, I'm generally sceptical about findings based on supernova data. They require so much statistical work to interpret correctly that the error rate on first publications is remarkably high.

Keysh

> there could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy.

A problem with that idea would be that the ages of galaxies in low-density regions (including voids) tend to be younger than galaxies in denser regions, suggesting that galaxy evolution proceeds more slowly in voids.

https://www.iaa.csic.es/en/news/galaxies-great-cosmic-voids-...

sesm

Overturning Lambda-CDM model removes only one observation that is explainable by Dark Matter (peaks in spectrum of CMB). It's not the only observation.

throwawaymaths

well the edges of a galaxy are in less of a deep gravity well than the center, so time and thus rotation should go faster. is that enough to account for flat rotation curves? i dont know enough to do a back of the envelope calculation

escape_goat

Back here on the lay benches, I think the best starting point in the Wikipedia is probably the the article on inhomogenous cosmology, of which the Timescape Comsology proposed by David Wiltshire (listed as an author on this paper) in 2007 is a notable example; it is discussed in the article.

<https://en.wikipedia.org/wiki/Inhomogeneous_cosmology>

Gooblebrai

This is a mind-blowing theory!

astrobe_

> As a result, there could be empty regions of space in which billions of years more have elapsed

If they are empty, those billion years didn't happen. But nothing is really empty, right?

pdonis

> If they are empty, those billion years didn't happen.

No, that's not correct. Here's a better way to look at it:

In our cosmological models, we "slice up" the spacetime of the universe into slices of "space at a constant time"--each slice is like a "snapshot" of the space of the entire universe at a single instant of "cosmological time". The models, which assume homogeneity and isotropy, assume that the actual elapsed proper time at every point in space in each "snapshot" is the same--in other words, that "cosmologcal time" is also proper time for comoving observers everywhere in space at that instant of cosmological time--the time actually elapsed since the Big Bang on a clock moving with each observer.

What these supernova papers are examining is the possibility that "cosmological time" and proper time (clock time) for comoving observers do not always match: roughly speaking, in areas with higher mass concentration (galaxy clusters), proper time lags behind cosmological time (the time we use in the math to label each "snapshot" slice of the space of the universe), and in areas with lower mass concentration (voids), proper time runs ahead of cosmological time. The idea is that this mismatch between proper time and cosmological time can be significant enough to affect the inferences we should be drawing from the supernova observations about the expansion history of the universe.

As far as I know the jury is still out on all this; claims by proponents that what is presented in these papers is already sufficient to require "foundational change" are, I think, premature. But it is certainly a line of research that is worth pursuing.

lukasb

Does the idea of a single cosmological time even make sense? I thought one of the key parts of relativity is that which events happen simultaneously depends on your perspective.

le-mark

As a layman, what I don’t get is; the speed of light is constant, so wouldn’t that nullify any time/space fluctuations due to lack of mass/gravity?

ben_w

Assuming I correctly understood the argument in the link:

Even if the space was truly empty, the expansion of that space would have gone on for longer, and thus things on opposite sides would eventually notice they were more distant.

But also yes the space isn't really totally empty.

nine_k

The higher the density, the more curved is the spacetime at that area, and the slower is the passage of time. You don't have to go to extremes like black holes vs absolute vacuum. A sufficient difference should be visible between regions closer to centers of galaxies, or just clusters of nearby galaxies, and really large "voids" between them, which contain some matter, and even some stars, but are vastly more empty. This is what the article explores.

(This connects in a funny way to Vernor Vinge's SF idea of slower and faster areas of space. The "high" / "fast" space is mostly empty, so the time passes there faster than in the "unthinking depths" around galactic cores, and hugely more progress is done by civilizations in the "fast" space, as observed from the "slow" space.)

skirmish

> Vernor Vinge's SF idea of slower and faster areas of space

His ideas were more ambitious: there are advanced technologies (e.g. FTL travel) that work in "fast" space but completely stop working in the "slow zone" (where the Solar system is located). On the other hand, even human-level intelligence would stop functioning close to the galactic center, the crew would not be able to operate the ship and would be stranded.

asplake

It’s early days on this, so let me ask again what I have asked previously: What does timescape do to estimates of the age of the universe?

sigmoid10

It would mean that we literally can't calculate it anymore, because expansion and everything else we see might just be artefacts of inhomogeneities beyond the scale of the observable universe. But that would crash hard with our observation of the CMB and since this study only looks at supernovae, I would not bet on it holding up for long.

geysersam

How would

> that we literally can't calculate it (the age of the universe) anymore

crash with our observation of the CMB?

I don't see how us being unable to calculate a quantity from one set of observations could possibly clash with another set of observations (the cmb).

What am I missing?

hnuser123456

The CMB suggests we get a picture of the entire early universe.

However, other things are suggesting we might not be seeing the whole universe just by looking as far away as possible. It could be we can see some regions on the CMB that have already expanded outside of our observable universe. These regions aren't just "even fainter and we need a better telescope", they're "the last photon from that region that will ever reach us came and left billions of years ago."

Therefore, there might not be one singular hubble constant, there might be two. One that applies to our local observable universe, and one that applies to the entire universe.

It could be that the universe is 26 billion years old: https://academic.oup.com/mnras/article/524/3/3385/7221343?lo...

And because at great enough distances(/times), expansion is faster than light, and we simply can't see a significantly different epoch of the universe just by looking deeper.

User23

Even more fun, once you abandon isotropy you don't even need to posit matter inhomogenities. It could just be that spacetime itself has irregular topology.

Which, incidentally, is probably a better theory than dark matter. For example it can produce the same results without the problem of undetectable matter.

cryptonector

Presumably we should be able to build a theory of spacetime that yields no need for dark matter, but we're not there yet.

On the other hand, determining the local time dilation factor based on all mass beyond the local area is essentially not possible. We can talk about how the great voids have less time dilation than galaxy clusters, sure. But what about the universe as a whole? Our universe could be embedded in a larger one that contributes to time dilation in ours and we could never sense that. Time dilation at cosmological scales is relative for this reason.

block_dagger

Maybe Vernor Vinge was right.

Vecr

He wasn't. He stated from the start that all of his stories were gimicked to remove the singularity.

This theory does not do that.

incognito124

Interesting that the following wiki has been updated with this paper:

https://en.wikipedia.org/wiki/Inhomogeneous_cosmology

SaintSeiya

The ΛCDM model always felt "wrong" in my gut: dark matter? dark energy? is just the modern equivalent of the aether theory. The fact that is more complex to calculate is not an excuse to prefer the ΛCDM model. The God's theory is even simpler as it spares of any math and physics, yet we do not use it.

dgroshev

Previously (includes informed critique of the paper): https://news.ycombinator.com/item?id=42495703

jmward01

The expansion of the universe has always come down to one question for me. In an expanding universe when you throw a ball up what speed does it come down?

pezezin

My very limited understanding of the topic is that for a gravitationally bound system like the Earth, the usual rules apply, but on cosmological scales the expansion of the universe means that time-translation is not invariant and thus conservation of energy is not well-defined.

https://en.wikipedia.org/wiki/Conservation_of_energy#General...

jmward01

Getting rid of the time side then, think of it as an orbit. If the universe was expanding then something could be orbiting slower than gravity would allow. Basically this question keeps bringing me back to the ties between mass and the expansion of the universe. No matter how you look at it mass must be special when it comes to expansion because it is either giving off 'free' energy in the form of slow orbits and acceleration between two objects or it -isn't- giving off that energy and something is canceling it out. Following this rabbit-hole is pretty interesting at a minimum.

thomquaid

if you throw less than escape velocity, about the energy you threw with less system losses. if you throw greater than escape velocity, about any energy is possible less system losses, if you allow enough time for it to come back after its trip around the solar system. if you throw it into the milky way, same thing, easier potentials. if you throw it at relativistic velocities, expansion of the universe could play a significant role.

nimish

Finally. We should use numerical relativity and simulate using full-fat GR. Not the half-assed approximations.

XorNot

Yes I'm sure the problem was no physicist working in the field their entire career thought to just do this.

nimish

Well no one bothered to actually implement it so who cares whether they thought it first or not?

russdill

Spoiler alert, this paper does not reach these conclusions by doing the thing you are asking.