Why Everything in the Universe Turns More Complex
105 comments
·April 14, 2025hliyan
bubblyworld
I think speculative science always starts out as philosophy. This is as true now as it was in the 18th century. If you look at any thinker on the edge of human understanding you'll find something similar (e.g. I was reading Michael Levin's stuff on bioelectricity recently and it also has a heavy dose of philosophy).
I don't really have an issue with any of the points you raised - why do they bother you?
The interesting stuff is the discussion about "functional information" later in the paper, which is their proposed quantitative measure for understanding the evolution of complexity (although it seems like early stages for the theory).
It's "just" a slight generalisation of the ideas of evolution but it applies to nonbiological systems and they can make quantitative predictions. If it turns out to be true then (for me) that is a pretty radical discovery.
I'm looking forward to seeing what can be demonstrated experimentally (the quanta article suggests there is some evidence now, but I haven't yet dug into it).
cryptonector
> I don't really have an issue with any of the points you raised - why do they bother you?
Idk about GP, but bad science writing ("identification of conceptual equivalencies ...") does bother me. It's sloppy, and tends to hide possibly invalid shortcuts taken by the authors by being an impenetrable fog of words. That sort of thing is a very good indicator of bunk, and it tends to peg my BS meter. Which isn't to say that there is no place for that sort of language in a scientific paper, but that one should preface the use of it with an admission of hand-waving for some purpose.
haswell
> I think speculative science always starts out as philosophy. This is as true now as it was in the 18th century.
Indeed, and Natural Philosophy was the precursor to what we now call Science.
I still think the old name better fit what we’re doing because it admits that the work is still a philosophical endeavor.
This is not to question the validity of what we now call science, but it’s common these days to believe in the ultimate supremacy of science as the answer to questions that are best explored both philosophically and scientifically, and because pure science still can’t answer important philosophical questions that that the entire scientific discipline rests upon.
analog31
Tell me about the supremacy of science after the government restores the NIH, NOAA, etc. In fact most people in the world believe in the supremacy of their religious faiths.
ysofunny
> I think speculative science always starts out as philosophy
or in my words: "the first approximation is poetic. the last one is mathematical"
from philosophy to hard-science and engineered tooling and other products (andor services)
similarly to
from poetry as dubious, cloudy, and vague ideas all the way to crystal clear, fixed and unmoving (dead) formalizations
raxxorraxor
I believe model and concept can be equivalent, not sure about the required formal terminology in English.
Complexity is probably most formally modeled in entroy in thermodynamics, although it behaves in the opposite direction that these ideas and oberservations suggest it should.
It still asks questions about the reason for this complexity and there is no scientific answer aside from "propably accidental complexity".
Science is curious so it probably shouldn't be dismissed by unmet formal requirements that aren't specified. "Layman" is unspecific, so what would your requirements be exactly?
coldtea
>- "identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" - what exactly is a "conceptual equivalence"? You mean models?
No, a model is not an "identification of conceptual equivalencies among disparate phenomena". It's a simplified representation of a system.
"identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" could be called an analogy, an isomorphism, a unifying framework, etc.
>Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.
Perhaps because the most important insights are the most basic ones - it's upon those eveything else sits upon.
>At this point, I gave up
If you can't bother beyond the abstract or 1st paragraph, or are perplexed that the abstract has a 10,000ft simplistic introduction into the basics, then it's better that you did :)
EncomLab
"Complexity" is a hugely problematic term when used in this way - remember that entropy and complexity are related, but they are not interchangeable. A complex system can have lower entropy than a simpler system, and conversely, a system can have high entropy but be relatively simple. By mingling these terms without specifying objective reference points, it all just comes out as word salad.
This paper just reads like an attempt at sounding smart while actually saying little.
titzer
> a system can have high entropy but be relatively simple.
Good examples of these are anything that Kolmogorov-compresses well. For example, by almost any measure the output of a pseudo random number generator has high entropy. Yet it has low information density (low complexity), as the program that generates the sequence, plus its state, is really small.
andrewflnr
I think a better example is just hot gas. Heat up a tube of gas, and its entropy will increase, with no effect on its complexity. Still not terribly compressible either though.
pyfon
Yes indeed. As I understand it, entropy is about states that are more likely.
I wonder if it always increases though? Eventually there will be enough entropy that any change may cause it to reduce or oscillate? (At universe / reachable universe scale).
kergonath
> I wonder if it always increases though?
It always increases in an isolated system. That caveat is almost always missing in pop-sci level of discussions about entropy, but it is crucial.
> Eventually there will be enough entropy that any change may cause it to reduce or oscillate?
Assuming that the universe is actually an isolated system, entropy will reach a maximum (it cannot oscillate). It is interesting to speculate, and of course our theories are imperfect and we are certainly missing something. In particular, the relationship between time and entropy is not straightforward. Very roughly: is the entropy a function of time, which we could define otherwise, or is time a consequence of entropy changes?
In the first case, we can suppose that if the universe reaches an entropy maximum we’d be far enough outside the conditions under which our theories work that we’d just have entropy decrease with time (i.e., the rule that entropy increases with time is only valid close to our usual conditions).
But in the second case, it would mean that the universe reached the end of time. It could evolve in any conceivable way (in terms of the fundamental laws of Physics), and the arrow of time would always point to the same moment. "What comes after?" Would be a question just as meaningless as "what came before the Big Bang?"
In any case, there are a lot of assumptions and uncertainty. The story does not do the subject any justice.
andrewflnr
Yes, we call that state "heat death". Note that the second law is actually that entropy never decreases; it's allowed to stay constant for certain interactions (for instance I'm pretty sure an elastic collision preserves entropy).
ysofunny
that is why the complex is distinct from the complicated
kens
Coincidentally, I'm reading Walker's book "Life as No One Knows It: The Physics of Life's Emergence" on the same topic. (Walker is one of the researchers in the article.) Summary: I don't like the book. The book was motivating me to write an article "Books I don't like", but I'll comment here instead :-)
The book describes "Assembly Theory", a theory of how life can arise in the universe. The idea is that you can quantitatively measure the complexity of objects (especially chemicals) by the number of recursive steps to create them. (The molecule ATP is 21 for instance.) You need life to create anything over 15; the idea of life is it contains information that can create structures more complex than what can be created randomly. The important thing about life is that it isn't spontaneous, but forms an unbroken chain through time. Explaining how it started may require new physics.
If the above seems unclear, it's because it is unclear to me. The book doesn't do a good job of explaining things. It looks like a mass-market science book, but I found it very confusing. For instance, it's unclear where the number 21 for ATP comes from, although there's an analogy to LEGO. The book doesn't define things and goes into many, many tangents. The author is very, very enthusiastic about the ideas but reading the book is like looking at ideas through a cloud of vagueness.
The writing is also extremely quirky. Everyone is on a first-name basis, from Albert (Einstein) to Johnny (von Neumann) and Erwin (Schrödinger). One chapter is written in the second person, and "you" turn out to be "Albert." The book pushes the idea that physics is great and can solve everything, covering physics "greatest hits" from relativity and quantum mechanics to gravitational waves and the Higgs boson. (The underlying theme is: "Physics is great. This book is physics. Therefore, this book is great.") The book has a lot of discussion of how it is a new paradigm, Kuhn's paradigm shifts, how it will move astrobiology beyond the pre-paradigmatic phase and unify fields of research and so forth. It's not a crackpot book, but there are an uncomfortable number of crackpot red flags.
I'm not rejecting the idea of assembly theory. To be honest, after reading the book, I don't understand it well enough to say which parts seem good and which parts seem flawed. There seem to be interesting ideas struggling to get out but I'm not getting them. (I don't like to be negative about books, but there are a few that I regret reading and feel that I should warn people.)
roughly
Walker gave a talk recently at Long Now on Assembly Theory that sounds like it did a better job of getting the point across:
aradox66
I felt similar reading that book. She seems very clear that she wants to develop paradigmatic physics, and wants Assembly Theory to be paradigmatic, but there's not a lot of meat on the bone.
Viliam1234
> It's not a crackpot book, but there are an uncomfortable number of crackpot red flags.
How do you know it's not a crackpot book? All evidence you mentioned here seems to support that conclusion.
andrewflnr
Amateur speculation, but informed by professionals: I think this tendency toward complexity is situational, not fundamental. Specifically, it's a product of this stage of the universe having lots of available energy. More complex structures are favored when/because they can consume more energy and increase entropy more effectively. The complexity will probably start fading when the hydrogen-fusion party dies. The second law will continue on its way.
Kungfuturtle
This reminds me of Teilhard de Chardin's take on complexification, as laid out in his seminal book Le Phénomène humain. See e.g., this article[0] for a simple overview of the hypothesis. For further reading, I recommend the excellent new translation by Sarah Appleton-Weber, The Human Phenomenon[1].
[0] <https://onlinelibrary.wiley.com/doi/pdf/10.1002/%28SICI%2910...>
[1] <https://www.liverpooluniversitypress.co.uk/doi/book/10.3828/...>
OgsyedIE
I'm fairly sure this is already in the usual canon of statistical mechanics.
"When one compares a hotplate with and without a Benard cell apparatus on top, there is an overall increase in entropy as energy passes through the system as required by the second law, because the increase in entropy in the environment (at the heat sink) is greater than the decreases in entropy that come about by maintaining gradients within the Benard cell system."
gsf_emergency
The abstract heresy innuendo'd here seems to be about an increase in global (aka universal) "complexity"*
(Think: no heat death!)
Related to another heresy understated by qmag just this week: https://news.ycombinator.com/item?id=43665831
In that case, qmag didn't (dare to?) shout loud enough that the para-particles are globally ?distinguishable..
That's like a very restricted version of TFA's claim though..
Another take on the issue:
https://scottaaronson.blog/?p=762
*I don't want to say "entropy" because it's not clear to many folks, including experts, whether entropy is uh, "correlated" or "anticorrelated" with complexity.
raxxorraxor
> "correlated" or "anticorrelated" with complexity.
Also the value of entropy has different signs in thermodynamics and computer science for example. Not helpful either...
gsf_emergency
It's because thermo count states and CS use probabilities.. strange swap, I know, pple assume the opposite..
adrian_b
Sentences like this, i.e. "everything turns more complex", must be formulated much more precisely in order to become true.
The article talks a lot about biological evolution, but in that case the only claim that is likely to be true is that the complexity of the entire biosphere increases continuously, unless a catastrophe resets the biosphere to a lower complexity.
If you look only at a small part of the biosphere, like one species of living beings, it is extremely frequent to see that it evolves to become simpler, not more complex, because a simpler structure is usually optimal for constant environmental conditions, the more complex structures are mainly beneficial for avoiding extinction when the environmental conditions change.
kdavis
“The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation - well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation.” ― Arthur Eddington, New Pathways in Science
EVa5I7bHFq9mnYK
Entropy is always increasing in a closed system, but locally it can decrease, if energy is supplied from the outside. Us evolving on Earth comes at the expense of increased entropy of the Sun.
mr_toad
> Entropy is always increasing in a closed system
Only if that system isn’t already in thermodynamic equilibrium. A closed system that reaches thermodynamic equilibrium has maximum entropy.
Why the universe as a whole didn’t start out in thermodynamic equilibrium, i.e doesn’t have maximum entropy is something we don’t understand.
EVa5I7bHFq9mnYK
If it were so, there would be no one to ask that question.
api
Maybe it's not a closed system.
https://en.wikipedia.org/wiki/Black_hole_cosmology
I'm partial to the hypothesis that our universe is actually a giant black hole in some kind of larger universe. The Big Bang was really the formation of our universe's event horizon. Cosmic inflation is the result of stuff falling into our universe, adding to its mass-energy -- there is no dark energy, our universe is just accreting mass-energy from something larger.
As for what the larger universe looks like -- in this model it may be impossible to know because the event horizon is impenetrable. It could be a much larger universe or it could be something else, like a higher dimensional one.
ThrowawayTestr
I read a theory that life in the universe might be favorable because we increase entropy so much.
pyfon
Life in the universe is pretty unfavourable! A rare thing indeed. Where it has evolved I think it is less about entropy and more about the nature of the matter - atoms, molecules. Particularly carbon and water. And the way they can replicate themselves through chemistry. That had to obey entropy but is not driven by it. Light scattering off the atmosphere will do the entropy trick well enough!
Gualdrapo
I remember reading somewhere that maybe the purpose of life is to increase entropy in the universe. If that is true and we haven't found any sound evidence of life elsewhere, I don't know.
kmoser
Where did you read this? "Purpose" is a very loaded word. If life has any purpose at all, it's to reproduce and propagate one's genes. Additional entropy just sounds like an inevitable side-effect of that.
__MatrixMan__
"to reproduce and populate its genes" feels like a better fit for the purpose of an organism.
If you subscribe to the big bang theory (and the idea that the purpose of a system is what it does), then the universe's purpose is to walk a path from low entropy to high entropy. Of what use is life, in such an endeavor? Well, life tends to seek out bits of stuck energy (food/fuel) and release it (metabolism/economy)--moving the universe further along on its path.
This gives a sort of answer to the question: "why bother have live at all?" And so I think the entropy purpose makes sense--moreso than just having it just be a side effect. Nobody will ever be absolutely right or wrong about such things (purposes), but they're handy to have around sometimes.
prabhu-yu
Can life evolve to slow down the process of increasing entropy? For ex: Sun is throwing energy in space. What if life tries to store it and use it only when it needs? Has the sunlight gone into space (without being captured by fossilized life), it would have thinly spread out in universe(high entropy, low energy density). But plants and humans (solar cells) capturing it to create fossil fules or create some infrastructure... Is it not life going against this theory? Or is it just intermidiate step of life which eventually (life will) blast all energy in short period of time at the end like an expontial system does?
justanotherjoe
What I find compelling is how it works at low and high levels. Low level because we dissipate energy just by being a living creature. And the high level because as you said, we as a civilization can't seem to escape it, and want to use pockets of low entropy like mineral veins and fuels. Until all is spent i guess. You don't mention how unsympathetic that purpose is, though. At that point any purpose you make for yourself is better than that one even if it's true.
gsf_emergency_2
Stafford Beers, "The Purpose of a System is What it Does (POSIWID)", very hot right now..
https://www.astralcodexten.com/p/come-on-obviously-the-purpo...
https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
ImHereToVote
Stop it. My eyes can only roll so much.
firecall
Agreed about purpose being a loaded term.
It's my, somewhat lazy, philosophical opinion, that there isn't any purpose and there doesn't need to be one.
I don't see why the universe would need a purpose for anything. Things are what they. Things changing state. Entropy.
I see reproduction as more of built in motivation to our system than a purpose as such. But that's semantics, and my purpose in life is not to argue about words! ;-)
justanotherjoe
Could be. Could also be that reproduction and propagation is the inevitable side effect of that, no? We cant dissipate energy when we're dead.
guerrilla
Rather a way to accomplish that. Life reproducing in order to accelerate the generation of entropy, in other words.
kouru225
Pretty sure this is what Schrodingers opinion is in his book “what is life?” But I haven’t read it. Maybe OP got it from that
mjan22640
Reproduction is not really a purpose. What makes copies of itself, happens to persist.
null
null
justinator
It tracks, though "attaining a higher state of entropy" is just what Universes generally do it seems, given our n of 1 Universes we've started to evaluate.
Though, I'm not sure if life is the best at it, when compared to say a black hole. Some smart apes burning off fossil fuels seems pretty insignificant in comparison -- or even seeing what our own Sun does in a few seconds.
File that under, "The Earth will be fine in the long run, it's humans that are f'd" George Carlin pov. Maybe when we start building Death Stars (plural)
nayuki
I read somewhere that life is more efficient at dissipating energy and faster at increasing entropy than non-living physical/chemical phenomena. Citation needed.
floatrock
Right, it's less about the purpose of life (which implies a directive force) and more that a characteristic of life is it's an emergent complexity that finds more efficient ways of increasing entropy.
It gets a bit blurry when you start to substitute "life" for any "complex cosmological system" though...
flanked-evergl
I think it was from Sean Caroll's book The Big Picture.
The statement is a category error, but that criticism distracts from the very valuable insight he does provide regarding entropy, life and complexity.
He did a series on minutephysics explaining it quite well, worth a watch. He does explain why complexity increases as entropy increases (with some additional qualification).
https://www.youtube.com/playlist?list=PLoaVOjvkzQtyZF-2VpJrx...
perrygeo
POSIWID. Life on earth's primary "purpose" if observed from space would be to dissipate low-entropy solar radiation, using it to build temporary structures out of carbon.
It is puzzling why life isn't more common. Perhaps dissipative self-organizing structures are everywhere - stars, solar systems and galaxies themselves maintain their order by dissipating energy. They just don't look like "life" to us.
__MatrixMan__
I have lost the book, but I think I read this in "What is Life? And Other Scientific Essays" by Erwin Schrödinger. If I recall, it was one of the "Other Scientific Essays."
tiborsaas
We are only relatively recently have good enough tooling to even talk about discovering bio- and technosignatures in the atmosphere of exoplanets. I'm really hoping that we will find some undeniable evidence in my lifetime.
robocat
Surely you mean accelerate entropy.
I presume the end-state of entropy would be the same (excluding ways to escape the universe).
seydor
Isn't that like saying that "some things take time"? Complexity also takes time to develop through a myriad probabilities. We even define complexity along the concept of things taking time or equivalent space/memory. As the authors say, functional information of physical systems is very difficult to quantify. Until then, this is another formulation of the anthropic principle , but with complexity instead of humanity.
afpx
Pretty cool. I often wondered if the universe was evolving similar to natural selection via a reinforcement learning process. Wave function collapses to the value that maximizes some objective function.
How would you test for it though? I've seen enough residual data from RL processes to almost see semblences of patterns that could be extracted and re-applied at a macro scale.
fpoling
The thing that is often missed in debates about entropy and Universe is that the classical notion on entropy is not compatible with General Relativity. Richard Tolman almost 100 years ago proposed an extension that was compatible.
One of the consequences of that extension was a possibility of a cyclic universe. On expansion one sees that classically defined entropy increases but then it will decrease on contraction.
These days that work is pretty much forgotten, but still it showed that with GR heat dearth of the universe was not the only option.
flanked-evergl
There is https://en.wikipedia.org/wiki/Conformal_cyclic_cosmology
If I had to bet money on it, I would say it's right, especially in light of things like this: https://phys.org/news/2025-03-ai-image-recognition-universe....
mr_mitm
Heat death was never the only option in GR. The field equations always allowed for a big crunch or a big rip.
fpoling
Yes, but that implies that in GR entropy or at least the value based on the classical definition can decrease.
So apparent increase in complexity can be attributed to gravity.
gsf_emergency_2
Sean Carroll today's go-to person for GR has been working at popularizing these ideas (for more than 10 years(!))
https://arxiv.org/abs/1405.6903
>For example, our universe lacked complex structures at the Big Bang and will also lack them after black holes evaporate and particles are dispersed.
See my comment below for link to Scott's preview.
Tried reading the paper [1]. I understand the authors are academics, which is why I'm surprised the paper reads like a layman's attempt at a contributing to a "theory of everything", or at best, an inquiry written by a 18th century European philosopher of science.
- "identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" - what exactly is a "conceptual equivalence"? You mean models? Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.
- "The laws of classical physics emerged as efforts to provide comprehensive, predictive explanations of phenomena in the macroscopic world" - followed by a laymen's listing of physical laws, then goes on to claim "conspicuously absent is a law of increasing “complexity.”"
- then a jumble of examples including gravitation, stellar evolution, mineral evolution and biological evolution
- this just feels like a slight generalization of evolution: "Systems of many interacting agents display an increase in diversity, distribution, and/or patterned behavior when numerous configurations of the system are subject to selective pressure."
At this point, I gave up.
[1] https://www.pnas.org/doi/10.1073/pnas.2310223120