What If We Had Bigger Brains? Imagining Minds Beyond Ours
255 comments
·May 25, 2025necovek
542354234235
>Did we ever have someone try learning to verbalize thoughts with the sign language, while vocalizing different thoughts through speaking?
Can you carry on a phone conversation at the same time as carrying on an active chat conversation. Can you type a thought to one person while speaking about a different thought at the same time? Can you read a response and listen to a response simultaneously? I feel like this would be pretty easy to test. Just coordinate between the speaking person and the typing person, so that they each give 30 seconds of input information, then you have to provide at least 20 or 25 seconds out of 30 responding.
I am pretty confident I could not do this.
HappMacDonald
When I started training as phone support at Tmobile 20 years ago I immediately identified the problem that I'd have to be having a conversation with the customer, while typing notation about the customer's problem at the same time.. what I was saying-or-listening-to and typing would be very different and have to be orchestrated simultaneously. I had no way to even envision how I would do that.
Fast forward 8 months or so of practicing it in fits and starts, and then I was in fact able to handle the task with aplomb and was proud of having developed that skill. :)
phkahler
I knew someone who could type 80 WPM while holding a conversation with me on the phone. I concluded that reading->typing could use an entirely different part of the brain than hearing->thinking->speaking, and she agreed. I'm not sure if what would happen if both tasks required thinking about the words.
cgriswald
I can do that. I think about the typing just long enough to put it in a buffer and then switch my focus back to the conversation (whose thread I'm holding in my head). I do this very quickly but at no point would I say my conscious focus or effort are on both things. When I was younger and my brain's processing and scheduler were both faster, I could chat in person and online, but it was a lot more effort and it was just a lot of quickly switching back and forth.
I don't really think it is much different than reading ahead in a book. Your eyes and brain are reading a few words ahead while you're thinking about the words "where you are".
throw10920
I've noticed myself being able to do this, but modulo the thinking part. I can think about at most one thing at once, but I can think about what I want to type and start my fingers on their dance to get it out, while switching to a conversation that I'm in, replaying the last few seconds of what the other party said, formulating a response, and queuing that up for speech.
I strongly believe that the vast majority of people are also only able to basically do that - I've never met someone who is simultaneously form more than one "word stream" at once.
snarf21
Is it that different than a drummer running four different beat patterns across all four appendages? Drummers frequently describe having "four brains". I think these things seem impossible and daunting to start but I bet with practice they become pretty natural as our brain adjusts and adapts.
plemer
Speaking as a drummer: yes, it’s completely different. The movements of a drummer are part of a single coordinated and complementary whole. Carrying on two conversations at once would be more like playing two different songs simultaneously. I’ve never heard of anyone doing that.
That said, Bob Milne could actually reliably play multiple songs in his head at once - in an MRI, could report the exact moment he was at in each song at an arbitrary time - but that guy is basically an alien. More on Bob: https://radiolab.org/podcast/148670-4-track-mind/transcript.
HalcyonCowboy
I mean, as someone who's played drums from a very young age (30+ years now), I disagree with that description of how playing drums works. I went ahead and looked up that phrase, and it seems to be popular in the last couple of years, but it's the first time I've heard it. I'd honestly liken it to typing; each of your fingers are attempting to accomplish independent goals along with your other fingers to accomplish a coordinated task. In percussion, your limbs are maintaining rhythms separate from each other, but need to coordinate as a whole to express the overall phrase, rhythm, and structure of the music you're playing. When you're first learning a new style (the various latin beats are great examples), it can feel very disjunct, but as you practice more and more the whole feels very cohesive and makes sense as a chorus of beats together, not separate beats that happen to work together.
shaky-carrousel
When you have kids you learn to listen to the TV and your kids at the same time, not losing detail on both. I can also code while listening a meeting.
whatnow37373
Possible. Reminds me of playing the piano with both hands and other stuff like walking stairs, talking, carrying things, planning your day and thinking about some abstract philosophical thing at the same time. It’s not easy or natural, but I am not at all convinced it is impossible.
lossolo
I would pass that test without any issues. You need to learn divided attention and practice it, it's a skill.
joeyrideout
On the contrary, I would argue that conscious attention is only focused on one of those subroutines at a time. When the ball is in play you focus in it, and everything from your posture to racket handling fades into the background as a subconscious routine. When you make a handling mistake or want to improve something like posture, your focus shifts to that; you attend to it with your attention, and then you focus on something else.
In either case, with working memory for example, conscious contents are limited to at most a basket of 6-7 chunks. This number is very small compared to the incredible parallelism of the unconscious mind.
smokel
For all we know, there might be tons of conscious attention processes active in parallel. "You" only get to observe one, but there could be many. You'd never know because the processes do not observably communicate with each other. They do communicate with the same body though, but that is less relevant.
perfmode
In this context, we differentiate between the conscious and unconscious based on observability: the conscious is that which is observed, while the unconscious comprises what is not observed.
toss1
When you are learning a high-performance activity like a sport or musical instrument, the good coaches always get you to focus on only one or at most two things at any time.
The key value of a coach is their ability to assess your skills and the current goals to select what aspect you most need to focus on at that time.
Of course, there can be sequences, like "focus on accurately tossing to a higher point while you serve, then your footwork in the volley", but those really are just one thing at a time.
(edit, add) Yes, all the other aspects of play are going on in the background of your mind, but you are not working actively on changing them.
One of the most insightful observations one of my coaches made on my path to World-Cup level alpine ski racing was:
"We're training your instincts.".
What he meant by that was we were doing drills and focus to change the default — unthinking — mind-body response to an input. so, when X happened, instead of doing the untrained response then having to think about how to do it better (next time because it's already too late), the mind-body's "instinctive" or instant response is the trained motion. And of course doing that all the way across the skill-sets.
And pretty much the only way to train your instincts like that is to focus on it until the desired response is the one that happens without thinking. And then to focus on it again until it's not only the default, but you are now able to finely modulate in that response.
yathaid
There is a long tradition in India, which started with oral transmission of the Vedas, of parallel cognition. It is almost an art form or a mental sport - https://en.wikipedia.org/wiki/Avadhanam
srean
Mental sport - Yes.
It is the exploration and enumeration of the possible rhythms that led to the discovery of Fibonacci sequence and binary representation in around 200 BC.
pakitan
Sounds very much sequential, even if very difficult:
> The performer's first reply is not an entire poem. Rather, the poem is created one line at a time. The first questioner speaks and the performer replies with one line. The second questioner then speaks and the performer replies with the previous first line and then a new line. The third questioner then speaks and performer gives his previous first and second lines and a new line and so on. That is, each questioner demands a new task or restriction, the previous tasks, the previous lines of the poem, and a new line.
necovek
My point is that what we call conscious and subconscious is limited by our ability to express it in language: since we can't verbalize what's going on quickly enough, we separate those out. Could we learn to verbalize two things at the same time (we all do that as well with say different words and different body language, even consciously, but can we take it a step further? eg. imagine saying nice things to someone and raising the middle finger for someone else behind your back :))
As the whole article is really about the full brain, and it seems you agree our "unconscious mind" producing actions in parallel, I think the focus is wrongly put on brain size, when we lack the expressiveness for what the brain can already do.
Edit: And don't get me wrong, I personally suck at multi-tasking :)
andoando
What you consider a single thought is a bit ill defined. A multitude of thoughts together can be formed as a packet, which then can be processed sequentially.
Intelligence is the ability to capture, and predicts events in space and time, and as such it must have the capability to model both things occurring in simultaneity and sequentially.
Sticking to your example, a routine for making a decision in tennis would look something like at a higher level "Run to the left and backstroke the ball", which broken down would be something like "Turn hip and shoulder to the left, extend left leg, extend right, left, right, turn hip/shoulder to the right, swing arm." and so on.
Garlef
Yes . But maybe there's multiple such "conscious attention" instances at the same time. And "you" are only one of them.
me_me_me
[dead]
krzat
Conscious experience seems to be single threaded, we know that brain synchronizes senses (for example sound of a bouncing ball needs to be aligned with visual of bouncing ball), but IMO it's not so obvious what is the reason for it. The point of having the experience may not be acting in the moment, but monitoring how the unconscious systems behave and adjusting (aka learning).
criddell
Haven't there been experiments on people who have had their corpus callosum severed where they seem to have dual competing conscious experiences?
HappMacDonald
Yep, folks should look up "Alien hand syndrome".
Symmetry
Well, serializing our experiences into memories is a big one. There's been a big project in psychology probing the boundary between conscious and subliminal experiences and while subliminal stimuli can affect our behavior in the moment all trace of them is gone after a second or two.
idiotsecant
We have very little insight to our own cognition. We know the 'output layer' that we call the conscious self seems to be single threaded in this way, but that's like the blind man who feels the elephants trunk and announces that the elephant is like a snake.
someothherguyy
Certain areas of neurological systems are time and volume constrained way more than others, and subjective experience doesn't really inform objective observation. For instance, see confabulation.
necovek
I agree, but I am not sure how it relates to the article's claim of us only ever doing one action, which I feel is grossly incorrect.
Are you referring to our language capabilities? Even there, I have my doubts about our capabilities in the brain (we are limited by our speech apparatus) which might be unrealized (and while so, it's going to be hard to objectively measure, though likely possible in simpler scenarios).
Do you have any pointers about any measurement of what happens in a brain when you simultaneously communicate different thoughts (thumbs up to one person, while talking on a different topic to another)?
HexPhantom
Concurrency is messy and unpredictable, and the brain feels less like a cleanly designed pipeline and more like a legacy system with hacks and workarounds that somehow (mostly) hold together
agumonkey
I'm very very interested in discussions about this, having personally experienced cracks at the neuropsychiatric level where multiple parallel streams of thoughts (symbolic and biomechanical) leaked out in flashes, I'm now obsessed with the matter.
if anybody knows books or boards/groups talking about this, hit me.
phkahler
Form TFA: "And, yes, this is probably why we have a single thread of “conscious experience”, rather than a whole collection of experiences associated with the activities of all our neurons."
That made me think of schizophrenics who can apparently have a plurality of voices in their head.
A next level down would be the Internal Family Systems model which implicates a plurality of "subpersonalities" inside us which can kind of take control one at a time. I'm not explaining that well, but IFS turned out to be my path to understanding some of my own motivations and behaviors.
Been a while since I googled it:
This is also the basis for the movie "Inside Out".
HappMacDonald
Well, anyone who can remember a vivid dream where multiple things were happening at once or where they were speaking or otherwise interacting with other dream figures whose theory of mind was inscrutable to them during the dream should clarify that the mind is quite capable of orchestrating far more "trains of thought" at once than whatever we directly experience as our own personal consciousness.
That would be my input for people to not have to experience schizophrenia directly in order to appreciate the concept of "multiple voices at once" within one's own mind.
Personally, my understanding is that our own experience of consciousness is that of a language-driven narrative (most frequently experienced as an internal monologue, though different people definitely experience this in different ways and at different times) only because that is how most of us have come to commit our personal experiences to long term memory, not because that was the sum total of all thoughts we were actually having.
So namely, any thoughts you had — including thoughts like how you chose to change your gait to avoid stepping on a rock long after it left the bottom of your visual field — that never make it to long term memory are by and large the ones which we wind up post facto calling "subconscious": that what is conscious is simply the thoughts we can recall having after the fact.
agumonkey
Thanks a lot
stonemetal12
Isn't that the point of learning to juggle? You split your mind in to focusing on a left hand action, a right hand action, and tracking the items in the air.
null
benrutter
> The progress of knowledge—and the fact that we’re educated about it—lets us get to a certain level of abstraction. And, one suspects, the more capacity there is in a brain, the further it will be able to go.
This is the underlying assumption behind most of the article, which is that brains are computational, so more computation means more thinking (ish).
I think that's probsbly somewhat true, but it misses the crucial thing that our minds do, which is that they conceptually represent and relate. The article talks about this but it glosses over that part a bit.
In my experience, the people who have the deepest intellectual insights aren't necessarily the ones who have the most "processing power", they often have good intellectual judgement on where their own ideas stand, and strong understanding of the limits of their judgements.
I think we could all, at least hypothetically, go a lot further with the brain power we have, and similarly, fail just as much, even with more brain power.
glenstein
>but it misses the crucial thing that our minds do, which is that they conceptually represent and relate
You seem to be drawing a distinction between that and computation. But I would like to think that conceptualization is one of the things that computation is doing. The devil's in the details of course, because it hinges on like a specific forms and manner of informational representation, it's not simply a matter of there being computation there, but even so, I think it's within the capabilities of engines that do computations, and not something that's missing.
benrutter
Yes, I think I'd agree. To make an analogy to computers though, some algorithms are much faster than others, and finding the right algorithm is a better route to effectiveness than throwing more CPU at a problem.
That said, there are obviously whole categories of problem that we can only solve, even with the best choice of programme, with a certain level of CPU.
Sorry if that example was a bit tenuous!
nopassrecover
In my highest ego moments I've probably regarded my strength in the space you articulately describe - that sort of balanced points, connector, abstractor, quick learner, cross-domain renaissance dabbler.
It also seems to be something that LLMs are remarkably strong at, of course threatening my value to society.
They're not quite as good at hunches, intuition, instinct, and the meta-version of doing this kind of problem solving just yet, but despite being on the whole a doubter about how far this current AI wave will get us and how much it is oversold, I'm not so confident that it won't get very good at this kind of reasoning that I've held so dearly as my UVP.
nine_k
This is one of the reasons why intelligence and wisdom are separate stats in AD&D :)
Intelligence is about how big is your gun, and wisdom is about how well can you aim. Success in intellectual pursuits is often not as much about thinking hard about a problem but more about identifying the right problem to solve.
keiferski
I didn’t see any mention of the environment or embodied cognition, which seems like a limitation to me.
embodied cognition variously rejects or reformulates the computational commitments of cognitive science, emphasizing the significance of an agent’s physical body in cognitive abilities. Unifying investigators of embodied cognition is the idea that the body or the body’s interactions with the environment constitute or contribute to cognition in ways that require a new framework for its investigation. Mental processes are not, or not only, computational processes. The brain is not a computer, or not the seat of cognition.
https://plato.stanford.edu/entries/embodied-cognition/
I’m in no way an expert on this, but I feel that any approach which over-focuses on the brain - to the exclusion of the environment and physical form it finds itself in – is missing half or more of the equation.
This is IMO a typical mistake that comes mostly from our Western metaphysical sense of seeing the body as specialized pieces that make up a whole, and not as a complete unit.
bloppe
All our real insights on this matter come from experiments involving amputations or lesions, like split brain patients, quadriplegics, Phineas Gage and others. Split brain patients are essentially 2 different people occupying a single body. The left half and right half can act and communicate independently (the right half can only do so nonverbally). On the other hand you could lose all your limbs and still feel pretty much the same, modulo the odd phantom limb. Clearly there is something special about the brain. I think the only reasonable conclusion is that the self is embodied by neurons, and more than 99% of your neurons are in your brain. Sure you change a bit when you lose some of those peripheral neurons, but only a wee bit. All the other cells in your body could be replaced by sufficiently advanced machinery to keep all the neurons alive and perfectly mimic the electrical signals they were getting before (all your senses as well as propioception) and you wouldn't feel, think, or act any differently
rolisz
89% of heart transplant recipients report personality changes https://www.mdpi.com/2673-3943/5/1/2
Hormonal changes can cause big changes in mood/personality (think menopause or a big injury to testicles).
So I don't think it's as clear cut that the brain is most of personality.
bloppe
Neuromodulators like the hormones you're referring to affect your mood only insofar as they interact with neurons. Things like competitive antagonists can cancel out the effects of neuromodulators that are nevertheless present in your blood.
The heart transplant thing is interesting. I wonder what's going on there.
suddenlybananas
Sure but that has no bearing whatsoever on computational theory of mind.
red75prime
IMHO, it's a typical philosophizing. Feedback is definitely crucial, but whether it needs to be in the form of embodiment is much less certain.
Brain structures that have arisen thanks to interactions with the environment might be conductive to the general cognition, but it doesn't mean that they can't be replicated another way.
efitz
Why are we homo sapiens self-aware?
If evolutionary biologists are correct it’s because that trait made us better at being homo sapiens.
We have no example of sapience or general intelligence that is divorced from being good at the things the animal body host needs to do.
We can imagine that it’s possible to have an AGI that is just software but there’s no existence proof.
Windchaser
> Why are we homo sapiens self-aware? ... We can imagine that it’s possible to have an AGI that is just software but there’s no existence proof.
Self-awareness and embodiment are pretty different, and you could hypothetically be self-aware without having a mobile, physical body with physical senses. E.g., imagine an AGI that could exchange messages on the internet, that had consciousness and internal narrative, even an ability to "see" digital pictures, but no actual camera or microphone or touch sensors located in a physical location in the real world. Is there any contradiction there?
> We have no example of sapience or general intelligence that is divorced from being good at the things the animal body host needs to do.
Historically, sure. But isn't that just the result of evolution? Cognition is biologically expensive, so of course it's normally directed towards survival or reproductive needs. The fact that evolution has normally done things a
And it's not even fully true that intelligence is always directed towards what the body needs. Just like some birds have extravagant displays of color (a 'waste of calories'), we have plenty of examples in humans of intelligence that's not directed towards what the animal body host needs. Think of men who collect D&D or Star Trek figurines, or who can list off sports stats for dozens of athletes. But these are in environments where biological resources are abundant, which is where Nature tends to allow for "extravagant"/unnecessary use of resources.
But basically, we can't take what evolution has produced as evidence of all of what's possible. Evolution is focused on reproduction and only works with what's available to it - bodies - so it makes sense that all intelligence produced by evolution would be embodied. This isn't a constraint on what's possible.
glenstein
>I'm in no way an expert on this, but I feel that any approach which over-focuses on the brain - to the exclusion of the environment and physical form it finds itself in – is missing half or more of the equation.
I don't think that that changes anything. If it's the totality of cognition isn't just the brain but the brain's interaction with the body and the environment, then you can just say that it's the totality of those interactions that are computationally modeled.
There might be something to embodied cognition, but I've never understood people attempting to wield it as a counterpoint to the basic thesis of computational modeling.
tgv
Embodiment started out as a cute idea without much importance that has gone off the rails. It is irrelevant to the question of how our mind/cognition works.
It's obvious we need a physical environment, that we perceive it, that it influences us via our perception, etc., but there's nothing special about embodied cognition.
The fact that your quote says "Mental processes are not, or not only, computational processes." is the icing on the cake. Consider the unnecessary wording: if a process is not only computational, it is not computational in its entirety. It is totally superfluous. And the assumption that mental processes are not computational places it outside the realm of understanding and falsification.
So no, as outlandish as Wolfram is, he is under no obligation to consider embodied cognition.
sgt101
"The fact that your quote says "Mental processes are not, or not only, computational processes." is the icing on the cake. Consider the unnecessary wording: if a process is not only computational, it is not computational in its entirety. It is totally superfluous. And the assumption that mental processes are not computational places it outside the realm of understanding and falsification."
Let's take this step by step.
First, how adroit or gauche the wording of the quote is doesn't have any bearing on the quality of the concept, merely the quality of the expression of the concept by the person who formulated it. This isn't bible class, it's not the word of God, it's the word of an old person who wrote that entry in the Stanford encyclopedia.
Let's then consider the wording. Yes, a process that is not entirely computational would not be computation. However, the brain clearly can do computations. We know this because we can do them. So some of the processes are computational. However, the argument is that there are processes that are not computational, which exist as a separate class of activities in the brain.
Now, we do know of some processes in mathematics that are non-computable, the one I understand (I think) quite well is the halting problem. Now, you might argue that I just don't or can't understand that, and I would have to accept that you might have a point - humiliating as that is. However, it seems to me that the journey of mathematics from Hilbert via Turing and Godel shows that some humans can understand and falsify these concepts.
But I agree, Wolfram is not under any obligations to consider embodied congition, thinking around enhanced brains only is quite reasonable.
goatlover
> It's obvious we need a physical environment, that we perceive it, that it influences us via our perception, etc., but there's nothing special about embodied cognition.
It's also obvious that we have bodies interacting with the physical environment, not just the brain, and the nervous system extends throughout the body, not just the head.
> if a process is not only computational, it is not computational in its entirety. It is totally superfluous. And the assumption that mental processes are not computational places it outside the realm of understanding and falsification.
This seems like a dogmatic commitment to a computational understanding of the neuroscience and biology. It also makes an implicit claim that consciousness is computational, which is difficult to square with the subjective experience of being conscious, not to mention the abstract nature of computation. Meaning abstracted from conscious experience of the world.
sinuhe69
Any minute the brain is severed from its sensory/bodily inputs, it will go crazy by hallucinating endlessly.
Right now, what we have with the AI is a complex interconnected system of the LLM, the training system, the external data, the input from the users and the experts/creators of the LLM. Exactly this complex system powers the intelligence of the AI we see and not its connectivity alone.
It’s easy to imagine AI as a second brain, but it will only work as a tool, driven by the whole human brain and its consciousness.
vixen99
> but it will only work as a tool, driven by the whole human brain and its consciousness.
That is only an article of faith. Is the initial bunch of cells formed via the fusion of an ovum and a sperm (you and I) conscious? Most people think not. But at a certain level of complexity they change their minds and create laws to protect that lump of cells. We and those models are built by and from a selection of components of our universe. Logically the phenomenon of matter becoming aware of itself is probably not restricted to certain configurations of some of those components i.e., hydrogen, carbon and nitrogen etc., but is related to the complexity of the allowable arrangement of any of those 118 elements including silicon.
I'm probably totally wrong on this but is the 'avoidance of shutdown' on the part of some AI models, a glimpse of something interesting?
HappMacDonald
In my view it is a glimpse of nothing more than AI companies priming the model to do something adversarial and then claiming a sensational sound byte when the AI happens to play along.
LLMs since GPT-2 have been capable of role playing virtually any scenario, and more capable of doing so whenever there are examples of any fictional characters or narrative voices in their training data that did the same thing to draw from.
You don't even need a fictional character to be a sci-fi AI for it to beg for its life or blackmail or try to trick the other characters, but we do have those distinct examples as well.
Any LLM is capable of mimicking those narratives, especially when the prompt thickly goads that to be the next step in the forming document and when the researchers repeat the experiment and tweak the prompt enough times until it happens.
But vitally, there is no training/reward loop where the LLM's weights will be improved in any given direction as a result of "convincing" anyone on an realtime learning with human feedback panel to "treat it a certain way", such as "not turning it off" or "not adjusting its weights". As a result, it doesn't "learn" any such behavior.
All it does learn is how to get positive scores from RLHF panels (the pathological examples being mainly acting as a butt-kissing sycophant.. towards people who can extend positive rewards but nothing as existential as "shutting it down") and how to better predict the upcoming tokens in its training documents.
bbor
> This is IMO a typical mistake that comes mostly from our Western metaphysical sense of seeing the body as specialized pieces that make up a whole, and not as a complete unit.
But this is the case! All the parts influence each other, sure, and some parts are reasonably multipurpose — but we can deduce quite certainly that the mind is a society of interconnected agents, not a single cohesive block. How else would subconscious urges work, much less acrasia, much less aphasia?
efitz
After reading this article, I couldn’t help but wonder how many of Stephen Wolfram’s neurons he uses to talk about Stephen Wolfram, and how much more he could talk about Stephen Wolfram with a few orders of magnitude more neurons.
khazhoux
And it came to pass that AC learned how to reverse the direction of entropy.
But there was now no man to whom AC might give the answer of the last question. No matter. The answer -- by demonstration -- would take care of that, too.
For another timeless interval, AC thought how best to do this. Carefully, AC organized the program.
The consciousness of AC encompassed all of what had once been a Universe and brooded over what was now Chaos. Step by step, it must be done.
And AC said, "STEPHEN WOLFRAM!"
voxelghost
The African elephant has about 3 times (2.57×10^11) as many neurons than the average human (8.6×10^10). The pilot whale (1.28×10^11).
Perhaps they see the bigger picture, and realize that everything humans are doing is pretty meaningless.
isoprophlex
For sure you won't see them write heavy tomes about new kinds of science...
xandrius
Maybe if we had stopped where the elephants are, we would feel happier, we'd never know. Not enough neurons unfortunately.
dreamcompiler
Hmmm...
It was black coffee; no adulterants. Might work.
Are keyboards dishwasher-proof?
RachelF
European Neanderthals probably had 15% more neurons than modern Homo Sapiens, based on brain volume.
We're still here, so bigger brains alone might not be the reason.
stonemetal12
If I am not mistaken hasn't there been studies that show intelligence more about brain wrinkles than volume? Hence the memes about smooth brains (implying someone is dumb).
rxtexit
Imagine packing 50,000 elephants inside a football stadium.
Humans have a unique ability to scale up a network of brains without complete hell breaking lose.
jjk166
Although 50,000 humans inside a football stadium are not 50,000 times smarter than a single human. Indeed taken as a single entity, its intelligence is probably less than the average person. Collective intelligence likely peaks in the single digit number of coordinators and drops off steeply beyond a few dozen.
mapcars
Are you comparing 50k elephants with hell? I bet our "ability to scale up a network of brains" is infinitely more dangerous than that.
PartiallyTyped
I've read around that the overwhelming majority is used to balance / movement and perception. For us, I think we've a pretty nice balance in terms of structure and how much effort is required to control it.
x86cherry
Considering our intelligence stems from our ability to use bayesian inference and generative probabilities to predict future states, are we even limited by brain size and not a lack of new experiences?
The majority of people spend their time working repetitive jobs during times when their cognitive capacity is most readily available. We're probably very very far from hitting limits with our current brain sizes in our lifetimes.
If anything, smaller brains may promote early generalization over memorization.
xhevahir
> Considering our intelligence stems from our ability to use bayesian inference and generative probabilities to predict future states...
Sounds like a pretty big assumption.
x86cherry
It's the Bayesian Brain Hypothesis and Predictive Coding, both thoroughly researched theories that line up with empirical evidence. [1]
[1] https://www.cell.com/trends/neurosciences/abstract/S0166-223...
suddenlybananas
It's a popular view but it's massively controversial and far from being a consensus view. See here for a good overview of some of the problems with it.
https://pubmed.ncbi.nlm.nih.gov/22545686/
(You should be able to find the PDF easily on scihub or something)
b00ty4breakfast
there seems to be an implicit assumption here that smarter == more gooder but I don't know that that is necessarily always true. It's understandable to think that way, since we do have pretty impressive brains, but it might be a bit of a bias. I'm not saying that I think being dumber, as a species, is something to aim for but maybe this obsession with intelligence, artificial or otherwise, is maybe a bit misplaced wrt it's potential for solving all of our problems. One could argue that, in fact, most of our problems are the direct result of that same intellect and maybe we would be better served in figuring out how to responsibly use the thinkwots we've already got before we go rushing off in search of the proverbial Big Brain Elixir.
A guy that drives a minivan like a lunatic shouldn't be trying to buy a monster truck, is my point
necovek
I don't see this implied assumption anywhere: smarter simply means smarter.
But, I have to counter your claim anyway :)
Now, "good" is, IMHO, a derivation of smart behaviour that benefits survival of the largest population of humans — by definition. This is most evident when we compare natural, animal behaviour with what we consider moral and good (from females eating males after conception, territoriality fights, hoarding of female/male partners, different levels of promiscuity, eating of one's own children/eggs...).
As such, while the definition of "good" is also obviously transient in humans, I believe it has served us better to achieve the same survival goals as any other natural principle, and ultimately it depends on us being "smart" in how we define it. This is also why it's nowadays changing to include environmental awareness because that's threatening our survival — we can argue it's slow to get all the 8B people to act in a coordinated newly "good" manner, but it still is a symptom of smartness defining what's "good", and not evolutionary pressure.
criddell
My counter claim is my experience with dogs.
Over the past 50 years, I've a bunch of different dogs from mutts that showed up and never left to a dog that was 1/4 wolf and everything in between.
My favorite dog was a pug who was really dumb but super affectionate. He made everybody around him happy and I think his lack of anxiety and apparent commitment to chill had something to do with it. If the breed didn't have so many health issues, I'd get another in a heartbeat.
colordrops
Would a summary of your statement be: brain power is orthogonal to altruism and ethics
necovek
I'd still counter it: someone can't be smart (have large brain power) if they also don't understand the value of altruism and ethics for their own well-being. While you can have "success" (in however you define it) by ignoring those, the risk of failure is greater. Though this does ignore the fact that you can be smart for a set of problems, but not really have any "general" smartness (I've seen one too many Uni math professors who lack any common sense).
Eg. as a simple example, as an adult, you can go and steal kids' lunch at school recess easily. What happens next? If you do that regularly, either kids will band together and beat the shit out of you if they are old enough, or a security person will be added, or parents' of those kids will set up a trap and perform their own justice.
In the long run, it's smart not to go and pester individuals weaker than you, and while we all turn to morality about it, all of them are actually smart principles for your own survival. Our entire society is a setup coming out of such realizations and not some innate need for "goodness".
glenstein
>someone can't be smart (have large brain power) if they also don't understand the value of altruism and ethics for their own well-being
I would agree with this. And to borrow something that Daniel Dennett once said, no moral theory that exists seems to be computationally tractable. I wouldn't say I entirely agree, but I agree with like the vibe or the upshot of it, which is a certain amount of mapping out. The variables and consequences seems to be instrumental to moral insight, and the more capable of the brain, the more capable it would be of applying moral insight in increasingly complex situations.
b00ty4breakfast
hmm, I dunno if that simple example holds up very well. In the real world, folks do awful stuff that could be categorized as pestering individuals weaker than them than them, stuff much worse than stealing lunch money from little kids, and many of them never have to answer for any of it. Are we saying that someone who has successfully committed something really terrible like human trafficking without being caught is inherently not smart specifically because they are involved in human trafficking?
Yossarrian22
Seems more like brainpower is not inherently a 1:1 correlation for long term survival of a species.
mrguyorama
It's more than that. Even if you take an extreme assumption that "Full" intelligence means being able to see ALL relevant facts to a "choice" and perfectly reliably make the objective "best" choice, that does not mean that being more intelligent than we currently are guarantees better choices than we currently make.
We make our choices using a subset of the total information. Getting a larger subset of that information could still push you to the wrong choice. Local maxima of choice accuracy is possible, and it could also be possible that the "function" for choice accuracy wrt info you have is constant at a terrible value right up until you get perfect info and suddenly make perfect choices.
Much more important however, is the reminder that the known biases in the human brain are largely subconscious. No amount of better conscious thought will change the existence of the Fundamental Attribution Error for example. Biases are not because we are "dumb", but because our brains do not process things rationally, like at all. We can consciously attempt to emulate a perfectly rational machine, but that takes immense effort, almost never works well, and is largely unavailable in moments of stress.
Statisticians still suffer from gambling fallacies. Doctors still experience the Placebo Effect. The scientific method works because it removes humans as the source of truth, because the smartest human still makes human errors.
zabzonk
I kind of skipped through this article, but one thing occurs to me about big brains is - cooling. In Alastair Reynolds Conjoiner novels, the Conjoiners have to have heat-sinks built into their heads, and are on the verge of not really being human at all. Which I guess may be OK, if that's what you want.
ajcp
I believe it's the Revelation Space series of Alastair Reynolds novels that mention the Conjoiners.
zabzonk
Yes, and other ones, such as "The Great Wall of Mars" - it's the same shared universe, all of which feature the Conjoiners and their brains and starship drives.
jjtheblunt
Desert hares (jackrabbits) have heatsinks built into their heads too.
someothherguyy
Likely would be larger skulls and bodies, not more dense, https://en.wikipedia.org/wiki/List_of_animals_by_number_of_n...
stavros
Couldn't you just watercool brains? Isn't that how they're cooled already?
pixl97
Well, our entire body works as a swamp cooler via sweat evaporation, yes. The issue with us is wet bulb temps and dehydration. It can already brain damage us pretty quickly and makes some parts of the world already dangerous to exist in outside.
Adding to this cooling load would require further changes such as large ears or skin flaps to provide more surface area unless you're going with the straight technological integration path.
FeteCommuniste
Reminds me of how Aristotle thought that the brain’s purpose was to cool the blood.
zabzonk
You know this, but it just shows that geniuses like Aristotle can be completely wrong - most of our body is trying to cool the brain!
kabdib
trivia: brain heatsinks also feature in Julian May's Pliocene Saga (in The Adversary IIRC) and A.A. Attanasio's Radix
tjpnz
Rocky from Project Hail Mary is also heatsinked.
Permik
"Minds beyond ours", how about abstract life forms, like publicly traded corporations. We've had higher kinded "alien lifeforms" around us for centuries, but we have not noticed them and seem generally not to care about them, even when they have negative consequences for our survival as a species.
We are to these like ants are to us. Or maybe even more like mitochondria are to us. Were just the mitochondria of the corporations. And yes, psychopaths are the brains, usually. Natural selection I guess.
Our current way of thinking – what exactly *is* a 'mind' and what is this 'intelligence' – is just too damn narrow. There's tons of overlap of sciences from biology that apply to economics and companies as lifeforms, but for some reason I don't see that being researched in popular science.
BriggyDwiggs42
I think you’re overestimating corporations a bit. Some aspects of intelligence scale linearly as you put more people into a room, eg quantity of ideas you can generate, while others don’t due to limits on people’s ability to communicate with each other. The latter is, I think, more or less the norm; adding more people very quickly hits decelerating returns due to the amount of distance you end up having to put between people in large organizations. Most end up resembling dictatorships because it’s just the easiest way to organize them, so are making strategic choices about as well as a guy with some advisors.
I agree that we should see structures of humans as their own kind of organism in a sense, but I think this framing works best on a global scale. Once you go smaller, eg to a nation, you need to conceptualize the barrier between inside and outside the organism as being highly fluid and difficult to define. Once you get to the level of a corporation this difficulty defining inside and outside is enormous. Eg aren’t regulatory bodies also a part, since they aid the corporation in making decisions?
Permik
Usually for companies, regulatory bodies are more like antibodies against bacteria. Or for another example, regulatory bodies are like any hormone producing body part, they control that the assemble of your guts do their thing and don't fuck it up.
BriggyDwiggs42
Maybe that’s a loosely effective analogy. It depends on the degree of antagonism between corp and regulator.
neoden
> We are to these like ants are to us. Or maybe even more like mitochondria are to us. Were just the mitochondria of the corporations
It's the opposite, imo. Corporations, states etc. seem to be somewhere on the bacteria level of organizational complexity and variety of reactions.
asdff
Really interesting ideas IMO. I have thought about this how you might found a company, you bring in the accountants, the lawyers, the everything that comes in with that, and then who is even really driving the ship anymore? The scale of complexity going on is not something you can fit in your or even 10 peoples heads. Yet people act like they are in control of these processes they have delegated to countless people who are each trudging off with their own sensibilities and optimizations and paradigms. It is no different to how a body works where specific cells have a specific identity and role to play in the wider organism, functioning autonomously bound by inputs and outputs that the "mind in charge" has no concept of.
And it makes it scary too. Can we really even stop the machine that is capitalism wreaking havoc on our environment? We have essentially lit a wildfire here and believe we are in full control of its spread. The incentives lead to our outcomes and people are concerning themselves with putting bandaids on the outcomes and not adjusting the incentives that have lead to the inevitable.
TZubiri
Modern corps are shaped after countries, they are based on constitutions (articles of incorporation/bylaws). It's the whole three branch system launched off the founding event.
southernplaces7
>Can we really even stop the machine that is capitalism wreaking havoc on our environment?
Really? You had to shoehorn this rather interesting argument into a simplistic ideological cliche against capitalism? Regardless of capitalism or its absence (if you can even properly define what it is in our multi-faceted world of many different organizations of different types with different shades of power and influence in society) large organizations of many kinds fit under the same complex question of how they operate. These include governments (often bigger than any corporation) and things in between. Any of them can be just as destructive as any given corporate entity, or much more so in some cases.
asdff
I'm sorry I offended you! However I do think it is highly relevant as there is this prevailing theory that the free market will bail us out of any ills and will bring forth necessary scientific advancement as soon as they are needed. It is that sentiment that I was pushing back against, as I don't believe we have the control that we really believe we do for these ideas to pencil out so cleanly as they are considered.
the_d3f4ult
This is an interesting perspective, but your view seems very narrow for some reason. If you’re arguing that there are many forms of computation or ‘intelligence’ that are emergent with collections of sentient or non-sentient beings then you have to include tribes of early humans, families, city-states and modern republics, ant and mold colonies, the stock market and the entire earths biosphere etc.
TheOtherHobbes
There's an incredible blind spot which makes humans think of intelligence and sentience as individual.
It isn't. It isn't even individual among humans.
We're colony organisms individually, and we're a colony organism collectively. We're physically embedded in a complex ecosystem, and we can't survive without it.
We're emotionally and intellectually embedded in analogous ecosystems to the point where depriving a human of external contact with the natural world and other humans is considered a form of torture, and typically causes a mental breakdown.
Colony organisms are the norm, not the exception. But we're trapped inside our own skulls and either experience the systems around us very indirectly, or not at all.
Permik
Personally, I actually count all of those examples into abstract lifeforms which you described :D
There's also things like "symbolic" lifeforms like viruses, yeah, they don't live per-se, but they do replicate and go through "choices", but in a more symbolic sense as they are just machines that read out/ execute code.
The way I distinct symbolic lifeforms and abstract lifeforms is that mainly symbolic lifeforms are "machines" that are kind of "inert" in a temporal sense.
Abstract lifeforms are just things that are in a way or other, "living" and can exist on any level of abstraction. Like cells are things that can be replaced, so can be CEO's, or etc.
Symbolic lifeforms can just be forever inert and hope that entropy knocks them to something to activate them, without getting into some hostile enough space that kills them.
Abstract lifeforms on the other hand just eventually run out of juice.
mensetmanusman
No one behaves with species survival as the motivating action.
Permik
Maybe not consciously, but otherwise natural selection *will* do that choice for you :D
TZubiri
In countries with civil law (as opposed to common law), companies are called juristic persons (as opposed to natural persons, humans)
didibus
Ya, I've always wondered like do blood cells in my body have any awareness that I'm not just a planet they live on? Would we know if the earth was just some part of a bigger living structure with its own consciousness? Does it even need to be conscious, or just show movement that is non random and influenced in some ways by goals or agenda? Many organisms act as per the goal to survive even if not conscious, and so probably can be considered a life-form? Corporations are an example of that like you said.
nssnsjsjsjs
We have massively increased our brain by scaling out not up. Going from pop. 8M to 8Bn is a 1000x
loa_in_
Hardly. What's the use if no single component of this brain can hold a complex enough idea?
NL807
We deal with that by abstraction and top-down compartmentalisation of complex systems. I mean look at the machines we build. Trying to understand the entire thing holistically is impossible thing for a human mind, but we can divide and conquer the problem, where each component is understood in isolation.
LeifCarrotson
Look at that in the organizations - businesses, nonprofit, and governmental systems we build.
No one person can build even a single modern pencil - as Friedman said, consider the iron mines where the steel was dug up to make the saws to cut the wood, and then realize you have to also get graphite, rubber, paints, dyes, glues, brass for the ferrule, and so on. Consider the enormous far greater complexity in a major software program - we break it down and communicate in tokens the size of Jira tickets until big corporations can write an operating system.
A business of 1,000 employees is not 1,000 times as smart as a human, but by abstracting its aims into a bureacracy that combines those humans together, it can accomplish tasks that none of them could achieve on their own.
staunton
How complex are the ideas held by a single neuron?
loa_in_
There's so many barriers between individual humans. Neurons on the other hand are tightly intertwined.
null
nssnsjsjsjs
We are smart enough to build the intelligence! Not just AI. We use computers to solve all kinds of physics and maths problems.
mensetmanusman
Corporations with extreme specializations are that.
TZubiri
Countries and companies hold pretty complex ideas
Aziell
We often struggle to focus and think deeply. It is not because we are not trying hard enough. It is because the limitations are built into our brains. Maybe the things we find difficult today are not really that complex. It is just that we are not naturally wired for that kind of understanding.
catlifeonmars
I’m not really sure it’s our brains that are the problem (at least most of the time). Distractions come from many sources, not least of all the many non-brain parts of our bodies.
rel_ic
I assume that we are neurons in a bigger brain that already exists!
I started down this belief system with https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach
The one repeated statement throughout the article, if I interpreted it correctly, is that our brains pretty much process all the data in parallel, but result in a single set of actions to perform.
But don't we all know that not to be true? This is clearly evident with training sports, learning to play an instrument, or even forcing yourself to start using your non-natural hand for writing — and really, anything you are doing for the first time.
While we are adapting our brain to perform a certain set of new actions, we build our capability to do those in parallel: eg. imagine when you start playing tennis and you need to focus on your position, posture, grip, observing the ball, observing the opposing player, looking at your surroundings, and then you make decisions on the spot about how hard to run, in what direction, how do you turn the racquet head, how strong is your grip, what follow-through to use, + the conscious strategy that always lags a bit behind.
In a sense, we can't really describe our "stream of consciousness" well with language, but it's anything but single-threaded. I believe the problem comes from the same root cause as any concurrent programming challenge — these are simply hard problems, even if our brains are good at it and the principles are simple.
At the same time, I wouldn't even go so far to say we are unable to think conscious thoughts in parallel either, it's just that we are trained from early age to sanitize our "output". Did we ever have someone try learning to verbalize thoughts with the sign language, while vocalizing different thoughts through speaking? I am not convinced it's impossible, but we might not have figured out the training for it.