Skip to content(if available)orjump to list(if available)

Why are there so many rationalist cults?

skrebbel

This article is beautifully written, and it's full of proper original research. I'm sad that most comments so far are knee-jerk "lol rationalists" type responses. I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.

meowface

Asterisk is basically "rationalist magazine" and the author is a well-known rationalist blogger, so it's not a surprise that this is basically the only fair look into this phenomenon - compared to the typical outside view that rationalism itself is a cult and Eliezer Yudkowsky is a cult leader, both of which I consider absurd notions.

lyu07282

> I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.

I once called rationalists infantile, impotent liberal escapism, perhaps that's the novel take you are looking for.

Essentially my view is that the fundamental problem with rationalists and the effective altruist movement is that they are talking about profound social and political issues, with any and all politics completely and totally removed from it. It is liberal depoliticisation[1] driven to its ultimate conclusion. That's just why they are ineffective and wrong about everything, but that's also why they are popular among the tech elites that are giving millions to associated groups like MIRI[2]. They aren't going away, they are politically useful and convenient to very powerful people.

[1] https://en.wikipedia.org/wiki/Post-politics

[2] https://intelligence.org/transparency/

knallfrosch

I think it's perfectly fine to read these articles, think "definitely a cult" and ignore whether they believe in spaceships, or demons, or AGI.

The key takeaway from the article is that if you have a group leader who cuts you off from other people, that's a red flag – not really a novel, or unique, or situational insight.

andrewflnr

That's a side point of the article, acknowledged as an old idea. The central points of this article are actually quite a bit more interesting than that. He even summarized his conclusions concisely at the end, so I don't know what your excuse is for trivializing it.

mm263

[flagged]

teh_klev

I have a link for you:

https://news.ycombinator.com/newsguidelines.html

Scroll to the bottom of the page.

null

[deleted]

pavlov

A very interesting read.

My idea of these self-proclaimed rationalists was fifteen years out of date. I thought they’re people who write wordy fan fiction, but turns out they’ve reached the point of having subgroups that kill people and exorcise demons.

This must be how people who had read one Hubbard pulp novel in the 1950s felt decades later when they find out he’s running a full-blown religion now.

The article seems to try very hard to find something positive to say about these groups, and comes up with:

“Rationalists came to correct views about the COVID-19 pandemic while many others were saying masks didn’t work and only hypochondriacs worried about covid; rationalists were some of the first people to warn about the threat of artificial intelligence.”

There’s nothing very unique about agreeing with the WHO, or thinking that building Skynet might be bad… (The rationalist Moses/Hubbard was 12 when that movie came out — the most impressionable age.) In the wider picture painted by the article, these presumed successes sound more like a case of a stopped clock being right twice a day.

skybrian

The WHO didn't declare a global pandemic until March 11, 2020 [1]. That's a little slow and some rationalists were earlier than that. (Other people too.)

After reading a warning from a rationalist blog, I posted a lot about COVID news to another forum and others there gave me credit for giving the heads-up that it was a Big Deal and not just another thing in the news. (Not sure it made all that much difference, though?)

[1] https://pmc.ncbi.nlm.nih.gov/articles/PMC7569573/

qcnguy

Yeah that paragraph was really sad, where I stopped reading even. Both of those beliefs are dead wrong and they are the best examples the author could find to defend this delusional and dangerous belief system.

The "threat of AI" they're claiming validates rationalism doesn't exist. These loons were the reason Google sat on their LLMs and made their image models only draw pictures of robots, because of the supposed "threat" of AI. Now everyone can run models way better on their own laptops and the sky hasn't fallen, there hasn't even been mass unemployment or anything. Not even the weakest version of this belief has proven true. AI is very friendly, even.

And masks? How many graphs of cases/day with mask mandate transitions overlayed are required before people realize masks did nothing? Whole countries went from nearly nobody wearing them, to everyone wearing them, overnight, and COVID cases/day didn't even notice. You can't look at a case graph and see where the rules changed. Which makes sense because SARS-CoV-2 is aerosolized and can enter through the masks, around the masks, when masks are removed and even through the eyeballs.

Seems like rationalists in the end have managed to be correct about nothing. What a disappointment.

lexandstuff

The point of wearing a mask is to protect other people from your respiratory droplets. Please wear a mask when you're sick.

null

[deleted]

skybrian

It was genuinely difficult to persuade people to wear masks before everyone started doing it and it became normal.

jmoggr

I think the comments here have been overly harsh. I have friends in the community and have visited the LessWrong "campus" several times. They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb (in hopefully somewhat respectful manner).

As for the AI doomerism, many in the community have more immediate and practical concerns about AI, however the most extreme voices are often the most prominent. I also know that there has been internal disagreement on the kind of messaging they should be using to raise concern.

I think rationalists get plenty of things wrong, but I suspect that many people would benefit from understanding their perspective and reasoning.

cynicalkane

> They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb

I don't think LessWrong is a cult (though certainly some of their offshoots are) but it's worth pointing out this is very characteristic of cult recruiting.

For cultists, recruiting cult fodder is of overriding psychological importance--they are sincere, yes, but the consequences are not what you and I would expect from sincere people. Devotion is not always advantageous.

JohnMakin

One of a few issues I have with groups like these, is that they often confidently and aggressively spew a set of beliefs that on their face logically follow from one another, until you realize they are built on a set of axioms that are either entirely untested or outright nonsense. This is common everywhere, but I feel especially pronounced in communities like this. It also involves quite a bit of navel gazing that makes me feel a little sick participating in.

The smartest people I have ever known have been profoundly unsure of their beliefs and what they know. I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

jl6

I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.

dan_quixote

As a former mechanical engineer, I visualize this phenomenon like a "tolerance stackup". Effectively meaning that for each part you add to the chain, you accumulate error. If you're not damn careful, your assembly of parts (or conclusions) will fail to measure up to expectations.

godelski

I like this approach. Also having dipped my toes in the engineering world (professionally) I think it naturally follows that you should be constantly rechecking your designs. Those tolerances were fine to begin with, but are they now that things have changed? It also makes you think about failure modes. What can make this all come down and if it does what way will it fail? Which is really useful because you can then leverage this to design things to fail in certain ways and now you got a testable hypothesis. It won't create proof, but it at least helps in finding flaws.

robocat

I saw an article recently that talked about stringing likely inferences together but ending up with an unreliable outcome because enough 0.9 probabilities one after the other lead to an unlikely conclusion.

Edit: Couldn't find the article, but AI referenced Baysian "Chain of reasoning fallacy".

ctkhn

Basically the same as how dead reckoning your location works worse the longer you've been traveling?

to11mtm

I like this analogy.

I think of a bike's shifting systems; better shifters, better housings, better derailleur, or better chainrings/cogs can each 'improve' things.

I suppose where that becomes relevant to here, is that you can have very fancy parts on various ends but if there's a piece in the middle that's wrong you're still gonna get shit results.

guerrilla

This is what I hate about real life electronics. Everything is nice on paper, but physics sucks.

godelski

  > I don’t think it’s just (or even particularly) bad axioms
IME most people aren't very good at building axioms. I hear a lot of people say "from first principles" and it is a pretty good indication that they will not be. First principles require a lot of effort to create. They require iteration. They require a lot of nuance, care, and precision. And of course they do! They are the foundation of everything else that is about to come. This is why I find it so odd when people say "let's work from first principles" and then just state something matter of factly and follow from there. If you want to really do this you start simple, attack your own assumptions, reform, build, attack, and repeat.

This is how you reduce the leakiness, but I think it is categorically the same problem as the bad axioms. It is hard to challenge yourself and we often don't like being wrong. It is also really unfortunate that small mistakes can be a critical flaw. There's definitely an imbalance.

  >> The smartest people I have ever known have been profoundly unsure of their beliefs and what they know.
This is why the OP is seeing this behavior. Because the smartest people you'll meet are constantly challenging their own ideas. They know they are wrong to at least some degree. You'll sometimes find them talking with a bit of authority at first but a key part is watching how they deal with challenging of assumptions. Ask them what would cause them to change their minds. Ask them about nuances and details. They won't always dig into those can of worms but they will be aware of it and maybe nervousness or excited about going down that road (or do they just outright dismiss it?). They understand that accuracy is proportional to computation, and you have exponentially increasing computation as you converge on accuracy. These are strong indications since it'll suggest if they care more about the right answer or being right. You also don't have to be very smart to detect this.

joe_the_user

IME most people aren't very good at building axioms.

It seems you implying that some people are good building good axiom systems for the real world. I disagree. There are a few situations in the world where you have generalities so close to complete that you can use simple logic on them. But for the messy parts of the real world, there simply is not set of logical claims which can provide anything like certainty no matter how "good" someone is at "axiom creation".

guerrilla

> I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

This is what you get when you naively re-invent philosophy from the ground up while ignoring literally 2500 years of actual debugging of such arguments by the smartest people who ever lived.

You can't diverge from and improve on what everyone else did AND be almost entirely ignorant of it, let alone have no training whatsoever in it. This extreme arrogance I would say is the root of the problem.

BeFlatXIII

> Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.

Non-rationalists are forced to use their physical senses more often because they can't follow the chain of logic as far. This is to their advantage. Empiricism > rationalism.

whatevertrevor

That conclusion presupposes that rationality and empiricism are at odds or mutually incompatible somehow. Any rational position worth listening to, about any testable hypothesis, is hand in hand with empirical thinking.

om8

Good rationalism includes empiricism though

ehmrb

[dead]

tibbar

Yet I think most people err in the other direction. They 'know' the basics of health, of discipline, of charity, but have a hard time following through. 'Take a simple idea, and take it seriously': a favorite aphorism of Charlie Munger. Most of the good things in my life have come from trying to follow through the real implications of a theoretical belief.

bearl

And “always invert”! A related mungerism.

analog31

Perhaps part of being rational, as opposed to rationalist, is having a sense of when to override the conclusions of seemingly logical arguments.

1attice

In philosophy grad school, we described this as 'being reasonable' as opposed to 'being rational'.

That said, big-R Rationalism (the Lesswrong/Yudkowsky/Ziz social phenomenon) has very little in common with what we've standardly called 'rationalism'; trained philosophers tend to wince a little bit when we come into contact with these groups (who are nevertheless chockablock with fascinating personalities and compelling aesthetics.)

From my perspective (and I have only glancing contact,) these mostly seem to be _cults of consequentialism_, an epithet I'd also use for Effective Altruists.

Consequentialism has been making young people say and do daft things for hundreds of years -- Dostoevsky's _Crime and Punishment_ being the best character sketch I can think of.

While there are plenty of non-religious (and thus, small-r rationalist) alternatives to consequentialism, none of them seem to make it past the threshold in these communities.

The other codesmell these big-R rationalist groups have for me, and that which this article correctly flags, is their weaponization of psychology -- while I don't necessarily doubt the findings of sociology, psychology, etc, I wonder if they necessarily furnish useful tools for personal improvement. For example, memorizing a list of biases that people can potentially have is like numbering the stars in the sky; to me, it seems like this is a cargo-cultish transposition of the act of finding _fallacies in arguments_ into the domain of finding _faults in persons_.

And that's a relatively mild use of psychology. I simply can't imagine how annoying it would be to live in a household where everyone had memorized everything from connection theory to attachment theory to narrative therapy and routinely deployed hot takes on one another.

In actual philosophical discussion, back at the academy, psychologizing was considered 'below the belt', and would result in an intervention by the ref. Sometimes this was explicitly associated with something we called 'the Principle of Charity', which is that, out of an abundance of epistemic caution, you commit to always interpreting the motives and interests of your interlocutor in the kindest light possible, whether in 'steel manning' their arguments, or turning a strategically blind eye to bad behaviour in conversation.

The importance Principle of Charity is probably the most enduring lesson I took from my decade-long sojurn among the philosophers, and mutual psychological dissection is anathema to it.

kergonath

> I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

I really like your way of putting it. It’s a fundamental fallacy to assume certainty when trying to predict the future. Because, as you say, uncertainty compounds over time, all prediction models are chaotic. It’s usually associated with some form of Dunning-Kruger, where people know just enough to have ideas but not enough to understand where they might fail (thus vastly underestimating uncertainty at each step), or just lacking imagination.

ramenbytes

Deep Space 9 had an episode dealing with something similar. Superintelligent beings determine that a situation is hopeless and act accordingly. The normal beings take issue with the actions of the Superintelligents. The normal beings turn out to be right.

MajimasEyepatch

I feel this way about some of the more extreme effective altruists. There is no room for uncertainty or recognition of the way that errors compound.

- "We should focus our charitable endeavors on the problems that are most impactful, like eradicating preventable diseases in poor countries." Cool, I'm on board.

- "I should do the job that makes the absolute most amount of money possible, like starting a crypto exchange, so that I can use my vast wealth in the most effective way." Maybe? If you like crypto, go for it, I guess, but I don't think that's the only way to live, and I'm not frankly willing to trust the infallibility and incorruptibility of these so-called geniuses.

- "There are many billions more people who will be born in the future than those people who are alive today. Therefore, we should focus on long-term problems over short-term ones because the long-term ones will affect far more people." Long-term problems are obviously important, but the further we get into the future, the less certain we can be about our projections. We're not even good at seeing five years into the future. We should have very little faith in some billionaire tech bro insisting that their projections about the 22nd century are correct (especially when those projections just so happen to show that the best thing you can do in the present is buy the products that said tech bro is selling).

xg15

The "longtermism" idea never made sense to me: So we should sacrifice the present to save the future. Alright. But then those future descendants would also have to sacrifice their present to save their future, etc. So by that logic, there could never be a time that was not full of misery. So then why do all of that stuff?

human_person

"I should do the job that makes the absolute most amount of money possible, like starting a crypto exchange, so that I can use my vast wealth in the most effective way."

Has always really bothered me because it assumes that there are no negative impacts of the work you did to get the money. If you do a million dollars worth of damage to the world and earn 100k (or a billion dollars worth of damage to earn a million dollars), even if you spend all of the money you earned on making the world a better place, you arent even going to fix 10% of the damage you caused (and thats ignoring the fact that its usually easier/cheaper to break things than to fix them).

null

[deleted]

gen220

Strongly recommend this profile in the NYer on Curtis Yarvin (who also uses "rationalism" to justify their beliefs) [0]. The section towards the end that reports on his meeting one of his supposed ideological heroes for an extended period of time is particularly illuminating.

I feel like the internet has led to an explosion of these such groups because it abstracts the "ideas" away from the "people". I suspect if most people were in a room or spent an extended amount of time around any of these self-professed, hyper-online rationalists, they would immediately disregard any theories they were able to cook up, no matter how clever or persuasively-argued they might be in their written down form.

[0]: https://www.newyorker.com/magazine/2025/06/09/curtis-yarvin-...

trawy081225

> I feel like the internet has led to an explosion of these such groups because it abstracts the "ideas" away from the "people". I suspect if most people were in a room or spent an extended amount of time around any of these self-professed, hyper-online rationalists, they would immediately disregard any theories they were able to cook up, no matter how clever or persuasively-argued they might be in their written down form.

Likely the opposite. The internet has led to people being able to see the man behind the curtain, and realize how flawed the individuals pushing these ideas are. Whereas many intellectuals from 50 years back were just as bad if not worse, but able to maintain a false aura of intelligence by cutting themselves off from the masses.

wussboy

Hard disagree. People use rationality to support the beliefs they already have, not to change those beliefs. The internet allows everyone to find something that supports anything.

I do it. You do it. I think a fascinating litmus test is asking yourself this question: “When did I last change my mind about something significant?” For most people the answer is “never”. If we lived in the world you described, most people’s answers would be “relatively recently”.

lordnacho

> I immediately become suspicious of anyone who is very certain of something

Me too, in almost every area of life. There's a reason it's called a conman: they are tricking your natural sense that confidence is connected to correctness.

But also, even when it isn't about conning you, how do people become certain of something? They ignored the evidence against whatever they are certain of.

People who actually know what they're talking about will always restrict the context and hedge their bets. Their explanation are tentative, filled with ifs and buts. They rarely say anything sweeping.

dcminter

In the term "conman" the confidence in question is that of the mark, not the perpetrator.

sdwr

Isn't confidence referring to the alternate definition of trust, as in "taking you into his confidence"?

jpiburn

"Cherish those who seek the truth but beware of those who find it" - Voltaire

paviva

Most likely Gide ("Croyez ceux qui cherchent la vérité, doutez de ceux qui la trouvent", "Believe those who seek Truth, doubt those who find it") and not Voltaire ;)

Voltaire was generally more subtle: "un bon mot ne prouve rien", a witty saying proves nothing, as he'd say.

Animats

Many arguments arise over the valuation of future money. See "discount function" [1] At one extreme are the rational altruists, who rate that near 1.0, and the "drill, baby, drill" people, who are much closer to 0.

The discount function really should have a noise term, because predictions about the future are noisy, and the noise increases with the distance into the future. If you don't consider that, you solve the wrong problem. There's a classic Roman concern about running out of space for cemeteries. Running out of energy, or overpopulation, turned out to be problems where the projections assumed less noise than actually happened.

[1] https://en.wikipedia.org/wiki/Discount_function

ctoth

> I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

Are you certain about this?

teddyh

All I know is that I know nothing.

p1esk

How do you know?

tshaddox

Well you could be a critical rationalist and do away with the notion of "certainty" or any sort of justification or privileged source of knowledge (including "rationality").

adrianN

Your own state of mind is one of the easiest things to be fairly certain about.

ants_everywhere

The fact that this is false is one of the oldest findings of research psychology

lazide

said no one familiar with their own mind, ever!

at-fates-hands

Isaac Newton would like to have a word.

elictronic

I am not a big fan of alchemy, thank you though.

null

[deleted]

idontwantthis

Suspicious implies uncertain. It’s not immediate rejection.

JKCalhoun

You're describing the impressions I had of MENSA back in the 70's.

ambicapter

One of the only idioms that I don't mind living my life by is, "Follow the truth-seeker, but beware those who've found it".

JKCalhoun

Interesting. I can't say I've done much following though — not that I am aware of anyway. Maybe I just had no leaders growing up.

gwbas1c

Many years ago I met Eliezer Yudkowsky. He handed me a pamphlet extolling the virtues of rationality. The whole thing came across as a joke, as a parody of evangelizing. We both laughed.

I glanced at it once or twice and shoved it into a bookshelf. I wish I kept it, because I never thought so much would happen around him.

yubblegum

imo These people are promoted. You look at their backgrounds and there is nothing that justifies their perches. Eliezer Yudkowsky is (iirc) a Thiel baby, isn't he?

quickthrowman

I only know Eliezer Yudkowsky from his Harry Potter fanfiction, most notably Harry Potter and the Methods of Rationality.

Is he known publicly for some other reason?

meowface

He's considered the father of rationalism and the father of AI doomerism. He wrote this famous article in Time magazine a few years ago: https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-no...

His book If Anyone Builds It, Everyone Dies comes out in a month: https://www.amazon.com/Anyone-Builds-Everyone-Dies-Superhuma...

You can find more info here: https://en.wikipedia.org/wiki/Eliezer_Yudkowsky

wredcoll

> He's considered the father of rationalism

[citation needed]

Even for this weird cult that is trying to appropriate the word, would they really consider him the father of redefining the word?

yahoozoo

[flagged]

meowface

Huh, neo-Nazis in HN comment sections?? Jeez. (I checked their other comments and there are things like "Another Zionist Jew to-the-core in charge of another shady American tech company.")

vehemenz

I get the impression that these people desperately want to study philosophy but for some reason can't be bothered to get formal training because it would be too humbling for them. I call it "small fishbowl syndrome," but maybe there's a better term for it.

username332211

The reason why people can't be bothered to get formal training is that modern philosophy doesn't seem that useful.

It was a while ago, but take the infamous story of the 2006 rape case in Duke University. If you check out coverage of that case, you get the impression every member of faculty that joined in the hysteria was from some humanities department, including philosophy. And quite a few of them refused to change their mind even as the prosecuting attorney was being charged with misconduct. Compare that to Socrates' behavior during the trial of the admirals in 406 BC.

Meanwhile, whatever meager resistence was faced by that group seems to have come from economists, natural scientist or legal scholars.

I wouldn't blame people for refusing to study in a humanities department where they can't tell right from wrong.

djeastm

Modern philosophy isn't useful because some philosophy faculty at Duke were wrong about a rape case? Is that the argument being made here?

qcnguy

Which group of people giving modern training in philosophy should we judge the field by? If they can't use it correctly in such a basic case then who can?

username332211

No. The fact that they were wrong is almost irrelevant.

The faculty denounced the students without evidence, judged the case thought their emotions and their preconceived notions and refused to change their minds as new evidence emerged. Imagine having an academic discussion on a difficult ethical issue with such a teacher...

And none of that would have changed even, even if there somehow was a rape-focused conspiracy among the students of that university. (Thought the problem would have been significantly less obvious.)

null

[deleted]

wredcoll

> Meanwhile, whatever meager resistence was faced by that group seems to have come from economists, natural scientist or legal scholars.

> I wouldn't blame people for refusing to study in a humanities department where they can't tell right from wrong.

Man, if you have to make stuff up to try to convince people... you might not be on the right side here.

username332211

I'm not sure what you are talking about. I have to admit, I mostly wrote my comment based on my recollections and it's a case 20 years ago I barely paid attention to until after the bizzaire conclusion. But looking trough Wikipedia's articles on the case[1] it doesn't seem I'm that far from the truth.

I guess I should have limited my statement about resisting mob justice to the economists at that university as the other departments merely didn't sign on to the public letter of denunciation?

Its weird that Wikipedia doesn't give you a percentage of signatories of the letter of 88 from the philosophy department, but several of the notable signatories are philosophers.

[1] https://en.m.wikipedia.org/wiki/Reactions_to_the_Duke_lacros...

Edit: Just found some articles claiming that a chemistry professor by the name of Stephen Baldwin was the first to write to the university newspaper condemning the mob.

fellowniusmonk

Philosophy is interesting in how it informs computer science and vice-versa.

Mereological nihilism and weak emergence is interesting and helps protect against many forms of kind of obsessive levels of type and functional cargo culting.

But then in some areas philosophy is woefully behind, and you have philosophers poo-pooing intuitionism when any software engineer working on sufficiently federated or real world sensor/control system borrows constructivism into their classical language to not kill people (agda is interesting of course). Intermediate logic is clearly empirically true.

It's interesting that people don't understand the non-physicality of the abstract and you have people serving the abstract instead of the abstract being used to serve people. People confusing the map for the terrain is such a deeply insidious issue.

I mean all the lightcone stuff, like, you can't predict ex ante what agents will be keystones in beneficial casual chains so its such waste of energy to spin your wheels on.

freejazz

>The reason why people can't be bothered to get formal training is that modern philosophy doesn't seem that useful.

But rationalism is?

lmm

Well, maybe. It seems at least adjacent to the stuff that's been making a lot of people rich lately.

samdoesnothing

I think the argument is that philosophy hasn't advanced much in the last 1000 years, but it''s still 10,000 years ahead of whatever is coming out of the rationalist camp.

username332211

Nature abhors a vaccum. After the October revolution, the genuine study of humanities was extinguished in Russia and replaced with the mindless repetition of rather inane doctrines. But people with awakened and open minds would always ask questions and seek answers.

Those would, of course, be people with no formal training in history or philosophy (as the study of history where you aren't allowed to question Marxist doctrine would be self-evidently useless). Their training would be in the natural sciences or mathematics. And without knowing how to properly reason about history or philosophy, they may reach fairly kooky conclusions.

Hence why Rationalism can be though as the same class of phenomena as Fomenko's chronology (or if you want to be slightly more generous, Shafarevich's philosophical tracts).

NoMoreNicksLeft

Yeh, probably.

Imagine that you're living in a big scary world, and there's someone there telling you that being scared isn't particularly useful, that if you slow down and think about the things happening to you, most of your worries will become tractable and some will even disappear. It probably works at first. Then they sic Roko's Basilisk on you, and you're a gibbering lunatic 2 weeks later...

1attice

My thoughts exactly! I'm a survivor of ten years in the academic philosophy trenches and it just sounds to me like what would happen if you left a planeload of undergraduates on a _Survivor_ island with an infinite supply of pizza pockets and adderall

joeblubaugh

Funny that this also describes these cult rationalist groups very well.

samdoesnothing

Why would they need formal training? Can't they just read Plato, Socrates, etc, and classical lit like Dostoevsky, Camus, Kafka etc? That would be far better than whatever they're doing now.

guerrilla

I'm someone who has read all of that and much more, including intense study of SEP and some contemporary papers and textbooks, and I would say that I am absolutely not qualified to produce philosophy of the quality output by analytic philosophy over the last century. I can understand a lot of it, and yes, this is better than being completely ignorant of the last 2500 years of philosophy as most rationalists seem to be, but doing only what I have done would not sufficiently prepare them to work on the projects that they want to work on. They (and I) do not have the proper training in logic or research methods, let alone the experience that comes from guided research in the field as it is today. What we all lack especially is the epistemological reinforcement that comes from being checked by a community of our peers. I'm not saying it can't be done alone, I'm just saying that what you're suggesting isn't enough and I can tell you because I'm quite beyond that and I know that I cannot produce the quality of work that you'll find in SEP today.

samdoesnothing

Oh I don't mean to imply reading some classical lit prepares you for a career producing novel works in philosophy, simply that if one wants to understand themselves, others, and the world better they don't need to go to university to do it. They can just read.

sien

Trying to do a bit of formal philosophy at University is really worth doing.

You realise that it's very hard to do well and it's intellectual quicksand.

Reading philosophers and great writers as you suggest is better than joining a cult.

It's just that you also want to write about what you're thinking in response to reading such people and ideally have what you write critiqued by smart people. Perhaps an AI could do some of that these days.

kayodelycaon

I took a few philosophy classes. I found it incredibly valuable in identifying assumptions and testing them.

Being Christian, it helped me understand what I believe and why. It made faith a deliberate, reasoned choice.

And, of course, there are many rational reasons for people to have very different opinions when it comes to religion and deities.

Being bipolar might give me an interesting perspective. Everything I’ve read about rationalists misses the grounding required to isolate emotion as a variable.

dragonwriter

> It's just that you also want to write about what you're thinking in response to reading such people and ideally have what you write critiqued by smart people. Perhaps an AI could do some of that these days.

An AI can neither write about what you are thinking in your place nor substitute for a particularly smart critic, but might still be useful for rubber-ducking philosophical writing if used well.

giraffe_lady

This is like saying someone who wants to build a specialized computer for a novel use should read the turing paper and get to it. A lot has of development has happened in the field in the last couple hundred years.

samdoesnothing

I don't think that is similar at all. People want to understand the world better, they don't want to learn how to build it from first principles.

Jtsummers

> Many of them also expect that, without heroic effort, AGI development will lead to human extinction.

> These beliefs can make it difficult to care about much of anything else: what good is it to be a nurse or a notary or a novelist, if humanity is about to go extinct?

Replace AGI causing extinction with the Rapture and you get a lot of US Christian fundamentalists. They often reject addressing problems in the environment, economy, society, etc. because the Rapture will happen any moment now. Some people just end up stuck in a belief about something catastrophic (in the case of the Rapture, catastrophic for those left behind but not those raptured) and they can't get it out of their head. For individuals who've dealt with anxiety disorder, catastrophizing is something you learn to deal with (and hopefully stop doing), but these folks find a community that reinforces the belief about the pending catastrophe(s) and so they never get out of the doom loop.

taurath

Raised to huddle close and expect the imminent utter demise of the earth and being dragged to the depths of hell if I so much as said a bad word I heard on TV, I have to keep an extremely tight handle on my anxiety in this day and age.

It’s not from a rational basis, but from being bombarded with fear from every rectangle in my house, and the houses of my entire community

tines

The Rapture isn't doom for the people who believe in it though (except in the lost sense of the word), whereas the AI Apocalypse is, so I'd put it in a different category. And even in that category, I'd say that's a pretty small number of Christians, fundamentalist or no, who abandon earthly occupations for that reason.

JohnMakin

I don't mean to well ackshually you here, but there are several different theological beliefs around the Rapture, some of which believe Christians will remain during the theoretical "end times." The megachurch/cinema version of this very much believes they won't, but, this is not the only view, either in modern times or historically. Some believe it's already happened, even. It's a very good analogy.

Jtsummers

Yes, I removed a parenthetical "(or euphoria loop for the Rapture believers who know they'll be saved)". But I removed it because not all who believe in the Rapture believe they will be saved (or have such high confidence) and, for them, it is a doom loop.

Both communities, though, end up reinforcing the belief amongst their members and tend towards increasing isolation from the rest of the world (leading to cultish behavior, if not forming a cult in the conventional sense), and a disregard for the here and now in favor of focusing on this impending world changing (destroying or saving) event.

joe_the_user

A lot of people also believe that global warming will cause terrible problems. I think that's a plausible belief but if you combine people believing one or another of these things, you've a lot of the US.

Which is to say that I don't think just dooming is going on. Especially, the belief in AGI doom has a lot of plausible arguments in its favor. I happen not to believe in it but as a belief system it is more similar to a belief in global warming than to a belief in the raptures.

taberiand

Replace AGI with Climate Change and you've got an entirely reasonable set of beliefs.

psunavy03

You can believe climate change is a serious problem without believing it is necessarily an extinction-level event. It is entirely possible that in the worst case, the human race will just continue into a world which sucks more than it necessarily has to, with less quality of life and maybe lifespan.

taberiand

I never said I held the belief, just that it's reasonable

ImaCake

You can treat climate change as your personal Ragnarok, but its also possible to take a more sober view that climate change is just bad without it being apocalyptic.

NoMoreNicksLeft

You have a very popular set of beliefs.

bobson381

I keep thinking about the first Avengers movie, when Loki is standing above everyone going "See, is this not your natural state?". There's some perverse security in not getting a choice, and these rationalist frameworks, based in logic, can lead in all kinds of crazy arbitrary directions - powered by nothing more than a refusal to suffer any kind of ambiguity.

csours

Humans are not chickens, but we sure do seem to love having a pecking order.

snarf21

I think it is more simple in that we love tribalism. A long time ago being part of a tribe had such huge benefits over going it alone that it was always worth any tradeoffs. We have a much better ability to go it alone now but we still love to belong to a group. Too often we pick a group based on a single shared belief and don't recognize all the baggage that comes along. Life is also too complicated today. It is difficult for someone to be knowledgeable in one topic let alone the 1000s that make up our society.

csours

maybe the real innie/outie is the in-group/out-group. no spoilers, i haven't finished that show yet

lazide

Making good decisions is hard, and being accountable to the results of them is not fun. Easier to outsource if you can.

jacquesm

They mostly seem to lean that way because it gives them carte blanche to do as they please. It is just a modern version of 'god has led my hand'.

notahacker

I agree with the religion comparison (the "rational" conclusions of rationalism tend towards millenarianism with a scifi flavour), but the people going furthest down that rabbit hole often aren't doing what they please: on the contrary they're spending disproportionate amounts of time worrying about armageddon and optimising for stuff other people simply don't care about, or in the case of the explicit cults being actively exploited. Seems like the typical in-too-deep rationalist gets seduced by the idea that others who scoff at their choices just aren't as smart and rational as them, as part of a package deal which treats everything from their scifi interests to their on-the-spectrum approach to analysing every interaction from first principles as great insights...

Mizza

It's amphetamine. All of these people are constantly tweaking. They're annoying people to begin with, but they're all constantly yakked up and won't stop babbling. It's really obvious, I don't know why it isn't highlighted more in all these post Ziz articles.

Muromec

How do you know?

tbrake

having known dozens of friends, family, roommates, coworkers etc both before and after they started them. The two biggest telltale signs -

1. tendency to produce - out of no necessity whatsoever, mind - walls of text. walls of speech will happen too but not everyone rambles.

2. Obnoxiously confident that they're fundamentally correct about whatever position they happen to be holding during a conversation with you. No matter how subjective or inconsequential. Even if they end up changing it an hour later. Challenging them on it gets you more of #1.

MinimalAction

Pretty much spot on! It is frustrating to talk with these when they never admit they are wrong. They find new levels of abstractions to deal with your simpler counterarguments and it is a never ending deal unless you admit they were right.

TheAceOfHearts

Many people like to write in order to develop and explore their understanding of a topic. Writing lets you spend a lot of time playing around with whatever idea you're trying to understand, and sharing this writing invites others to challenge your assumptions.

When you're uncertain about a topic, you can explore it by writing a lot about said topic. Ideally, when you've finished exploring and studying a topic, you should be able to write a much more condensed / synthesized version.

null

[deleted]

Muromec

I mean, I know the effects of adderall/ritalin and it's plausible, what I'm asking is whether if gp knows that for a fact or deduces from what is known.

Henchman21

I call this “diarrhea of the mind”. It’s what happens when you hear a steady stream of bullshit from someone’s mouth. It definitely tracks with substance abuse of “uppers”, aka meth, blow, hell even caffeine!

ajkjk

Presumably they mean Adderall. Plausible theory tbh. Although it's just a factor not an explanation.

samdoesnothing

Yeah it's pretty obvious and not surprising. What do people expect when a bunch of socially inept nerds with weird unchallenged world views start doing uppers? lol

kridsdale3

I like to characterize the culture of each (roughly) decade with the most popular drugs of the time. It really gives you a new lens for media and culture generation.

throwanem

Who's writing them?

null

[deleted]

meroes

It grew out of many different threads: different websites, communities, etc all around the same time. I noticed it contemporaneously in the philosophy world where Nick Bostrom’s Simulation argument was boosted more than it deserved (like everyone was just accepting it at the lay-level). Looking back I see it also developed from less wrong and other sites, but I was wondering what was going on with simulations taking over philosophy talk. Now I see how it all coalesced.

All of it has the appearance of sounding so smart, and a few sites were genuine. But it got taken over.

potatolicious

Yeah, a lot of the comments here are really just addressing cults writ large and opposed to why this one was particularly successful.

A significant part of this is the intersection of the cult with money and status - this stuff really took off once prominent SV personalities became associated with it, and got turbocharged when it started intersecting with the angel/incubator/VC scene, when there was implicit money involved.

It's unusually successful because -- for a time at least -- there was status (and maybe money) in carrying water for it.

jacquesm

Paypal will be traced as the root cause of many of our future troubles.

varjag

Wish I could upvote this twice. It's like intersectionality for evil.

wredcoll

https://en.m.wikipedia.org/wiki/Barth%C3%A9lemy-Prosper_Enfa...

Sometimes history really does rhyme.

> Enfantin and Amand Bazard were proclaimed Pères Suprêmes ("Supreme Fathers") – a union which was, however, only nominal, as a divergence was already manifest. Bazard, who concentrated on organizing the group, had devoted himself to political reform, while Enfantin, who favoured teaching and preaching, dedicated his time to social and moral change. The antagonism was widened by Enfantin's announcement of his theory of the relation of man and woman, which would substitute for the "tyranny of marriage" a system of "free love".[1]

6177c40f

To be clear, this article isn't calling rationalism a cult, it's about cults that have some sort of association with rationalism (social connection and/or ideology derived from rationalist concepts), e.g. the Zizians.

throwanem

This article attempts to establish disjoint categories "good rationalist" and "cultist." Its authorship, and its appearance in the cope publication of the "please take us seriously" rationalist faction, speak volumes of how well it is likely to succeed in that project.

ImaCake

Not sure why you got down voted for this. The opening paragraph of the article reads as suspicious to the observant outsider:

>The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally.

Anyone who had just read a lot about Scientology would read that and have alarm bells ringing.

6177c40f

I think it's a meaningful distinction- most rationalists aren't running murder cults.

pizzadog

I have a lot of experience with rationalists. What I will say is:

1) If you have a criticism about them or their stupid name or how "'all I know is that I know nothing' how smug of them to say they're truly wise," rest assured they have been self flagellating over these criticisms 100x longer than you've been aware of their group. That doesn't mean they succeeded at addressing the criticisms, of course, but I can tell you that they are self aware. Especially about the stupid name.

2) They are actually well read. They are not sheltered and confused. They are out there doing weird shit together all the time. The kind of off-the-wall life experiences you find in this community will leave you wide eyed.

3) They are genuinely concerned with doing good. You might know about some of the weird, scary, or cringe rationalist groups. You probably haven't heard about the ones that are succeeding at doing cool stuff because people don't gossip about charitable successes.

In my experience, where they go astray is when they trick themselves into working beyond their means. The basic underlying idea behind most rationalist projects is something like "think about the way people suffer everyday. How can we think about these problems in a new way? How can we find an answer that actually leaves everyone happy?" A cynic (or a realist, depending on your perspective) might say that there are many problems that fundamentally will leave some group unhappy. The overconfident rationalist will challenge that cynical/realist perspective until they burn themselves out, and in many cases they will attract a whole group of people who burn out alongside them. To consider an extreme case, the Zizians squared this circle by deciding that the majority of human beings didn't have souls and so "leaving everyone happy" was as simple as ignoring the unsouled masses. In less extreme cases this presents itself as hopeless idealism, or a chain of logic that becomes so divorced from normal socialization that it appears to be opaque. "This thought experiment could hypothetically create 9 quintillion cubic units of Pain to exist, so I need to devote my entire existence towards preventing it, because even a 1% chance of that happening is horrible. If you aren't doing the same thing then you are now morally culpable for 9 quintillion cubic units of Pain. You are evil."

Most rationalists are weird but settle into a happy place far from those fringes where they have a diet of "plants and specifically animals without brains that cannot experience pain" and they make $300k annually and donate $200k of it to charitable causes. The super weird ones are annoying to talk to and nobody really likes them.

wredcoll

But are they scotsmen?

a_bonobo

This is a great article.

There's so much in these group dynamics that repeats group dynamics of communist extremists of the 70s. A group that has found a 'better' way of life, all you have to do is believe in the group's beliefs.

Compare this part from OP:

>Here is a sampling of answers from people in and close to dysfunctional groups: “We spent all our time talking about philosophy and psychology and human social dynamics, often within the group.” “Really tense ten-hour conversations about whether, when you ate the last chip, that was a signal that you were intending to let down your comrades in selfish ways in the future.”

This reeks of Marxist-Leninist self-criticism, where everybody tried to up each other in how ideologically pure they were. The most extreme outgrowing of self-criticism is when the Japanese United Red Army beat its own members to death as part of self-criticisms.

>'These violent beatings ultimately saw the death of 12 members of the URA who had been deemed not sufficiently revolutionary.' https://en.wikipedia.org/wiki/United_Red_Army

History doesn't repeat, but it rhymes.