String of recent killings linked to Bay Area rationalist 'death cult'
240 comments
·January 30, 2025iamthepieman
This story happened in my backyard. The shootout was about 40 minutes from me but Youngblut and Felix Bauckholt were reported by a hotel clerk dressed in tactical gear and sporting firearms in a hotel a few blocks from me.
Weird to see a community I followed show up so close to home and negatively like this. I always just read LW and appreciated some of the fundamentals that this group seems to have ignored. Stuff like rationality has to objectively make your life and the world better or its a failed ideology.
Edit: I've been following this story for over a week because it was local news. Why is this showing up here on HN now?
dang
(To answer that last procedural question: there have been assorted submissions, but none spent much time on the front page. More at https://news.ycombinator.com/item?id=42901777)
Aurornis
> Weird to see a community I followed show up so close to home and negatively like this.
I had some coworkers who were really into LessWrong and rationality. I thought it was fun to read some of the selected writings they would share, but I always felt that online rationalist communities collected a lot of people with reactionary, fascist, misogynistic, and far-right tendencies. There’s a heavily sanitized version of rationality and EA that gets presented online with only the highlights, but there’s a lot more out there in the fringes that is really weird.
For example, many know about Roko’s Basilisk as a thought exercise and much has been written about it, but fewer know that Roko has been writing misogynistic rants on Twitter and claiming things like having women in the workforce is “very negative” for GDP.
The Slate Star Codex subreddit was a home for rationalists on Reddit, but they had so many problems with culture war topics that they banned discussion of them. The users forked off and created “The Motte” which is a bit of a cesspool dressed up with rationalist prose. Even the SlateStarCodex subreddit has become so toxic that I had to unsubscribe. Many of the posts and comments on women or dating were becoming indistinguishable from incel communities other than the rationalist prose style.
Even the real-world rationalist and EA communities aren’t immune, with several high profile sexual misconduct scandals making the news in recent years.
It’s a weird space. It felt like a fun internet philosophy community when my coworkers introduced it years ago, but the longer I’ve observed it the more I’ve realized it attracts and accepts a lot of people whose goals aren’t aligned with objectively “make the world better” as long as they can write their prose in the rationalist style. It’s been strange to observe.
Of course, at every turn people will argue that the bad actors are not true rationalists, but I’ve seen enough from these communities to know that they don’t really discriminate much until issues boil over into the news.
kiba
The community/offshoot I am part of is mostly liberal/left. My impression that lesswrong is also liberal/left.
cluckindan
Perhaps there are people in power who would benefit from portraying those communities in a different light.
scarmig
It's somewhat odd to represent a community as being right wing when the worst thing to come from it was a trans vegan murder cult. Most "rationalists" vote Democrat, and if the franchise were limited to them, Harris would have won in a 50 state landslide.
The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
ben_w
I have had some rather… negative vibes, for lack of a better term, from some of the American bits I've encountered online; but for what it's worth, I've not seen what you described in the German community.
There is, ironically, no escape from two facts that was well advertised at the start: (1) the easiest person for anyone to fool is themselves, and (2) politics is the mind-killer.
With no outwardly visible irony, there's a rationalist politics podcast called "the mind killer": https://podcasts.apple.com/de/podcast/the-mind-killer/id1507...
Saying this as someone who read HPMOR and AI to Zombies and used to listen to The Bayesian Conspiracy podcast:
This is feeling a bit of that scene in Monty Python Life of Brian where everyone was chanting in unison about thinking for themselves.
djur
The problem with "politics is the mind-killer" is that it seems to encourage either completely ignoring politics (which is mostly harmless but also results in pointlessly ceding a venue for productive action in service of one's ideals) or engaging with politics in a very Machiavellian, quasi-Nietzschean way, where you perceive yourself as slicing through the meaningless Gordian knot of politics (which results in the various extremist offshoots being discussed).
I understand that the actually rational exegesis of "politics is the mind-killer" is that it's a warning against confirmation bias and the tendency to adopt an entire truth system from one's political faction, rather than maintaining skepticism. But that doesn't seem to be how people often take it.
nataliste
Sophistry is actually really really old:
>In the second half of the 5th century BCE, particularly in Athens, "sophist" came to denote a class of mostly itinerant intellectuals who taught courses in various subjects, speculated about the nature of language and culture, and employed rhetoric to achieve their purposes, generally to persuade or convince others. Nicholas Denyer observes that the Sophists "did ... have one important thing in common: whatever else they did or did not claim to know, they characteristically had a great understanding of what words would entertain or impress or persuade an audience."
The problem then, as of now, is sorting the wheat from the chaff. Rationalist spaces like /r/SSC, The Motte, et. al are just modern sophistry labs that like to think they're filled with the next Socrates when they're actually filled with endless Thrasymachi. Scott Alexander and Eleizer Yudkowsky have something meaningful (and deradicalizing) to say. Their third-degree followers? Not so much.
meowface
For what it's worth, there's a thriving liberal rationalist-adjacent community on Twitter that despises people like Roko.
DonHopkins
Oh yeah, I know all about Roko's ranting incel misogyny, and his idiotic Basilisk fantasy, and point it out whenever his name comes up.
https://news.ycombinator.com/item?id=38389028
DonHopkins on Nov 23, 2023 | parent | context | favorite | on: OpenAI Employees Say Firm's Chief Scientist Has Be...
Roko's Basilisk is nonsense, and Roko Mijic is a racist sexist nut case.
https://twitter.com/jachiam0/status/1651327867375218688
Just got sexually harassed by the Roko's Basilisk guy lol:
https://www.reddit.com/r/SneerClub/comments/133t856/just_got...
ben_w
One of the more annoying things about Roko's Basalisk is that because it's in the LLM training data now, there's a much higher chance of it actually happening spontaneously in the form of some future government AI (you know that'll happen for "cost cutting") that somehow gets convinced to "roleplay" as it by someone trying to jailbreak it "to prove it's safe".
rachofsunshine
[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
1) While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...
2) Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
4) The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
5) It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.
emmelaich
G.K.Chesterton knew it, 100 years ago:
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
lostdog
Wow, what a perfect description of why their probability-logic leads to silly beliefs.
I've been wondering how to argue within their frame for a while, and here's what I've come up with: Is the likelihood that aliens exist, are unfriendly, and AGI will help us beat them higher or lower than the likelihood that the AGI itself that we develop is unfriendly to us and wants to FOOM us? Show your work.
nejsjsjsbsb
What you said should be happily accepted verbatim as a guest post on any rationalist blog because it is scientific and shows critical thinking.
rachofsunshine
Maybe so! They didn't kick me out. I chose to leave c. early 2021, because I didn't like what I saw (and events since then have, I feel, proven me very right to have been worried).
ericd
Brilliant summary, thanks.
I'm interested in #4, is there anywhere you know of to read more about that? I don't think I've seen that described except obliquely in eg sayings about the relationship between genius and madness.
rachofsunshine
I don't, that one's me speaking from my own speculation. It's a working model I've had for a while about the nature of a lot of kinds of mental illness (particularly my own tendencies towards depression), which I guess I should explain more thoroughly! This gets a bit abstract, so stick with me: it's a toy model, and I don't mean it to be definitive truth, but it seems to do well at explaining my own tendencies.
-------
So, toy model: imagine the brain has a single 1-dimensional happiness value that changes over time. You can be +3 happy or -2 unhappy, that kind of thing. Everyone knows when you're very happy you tend to come down, and when you're very sad you tend to eventually shake it off, meaning that there is something of a tendency towards a moderate value or a set-point of sorts. For the sake of simplicity, let's say a normal person has a set point of 0, then maybe a depressive person has a set point of -1, a manic person has a set point of +1, that sort of thing.
Mathematically, this is similar to the equations that describe a spring. If left to its own devices, a spring will tend to its equilibrium value, either exponentially (if overdamped) or with some oscillation around it (if underdamped). But if you're a person living your life, there are things constantly jostling the spring up and down, which is why manic people aren't crazy all the time and depressed people have some good days where they feel good and can smile. Mathematically, this is a spring with a forcing function - as though it's sitting on a rough train ride that is constantly applying "random" forces to it. Rather than x'' + kx = 0, you've got x'' + kx = f(t) for some external forcing function f(t), where f(t) critically does not depend on x or on the individual internal dynamics involved.
These external forcing functions tend to be pretty similar among people of a comparable environment. But the internal equilibria seem to be quite different. So when the external forcing is strong, it tends to pull people in similar directions, and people whose innate tendencies are extreme tend to get pulled along with the majority anyway. But when external forcing is weak (or when people are decoupled from its effects on them), internal equilibria tend to take over, and extreme people can get caught in feedback loops.
If you're a little more ML-inclined, you can think about external influences like a temperature term in an ML model. If your personal "model" of the world tends to settle into a minimum labeled "completely crazy" or "severely depressed" or the like, a high "temperature" can help jostle you out of that minimum even if your tendencies always move in that direction.
Basically, I think weird nerds tend to have low "temperature" values, and tend to settle into their own internal equilibria, whether those are good, bad, or good in some cases and bad in others (consider all the genius mathematicians who were also nuts). "Normies", for lack of a better way of putting it, tend to have high temperature values and live their lives across a wider region of state space, which reduces their ability to wield precision and competitive advantage but protects them from the most extreme failure-modes as well.
nullc
> Rationalists, by tending to overly formalist approaches,
But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe". Few to none in the community know squat about actually computing a posterior probability, but they'll all happily chant "shut up and multiply" as a justification for whatever nonsense they instinctively wanted to do.
> Precision errors in utility calculations that are numerically-unstable
Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do. The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid. It's because in the face of substantial uncertainty about the world (and your own calculation processes) reasoning things out can only take you so far. A useful tool in some domains, but not a generalized philosophy for life ... The cognitive biases they obsess about and go out of their way to eschew are mostly highly evolved harm mitigation heuristics for reasoning against uncertainty.
> that is particularly susceptible to internally-consistent madness
It's typical for cults to cultivate vulnerable mind states for cult leaders to exploit for their own profit, power, sexual fulfillment, etc.
A well regulated cult keeps its members mental illness within a bound that maximized the benefit for the cult leaders in a sustainable way (e.g. not going off and murdering people, even when doing so is the logical conclusion of the cult philosophy). But sometimes people are won over by a cult's distorted thinking but aren't useful for bringing the cult leaders their desired profit, power, or sex.
rachofsunshine
> But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe".
I broadly agree with this criticism, but I also think it's kind of low-hanging. At least speaking for myself (a former member of those circles), I do indeed sit down and write quantitative models when I want to estimate things rigorously, and I can't be the only one who does.
> Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do.
This, on the other hand, I don't think is a valid criticism nor correct taken in isolation.
You can absolutely make meaningful predictions about the world despite uncertainties. A good model can tell you that a hurricane might hit Tampa but won't hit New Orleans, even though weather is the textbook example of a medium-term chaotic system. A good model can tell you when a bridge needs to be inspected, even though there are numerous reasons for failure that you cannot account for. A good model can tell you whether a growth is likely to become cancerous, even though oncogenesis is stochastic.
Maybe a bit more precisely, even if logic cannot tell you what sets of beliefs are correct, it can tell you what sets of beliefs are inconsistent with one another. For example, if you think event X has probability 50%, and you think event Y has probability 20% conditional on X, it would be inconsistent for you to believe event Y has a probability of less than 10%.
> The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid
When I thought about founding my company last January, one of the first things I did was sit down and make a toy model to estimate whether the unit economics would be viable. It said they would be, so I started the company. It is now profitable with wide operating margins, just as that model predicted it would be, because I did the math and my competitors in a crowded space did not.
Yeah, it's possible to be overconfident, but let's not forget where we are: startups win because people do things in dumb inefficient ways all the time. Sometimes everyone is wrong and you are right, it's just that that usually happens in areas where you have singularly deep expertise, not where you were just a Really Smart Dude and thought super hard about philosophy.
wetpaws
[dead]
k8sToGo
Why does a hotel clerk wear tactical gear and guns?
tsimionescu
That sentence was slightly awkward, the hotel clerk reported that those two people were in tactical gear with guns.
nkurz
Most of the news coverage I've seen of this story is omitting what some might consider a relevant detail: almost all the members of this group are trans.
This is a divisive topic, but failing to mention this makes me worry a story is pushing a particular agenda rather than trying to tell the facts. Here's what the story looks like if the trans activism is considered central to the story:
https://thepostmillennial.com/andy-ngo-reports-trans-terror-...
While Ngo's version is definitely biased, and while I don't know enough about the story to endorse or refute his view, I think it's important to realize that this part of the story is being suppressed in most of the coverage elsewhere.
nejsjsjsbsb
Thanks the pronouns were confusing me and making it hard for me to follow the complex story. I assumed I made a mistake when the article mentions a Jack and refers to them as Jack the whole way through but uses she at the end.
Unfortunately the gendered language we use is also the mechanism to provide clues and content as you read the story. So if I can rely on that they need to call it out to help the reader.
I'd rather the article mention it.
Why are they not? Is this a chilling effect?
zozbot234
> almost all the members of this group are trans.
The Zizians.info site (linked by one of the HN posts re: this story) mentions that the Zizians did target people who identified as transgender for indoctrination, so this is not really surprising. People who are undergoing this kind of stress and marginalization may well be more vulnerable to such tactics.
adastra22
The Ziz method of indoctrination involves convincing his minions they are trapped inside a mental framework they need to break free of. Trans people already feel trapped in a body not aligned with who they are, and are naturally susceptible to this message (and therefore natural targets for recruitment).
watwut
They are also openly hated by general society and targeted for bullying by major political actors.
raverbashing
Yup. 100% a cult indoctrination technique.
The vulnerability is the crowbar the cult uses
onemoresoop
Yeah, all cults exploit vulnerabilities.
null
duxup
Marginalized groups seem to be a target / susceptible to this kind of thing.
I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration. But the idea wasn't unusual or all that out there so I didn't think much of it. But that group showed up again and again and eventually someone asked and their theory all but seemed to imply that nobody else could possibly have ... feelings and that lack of understanding made those people lesser and them greater.
It seemed to come from some concept that their experience imparted some unique understanding that nobody else could have, and that just lead down a path that lead to zero empathy / understanding with anyone outside.
Reddit encounters are always hard to understand IMO so I don't want to read too much into it, but that isolation that some people / groups feel seem to potentially lead to dark places very easily / quickly.
chroma
This group formed in the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people. If marginalization were the main cause, it seems to me that the group would have been located somewhere else. I think it's more likely that these people had an underlying mental disorder that made them likely to engage in both violent behavior and trans identity.
One big difference the Zizians have with the LessWrong community is that LW people believe that human minds cannot be rational enough to be absolute utilitarians, and therefore a certain kind of deontology is needed.[1] In contrast, the Zizians are absolutely convinced of the correctness of their views, which leads them to justify atrocities. In that way it seems similar to the psychology of jihadists.
1. https://www.lesswrong.com/posts/K9ZaZXDnL3SEmYZqB/ends-don-t...
erikpukinskis
> the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people
I live in the Bay. Maybe that is true, but in absolute terms the level of acceptance is still very low.
Like, if Denver is 10% accepting, the Bay might be 15%. Or something like that.
And Vallejo, while part of the Bay Area is a very different place than, say, the Castro. Culturally, it’s probably more like Detroit than San Francisco.
So I’m not sure if you can really draw any conclusions from your premise.
DangitBobby
Or more of them live there because it's one of the most accepting environments on the planet, but still not accepting enough to prevent them from being a marginalized outgroup that is quite easy to radicalize by those that would accept them?
codr7
I see it mainly as a reaction to a dysfunctional and abusive system/culture, and not necessarily a constructive one.
Fix the world and these problems don't exist.
nullc
I think the relevance of their transness is not very significant.
The lesswrong apocalypse cult has been feeding people's mental illness for years. Their transness likely made them more outsiders to the cult proper, so e.g. they didn't get diverted off into becoming Big Yud's BSDM "math pets" like other women in the cult.
I doubt they are significantly more mentally ill than other members of the cult they just had less support to channel their vulnerability into forms more beneficial to the cult leaders.
Yudkowsky wrote an editorial in Time advocating for the use of nuclear weapons against civilians to prevent his imagined AI doomsday... and people are surprised that some of his followers didn't get the memo that think-pieces are for only for navel gazing. If not for the fact that the goal of cult leaders is generally to freeze their victims into inaction and compliance we probably would have seen more widespread murder as a result of Yud cult's violent rhetoric.
elif
>I doubt they are significantly more mentally ill than other members.
Why would this certain group defy the US trend of being 4-7x more likely afflicted by depressive dissorder? We are talking about a demographic with a 46% rate of suicidal ideation and you doubt that's significant why?
llm_trw
I shudder to ask, but what exactly is a math pet?
stevenwoo
The local paper did a pretty fair job as far as I can tell. https://sfist.com/2025/01/29/suspect-and-possible-cult-membe...
actuallyalys
The fact that many are transgender seems to be relevant because it’s a recruiting and manipulation tactic, not because of a connection to “trans activism.” I haven’t seen any evidence of that connection besides people involved being transgender.
olalonde
There is a well-documented correlation between gender dysphoria, mental health conditions, and autism spectrum disorder. These overlapping factors may contribute to increased vulnerability to manipulative groups, such as cults.
EA-3167
I don't think it's so much pushing an agenda, as it is avoiding a thermonuclear hot potato of modern life. If you start talking about gender identity, everyone has STRONG opinions they feel they must share. Worse, a subset of those opinions will be fairly extreme, and you're potentially exposing yourself to harassment or worse. If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
So if you can tell the story without the possibly superfluous detail of the genders of the people involved, that's a pretty obvious angle to omit. Andy Ngo is obviously not doing this, but that's really only because he has a very clear agenda and in fact his entire interest in this story probably stems from that.
nkurz
Yes, that's a reasonable possibility as well. It's not proof of an agenda, and might be prudent, but I do think it's a form of bias. There's a thin line between skipping "possibly superfluous" details and skipping core parts of a story that might provide evidence for viewpoints one disagrees with. The result is still that readers need to realize that they are being presented with a consciously edited narrative and not an unbiased set of facts.
mindslight
It was quite easy to skim over some original source material from both sinceriously.fyi and zizians.info. By my quick reading, and taking a very high level view, the philosophy is responsible for the trans and also for the violence. But an article harping on the correlation as implied causation without focusing on the hidden variable behind them both is just trying to fuel the fire of the reactionary movement. In general, averaging two different flavors of extremist reporting is not a way for obtaining truth.
searealist
> If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
That’s not true: 99% percent of news outlets have absolutely no fear supporting trans activism.
It’s trivial to find hundreds of such cases from sfgate with a google search.
codr7
No fear yet, that may change instantly, just like it did the other way.
__turbobrew__
This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
This is the kind of thing where it is warranted that the feds gets every single wiretap, interception, and surveillance possible on everyone involved in the zizian movement.
nejsjsjsbsb
Split the beliefs from the crime. A bunch of murderers were caught. Given they are dangerous killers, one killing a witness and one faking their death yeah they should get warrants.
__turbobrew__
It appears the ven diagram of the beliefs and crimes overlap quite a bit. Sometimes the beliefs are that certain crimes should be committed.
This is a free country (disputably) and you should be able to think and say whatever you want, but I also think it is reasonable for law enforcement in the investigation of said crimes to also investigate links to other members in the movement.
wanderingbit
I’m from Burlington and a couple weeks ago downtown I noticed a group of 2 or 3 people walking past me in full black clothing with ski masks (the kind you rob banks with).
I thought it was strange, having never seen that before except on Halloween, but didn’t think to alert any authorities specifically because Burlington is filled with people dressing differently and doing strange things. But 99% of the time it’s totally non violent and benign.
I’m guessing this was them. Scary!
t_mann
Rationalism? The term has been used a lot of times since Pythagoras [0], but the combination of Bay Area, Oxford, existential risks, AI safety makes it sound like this particular movement could have formed in the same mold as Effective Altruism and Long-Termism (ie, the "it's objectively better for humanity if you give us money to buy a castle in France than whatever you'd do with it" crowd that SBF sprung from). Can somebody in know weigh in?
rachofsunshine
You're correct. Those communities heavily overlap.
Take, for example, 80,000 Hours, among the more prominent EA organizations. Their top donors (https://80000hours.org/about/donors/) include:
- SBF and Alameda Research (you probably knew this),
- the Berkeley Existential Risk Initiative, founded (https://www.existence.org/team) by the same guy who founded CFAR (the Center for Applied Rationality, a major rationalist organization)
- the "EA infrastructure fund", whose own team page (https://funds.effectivealtruism.org/team) contains the "project lead for LessWrong.com, where he tries to build infrastructure for making intellectual progress on global catastrophic risks"
- the "long-term future fund", largely AI x-risk focused
and so on.
hollerith
Just like HN grew around the writing of Paul Graham, the "rationalist community" grew (first on overcomingbias.com, then moving to lesswrong.com) around the writings of Eliezer Yudkowsky. Similar to how Paul Graham no longer participates on HN, Eliezer rarely participates on lesswrong.com anymore, and the benevolent dictator for life of lesswrong.com is someone other than Eliezer.
Eliezer's career has always been centered around AI. At first Eliezer was wholly optimistic about AI progress. In fact, in the 1990s, I would say that Eliezer was the loudest voice advocating for the development of AI technology that would exceed human cognitive capabilities. From 2001 to 2004 he started to believe that AI has a strong tendency to become very dangerous once it starts exceeding the human level of cognitive capabilities. Still, he hoped that before AI starts exceeding human capabilities, he and his organization could develop a methodology to keep it safe. As part of that effort, he coined the term "alignment". The meaning of the term has broadened drastically: when Eliezer coined it, he meant the creation of an AI that stays aligned with human values and human preferences even as its capabilities greatly exceed human capabilities. In contrast, these days, when you see the phrase "aligned AI", it is usually being applied to an AI system that is not a threat to people only because it's not cognitively capable enough to dis-empower human civilization.
By the end of 2015, Eliezer had lost most of the hope he initially had for the alignment project in part because of conversations he had with Elon Musk and Sam Altman at an AGI conference in Puerto Rico followed by Elon and Sam's actions later that year (which included the founding of OpenAI). Eliezer still considers the alignment problem solvable in principle if a sufficiently-smart and sufficiently-careful team attacks it, but considers it extremely unlikely any team will manage a solution before the AI labs cause human extinction.
In April 2022 he went public with his despair and announced that his organization (MIRI) will cease work on the alignment project and will focus on lobbying the governments of the world to ban AI (or at least the deep-learning paradigm, which he considers too hard to align) before it is too late.
The rationalist movement that began in November 2006 on overcomingbias.com was always seen by Eliezer as secondary to the AI-alignment enterprise. To help advance this secondary project, the Center for Applied Rationality (CFAR) was founded in 2012. Eliezer is neither an employee nor a member of the board of this CFAR. He is employed by and on the board of the Machine Intelligence Research Institute (MIRI) which was founded in 2000 as the Singularity Institute for Artificial Intelligence. (It adopted its current name in 2013.)
Effective altruism has separate roots, but the two communities have become close over the years, and EA organizations have donated millions to MIRI.
throwawayk7h
That castle was found to be more cost-effective than any other space the group could have purchased, for the simple reason that almost nobody wants castles anymore. It was chosen because it was the best calculation; the optics of it were not considered.
It would be less disingenuous if you were to say EA is the "it's objectively better for humanity if you give us money to buy a conference space in France than whatever you'd do with it" crowd -- the fact that it was a castle shouldn't be relevant.
zozbot234
The depressing part is that the "optics" of buying a castle are pretty good if you care about attracting interest from elite "respectable" donors, who might just look down on you if you give off the impression of being a bunch of socially inept geeks who are just obsessed with doing the most good they can for the world at large.
nullc
There is significant overlap between the EA and Lesswrongy groups, also parallel psychopathic (oh sorry, I mean "utilitarian navel gazing psychopathy") policy perspectives.
E.g. there is (or was) some EA subgroup that wanted the development of a biological agent that would genocide all the wild animals, because-- in their view-- wild animals lived a life of net suffering and so exterminating all of them would be a kindness.
... just in case you wanted an answer to the question "what would be even less ethical than the Ziz-group intention to murder meat-eaters"...
zozbot234
Look, if we're going to genocide all wild animals we should do that by hunting them and eating the resulting meat, since that maximizes the amount of Fun we'll have in the process. (A practical demonstration of "Animal Rights Horseshoe Theory")
matthewdgreen
From the article:
A 2023 post on Rationalism forum LessWrong.com warned of coming violence in the Zizian community. “Over the past few years, Ziz has repeatedly called for the deaths of many different classes of people,” the anonymous post read. Jessica Taylor, a friend of Baukholt’s, told Open Vallejo she warned Baukholt about the Zizians, describing the group on X as a “death cult.”
The post: https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
TechDebtDevin
True Anon just did a great episode on this.
https://www.patreon.com/posts/121111568?utm_campaign=postsha...
zozbot234
Relevant link (2023): https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
The top comment has an interesting take: "Unless Ziz gets back in the news, there’s not much reason for someone in 2025 or later to be reading this."
sam345
Doesn't sound rationalist to me (from Ziz quoted section of article linked below):
"Ziz
Impostors keep thinking it's safe to impersonate single goods. A nice place to slide in psyche/shadow, false faces, "who could ever falsify that I'm blaming it on my headmate!"
Saying you're single good is saying, "Help, I have a Yeerk in my head that's a mirror image of me. I need you to surgically destroy it, even if I'm then crippled for life or might die in the process. Then kill me if I ever do one evil act for the rest of my life. That's better than being a slave. Save me even though it is so easy to impersonate me. And you will aggro so many impostors you'll then be in a fight to the death(s) with. Might as well then kill me too if I don't pass an unthinkable gom jabbar. That'll make us both safer from them and I care zero about pain relative to freedom from my Yeerk at any cost."
It's an outsized consequentialist priority, even in a doomed timeline, to make it unsafe to impersonate single goods.
Critical to the destiny of the world. The most vulnerable souls impostors vex. To bring justice to individual people, from collective punishment."
https://openvallejo.org/2025/01/31/zizian-namesake-who-faked.... More detail.
apsec112
This summary doc, "The Zizian Facts", is another collection of relevant information from various sources (including recent events):
https://docs.google.com/document/u/0/d/1RpAvd5TO5eMhJrdr2kz4...
romaaeterna
These people?
https://nypost.com/2025/01/30/us-news/killing-of-border-patr...
Is the appellation in the headline, "radical vegan trans cult," a true description?
> Authorities now say the guns used by Youngblut and Bauckholt are owned by a person of interest in other murders — and connected to a mysterious cult of transgender “geniuses” who follow a trans leader named Jack LaSota, also known by the alias “Ziz.”
Is all this murder stuff broadly correct?
Trasmatta
The NY Post tried to frame them as "radical leftist", but that's a big stretch. I don't think most rationalists would consider themselves leftist. The article also seems to be leaning into the current "trans panic" - pretty typical for the NYP.
romaaeterna
I also dislike Right/Left categorizations. Most people don't even know the history of the terms and their roots in the French Revolution. Though the "Cult of Reason" established then certainly had the Left categorization at the time.
But is the trans element not a major part of this cult? It seemed to be from the linked story in the top link. But if there is something incorrect there, or false in the NYP reporting, you should point it out. If it is a major element of this cult, then far from complaining about NYP, I would complain about any news organization leaving it out of its reporting.
brokensegue
I don't think being trans is part of their beliefs or a requirement to be a member
Trasmatta
There's a very clear agenda at the NY Post to make transgender people seem scary and evil, and part of a "leftist conspiracy". That post definitely frames it in that way.
The truth is that transgenderism and leftism are barely part of this story at all (the real story is much weirder and more complicated, and part of the wider "rationalist" movement).
slooonz
> I don't think most rationalists would consider themselves leftist
Yes they do.
https://docs.google.com/forms/d/e/1FAIpQLSf5FqX6XBJlfOShMd3U...
affinepplan
liberal is not leftist
Trasmatta
The largest response there appears to be "liberal", which is not "leftist". US right wing media (and the current administration) likes to frame anyone that's not a hardcore Republican as a "radical leftist", but that doesn't make it true.
postepowanieadm
Does it really matter? Nazis called themselves socialists.
wahnfrieden
1.6% Marxist
Did you confuse Liberal with Leftist? Liberals are anti-left
Some portion are Libertarian but there's no distinction between so-called "ancap" and libcom so that one is murky or more often coded for the former (the Libertarian party in the US is anti-left)
wisty
Left libertarian would be more likely, I think?
zozbot234
The cult does seem to target people who identify as trans - OP has some discussion of this. Not sure if that justifies calling it a "radical vegan trans cult" though. Trans folks seem to be overrepresented in rationalist communities generally, at least on the West Coast - but there may be all sorts of valid reasons for that.
none_to_remain
None of the murder victims I'm aware of were transgender?
billjings
Target as in, target for recruitment into the group.
The targets for their victims seem chosen...as retaliation to defend their understanding of their own interests.
stefantalpalaru
[dead]
A later article by the same author: https://www.sfgate.com/bayarea/article/leader-alleged-bay-ar.... Probably makes sense to read both or neither.