String of recent killings linked to Bay Area 'Zizians'
951 comments
·January 30, 2025rachofsunshine
[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
1) While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...
2) Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
4) The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
5) It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.
emmelaich
G.K.Chesterton knew it, 100 years ago:
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
wcfrobert
Or David Hume, 300 years ago:
"Reason is, and ought only to be, the slave of the passions"
MichaelZuo
To be fair it did work out suprisingly well in the early days, even the really weird comment chains attracted only a small minority of the bizarrely deranged. Probably because back then the median LW commentator was noticeably smarter than the median HN commentator.
Pascal’s mugging was even coined there I believe, but then as it grew… whatever communal anti-derangement protections existed gradually declined.
And it now is more often than not a negative example.
noduerme
I'll take Kurt Vonnegut, from "Mother Night":
I have never seen a more sublime demonstration of the totalitarian mind, a mind which might be linked unto a system of gears where teeth have been filed off at random. Such snaggle-toothed thought machine, driven by a standard or even by a substandard libido, whirls with the jerky, noisy, gaudy pointlessness of a cuckoo clock in Hell.
mikhmha
So did Fyodor Dostoevsky.
sublimefire
Yes, exactly, “Crime and punishment” or ”Demons” or others. Some of the dialogues are exactly about the ideologies and how different characters think and apply them, how reason manifests in violence.
bowsamic
Even Aristotle knew that reason was just an aspect of being a human and not the whole thing
To be honest the only philosopher I know of who convincingly argued that everything is reason is Hegel, but he did so more by making the idea of reason so broad that even empiricism (and emotion, humour, love, the body, etc.) falls under it...
null
01100011
Another thing I'll add after having spent a few years in a religious cult:
It's all about the axioms. If you tweak the axioms you can use impeccable logic to build a completely incorrect framework that will appeal to otherwise intelligent(and often highly intelligent) people.
Also, people are always less rational than they are able to admit. The force of things like social connection can very easily warp the reasoning capabilities of the most devout rationalist(although they'll likely never admit that).
simplicio
Im kinda skeptical these folks were following some hyper-logical process from flawed axioms that led them to the rigorous solution: "I should go stab our land-lord with a samurai sword" or "I should start a shootout with the Feds".
The rationalist stuff just seems like some meaningless patter they stuck ontop of more garden variety cult stuff.
dartos
If you convince someone smart who errs towards logic of your axioms, they’ll convince themselves to do whatever you want.
rng-concern
The axioms of rationality, morality, etc. I've always found interesting.
We have certain axioms, (let me chose an arbitrary, and possibly not quite an axiomy-enough example): "human life has value". We hold this to be self-evident and construct our society around it.
We also often don't realize that other people and cultures have different axioms of morality. We talk/theorize at this high level, but don't realize we have different foundations.
CPLX
Right, and a related problem is a lot of the logic is more like Zeno’s paradox.
janderson215
Succinct. This should be handed out as a “Signs you’re being manipulated” flyer to young people.
null
lostdog
Wow, what a perfect description of why their probability-logic leads to silly beliefs.
I've been wondering how to argue within their frame for a while, and here's what I've come up with: Is the likelihood that aliens exist, are unfriendly, and AGI will help us beat them higher or lower than the likelihood that the AGI itself that we develop is unfriendly to us and wants to FOOM us? Show your work.
nkrisc
It’s pointless. They aren’t rational. Any argument you come up with that contradicts their personal desires will be successfully “reasoned” away by them because they want it to be. Your mistake was ever thinking they had a rational thought to begin with, they think they are infallible.
HeatrayEnjoyer
"Widespread robots that make their own decisions autonomously will probably be very bad for humans if they make decisions that aren't in our interest" isn't really that much of a stretch is it?
If we were going slower maybe it would seem more theoretical. But there are multiple Manhattan-Project-level or (often, much) larger efforts ongoing to explicitly create software and robotic hardware that makes decisions and takes actions without any human in the loop.
We don't need some kind of 10000 IQ god intelligence if a glitch token causes the majority of the labor force to suddenly and collectively engage in sabotage.
shadowgovt
Much of philosophy throughout history seems to operate this way.
I think philosophy is a noble pursuit, but it's worth noting how often people drew very broad conclusions, and then acted on them, from not very much data. Consider the dozens of theories of the constitution of the world from the time of the Greek thinkers (even the atomic theory doesn't look very much at all like atoms as we now understand them), or the myriad examples of political philosophies that ran up against people simply not acting the way the philosophy needed them to act to cohere.
The investigation of possibility is laudable, but a healthy and regular dose of evidence is important.
implements
> Much of philosophy throughout history seems to operate this way.
“Philosophy is poor at revealing truths but excellent at revealing falsehoods (or at least unsupported arguments)” was the main lesson I took from informally studying it.
suddenlybananas
The idea that you need evidence to justify your beliefs is a philosophical position.
yawboakye
very few philosophers dared to live by their theories. the famous failures of aristotle (in the case of alexander) and plato in syracuse (where he saw firsthand that the philosopher-king is at best a book character) are good examples. seneca didn’t live stoically: he was avaricious and didn’t bother to incite civil war over unpaid debt, if ancient sources are to be believed. he failed horribly with nero, who later instructed him to commit suicide for treasonous crimes. again, if ancient sources are to be believed, he fumbled his suicide out of raw fear.
the cynics, though, made a good life but that’s not because they had a better philosophy. it’s because cynicism is base/primitive logic available to the brute as well as the civilized man.
idunnoman1222
They think they can predict the future by extension know what’s good for us. If they could choose you wouldn’t get a vote.
michaelkeenan
AGI would be extremely helpful in navigating clashes with aliens, but taking the time to make sure it's safe is very unlikely to make a difference to whether it's ready in time. Rationalists want AGI to be built, and they're generally very excited about it, e.g. many of them work at Anthropic. They just don't want a Move Fast and Break Things pace of development.
waveBidder
the term you're looking for is pascal's mugging, and it originates from within rationalism
IshKebab
It seems a bit nonsense to me.
> The mugger argues back that for any low but strictly greater than 0 probability of being able to pay back a large amount of money (or pure utility) there exists a finite amount that makes it rational to take the bet.
This is a basic logic error. It ignores the very obvious fact that increasing the reward amount decreases the probability that it will be returned.
E.g. if the probability of the reward R being returned is (0.5/R) we get "a low by strictly greater than 0 probability", and for that probability there is a (different) finite reward that would make it rational to take the bet, but it's not R.
This is even simpler and more stupid than the "proofs" that 0=1. It does not change my opinion that philosophers (and lesswrong) are idiots.
throwaway4aday
It seems that you didn't understand the main point of the exposition. I'll summarize the ops comment a bit further.
Points 1 and 2 only explain how they are able to erroneously justify their absurd beliefs, they don't explain why they hold those beliefs.
Points 3 through 5 are the heart of the matter; egotistical and charismatic (to some types of people) leaders, open minded, freethinking and somewhat weird or marginalized people searching for meaning plus a way for them all to congregate around some shared interests.
TLDR: perfect conditions for one or more cults to form.
api
No, it’s the “rationality.” Well maybe the people too, but the ideas are at fault.
As I posted elsewhere on this subject: these people are rationalizing, not rational. They’re writing cliche sci-fi and bizarre secularized imitations of baroque theology and then reasoning from these narratives as if they are reality.
Reason is a tool not a magic superpower enabling one to see beyond the bounds of available information, nor does it magically vaporize all biases.
Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
That's why every step needs to be checked with experiment or observation before a next step is taken.
I have followed these people since stuff like Overcoming Bias and LessWrong appeared and I have never been very impressed. Some interesting ideas, but honestly most of them were recycling of ideas I’d already encountered in sci-fi or futurist forums from way back in the 1990s.
The culty vibes were always there and it instantly put me off, as did many of the personalities.
“A bunch of high IQ idiots” has been my take for like a decade or more.
rachofsunshine
> As I posted elsewhere on this subject: these people are rationalizing, not rational.
That is sometimes true, but as I said in another comment, I think this is on the weaker end of criticisms because it doesn't really apply to the best of that community's members and the best of its claims, and in either case isn't really a consequence of their explicit values.
> Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
True, but an odd analogy: we use software to make very important predictions all the time. For every Therac-25 out there, there's a model helping detect cancer in MRI imagery.
And, of course, other methods are also prone to error.
> That's why every step needs to be checked with experiment or observation before a next step is taken.
Depends on the setting. Some hypotheses are not things you can test in the lab. Some others are consequences you really don't want to confirm. Setting aside AI risk for a second, consider the scientists watching the Trinity Test: they had calculated that it wouldn't ignite the atmosphere and incinerate the entire globe in a firestorm, but...well, they didn't really know until they set the thing off, did they? They had to take a bet based on what they could predict with what they knew.
I really don't agree with the implicit take that "um actually you can never be certain so trying to reason about things is stupid". Excessive chains of reasoning accumulate error, and that error can be severe in cases of numerical instability (e.g. values very close to 0, multiplications, that kind of thing). But shorter chains conducted rigorously are a very important tool to understand the world.
api
> "um actually you can never be certain so trying to reason about things is stupid"
I didn't mean to say that, just that logic and reason are not infallible and have to be checked. Sure we use complex software to detect cancer in MRI images, but we constantly check that this software works by... you know... seeing if there's actual cancer where it says there is, and if there's not we go back around the engineering circle and refine the design.
Let's say I use the most painstaking, arduous, careful methods to design an orbital rocket. I take extreme care to make every design decision on the basis of physics and use elaborate simulations that my designs are correct. I check, re-check, and re-check. Then I build it. It's never flown before. You getting on board?
Obviously riding on an untested rocket would be insane no matter how high-IQ and "rational" its engineers tried to be. So is revamping our entire political, economic, or social system on the basis of someone's longtermist model of the future that is untestable and unverifiable. So is banning beneficial technologies on the basis of hypothetical dangers built on hypothetical reasoning from untestable priors. And so on...
... and so is, apparently, killing people, because reasons?
hollerith
>They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm".
Which leader said anything like that? Certainly not Eliezer or the leader of the Center for Applied Rationality (Anna Salamon) or the project lead of the web site lesswrong.com (Oliver Habryka)!
habryka
Hello, can confirm, criticism is like the bread and butter of LW, lol. I have very extensively criticized tons of people in the extend rationality ecosystem, and I have also never seen anyone in any leadership position react with anything like this quote. Seems totally made up.
throwaway-rat
I found Eliezer's Facebook post which OP likely was thinking of. https://www.facebook.com/yudkowsky/posts/pfbid0WN6GeX8S9DK9T...
> I feel like it should have been obvious to anyone at this point that anybody who openly hates on this community generally or me personally is probably also a bad person inside and has no ethics* and will hurt you if you trust them and will break rules to do so; but in case it wasn't obvious, consider the point made explicitly.
> (Let not this post be construed as casting aspersions on any of the many, many people who've had honest disagreements with me or us, including loud or heated or long ones, that they conducted by debates about ideas rather than insinuations about people.)
Creepy. But after people argued with Eliezer for a considerable time --he made 11 updates. The result was less shockingly bad:
> There's a certain cluster of behaviors and attitudes, which includes things like "getting excited about opportunities to make fun of furries" or uttering phrases like "group X is a bunch of neckbeards". Notably, this is not the same cluster as "strongly and vocally disagreeing with group X about idea Y". Call the first cluster "frobnitz".
> I feel like it should have been obvious to anyone at this point that anybody who openly frobnitzes me, or even more so frobnitzes this community, or even more so still frobnitzes a genuine cinnamon-roll-grade Level Ten Arch-Bodhisattva like Scott Alexander or Scott Aaronson, probably lacks an internal commitment to ordinary interpersonal ethical injunctions and will hurt you if you trust them and will break rules to do so. But in case it wasn't obvious, consider the point made explicitly. (Subtext: Topher Brennan. Do not provide any link in comments to Topher's publication of private emails, explicitly marked as private, from Scott Alexander.)
If this is the only evidence-- OPs allegation is exaggerated but not 'totally made up'.
throwaway-rat
I'm pretty sure I remember a post on Eliezer's Facebook from the early 2010s. I have definitely witnessed some... well - you know, 'culty' vibes and social pressure around Less Wrong.
nullc
> Rationalists, by tending to overly formalist approaches,
But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe". Few to none in the community know squat about actually computing a posterior probability, but they'll all happily chant "shut up and multiply" as a justification for whatever nonsense they instinctively wanted to do.
> Precision errors in utility calculations that are numerically-unstable
Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do. The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid. It's because in the face of substantial uncertainty about the world (and your own calculation processes) reasoning things out can only take you so far. A useful tool in some domains, but not a generalized philosophy for life ... The cognitive biases they obsess about and go out of their way to eschew are mostly highly evolved harm mitigation heuristics for reasoning against uncertainty.
> that is particularly susceptible to internally-consistent madness
It's typical for cults to cultivate vulnerable mind states for cult leaders to exploit for their own profit, power, sexual fulfillment, etc.
A well regulated cult keeps its members mental illness within a bound that maximized the benefit for the cult leaders in a sustainable way (e.g. not going off and murdering people, even when doing so is the logical conclusion of the cult philosophy). But sometimes people are won over by a cult's distorted thinking but aren't useful for bringing the cult leaders their desired profit, power, or sex.
rachofsunshine
> But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe".
I broadly agree with this criticism, but I also think it's kind of low-hanging. At least speaking for myself (a former member of those circles), I do indeed sit down and write quantitative models when I want to estimate things rigorously, and I can't be the only one who does.
> Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do.
This, on the other hand, I don't think is a valid criticism nor correct taken in isolation.
You can absolutely make meaningful predictions about the world despite uncertainties. A good model can tell you that a hurricane might hit Tampa but won't hit New Orleans, even though weather is the textbook example of a medium-term chaotic system. A good model can tell you when a bridge needs to be inspected, even though there are numerous reasons for failure that you cannot account for. A good model can tell you whether a growth is likely to become cancerous, even though oncogenesis is stochastic.
Maybe a bit more precisely, even if logic cannot tell you what sets of beliefs are correct, it can tell you what sets of beliefs are inconsistent with one another. For example, if you think event X has probability 50%, and you think event Y has probability 20% conditional on X, it would be inconsistent for you to believe event Y has a probability of less than 10%.
> The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid
When I thought about founding my company last January, one of the first things I did was sit down and make a toy model to estimate whether the unit economics would be viable. It said they would be, so I started the company. It is now profitable with wide operating margins, just as that model predicted it would be, because I did the math and my competitors in a crowded space did not.
Yeah, it's possible to be overconfident, but let's not forget where we are: startups win because people do things in dumb inefficient ways all the time. Sometimes everyone is wrong and you are right, it's just that that usually happens in areas where you have singularly deep expertise, not where you were just a Really Smart Dude and thought super hard about philosophy.
LeroyRaz
What you describe (doing basic market analysis) is pretty much unrelated to 'rationality.'
'Rationality' hasn't really made any meaningful contributions to human knowledge or thinking. The things you describe, are all products of scientists and statisticians, etc...
Bayesian statistics is not rationality. It is just Bayesian statistics... And it is mathematicians who should get the credit not less wrong!!
nullc
Perhaps I overemphasized it, but a personal experience on that front was key to realizing that the lesswrong community was in aggregate a bunch of bullshit sophistic larpers.
In short, some real world system had me asking a simply poised probabilities question. I eventually solved it. I learned two things as a result, one (which I kinda knew, but didn't 'know' before) is that the formal answer to even very simple question can be extremely complicated (e.g. asking for the inverse of a one line formal turning into a half page of extremely dense math), and two that many prominent members of the lesswrong community were completely clueless about the practice of the tools they advocate, not even knowing the most basic search keywords or realizing that there was little hope of most of their fans ever applying these tools to all but the absolute simplest questions.
> You can absolutely make meaningful predictions about the world despite uncertainties. A good model can tell you that a hurricane might
Thanks for the example though-- reasoning about hurricanes is the result of decades of research by thousands of people, the inputs involve data from thousands of weather stations including floating buoys, multiple satellites, and aircraft that fly through the storms to get data. The calculations include numerous empirically derived constants that provide averages for unmeasureable quantities for inputs that the models need plus adhoc corrections to fit model outputs to previously observed behavior.
And the results, while extremely useful, are vague and not particularly precise-- there are many questions they can't answer.
While it is a calculation, it is very much an example of empiracy being primary over reason.
And if someone is thinking that our success with hurricane modeling tells them anything about their ability to 'reason things out' from their own life, without decades of experience, data collection, satellite monitoring, teams of PHD, then they're just mistaken. It's just not comparable.
Reasoning things out, with or without the aid of data, can absolutely be of use. But that utility is bounded by the quality of our data, our understanding of the world, errors in our reasoning process, etc. And people do engage in that level of reasoning all the time. But it's not more primary than it is because of the significant and serious limitations.
I suspect that the effort require to calculate things out also comes with a big risk of overconfidence. Like, stick your thumb in the air, make some rough cash flow calculations, etc. That's a good call and probably captures the vast majority of predictive power for some new business. But if instead you make some complicated multi-agent computational model of the business it might only have a little be more predictive power but a lot more risk of following it off a cliff when experience is suggesting the predictions were wrong.
> people do things in dumb inefficient ways all the time
Or, even more often, they're optimizing for a goal different than yours, one that might not even be legible to you!
> just as that model predicted it would be, because I did the math and my competitors in a crowded space did not.
or so you think! Often organizations fail to do "obvious" things because there are considerations that just aren't visible or relevant to outsiders, rather than any failure of reasoning.
For example, I've been part of an org that could have pivoted to a different product and made more money... but doing so would have meant laying off a bunch of people that everyone really liked working with. The extra money wasn't worth it. Whomever eventually scooped up that business might have thought they were smart for seeing it where we didn't, but if so they'd be wrong about why we didn't do it. We saw the opportunity and just had different objectives.
I wouldn't for a moment argue that collections of people don't do stupid things, they do-- but there is a lot less stupid than you might assume on first analysis.
> it's just that that usually happens in areas where you have singularly deep expertise, not where you were just a Really Smart Dude and thought super hard about philosophy
We agree completely there-- but it's really about the data and expertise. Sure, you have to do the thinking to connect the dots, and then have the courage and conviction (or hunger) to execute on it. You may need all three of data, expertise, and fancy calculations. But the third is sometimes optional and the former two are almost never optional and usually can only be replaced by luck, not 'reasoning'.
barnabee
> A good model can tell you that a hurricane might hit Tampa but won't hit New Orleans
On the other hand we have no model to predict that hurricane a year in advance and tell us which city it’ll hit.
Yet these people believe they can rationalise about far more unpredictable events far further in the future.
That is, I agree that they completely ignore the point at which uncertainties utterly dominate any calculation you might try to do and yet continue to calculate to a point of absurdity.
snoopertroop
[flagged]
novok
I noticed years ago too that AI doomers and rationalist type were very prone to (infinity * 0 = infinity) types of traps, which is a fairly autistic way of thinking. Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
shoo
> Humanity long time ago decided that infinity * 0 = 0
I'm guessing you don't mean this in any formal mathematical sense, without context, infinity multiplied by zero isn't formally defined. There could be various formulations and contexts where you could define / calculate something like infinity * zero to evaluate to whatever you want. (e.g. define f(x) := C x and g(x) := 1/x, What does f(x) * g(x) evaluate to in the limit as x goes to infinity? C. And we can interpret f(x) as going to infinity while g(x) goes to zero, so we can use that to justify writing "infinity * 0 = C" for an arbitrary C... )
So, what do you mean by "infinity * 0 = infinity" informally? That humans regard the expected value of (arbitrarily large impact) * (arbitrarily small probability) as zero?
lupusreal
It's true in the informal sense. Normal people, when considering an "infinitely" bad thing happening (being killed, losing their home, etc) with a very low probability will round that probability to zero ("It won't happen to ME"), multiply the two and resultantly spend zero time worrying about it, planning for it, etc.
For instance, a serial killer could kill me (infinitely bad outcome) but the chance of that happening is so tiny I treat it as zero, and so when I leave my house every day I don't look into the bushes for a psycho murderer waiting there for me, I don't wear body armor, I am unarmed, I don't even think about the chance of being killed by a serial killer. For all practical intents and purposes I treat that possibility as zero.
Important to remember that different people gave different thresholds at which they round to zero. Some people run through dark parking garages and jump into their car because they don't round the risk of a killer under their car slashing their achilles tendons down to zero. Some people carry a gun everywhere they go, because they don't round the risk of encountering a mass shooter to zero. Some people invest their time and money pursuing spaceflight development because they don't round a dino-killing asteroid to zero. A lot of people don't round the chance of wrecking a motorcycle to zero, and therefore don't buy one even though they look like fun.
The lesswrong/rationalist people have a tendency to have very low thresholds at which they'll start to round to zero, at least when the potential harm would be met out to a large portion of humanity. Their unusually low threshold leads them to very unusual conclusions. They take seriously possibilities which most people consider to be essentially zero, giving rise to the perception that rationalists don't think that infinity * 0 = 0.
incrudible
> That humans regard the expected value of (arbitrarily large impact) * (arbitrarily small probability) as zero?
There are many arguments that go something like this: We don't know the probability of <extinction-level event>, but because it is considered a maximally bad outcome, any means to prevent it are justified. You will see these types of arguments made to justify radical measures against climate change or AI research, but also in favor space colonization.
These types of arguments are "not even wrong", they can't be mathematically rigorous, because all terms in that equation are undefined, even if you move away from infinities. The nod to mathematics is purely for aesthetics.
saagarjha
No, humanity decided that infinity doesn't exist and anyone trying to tell you about it is selling you religion.
harrison_clarke
not exactly a rationalist thing, but a lot of bay-area people will tell you that exponential growth exists, and it's everywhere
i can't think of any case where exponential growth actually happens, though. exponential decay and logistic curves are common enough, but not exponential growth
bowsamic
They actively look for ways for infinity to happen. Look at Eli's irate response to Roko's basilisk. To him even being able to imagine that there is a trap means that it will necessarily be realised.
I've seen "rationalist" AI doomers who say things like "given enough time technology will be invented to teleport you into the future where you'll be horifically tortured forever".
It's just extrapolation, taken to the extreme, and believed in totally religiously.
coldtea
>which is a fairly autistic way of thinking.
Any prominent ones of them I've read or met either openly shares their diagnosis, or 100% fits the profile.
meroes
Reminds me of the https://en.wikipedia.org/wiki/Measure_problem_(cosmology).
whimsicalism
i think you are putting too many people in one bucket
yellowapple
> Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
Among them being that ∞ × 0 = ∞ makes no mathematical sense. Multiplying literally any other number by zero results in zero. I see no reason to believe that infinity (positive or negative) would be some exception; infinity instances of nothing is still nothing.
ryandv
The problem is that infinity is neither a real nor a complex number, nor an element of any algebraic field, and the proposition that "x * 0 = 0" only holds if x is an element of some algebraic field. It is a theorem that depends on the field axioms.
The real numbers can be extended to include two special elements ∞ and -∞, but this extension does not constitute a field, and the range of expressions in which these symbols make sense is very strictly and narrowly defined (see Rudin's PMA, Definition 1.23):
(a) If x is real then
x + ∞ = +∞, x - ∞ = -∞, x / +∞ = x / -∞ = 0.
(b) If x > 0 then x * (+∞) = +∞, x * (-∞) = -∞.
(c) If x < 0 then x * (+∞) = -∞, x * (-∞) = +∞.
The extended real number system is most commonly used when dealing with limits of sequences, where you may also see such symbols appear: 3.15 Definition Let {sₙ} be a sequence of real numbers with the following property: For every real M there is an integer N such that n ≥ N implies sₙ ≥ M. We then write
sₙ ⟶ +∞.
In no other contexts do the symbols ∞ and -∞ make any sense. They only make sense according to the definitions given.It's usually the case that when you see people discussing infinity that they are actually talking about sequences of numbers that are unbounded above (or below). The expression "sₙ ⟶ +∞" is meant to denote such a sequence, and the definitions that extend the real number line (as in Definition 1.23 above) are used to do some higher-level algebra on limits of sums and products of sequences (e.g. the limit of sₙ + tₙ as n becomes "very large" for two sequences {sₙ}, {tₙ}) to shortcut around the lower-level formalisms of epsilons and neighborhoods of limit points in some metric space, which is how the limits of sequences are rigorously defined.
In no case do the symbols ∞ and -∞ refer to actual numbers. They are used in expressions that refer to properties of certain sequences once you look far enough down the sequence, past its first, second, hundredth, umpteenth, "Nth" terms, and so on.
Thus when you see people informally and loosely use expressions such as "infinity times zero" they're not actually multiplying two numbers together, but rather talking about the behavior of the product of two sequences as you evaluate terms further down both sequences; one of which is unbounded, while the other can be brought arbitrarily close to (but not necessarily equal to) zero. You will notice that no conclusions can be drawn regarding the behavior of such a product in general, whether referencing the definitions comprising the extended real number system or the lower-level definitions in terms of epsilons and neighborhoods of limit points.
So much confusion today comes down to people confidently using words, symbols, and signs they don't understand the definitions nor meanings of. Sometimes I wonder if this is the real esoteric meaning of the ancient Tower of Babel mythos.
ericd
Brilliant summary, thanks.
I'm interested in #4, is there anywhere you know of to read more about that? I don't think I've seen that described except obliquely in eg sayings about the relationship between genius and madness.
rachofsunshine
I don't, that one's me speaking from my own speculation. It's a working model I've had for a while about the nature of a lot of kinds of mental illness (particularly my own tendencies towards depression), which I guess I should explain more thoroughly! This gets a bit abstract, so stick with me: it's a toy model, and I don't mean it to be definitive truth, but it seems to do well at explaining my own tendencies.
-------
So, toy model: imagine the brain has a single 1-dimensional happiness value that changes over time. You can be +3 happy or -2 unhappy, that kind of thing. Everyone knows when you're very happy you tend to come down, and when you're very sad you tend to eventually shake it off, meaning that there is something of a tendency towards a moderate value or a set-point of sorts. For the sake of simplicity, let's say a normal person has a set point of 0, then maybe a depressive person has a set point of -1, a manic person has a set point of +1, that sort of thing.
Mathematically, this is similar to the equations that describe a spring. If left to its own devices, a spring will tend to its equilibrium value, either exponentially (if overdamped) or with some oscillation around it (if underdamped). But if you're a person living your life, there are things constantly jostling the spring up and down, which is why manic people aren't crazy all the time and depressed people have some good days where they feel good and can smile. Mathematically, this is a spring with a forcing function - as though it's sitting on a rough train ride that is constantly applying "random" forces to it. Rather than x'' + kx = 0, you've got x'' + kx = f(t) for some external forcing function f(t), where f(t) critically does not depend on x or on the individual internal dynamics involved.
These external forcing functions tend to be pretty similar among people of a comparable environment. But the internal equilibria seem to be quite different. So when the external forcing is strong, it tends to pull people in similar directions, and people whose innate tendencies are extreme tend to get pulled along with the majority anyway. But when external forcing is weak (or when people are decoupled from its effects on them), internal equilibria tend to take over, and extreme people can get caught in feedback loops.
If you're a little more ML-inclined, you can think about external influences like a temperature term in an ML model. If your personal "model" of the world tends to settle into a minimum labeled "completely crazy" or "severely depressed" or the like, a high "temperature" can help jostle you out of that minimum even if your tendencies always move in that direction.
Basically, I think weird nerds tend to have low "temperature" values, and tend to settle into their own internal equilibria, whether those are good, bad, or good in some cases and bad in others (consider all the genius mathematicians who were also nuts). "Normies", for lack of a better way of putting it, tend to have high temperature values and live their lives across a wider region of state space, which reduces their ability to wield precision and competitive advantage but protects them from the most extreme failure-modes as well.
ericd
Thanks very much for typing all that out!
>These external forcing functions tend to be pretty similar among people of a comparable environment. But the internal equilibria seem to be quite different. So when the external forcing is strong, it tends to pull people in similar directions, and people whose innate tendencies are extreme tend to get pulled along with the majority anyway. But when external forcing is weak (or when people are decoupled from its effects on them), internal equilibria tend to take over, and extreme people can get caught in feedback loops.
Yeah, this makes sense, an isolated group can sort of lose the "grounding" of interacting with the rest of society and start floating off in whatever direction, as long as they never get regrounded. When you say feedback loops, do you mean obsessive tendencies tending to cause them to focus on and amplify a small set of thoughts/beliefs, or something else?
I like the ML/temperature analogy, it's always interesting watching kids and thinking in that vein, with some kids at a super high temp exploring the search space of possibilities super quickly and making tons of mistakes, and others who are much more careful. Interesting point on nerds maybe having lower temp/converging more strongly/consistently on a single answer. And I guess artist types would be sort of the opposite on that axis?
0xDEAFBEAD
Interesting, this is the first argument I've heard that emotional instability is actually good for your mental health.
hibikir
There's another way around it. People that see themselves as "freethinkers" are also ultimately contrarians. Taking contrarianism as part of your identity makes people value unconventional ideas, but turn that around: It also means devaluing mainstream ideas. Since humanity is basically an optimization algorithm, being very contrarian means that, along with throwing away some bad assumptions, one also throws away a whole lot of very good defaults. So one might be right in a topic or two, but overall, a lot of bad takes are going to seep in and poison the intellectual well.
derektank
You don't have to adopt the ideas of every fringe or contrarian viewpoint you come across to be a freethinker; you simply have to be willing to consider and evaluate those views with the same level of rigor you give to mainstream views. Most people who do that will probably adopt a handful of fringe beliefs but, for the most part, retain a very large number of conventional beliefs too. Julia Galef is kind of an archetypal rationalist/free thinker and she has spoken about the merits of traditional ideas from within a rationalist framework.
aptwebapps
Here's another analysis which comes from a slightly different angle.
nataliste
This dynamic is not exclusive to those claiming to be part of an insular community of freethinkers:
matthewdgreen
From the article:
A 2023 post on Rationalism forum LessWrong.com warned of coming violence in the Zizian community. “Over the past few years, Ziz has repeatedly called for the deaths of many different classes of people,” the anonymous post read. Jessica Taylor, a friend of Baukholt’s, told Open Vallejo she warned Baukholt about the Zizians, describing the group on X as a “death cult.”
The post: https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
downrightmike
[flagged]
zxexz
This story just keeps getting more and more bizarre. Reading the charges and supporting affidavits, the whole thing is reading more and more like some sort of Yorgos Lanthimos film. The rationalist connection - a literal sequel to the 2022 events (in turn a sequel to the 2019 CFAR stuff) - is already weird enough. But I can't get over the ridiculousness of the VT situation. I have spent time in that area of VT, and the charged parties must have been acting quite bizarre for the clerk to alert the police. Checking into a motel wearing all black, open carrying does NOT cut it. The phones wrapped in foil is comical, and the fact that they were surveilled over several days is interesting, especially because it reads like the FBI only became aware of their presence after the stop and shootout?
The arresting agent seems pretty interesting, a former risk adjuster who recently successfully led the case against a large inter-state fraud scheme. This may just be the plot of Fargo season 10. Looking forward to the season arc of the FBI trying to understand the "rationalist" community. The episode titled "Roko's Basilisk", with no thematically tied elements, but they manage to turn Yudkowsky into a rat.
iamthepieman
This story happened in my backyard. The shootout was about 40 minutes from me but Youngblut and Felix Bauckholt were reported by a hotel clerk dressed in tactical gear and sporting firearms in a hotel a few blocks from me.
Weird to see a community I followed show up so close to home and negatively like this. I always just read LW and appreciated some of the fundamentals that this group seems to have ignored. Stuff like rationality has to objectively make your life and the world better or its a failed ideology.
Edit: I've been following this story for over a week because it was local news. Why is this showing up here on HN now?
Aurornis
> Weird to see a community I followed show up so close to home and negatively like this.
I had some coworkers who were really into LessWrong and rationality. I thought it was fun to read some of the selected writings they would share, but I always felt that online rationalist communities collected a lot of people with reactionary, fascist, misogynistic, and far-right tendencies. There’s a heavily sanitized version of rationality and EA that gets presented online with only the highlights, but there’s a lot more out there in the fringes that is really weird.
For example, many know about Roko’s Basilisk as a thought exercise and much has been written about it, but fewer know that Roko has been writing misogynistic rants on Twitter and claiming things like having women in the workforce is “very negative” for GDP.
The Slate Star Codex subreddit was a home for rationalists on Reddit, but they had so many problems with culture war topics that they banned discussion of them. The users forked off and created “The Motte” which is a bit of a cesspool dressed up with rationalist prose. Even the SlateStarCodex subreddit has become so toxic that I had to unsubscribe. Many of the posts and comments on women or dating were becoming indistinguishable from incel communities other than the rationalist prose style.
Even the real-world rationalist and EA communities aren’t immune, with several high profile sexual misconduct scandals making the news in recent years.
It’s a weird space. It felt like a fun internet philosophy community when my coworkers introduced it years ago, but the longer I’ve observed it the more I’ve realized it attracts and accepts a lot of people whose goals aren’t aligned with objectively “make the world better” as long as they can write their prose in the rationalist style. It’s been strange to observe.
Of course, at every turn people will argue that the bad actors are not true rationalists, but I’ve seen enough from these communities to know that they don’t really discriminate much until issues boil over into the news.
nataliste
Sophistry is actually really really old:
>In the second half of the 5th century BCE, particularly in Athens, "sophist" came to denote a class of mostly itinerant intellectuals who taught courses in various subjects, speculated about the nature of language and culture, and employed rhetoric to achieve their purposes, generally to persuade or convince others. Nicholas Denyer observes that the Sophists "did ... have one important thing in common: whatever else they did or did not claim to know, they characteristically had a great understanding of what words would entertain or impress or persuade an audience."
The problem then, as of now, is sorting the wheat from the chaff. Rationalist spaces like /r/SSC, The Motte, et. al are just modern sophistry labs that like to think they're filled with the next Socrates when they're actually filled with endless Thrasymachi. Scott Alexander and Eleizer Yudkowsky have something meaningful (and deradicalizing) to say. Their third-degree followers? Not so much.
wruza
Yudkowsky texts represent my mental image of a vector continuously scanning a latent space in some general direction. Changes just pile on and on until you come from concept A to concept B without ever making a logical step, but there’s nothing to criticise cause every step was a seemingly random nuance. Start at some rare values in most dimensions, crank up the temperature and you get yourself Yudkowsky.
> our coherent extrapolated volition is "our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted (…) The appeal to an objective through contingent human nature (perhaps expressed, for mathematical purposes, in the form of a utility function or other decision-theoretic formalism), as providing the ultimate criterion of "Friendliness", is an answer to the meta-ethical problem of defining an objective morality; extrapolated volition is intended to be what humanity objectively would want, all things considered, but it can only be defined relative to the psychological and cognitive qualities of present-day, unextrapolated humanity.
I doubt that a guy who seriously produces this can say something meaningful at all.
cvalka
I don't think Eleizer Yudkowsky has anything meaningful to say. He is a doomsday cult leader who happens to be fashionable in some circles.
kiba
The community/offshoot I am part of is mostly liberal/left. My impression that lesswrong is also liberal/left.
Aurornis
> The community/offshoot I am part of is mostly liberal/left
There isn't an official "rationalist" community. Some consider LessWrong to be the center, but there have always been different communities and offshoots. As far as I know, a lot of the famous rationalist figures haven't participated much in LessWrong for a long time now.
The far right offshoot I was referring to is known as "TheMotte" or "The Motte". It was a gathering point for people who were upset after the Slate Star Codex comment section and subreddit banned "culture war" topics because they were becoming an optics problem.
It's easy to forget because it's a "don't talk about it" topic, but after culture war topics were banned from SSC, The Motte subreddit had significantly more activity than the SlateStarCodex subreddit. They eventually left Reddit because so many posts were getting removed for violating Reddit policies. Their weekly "culture war" threads would have thousands of comments and you'd find people "steelmanning" things like how Trump actually won the 2020 election or holocaust denial.
The other groups I was referring to were CFAR, MIRI, and Leverage, all of which have been involved with allegations of cult-like behavior, manipulation, and sexual abuse. Here's one of several articles on the topic, which links to others: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experie...
Every time I discuss this on HN I get downvoted a lot. I think a lot of people identify as rationalists and/or had fun reading LessWrong or SSC back in the day, but don't understand all of the weirdness that exists around rationalist forums and the Bay Area rationalist community.
cluckindan
Perhaps there are people in power who would benefit from portraying those communities in a different light.
LeroyRaz
I think astral codex ten did a survey recently and the majority of respondents were politically left
timmytokyo
What kind of political leftist would you say resonates most acutely with Scott Alexander's January paean to the scientific racism of Emil Kirkegaard and Richard Lynn ("How To Stop Worrying And Learn To Love Lynn's National IQ Estimates")?
scarmig
It's somewhat odd to represent a community as being right wing when the worst thing to come from it was a trans vegan murder cult. Most "rationalists" vote Democrat, and if the franchise were limited to them, Harris would have won in a 50 state landslide.
The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
Unit327
"trans vegan murder cult" is the best band name ever
Aurornis
> It's somewhat odd to represent a community as being right wing when the worst thing to come from it was a trans vegan murder cult
I was referring to "The Motte", which emerged after the SlateStarCodex subreddit finally banned "culture war" topics. Scott announced it in this post: https://slatestarcodex.com/2019/02/22/rip-culture-war-thread...
The Ziz cult did not emerge from The Motte. I don't know why you came to that conclusion.
> Most "rationalists" vote Democrat,
Scott Alexander (of SlateStarCodex) did surveys of his audience. Interestingly, the culture war thread participants were split almost 50:50 between those identifying as left-wing and those identifying as right-wing.
Following the ban on discussion of culture war topics, many of the right-wing participants left for The Motte, which encouraged these conversations.
That's how there came to be a right-wing offshoot of the rationalist community.
The history is all out there. I'm surprised how many people are doubting me about this. You can read the origin story right on Scott's blog, and the Reddit post where they discuss their problems with running afoul of Reddit's content policies (necessitating a move off-platform) is still accessible: https://old.reddit.com/r/TheMotte/comments/uaoyng/meta_like_...
> The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
No, you're putting words in my mouth. I'm not complaining about a refusal to "progressive pieties as axiomatic". I'm relaying history of rationalist communities. It's surprising to see all of the denial about the topic.
saagarjha
Being a trans vegan doesn't automatically make you left wing. Nor does voting Democrat. Being progressive is a complex set of ideals, just as conservatism is a lot more than whatever the Republican party is doing today.
llm_trw
>The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
A trans vegan gang murdering police officers is what's come out of this milieu.
I don't see how anyone can say they aren't taking "progressive pieties as axiomatic".
The OP is just taking the "everything I don't like is fascist" trope to it's natural conclusion. Up next: Stalin actually a Nazi.
ben_w
I have had some rather… negative vibes, for lack of a better term, from some of the American bits I've encountered online; but for what it's worth, I've not seen what you described in the German community.
There is, ironically, no escape from two facts that was well advertised at the start: (1) the easiest person for anyone to fool is themselves, and (2) politics is the mind-killer.
With no outwardly visible irony, there's a rationalist politics podcast called "the mind killer": https://podcasts.apple.com/de/podcast/the-mind-killer/id1507...
Saying this as someone who read HPMOR and AI to Zombies and used to listen to The Bayesian Conspiracy podcast:
This is feeling a bit of that scene in Monty Python Life of Brian where everyone was chanting in unison about thinking for themselves.
djur
The problem with "politics is the mind-killer" is that it seems to encourage either completely ignoring politics (which is mostly harmless but also results in pointlessly ceding a venue for productive action in service of one's ideals) or engaging with politics in a very Machiavellian, quasi-Nietzschean way, where you perceive yourself as slicing through the meaningless Gordian knot of politics (which results in the various extremist offshoots being discussed).
I understand that the actually rational exegesis of "politics is the mind-killer" is that it's a warning against confirmation bias and the tendency to adopt an entire truth system from one's political faction, rather than maintaining skepticism. But that doesn't seem to be how people often take it.
paradox460
Politics aren't the mind killer, fear is. Politics just use fear to achieve their ends
Der_Einzige
The whole internet mainstream zeitgeist with dating among men has become identical to incel talking points from 5 years ago.
grumple
Reading about the Roko’s Basalisk saga, it seems clear that these people are quite far from rational and of extremely limited emotional development. It reads like observing a group of children who are afraid of the monster in the closet, which they definitely brought into existence by chanting a phrase in front of the bathroom mirror…
Members of these or other similar communities would do well to read anything on them dispassionately and critique anything they read. I’d also say that if they use Yudkowsy’s writings as a basis for understanding the world, that understanding is going to have to the same inadequacies of Yudkowsky and his writings. How many people without PhDs or even relevant formal education are putting out high quality writing on both philosophy and quantum mechanics (and whatever other subjects)?
meowface
For what it's worth, there's a thriving liberal rationalist-adjacent community on Twitter that despises people like Roko.
dang
(To answer that last procedural question: there have been assorted submissions, but none spent much time on the front page. More at https://news.ycombinator.com/item?id=42901777)
datadeft
Many people were curious here that the perpetrators were using Vim or Emacs.
bbarnett
Wait, OR?!
Clearly this is a poorly organized movement, with wildly different beliefs. There is no unity of purpose here. Emacs or vi, used without core beliefs being challenged?!
And one does not form a rationalist movement, and use emacs after all.
tbrownaw
ed is the standard editor
wesapien
After seeing this news, I recall watching a video by Julia Galef about "what is rationality". Would it be fair to say that in this situation, they lack epistemic rationality but are high in instrumental rationality?
olalonde
If they had high instrumental rationality, they would be effective at achieving their goals. That doesn’t seem to be the case - by conventional standards, they would even be considered "losers": jobless, homeless, imprisoned, or on the run.
Muromec
That depends on a goal. The goal of a martyr is not life.
lukan
Hard to say without hearing them speak for themself.
So far I have 0 idea of any motive.
Supposedly it should be rational, so I would at least like to hear it, before judging deeper.
wetpaws
[dead]
k8sToGo
Why does a hotel clerk wear tactical gear and guns?
tsimionescu
That sentence was slightly awkward, the hotel clerk reported that those two people were in tactical gear with guns.
zozbot234
Relevant link (2023): https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
The top comment has an interesting take: "Unless Ziz gets back in the news, there’s not much reason for someone in 2025 or later to be reading this."
__turbobrew__
This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
This is the kind of thing where it is warranted that the feds gets every single wiretap, interception, and surveillance possible on everyone involved in the zizian movement.
anon84873628
Calling them a roaming band or "tactical death squad" is giving far too much credit. It is a handful of crazy people who convinced themselves that a couple murders would solve their problems.
In particular the attack on border patrol was obviously random and illogical. And the fact that no one was convicted of the Pennsylvania murders seems to reflect more on the police and prosecutors than the intelligence of the perpetrators.
CamperBob2
Speaking of random and illogical, what prompted the Border Patrol to stop their car in the first place, I wonder? None of the news stories have elaborated on that.
umeshunni
The FBI report says it was a traffic stop. The milemarker 168 seems to be about 10 miles from the Canadian border.
https://www.fbi.gov/contact-us/field-offices/albany/news/fbi...
More information from the police report:https://drive.google.com/file/d/1wycOK3UbaQ9JWZuvo2gpZiDT-0t...
On January 20, 2025, at approximately 3:00 pm, an on-duty, uniformed United States Border Patrol (USBP) Agent initiated a stop of a blue 2015 Toyota Prius Hatchback with North Carolina license plate number KLA2040 to conduct an immigration inspection as it was driving southbound on Interstate 91 in Coventry, Vermont. The registered owner of the vehicle, Felix Baukholt, a citizen of Germany, appeared to have an expired visa in a Department of Homeland Security database. YOUNGBLUT was driving the Prius, and Baukholt was the lone passenger in the Prius. Multiple uniformed Border Patrol Agents were present at the stop in three USBP vehicles with emergency lights illuminated.
(later in the same document)
Investigators had been performing periodic surveillance of Baukholt and YOUNGBLUT since on or about Tuesday, January 14, 2025. A concerned citizen-an employee of a hotel in Lyndonville, Vermont--contacted law enforcement after a male and a female had checked into the hotel to report concerns about them, including that they appeared to be dressed in all-black tactical style clothing with protective equipment, with the woman, later identified as YOUNGBLUT, carrying an apparent firearm in an exposed-carry holster. Investigators with VSP and Homeland Security Investigations attempted to initiate a consensual conversation with Baukholt and YOUNGBLUT, but they declined to have an extended conversation, claiming that they were in the vicinity to look at purchasing property. After the contact with law enforcement, the pair checked out of the Lyndonville hotel on the afternoon of January 14, 2025. Investigators later observed the pair in similar tactical dress on Sunday, January 19, 2025, walking in downtown Newport; YOUNGBLUT was observed carrying a handgun at that time.
werdnapk
Have you ever driven near the border? You'll be flagged for doing anything out of the ordinary with your car. If you have to pull over for a moment to find your passport for example, or if you made a wrong turn and try to turn around, or really anything that looks "suspicious", you risk getting additional searches or being pulled over.
Workaccount2
I don't know, but this is ripe material for a documentaries and podcasts, so I'm sure there will be a lot more coming out in the future.
polynomial
I don't have a link handy, but they were under surveillance for a full week before they were pulled for a "traffic stop."
concordDance
Conflating ziz and less wrong feels a bit like conflating Aiden Hale with the LGBTQ movement or the Branch Davidians with Christianity.
larsiusprime
Or even just the Branch Davidians and the seventh day adventists, of whom the branch Davidians were an offshoot.
I’ve read a couple rationalist blogs for over a decade and this past week is the first I’ve ever heard of these “Zizians”
nejsjsjsbsb
Split the beliefs from the crime. A bunch of murderers were caught. Given they are dangerous killers, one killing a witness and one faking their death yeah they should get warrants.
liftIO
> Split the beliefs from the crime.
Pretty hard to do that when the beliefs explicitly endorse murder. Ziz used to run a blog on which she made thinly veiled death threats, argued for a personal philosophy of hair-trigger escalation and massive retribution, raged at the rationalist community for not agreeing with her on that philosophy and on theories of transness, and considered most people on Earth to be irredeemably evil for eating meat.
__turbobrew__
It appears the ven diagram of the beliefs and crimes overlap quite a bit. Sometimes the beliefs are that certain crimes should be committed.
This is a free country (disputably) and you should be able to think and say whatever you want, but I also think it is reasonable for law enforcement in the investigation of said crimes to also investigate links to other members in the movement.
Muromec
> but I also think it is reasonable for law enforcement in the investigation of said crimes to also investigate links to other members in the movement.
It doesn't work. Every single time a radicalized member of the marginalized community does this kind of crime, the numbered-letter-agency dutifully reports, that they knew the person to be radicalized, but had nothing to act on, because a lot of people have weird violence-approving beliefs, talk about them openly or with friends and very few actually hijack a Boeing or two. Those who plan to do things also happen to learn about op-sec mistakes of those caught before them.
Israel knew about Hamas, and the russian empire of 19th century knew about anarchists. Pouring a lot of resources into suppressing all of that didn't do jack shit in the long term.
concordDance
> It appears the ven diagram of the beliefs and crimes overlap quite a bit.
There are hundreds of thousands of rationalists (to a greater or lesser extent). Very few go shoot people.
nejsjsjsbsb
Probable cause is the key term here.
Cops can do a lot without it. They can choose what leads to follow.
But not sure what you are suggesting with respect to wiretapping.
null
victorbjorklund
Can you really split the beliefs of the nazi movement in germany in 1940 from the crimes the believers committed?
mobiledev2014
Yes, thank you for saying so- reading about all this, but especially all the people chiming in who already knew about a lot of it? The fact that the founder of LessWrong coined the term “alignment,” a subject I’ve read about many times… it feels like learning lizard people always walked among us
FeepingCreature
Honestly it feels like this is the first time people are realizing that six degrees of separation means that crazy people can usually be connected to influential people. In this case they're just realizing it with the rationalists.
numpad0
At least it's clear that they aren't receiving proper sword handling training. Good grief.
gaze
[flagged]
nejsjsjsbsb
Your about tells me you could make this comment way more specifically and with evidence.
Der_Einzige
Rest assured, I'm pretty sure among the easiest ways to make yourself the target of surveillance is to do anything interesting at all involving technology. All serious AI researchers, for example, should assume that they are the victims of this.
Muromec
>This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
I don't exactly see how it's different from a group of habitual alcoholics discussing politics and having a fatal disagreement, which is a normal day of the week in any police department with enough demographics to have this sort of low-effort low-gain crime. It's more scandalous because of details and people involved are more interesting, but everyone will forget about it after a week, as they don't matter.
Levitz
>I don't exactly see how it's different from a group of habitual alcoholics discussing politics and having a fatal disagreement
Intent, premeditation, possibly being designated a terrorist group depending on other factors. Big differences.
romaaeterna
These people?
https://nypost.com/2025/01/30/us-news/killing-of-border-patr...
Is the appellation in the headline, "radical vegan trans cult," a true description?
> Authorities now say the guns used by Youngblut and Bauckholt are owned by a person of interest in other murders — and connected to a mysterious cult of transgender “geniuses” who follow a trans leader named Jack LaSota, also known by the alias “Ziz.”
Is all this murder stuff broadly correct?
Trasmatta
The NY Post tried to frame them as "radical leftist", but that's a big stretch. I don't think most rationalists would consider themselves leftist. The article also seems to be leaning into the current "trans panic" - pretty typical for the NYP.
romaaeterna
I also dislike Right/Left categorizations. Most people don't even know the history of the terms and their roots in the French Revolution. Though the "Cult of Reason" established then certainly had the Left categorization at the time.
But is the trans element not a major part of this cult? It seemed to be from the linked story in the top link. But if there is something incorrect there, or false in the NYP reporting, you should point it out. If it is a major element of this cult, then far from complaining about NYP, I would complain about any news organization leaving it out of its reporting.
brokensegue
I don't think being trans is part of their beliefs or a requirement to be a member
Trasmatta
[flagged]
slooonz
> I don't think most rationalists would consider themselves leftist
Yes they do.
https://docs.google.com/forms/d/e/1FAIpQLSf5FqX6XBJlfOShMd3U...
moolcool
A naive observer would look at these numbers and say “wow, I am unsurprised that this movement is made mostly of straight white American male tech workers”.
Anyone reaching that conclusion though, would be a fool! For they would have forgotten to apply Bayes Theorem.
affinepplan
liberal is not leftist
wahnfrieden
1.6% Marxist
Did you confuse Liberal with Leftist? Liberals are anti-left
Some portion are Libertarian but there's no distinction between so-called "ancap" and libcom so that one is murky or more often coded for the former (the Libertarian party in the US is anti-left)
postepowanieadm
Does it really matter? Nazis called themselves socialists.
Trasmatta
[flagged]
alvah
Who is making a statement about "most rationalists" here? The claim is about a trans vegan murder cult, which doesn't appear to be a natural member of the right side of the political spectrum.
mitthrowaway2
Many rationalists do consider themselves leftist. Many others do not. It's a big tent and anyone can wander in.
wisty
Left libertarian would be more likely, I think?
Aurornis
> Is the appellation in the headline, "radical vegan trans cult," a true description?
For this small group, yes. Their leader believes in Nuremberg-style trials for people who eat meat. If you want to go down the rabbit hole, it gets much weirder: https://zizians.info/
zozbot234
The cult does seem to target people who identify as trans - OP has some discussion of this. Not sure if that justifies calling it a "radical vegan trans cult" though. Trans folks seem to be overrepresented in rationalist communities generally, at least on the West Coast - but there may be all sorts of valid reasons for that.
none_to_remain
None of the murder victims I'm aware of were transgender?
billjings
Target as in, target for recruitment into the group.
The targets for their victims seem chosen...as retaliation to defend their understanding of their own interests.
stefantalpalaru
[dead]
wanderingbit
I’m from Burlington and a couple weeks ago downtown I noticed a group of 2 or 3 people walking past me in full black clothing with ski masks (the kind you rob banks with).
I thought it was strange, having never seen that before except on Halloween, but didn’t think to alert any authorities specifically because Burlington is filled with people dressing differently and doing strange things. But 99% of the time it’s totally non violent and benign.
I’m guessing this was them. Scary!
taurknaut
I can't speak to Burlington but in philly balaclavas (which is what those masks are called) are quite common and have been since 2020. I suspect this is true of many cities. It's been the subject of some controversy involving mask bans. In fact seeing someone in all black with a ski mask on is a pretty typical, if intimidating, fashion.
mickelsen
Some bookmarks from early 2023 that seem relevant now:
https://old.reddit.com/r/SneerClub/
https://aiascendant.substack.com/p/extropias-children-chapte...
I do not engage with any of those people nor their weird communities.
joenot443
Sneer Club is one of the most nasty, uncharitable, and bad-faith subs out there these days. They generally hate HN, as well. I think any community which exists solely to make cheap shots at another community is poison at its core, SC’s parasocial relationship with LW is a perfect example.
Der_Einzige
N-gate was among the only things that made this website worth reading. They solely existed to make "cheap shots" at HN. If that's "poison", than I don't want an antidote!
zozbot234
What happens if you feed the n-gate archive to an LLM and ask it to generate commentary on current HN posts in the same style?
s1artibartfast
Just wanted to say fantastic substack writing. Thanks for linking as I go down this rabbit hole
mickelsen
Yeah! I devoured the entire series of posts in one go back then, I had no idea about all the people and their ties. Plus it was a super engaging read, I could imagine being there.
dvaun
I especially enjoy how the author from the substack series described the singularity as “The Rapture of the Nerds”.
guerrilla
I really loved the language describing the singularity as "an inescapable runaway feedback loop which leads to the ascension of an enemy god". Beautiful.
guerrilla
Holy shit, there are 7 chapters to that last one. Chapter 1 is fucking mind-blowing. I could never figure out why they were obsessed with Roko's basilisk but it makes total sense now considering how it all started.
This is such an epic unbelievable story. We live in this world?
Chapter 6 covers how StarSlateCodex comes into the picture, by the way. I always wondered that too.
nkurz
Most of the news coverage I've seen of this story is omitting what some might consider a relevant detail: almost all the members of this group are trans.
This is a divisive topic, but failing to mention this makes me worry a story is pushing a particular agenda rather than trying to tell the facts. Here's what the story looks like if the trans activism is considered central to the story:
https://thepostmillennial.com/andy-ngo-reports-trans-terror-...
While Ngo's version is definitely biased, and while I don't know enough about the story to endorse or refute his view, I think it's important to realize that this part of the story is being suppressed in most of the coverage elsewhere.
sterlind
it's been an exhausting couple of weeks for me, as a trans person. one executive order after another, explicitly attacking us. scrambling to update all my documents, navigating a Kafkaesque bureaucracy with constantly shifting rules.
now this.
there are like six Zizians. there are millions of trans people. I'm sure that many of the Zizians being trans says something about the Ziz cult, but Ziz doesn't say anything about "trans activism."
any evil one trans person does, is used to stain all trans people. recognize this tendency; don't let this become like blood libel.
radpanda
I’m not a big George W Bush fan but this quote of his has stuck with me for years:
> Too often, we judge other groups by their worst examples while judging ourselves by our best intentions
YurgenJurgensen
The more readily a group tries to restrict some behaviours, the more it implicitly endorses behaviours it doesn’t attempt to restrict.
all2
> any evil one trans person does, is used to stain all trans people. recognize this tendency; don't let this become like blood libel.
As a Christian, I can empathize. The wrongs and hypocrisies of so many are heaped on those who have no relation to the actions.
LeoPanthera
[flagged]
kurikuri
[flagged]
gerdesj
I don't think that you need to be a Christian to empathise with anyone. The big geezer's (JC) teachings imply to me that keeping quiet about your faith and simply doing good (for a given value of good) is the Way.
Declaring a religion might rile someone, before you have even engaged. I suggest that you simply proffer a hand. Empathise as best you can. Be careful.
At best, with mentioning religion, you have declared your rightful intentions and at worst you have added yet another layer of tribalism to a ... debate.
I'm pretty sure that at least the synoptic gospels and probably John too (for the full set) tell you to keep it quiet. There is no need to put your religious heart on your sleeve - that's between you and God.
accengaged
Sorry this is happening to you.
What really shatters my faith in common sense is the fact that trans is a minority of a minority and is made out to be a problem when it's not.
Just remember that you have allies in this industry. Lots of love.
cjbgkagh
I know I sound crazy saying what I'm about to say but it is the truth as I understand it and I think it's important.
It appears to me that there is a certain modality of thought that occurs much more often in people with hEDS, specifically those with TNXB SNPs. If you're super deep into type theory the odds substantially increase that you have hEDS - it's how I found out that I had it. And this same group is far more likely to be trans than the general population. A link that would be far more obvious if hEDS wasn't so underdiagnosed.
Additionally, it appears to me that mental disorders are often caused by auto-immune conditions which is extremely common in those with hEDS. So with a strong selection bias on math ability and trans and you're gonna end up with a lot of hEDS people who are strongly predisposed to mental disorders. I know someone with hEDS who obsessively studies the link between hEDS and serial killers - not something I want to be associated with the stats were pretty convicting. I do think it is possible that two TNXB SNPs are sufficient to explain why I think the way I do, why I'm far more like Ted Kaczynski than I would like to be. Of note; Ted Kaczynski did consider gender reassignment back in 1966.
Which is to say two things, I think what people are observing is a real phenomena and it is not purely from personal biases, though I'm not denying personal biases play a part in perception. And perhaps with that in mind the solution is in fact in diagnosing and treating the underlying auto-immune conditions. And to put a hat on a hat on my 'crazy' I think people are going to find that GLP1-Agonists like ozempic, specifically at the lower doses, are quite helpful in managing auto-immune conditions, among other things.
taurath
In my experience, it can often be trauma that causes the auto immunes. Seeing everything from a chemical standpoint only looks at half the picture. It’s hard to find a not traumatized autistic person for example.
twic
This is a new and interesting idea to me. Have you or your serial killer obsessed friend written up any of this?
sterlind
could you please contact me on Discord or email (in my profile)? I have also investigated the connections between hEDS, being trans and a specific kind of autism, along with TNXB.
nicecountry
>Of note; Ted Kaczynski did consider gender reassignment back in 1966.
That happened after he got brainwashed, from the age of 16, through the CIA's MK ULTRA experiment though.
shadowgovt
Precisely.
Most US cults are made up almost entirely of cis people, but nobody jumps to conclusions about the impact of cisness on indoctrination susceptibility.
mitthrowaway2
Absolutely. I think the rationalists feel this way too.
FeepingCreature
Can confirm.
Zizianism is not a logical outgrowth of rationality for many, many reasons. These people are just crazy.
tmshapland
Keep fighting, Sterlind. Most people aren't full of hate. Just the assholes who take the time to comment mean things on social platforms.
adastra22
It’s not “like blood libel.” It is blood libel. It is literally the same thing.
Muromec
Nobody is literally Hitler because the man is for sure dead. Also nobody is literally nazi, as the party was disbanded, so we are all here somewhere on the spectrum between being and not being Hitlers, nazis, fascists or what not, if nothing else, by the fact of being humans (or other sort of sentient next-token-predictors).
zozbot234
> almost all the members of this group are trans.
The Zizians.info site (linked by one of the HN posts re: this story) mentions that the Zizians did target people who identified as transgender for indoctrination, so this is not really surprising. People who are undergoing this kind of stress and marginalization may well be more vulnerable to such tactics.
adastra22
The Ziz method of indoctrination involves convincing his minions they are trapped inside a mental framework they need to break free of. Trans people already feel trapped in a body not aligned with who they are, and are naturally susceptible to this message (and therefore natural targets for recruitment).
adastra22
Following on because the edit window elapsed, the specific method of indoctrination used by ziz (but invented by Gwen) involved in novel method of sleep deprivation to induce split personalities. It’s been called “installing demons.” I wouldn’t be surprised either if the causality is the other way around: Ziz reworked these people to be trans. Ziz certainly seems to have treated the minds of the people around him/her as malleable putty.
I actually met Gwen and spent a weekend with him sometime ago. They were a roommate of my friend for a while. The person I knew doesn’t resemble the lunatic in the news article articles at all, but I have no doubt is physically/legally them. Cult indoctrination is a hell of a thing.
snickerbockers
I was wondering if it's relevant that so many of them are young people with "data science" degrees and they call themselves "rationalists". Sounds like they have some sort of superiority complex that might make them more susceptible to justifying acts of violence when it's "rational".
kranke155
Yes and no. Lots of people feel trapped. He/she (I’m not sure who this Ziz is) just sounds like someone who knew they could work their work into trans people.
watwut
They are also openly hated by general society and targeted for bullying by major political actors.
raverbashing
Yup. 100% a cult indoctrination technique.
The vulnerability is the crowbar the cult uses
onemoresoop
Yeah, all cults exploit vulnerabilities.
null
transmission77
[flagged]
nullc
I think the relevance of their transness is not very significant.
The lesswrong apocalypse cult has been feeding people's mental illness for years. Their transness likely made them more outsiders to the cult proper, so e.g. they didn't get diverted off into becoming Big Yud's BSDM "math pets" like other women in the cult.
I doubt they are significantly more mentally ill than other members of the cult they just had less support to channel their vulnerability into forms more beneficial to the cult leaders.
Yudkowsky wrote an editorial in Time advocating for the use of nuclear weapons against civilians to prevent his imagined AI doomsday... and people are surprised that some of his followers didn't get the memo that think-pieces are for only for navel gazing. If not for the fact that the goal of cult leaders is generally to freeze their victims into inaction and compliance we probably would have seen more widespread murder as a result of Yud cult's violent rhetoric.
elif
>I doubt they are significantly more mentally ill than other members.
Why would this certain group defy the US trend of being 4-7x more likely afflicted by depressive dissorder? We are talking about a demographic with a 46% rate of suicidal ideation and you doubt that's significant why?
nullc
Suicidal ideation, for example, is common in the lesswrong community, even with people pledging to end their own lives before the machine overlord is able to scan their brains and simulate an infinitude of copies of them in a state of perpetual torture.
Essentially the community in discussion is already selected for and generates mental illness, so the ordinary comorbidities of transgendered persons are likely less relevant.
llm_trw
I shudder to ask, but what exactly is a math pet?
CSMastermind
I was unacquainted with the term but after searching it seems that Eliezer Yudkowsky wrote several posts on a BDSM website where he fantasized about recruiting a harem of highly educated women to service him which he called "math pets".
photonthug
Someone that you introduce to free group actions and the Cox-Zucker machine, hairy ball theorems, stuff like that
Texasian
… lesswrong apocalypse cult?
Like the guy who wrote that insufferable Harry Potter fanfiction?
duxup
Marginalized groups seem to be a target / susceptible to this kind of thing.
I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration. But the idea wasn't unusual or all that out there so I didn't think much of it. But that group showed up again and again and eventually someone asked and their theory all but seemed to imply that nobody else could possibly have ... feelings and that lack of understanding made those people lesser and them greater.
It seemed to come from some concept that their experience imparted some unique understanding that nobody else could have, and that just lead down a path that lead to zero empathy / understanding with anyone outside.
Reddit encounters are always hard to understand IMO so I don't want to read too much into it, but that isolation that some people / groups feel seem to potentially lead to dark places very easily / quickly.
chroma
This group formed in the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people. If marginalization were the main cause, it seems to me that the group would have been located somewhere else. I think it's more likely that these people had an underlying mental disorder that made them likely to engage in both violent behavior and trans identity.
One big difference the Zizians have with the LessWrong community is that LW people believe that human minds cannot be rational enough to be absolute utilitarians, and therefore a certain kind of deontology is needed.[1] In contrast, the Zizians are absolutely convinced of the correctness of their views, which leads them to justify atrocities. In that way it seems similar to the psychology of jihadists.
1. https://www.lesswrong.com/posts/K9ZaZXDnL3SEmYZqB/ends-don-t...
erikpukinskis
> the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people
I live in the Bay. Maybe that is true, but in absolute terms the level of acceptance is still very low.
Like, if Denver is 10% accepting, the Bay might be 15%. Or something like that.
And Vallejo, while part of the Bay Area is a very different place than, say, the Castro. Culturally, it’s probably more like Detroit than San Francisco.
So I’m not sure if you can really draw any conclusions from your premise.
JumpCrisscross
> If marginalization were the main cause
I think they're crazy first, trans second. They were marginalised for being crazy. Then they found each other because they're trans. Many cults have random attributes shared by the members, whether it be race or sexual preferences. Their race or sexual preference didn't cause them to join a cult, they had other things going on that drove that. But when it came time to join one, they gravitated towards the one that identified with them.
DangitBobby
Or more of them live there because it's one of the most accepting environments on the planet, but still not accepting enough to prevent them from being a marginalized outgroup that is quite easy to radicalize by those that would accept them?
TMWNN
>I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration.
The death of the author is a reasonable approach to reading a work. But what you said reminded me of the more delusional view in which a) the watcher/reader's approach is the only "correct" one, and b) anyone who disagrees is *EVIL*. An instance of this happened among Tumblrinas obsessed with the supposed homosexual relationship between Holmes and Watson on BBC's Sherlock, and who were certain that the next episode of the show would reveal this to the world. Welp. <https://np.reddit.com/r/the_meltdown/comments/5oc59t/tumblr_...>
habinero
You can find all kinds of ridiculous people online, and they're all mostly harmless.
I mean, every LLM post on HN gets people writing fanfic about how AI is developing human intelligence and other silly things.
There's frankly no difference between the two groups -- they are equally silly -- except one is coded female and people like to shit on those hobbies more than male-coded AI fanfic.
codr7
I see it mainly as a reaction to a dysfunctional and abusive system/culture, and not necessarily a constructive one.
Fix the world and these problems don't exist.
akoboldfrying
>Fix the world and these problems don't exist.
Hard disagree. Plenty of antisocial (or worse) behaviour has been promulgated by those indisputably at the top of the social food chain -- almost every war of conquest, for example. Did the British Empire expand throughout the world because the British felt marginalised? No, the rest of the world considered them to be a great power and many other cultures voluntarily adopted their styles of dress and other customs as a mark of "modernity".
A sense of marginalisation (real or imagined) can certainly be a force that acts to reduce empathy and encourage violence, but it's by no means necessary.
olalonde
There is a well-documented correlation between gender dysphoria, mental health conditions, and autism spectrum disorder. These overlapping factors may contribute to increased vulnerability to manipulative groups, such as cults.
nejsjsjsbsb
Thanks the pronouns were confusing me and making it hard for me to follow the complex story. I assumed I made a mistake when the article mentions a Jack and refers to them as Jack the whole way through but uses she at the end.
Unfortunately the gendered language we use is also the mechanism to provide clues and content as you read the story. So if I can rely on that they need to call it out to help the reader.
I'd rather the article mention it.
Why are they not? Is this a chilling effect?
jl6
It goes unmentioned because there is an unwritten rule in progressive media that marginalized groups must never be perceived as doing wrong, because that will deepen their marginalization.
In practice it creates a moral blind spot where the worst extremists get a pass, in the name of protecting the rest. Non-progressive media are all too happy to fill in the gap. Cue resentment, cue backlash, cue Trump. Way to go, guys!
sterlind
conservative media has the opposite rule: make every story about a trans person into a narrative about trans ideology.
this should be a story about an ideological cult with trans members, but instead it's a story about the cult of trans ideology. it's called "nut picking" - use the worst examples of a group to tarnish the group as a whole.
a good example of this is attacks against Muslim Americans after 9/11.
xereeto
I think that’s a drastic oversimplification.
actuallyalys
The fact that many are transgender seems to be relevant because it’s a recruiting and manipulation tactic, not because of a connection to “trans activism.” I haven’t seen any evidence of that connection besides people involved being transgender.
red75prime
Why scare quotes? There are political organizations representing trans-people (and doing quite a bit of activity).
EA-3167
I don't think it's so much pushing an agenda, as it is avoiding a thermonuclear hot potato of modern life. If you start talking about gender identity, everyone has STRONG opinions they feel they must share. Worse, a subset of those opinions will be fairly extreme, and you're potentially exposing yourself to harassment or worse. If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
So if you can tell the story without the possibly superfluous detail of the genders of the people involved, that's a pretty obvious angle to omit. Andy Ngo is obviously not doing this, but that's really only because he has a very clear agenda and in fact his entire interest in this story probably stems from that.
nkurz
Yes, that's a reasonable possibility as well. It's not proof of an agenda, and might be prudent, but I do think it's a form of bias. There's a thin line between skipping "possibly superfluous" details and skipping core parts of a story that might provide evidence for viewpoints one disagrees with. The result is still that readers need to realize that they are being presented with a consciously edited narrative and not an unbiased set of facts.
mindslight
It was quite easy to skim over some original source material from both sinceriously.fyi and zizians.info. By my quick reading, and taking a very high level view, the philosophy is responsible for the trans and also for the violence. But an article harping on the correlation as implied causation without focusing on the hidden variable behind them both is just trying to fuel the fire of the reactionary movement. In general, averaging two different flavors of extremist reporting is not a way for obtaining truth.
dinkumthinkum
No, that is omitting quite a significant detail. If apparently the majority of people have X characteristic that is a tiny percentage in the overall population there is some correlation or something newsworthy there,
llm_trw
To quote a hot potato of a previous age: all bankers may be Jews, but not all Jews are bankers.
We know where that one took us.
searealist
> If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
That’s not true: 99% percent of news outlets have absolutely no fear supporting trans activism.
It’s trivial to find hundreds of such cases from sfgate with a google search.
codr7
No fear yet, that may change instantly, just like it did the other way.
sam345
Doesn't sound rationalist to me (from Ziz quoted section of article linked below):
"Ziz
Impostors keep thinking it's safe to impersonate single goods. A nice place to slide in psyche/shadow, false faces, "who could ever falsify that I'm blaming it on my headmate!"
Saying you're single good is saying, "Help, I have a Yeerk in my head that's a mirror image of me. I need you to surgically destroy it, even if I'm then crippled for life or might die in the process. Then kill me if I ever do one evil act for the rest of my life. That's better than being a slave. Save me even though it is so easy to impersonate me. And you will aggro so many impostors you'll then be in a fight to the death(s) with. Might as well then kill me too if I don't pass an unthinkable gom jabbar. That'll make us both safer from them and I care zero about pain relative to freedom from my Yeerk at any cost."
It's an outsized consequentialist priority, even in a doomed timeline, to make it unsafe to impersonate single goods.
Critical to the destiny of the world. The most vulnerable souls impostors vex. To bring justice to individual people, from collective punishment."
https://openvallejo.org/2025/01/31/zizian-namesake-who-faked.... More detail.
grumple
This Ziz person is really unhinged. I read some of their writing, it reminds me of every eloquent, manipulative narcissist I've met. They are never as smart as they think they are - or as smart as they want you to think they are - though they may be smart, charming, and engaging. They've created an alternate universe in their mind and haphazardly abuse whatever ideas they've encountered to justify it.
anon84873628
what the heck does any of this mean??
guerrilla
Sadly, I completely understand after reading all the links in this thread tonight.
The specific theory they're speaking in: https://zizians.info/
Backstory (7 chapters): https://aiascendant.substack.com/p/extropias-children-chapte...
squigz
A relevant bit from zizians.info
> This jargon serves multiple purposes. An important one is that it separates Zizians from others. The cost to read and understand Sinceriously is substantial, and most people are not willing to pay it. Another is to warp the beliefs of people who use it. The connotations and affect of Ziz's jargon encode her moral beliefs about the world separated from the quality of reasoning used to produce them. By offering language that reinforces these judgments Ziz creates conditions where even engaging with the beliefs of her followers requires the repetition and reinforcement of their frame.
kragen
Could you offer us a translation of the first two or three sentences? Apparently this comment was interpreted as a threat of murder.
jschoe
They write and talk in their group lingo so outsiders can't understand it without diving deep into their lore, mindset and community. It's a common thing. Seen it numerous times. Don't waste your time.
FeepingCreature
To be clear, this is not rationalist lingo.
booleandilemma
I'm sure it makes sense to the most indoctrinated of the cult members.
Muromec
One of the many rabbit holes, the deeper layers of.
null
A later article by the same author: https://www.sfgate.com/bayarea/article/leader-alleged-bay-ar.... Probably makes sense to read both or neither.