Facts don't change minds, structure does
196 comments
·July 22, 2025andrewmutz
ianbicking
I remember reading an article on one of the classic rationalist blogs (but they write SO MUCH I can't possibly find it) describing something like "rational epistemic skepticism" – or maybe a better term I can't recall either. (As noted below: "Epistemic learned helplessness")
The basic idea: an average person can easily be intellectually overwhelmed by a clever person (maybe the person is smarter, or more educated, or maybe they just studied up on a subject a lot). They basically know this... and also know that it's not because the clever person is always right. Because there's lots of these people, and not every clever person thinks the same thing, so they obviously can't all be right. But the average person (average with respect to whatever subject) is still rational and isn't going to let their beliefs bounce around. So they develop a defensive stance, a resistance to being convinced. And it's right that they do!
If someone confronts you with the PERFECT ARGUMENT, is it because the argument is true and revelatory? Or does it involve some slight of hand? The latter is much more likely
tunesmith
I tend to like the ethos/logos/pathos model. Arguments from clever people can sound convincing because ethos gets mixed in. And anyone can temporarily confuse someone by using pathos. This is why it's better to have arguments externalized in a form that can be reviewed on their own, logos only. It's the only style that can stand on its own without that ephemeral effect (aside from facts changing), and it's also the only one that can be adopted and owned by any listener that reviews it and proves it true to themselves.
marcosdumay
It's usually dumb people that have so many facts and different arguments that one can't keep up with.
And they usually have so many of those because they were convinced to pay disproportionate attention to it and don't see the need to check anything or reject bad sources.
scotty79
I noticed something similar. People who believe in absolute garbage tend to be the ones that don't have robust bs filter that would let them quickly reject absolute garbage. And it's surprisingly orthogonal to person's intelligence. There's correlation but even very intelligent people can have very weak bs filter and their intelligence post-rationalizes the absolute garbage they were unable to reject.
ayoubd
Was it this one? “Epistemic learned helplessness”
https://slatestarcodex.com/2019/06/03/repost-epistemic-learn...
ianbicking
Yes, that's the one, thank you!
TheOtherHobbes
The problem isn't the PERFECT ARGUMENT, it's the argument that doesn't look like an argument at all.
Take anti-vaxxers. If you try to argue with the science, you've already lost, because anti-vaxxers have been propagandised into believing they're protecting their kids.
How? By being told that vaccinations are promoted by people who are trying to harm their kids and exploit the public for cash.
And who tells them? People like them. Not scientists. Not those smart people who look down on you for being stupid.
No, it's influencers who are just like them, part of the same tribe. Someone you could socialise with. Someone like you.
Someone who only has your best interests at heart.
And that's how it works. That's why the anti-vax and climate denial campaigns run huge bot farms with vast social media holdings which insert, amplify, and reinforce the "These people are evil and not like us and want to make you poor and harm your kids" messaging, combined with "But believe this and you will keep your kids safe".
Far-right messaging doesn't argue rationally at all. It's deliberate and cynically calculated to trigger fear, disgust, outrage, and protectiveness.
Consider how many far-right hot button topics centre on protecting kids from "weird, different, not like us" people - foreigners, intellectuals, scientists, unorthodox creatives and entertainers, people with unusual sexualities, outgroup politicians. And so on.
So when someone tries to argue with it rationally, they get nowhere. The "argument" is over before it starts.
It's not even about rhetoric or cleverness - both of which are overrated. It's about emotional conditioning using emotional triggers, tribal framing, and simple moral narratives, embedded with constant repetition and aggressive reinforcement.
dasil003
I liked your point about tribalism up until you said one tribe is rational and the other not. The distribution of rational behavior does not change much tribe to tribe, it's the values that change. As soon as you say one tribe is more rational than another you're just feeding into more tribalism by insulting a whole group's intelligence.
I think the real problem is that zero friction global communication and social media has dramatically decreased the incentive to be thoughtful about anything. The winning strategy for anyone in the public eye is just to use narratives that resonate with people's existing worldview, because there is so much information out there and our civilization has become so complex that it's overwhelming to think about anything from first principles. Combine that with the dilution of local power as more and more things have gone online and global, a lot of the incentives for people to be truthful and have integrity are gone or at least dramatically diminished compared to the entirety of human history prior to the internet.
hn_acc1
It's also mentioned in "the authoritarians" (search for the book and the short-form essay) - roughly half the population is driven by intellectual curiosity about all kinds of things and don't always agree on much - they just want freedom to be individuals.
The other half is driven by fear, disgust, paranoia, etc.. That second group is much easier to trigger / convince - just play on their fears about their kids, their friends, their church ("will ban Bibles and churches"), etc.. (I was raised in this kind of environment).
Authoritarians WANT a "strong leader" to tell them what to think, how to act, etc. That's how they show they belong to the tribe: they believe everything that is said, they give the most $$ to their church, etc.
brailsafe
I really think most of these statements apply to both political sides of messaging in a majority of cases. You can't talk about in-group out-group unless you draw a line somewhere, and in your comment you drew a line between people who represent science and rationality and those that are fearful and reactionary, which you'd believe to be a sensible place to draw that line if you habitually consume basically any media. The actual science seems mostly incidental to any kind of conversation about it.
Some people are crippled by anxiety and fear of the unknown or fear of their neighbors. It's sad, but it's not unique to political alignment.
prometheus76
Ah yes. People who think like you and agree with you are rational, not prone to fear, disgust outrage, or protectiveness. But people who disagree with you are obviously irrational and can't be reasoned with. You are "educated" and they are "fear-mongers".
webnrrd2k
Just to add a little to the discussion, I suspect that the "not like us" messaging is mostly a right-wing thing, while there's more of a "don't contaminate my fluids" argument from the far-left.
Neither is a rational argument, and still trigger the same disgust and fear, but tend to have different implications for outgroups.
mpyne
Was it this one? https://slatestarcodex.com/2019/06/03/repost-epistemic-learn...
nudgeOrnurture
repetition breeds rationalism. variety of phrasing breeds facts.
it's how the brain works. the more cognitive and perceptive angles agree on the observed, the more likely it is, that the observed is really / actually observed.
polysemous language (ambiguity) makes it easy to manipulate the observed. reinterpretation, mere exposure and thus coopted, portfolio communist media and journalism, optimize, while using AI for everything will make it as efficient as it gets.
keep adding new real angles and they'll start to sweat or throw towels and tantrums and aim for the weak.
mistermann
[dead]
zaphar
The best way to lie is not presenting false facts, it's curating facts to suit your narrative. It's also often that you accidentally lie to yourself or others in this way. See a great many news stories.
prometheus76
The act of curating facts itself is required to communicate anything because there are an infinite number of facts. You have to include some and exclude others, and you arrange them in a hierarchy of value that matches your sensibilities. This is necessary in order to perceive the world at all, because there are too many facts and most of them need to be filtered. Everyone does this by necessity. Your entire perceptual system and senses are undergirded by this framework.
There is no such thing as "objective" because it would include all things, which means it could not be perceived by anyone.
__MatrixMan__
The subjective/objective split is useful. What good is raising the bar for objectivity such that it can never be achieved? Better to have objective just mean that nobody in the current audience cares to suggest contradictory evidence.
It's for indicating what's in scope for debate, and what's settled. No need to invoke "Truth". Being too stringent about objectivity means that everything is always in scope for debate, which is a terrible place to be if you want to get anything done.
api
I often put it this way: you can lie with the truth. I feel like most people don't get this.
darksaints
To add to your second point, those algorithms are extremely easy to game by states with the resources and desire to craft narratives. Specifically Russia and China.
There has actually been a pretty monumental shift in Russian election meddling tactics in the last 8 years. Previously we had the troll army, in which the primary operating tactic of their bot farms were to pose as Americans (as well as Poles, Czechs, Moldovans, Ukrainians, Brits, etc.) but push Russian propaganda. Those bot farms were fairly easy to spot and ban, and there was a ton of focus on it after the 2016 election, so that strategy was short lived.
Since then, Russia has shifted a lot closer to Chinese style tactics, and now have a "goblin" army (contrasted with their troll army). This group no longer pushes the narratives themselves, but rather uses seemingly mindless engagement interactions like scrolling, upvoting, clicking on comments, replying to comments with LLMs, etc., in order to game what the social media algorithms show people. They merely push the narratives of actual Americans (not easily bannable bots) who happen to push views that are either in line with Russian propaganda, or rhetoric that Russian intelligence views as being harmful to the US. These techniques work spectacularly well for two reasons: the dopamine boost to users who say abominable shit as a way of encouraging them to do more, and as a morale-killer to people who might oppose such abominable shit but see how "popular" it is.
https://www.bruegel.org/first-glance/russian-internet-outage...
yorwba
> These techniques work spectacularly well for two reasons
Do they work spectacularly well, though? E.g. the article you link shows that Twitter accounts holding anti-Ukrainian views received 49 reposts less on average during a 2-hour internet outage in Russia. Even granting that all those reposts were part of an organized campaign (its hardly surprising that people reposting anti-Ukrainian content are primarily to be found in Russia) and that 49 reposts massively boosted the visibility of this content, its effect is still upper bounded by the effect of propaganda exposure on people's opinions, which is generally low. https://www.persuasion.community/p/propaganda-almost-never-w...
darksaints
Notice that the two reasons I mentioned don't hinge on changing anyones mind.
1 - They boost dopamine reward systems in people who get "social" validation of their opinions/persona as an influencer. This isn't something specific to propaganda...this is a well-observed phenomenon of social media behavior. This not only gives false validation to the person spreading the misinformation/opinions, but it influences other people who desire that sort of influence by giving them an example of something successful to replicate.
2 - In aggregate, it demoralizes those who disagree with the opinions by demonstrating a false popularity. Imagine, for example, going to the comments of an instagram post of something and you see a blatant neo-nazi holocaust denial comment with 50,000 upvotes. It hasn't changed your mind, but it absolutely will demoralize you from thinking you have any sort of democratic power to overcome it.
No opinions have changed, but more people are willing to do things that are destructive to social discourse, and fewer people are willing to exercise democratic methods to curb it.
null
foobarian
> a "goblin" army
Hah, a "monkey amplifier" army! Look at garbage coming out of infinite monkeys keyboards and boost what fits. Sigh
NooneAtAll3
> Specifically Russia and China.
...or USA
psychoslave
What should make us believe any other state propaganda is better, even for its own general population?
staph
Thanks for your thoughts, they perfectly extend mine. I agree that it would be a sign of a very fragile belief system if it gets unwound by a single bit of contradictory evidence. And as to the "facts" that we're getting 24/7 coming out of every microwave is just a sign of complete decoupling of people's beliefs from empirical reality, in my humble opinion. Supply and demand and all that.
prometheus76
I would contend that empiricism is inadequate to discern what is real and what is true. Much of human experience and what is meaningful to being a person is not measurable nor quantifiable.
mike_hearn
> If you believe in climate change and encounter a situation where a group of scientists were proven to have falsified data in a paper on climate change, it really isn't enough information to change your belief in climate change, because the evidence of climate change is much larger than any single paper.
Although your wider point is sound that specific example should undermine your belief quite significantly if you're a rational person.
1. It's a group of scientists and their work was reviewed, so they are probably all dishonest.
2. They did it because they expected it to work.
3. If they expected it to work it's likely that they did it before and got away with it, or saw others getting away with it, or both.
4. If there's a culture of people falsifying data and getting away with it, that means there's very likely to be more than one paper with falsified data. Possibly many such papers. After all, the authors have probably authored papers previously and those are all now in doubt too, even if fraud can't be trivially proven in every case.
5. Scientists often take data found in papers at face value. That's why so many claims are only found to not replicate years or decades after they were published. Scientists also build on each other's data. Therefore, there are likely to not only be undetected fraudulent papers, but also many papers that aren't directly fraudulent but build on them without the problem being detected.
6. Therefore, it's likely the evidence base is not as robust as previously believed.
7. Therefore, your belief in the likelihood of their claims being true should be lowered.
In reality how much you should update your belief will depend on things like how the fraud was discovered, whether there were any penalties, and whether the scientists showed contrition. If the fraud was discovered by people outside of the field, nothing happened to the miscreants and the scientists didn't care that they got caught, the amount you should update your belief should be much larger than if they were swiftly detected by robust systems, punished severely and showed genuine regret afterwards.
jmcqk6
You're making a chain of assumptions and deductions that are not necessarily true given the initial statement of the scenario. Just because you think those things logically follow doesn't mean that they do.
You also make throw away assertions line "That's why so many claims are only found to not replicate years or decades after they were published." What is "so many claims?" The majority? 10%? 0.5%?
I totally agree with you that the nuances of the situation are very important to consider, and the things you mention are possibilities, but you are too eager to reject things if you think "that specific example should undermine your belief quite significantly if you're a rational person." You made lots of assumptions in these statements and I think a rational person with humility would not make those assumptions so quickly.
goatlover
The idea that people believe in climate change (or evolution) is odd considering people don't say they believe in General Relativity or atomic theory of chemistry. They just accept those as the best explanations for the evidence we have. But because climate change and evolution run counter to some people's values (often religious but also financially motivated), they get called beliefs.
wredcoll
> But because climate change and evolution run counter to some people's values (often religious but also financially motivated), they get called beliefs
Hey, weren't we just talking about propaganda?
psychoslave
You generally don't oppose to things you can grasp to the point you could understand how it challenges other beliefs you culturally or intuitively integrated.
Evolution directly challenges the idea that humans are very special creatures in a universe where mighty mystic forces care about them a lot.
Climate changes, and the weight of human industry in it, challenges directly the life style expectations of the wealthiests.
SkyBelow
To some extent, physics/chemistry/etc. challenge the notion that free will exists, but that challenge is far enough removed and rarely touched upon that people who believe in free will don't feel that modern science is attacking that belief, and the scientists working on it generally see free will or any mechanisms of the brain as far too complex when they are studying things on the order for a few particles or few molecules.
Some of neurology/psychology gets a bit closer, but science of the brain doesn't have major theories that are taught on the same level nor have much impact on public policy. The closest I can think of is how much public awareness of what constitutes a mental disorder lags behind science, but that area is still constantly contested even among the researchers themselves and thus prevents a unified message being given to the public that they must then respond to (choosing to believe the science or not).
zahlman
> the previous era of corporate controlled news media... The facts you are exposed to today are usually decided by an algorithm
... But that algorithm is still corporate controlled.
miki123211
See also: the Chinese robber fallacy.
Even if only 0.1% of Chinese people engaged in theft, and that would be a much lower rate than in any developed country, you'd still get a million Chinese thieves. You could show a new one every day, bombarding people with images and news reports of how untrustworthy Chinese people are. The news reports themselves wouldn't even be misinformation, as all the people shown would actually be guilty of the crimes they were accused of. Nevertheless, people would draw the wrong conclusion.
jfarmer
CS Peirce has a famous essay "The Fixation of Belief" where he describes various processes by which we form beliefs and what it takes to surprise/upset/unsettle them.
The essay: https://www.peirce.org/writings/p107.html
This blog post gestures at that idea while being an example of what Peirce calls the "a priori method". A certain framework is first settled upon for (largely) aesthetic reasons and then experience is analyzed in light of that framework. This yields comfortable conclusions (for those who buy the framework, anyhow).
For Peirce, all inquiry begins with surprise, sometimes because we've gone looking for it but usually not. About the a priori method, he says:
“[The a priori] method is far more intellectual and respectable from the point of view of reason than either of the others which we have noticed. But its failure has been the most manifest. It makes of inquiry something similar to the development of taste; but taste, unfortunately, is always more or less a matter of fashion, and accordingly metaphysicians have never come to any fixed agreement, but the pendulum has swung backward and forward between a more material and a more spiritual philosophy, from the earliest times to the latest. And so from this, which has been called the a priori method, we are driven, in Lord Bacon's phrase, to a true induction.”
CGMthrowaway
Wow. I'm reminded of a great essay/blgo I read years ago that I'll never find again that said a good, engaging talk/presentation has to have an element of surprise. More specifically, you start with an exposition of what your audience already knows/believes, then you introduce your thesis which is SURPRISING in terms of what they already know. Not too out of the realm of belief, but just enough.
The bigger/more thought-diverse the audience, the harder this is to do.
mswen
I had a grad school mentor William Wells who taught us something similar. A good research publication or presentation should aim for "just the right amount of surprise".
Too much surprise and the scientific audience will dismiss you out of hand. How could you be right while all the prior research is dead wrong?
Conversely, too little surprise and the reader / listener will yawn and say but of course we all know this. You are just repeating standard knowledge in the field.
Despite the impact on audience reception we tend to believe that most fields would benefit from robust replication studies and the researchers shouldn't be penalized for confirming the well known.
And, sometimes there really is paradigm breaking research and common knowledge is eventually demonstrated to be very wrong. But often the initial researchers face years or decades of rejection.
joelg
my understanding (which is definitely not exhaustive!) is that the case between Galileo and the church was way more nuanced than is popularly retold, and had nothing whatsoever to do with Biblical literalism like the passage in Joshua about making the sun stand still.
Paul Feyerabend has a book called Against Method in which he essentially argues that it was the Catholic Church who was following the classical "scientific method" of weighing evidence between theories, and Galileo's hypothesis was rationally judged to be inferior to the existing models. Very fun read.
marcofloriano
I completely agree with your comment. The common narrative about Galileo and the Church is often oversimplified and overlooks the intellectual context of the time. As you pointed out, it wasn’t about a crude Biblical literalism—after all, even centuries before Galileo, figures like Saint Thomas Aquinas, drawing on Aristotle, already accepted that the Earth is spherical.
By Galileo’s era, the Catholic Church was well aware of this scientific truth and actively engaged with astronomy and natural philosophy. The dispute was far more about competing models and the standards of evidence required, not a refusal to accept reason or observation.
Then I can’t help but think: if the author of the article didn’t even understand this, how can the rest of the article be correct if it started from a biased and almost false premise?
ajkjk
> Then I can’t help but think: if the author of the article didn’t even understand this, how can the rest of the article be correct if it started from a biased and almost false premise?
That seems pretty unfair. The article is clearly structured to treat the Galileo thing as an example, not a premise. It is supposed to be a familiar case to consider before going into unfamiliar ones. In that sense it clearly still works as an example even if it's false: does it not set you up to think about the general problem, even if it's a fictional anecdote? It's no different than using some observation about Lord of the Rings or Harry Potter as an example before setting into a point. The fact that it's fictional doesn't affect its illustrative merits.
carbonguy
> The fact that it's fictional doesn't affect its illustrative merits.
Indeed, it may even reinforce the overall argument being made in the post we're discussing; the "Galileo vs. Catholicism" narrative is itself a linchpin trope in an empirical scientific worldview, with the trope reinforcing (among other beliefs) that "it's right and proper to pursue and advocate for objective truth even to the extent of making enemies of the most powerful."
Considering the likely audience for a piece like this post we're discussing, that the Galileo narrative doesn't necessarily reflect what actually happened historically makes it a pretty good example on a meta-level. Are any of us who have the belief in the ultimate value of objectivity going to give up on it because a potentially weak example was used to support it?
psychoslave
Galileo started the troll himself depicting the opponent theory in the mouth of Simplicius.
And even with its acquaintances with the pope, he finished jailed at home. Far better than being burned alive like the Church did with Giordano Bruno.
So, yes, they are more nuances to the affair, but the case around lack of observable parallax or other indeed judicious reasoning is not going to create a great narrative to sell on the one hand, and on the other hand focusing on technical details is kind of missing the forest for the tree of what where the social issues at stake the trial examplified.
SkyBelow
Was it during Galileo's era or was it a much earlier time with Greek philosophers when the idea of heliocentrism was rejected because the lack of parallax movement of the stars? The idea of stars being so far away they wouldn't show parallax movement wasn't acceptable without stronger evidence than what was available at the time, given how massive that would make outer space, so the simpler explanation was that the sun moved.
kijin
The author doesn't use the Galileo episode as a premise, only as a catchy illustration. If anything, the more nuanced version of the story seems to support their argument better than the simplified version does.
null
ajuc
> Then I can’t help but think: if the author of the article didn’t even understand this, how can the rest of the article be correct if it started from a biased and almost false premise?
Same way Galileo could be correct about Earth circling the Sun despite basing it on incorrect assumptions :)
Asraelite
> By Galileo’s era, the Catholic Church was well aware of this scientific truth and actively engaged with astronomy and natural philosophy.
I'm confused. Are you saying that the Church knew the Earth was round or not? If they knew, then it doesn't matter what arguments were made, it was all in bad faith and therefore wasn't scientific.
EDIT: Never mind, I misread
mcswell
The sphericity of the Earth was not what Galileo and the Church were arguing about--they were arguing about whether the Sun revolved around the Earth, or the Earth around the Sun.
mike_hearn
The idea that people used to think the Earth was flat is a common misconception. Sometimes medieval painters would draw the Earth that way for artistic purposes, but nobody seriously thought it worked that way for real.
Why not? It's obvious to anyone who watched a ship sail over the horizon that the Earth must be a sphere because you see the body of the ship disappear before its sail mast does.
looperhacks
The church knew that the earth was round. Which is largely irrelevant, because Galileo argued for a heliocentric model vs the (at the time popular) geocentric model. Nobody argued that the earth was round
libraryofbabel
> the case between Galileo and the church was way more nuanced than is popularly retold
Ex historian here. This is true. It’s a complicated episode and its interpretation is made more murky by generations of people trying to use it to make a particular rhetorical point. Paul Feyerabend is guilty of this too, although he’s at least being very original in the contrarian philosophy of science he’s using it for.
If anyone is interested in the episode for its own sake (which is rare actually, unless you’re a renaissance history buff first and foremost), I’d probably recommend John Heilbron’s biography which has a pretty balanced take on the whole thing.
legitster
I just recently watched a lecture about this and was fascinated.
Specifically, the (incorrect) model of the universe that was used in Europe at the time had been refined to the point that it was absurdly accurate. Even had they adopted a heliocentric model, there would have been no direct benefit for for a long, long time. If anything, Galileo's work was rife with errors and mathematical problems that would have taken a lot of work to figure out.
So the argument was to take on a bunch of technical debt and switching costs for almost no benefits.
TheOtherHobbes
Does Feyerabend explain why Galileo was placed under house arrest?
Perhaps I'm missing some nuance here, but I don't see why a rational argument about competing models would require such drastic suppression.
opo
I have always thought the lesson here is to be careful when insulting those with a great deal of power over you. Pope Urban VIII was originally a patron and supporter of Galileo:
>...Indeed, although Galileo states in the preface of his book that the character is named after a famous Aristotelian philosopher (Simplicius in Latin, Simplicio in Italian), the name "Simplicio" in Italian also had the connotation of "simpleton."[55] Authors Langford and Stillman Drake asserted that Simplicio was modeled on philosophers Lodovico delle Colombe and Cesare Cremonini. Pope Urban demanded that his own arguments be included in the book, which resulted in Galileo putting them in the mouth of Simplicio. Some months after the book's publication, Pope Urban VIII banned its sale and had its text submitted for examination by a special commission
akurtzhs
He indirectly called the Pope a simpleton, and the Pope took offense.
PhasmaFelis
It wasn't his theory, it was that he presented it in the form of a dialogue with a character who was an obvious stand-in for the Pope, and then made that character sound like a complete idiot.
The heresy charges were an excuse to punish him for being disrespectful. He'd gotten approval from the Pope to publish; he would have been fine if he'd just been polite.
Obviously that's still petty and unjustified, but science denial wasn't the real reason for it.
veqq
> why Galileo was placed under house arrest
Galileo's friend Barberini became Pope and asked Galileo to write a book. But Barberini became paranoid about conspiracies and thought it had seditious, secretly-critical undertones.
teabee89
Reminds me of the Galileo podcast series in the Opinionated History of Mathematics by Viktor Blasjo: https://intellectualmathematics.com/opinionated-history-of-m...
null
wahern
> and had nothing whatsoever to do with Biblical literalism like the passage in Joshua about making the sun stand still.
The church is and was a large, often heterogenous institution. For some the issue was about conflict with literal interpretations of the bible, not merely the predominate allegorical interpretations (a more widely held concern, at least as a pedagogic matter). AFAIU, while the pope wasn't of this mind, some of the clerics tapped to investigate were. See, e.g., the 1616 Consultant's Report,
> All said that this proposition is foolish and absurd in philosophy, and formally heretical since it explicitly contradicts in many places the sense of Holy Scripture, according to the literal meaning of the words and according to the common interpretation and understanding of the Holy Fathers and the doctors of theology.
https://www.vaticanobservatory.org/sacred-space-astronomy/in...
jack_h
As I’ve grown older and witnessed history in action I’ve begun to understand that reality is much, much more complicated than the simple narratives of history we lean on as a society.
Just think of how many different competing narratives are currently in existence surrounding this tumultuous point in history and realize that at some point some of these narratives will become dominant. Over time as the events leave social memory the key conclusions will likely be remembered but a lot of the reasoning behind them will not. As it exits living memory most of the nuance and context is lost. Over time we may change the narrative by reconsidering aspects that were forgotten, recontextualizing events based on modern concepts and concerns, misunderstanding what happened, or even surreptitiously “modifying” what happened for political ends. Or to put it more plainly, history is written by the victors and can be rewritten as time goes on and the victors change.
throwawayffffas
The important thing is not why they thought they were right but the fact they could not tolerate being wrong, or even tolerate dissidence on that one little inconsequential thing.
That's why you have people today pushing for flat earth and creationism.
Because their whole shtick is we are always right about absolutely everything.
zebomon
Very engaging look at a very difficult topic to approach analytically.
I'm reminded of something I learned about the founder of Stormfront, the internet's first white supremacist forum. His child went on to attend college away from home, her first time away from her family, and over a period of roughly two years, she attended dinners with a group of Jewish students who challenged each of her beliefs one at a time. Each time, as she accepted the evidence her friends presented to her about a particular belief, she nonetheless would integrate the new information with her racist worldview. This continued piece by piece until there was nothing left of her racist worldview at all.
It's both heartening and disheartening at the same time, because if this person can change her mind after almost two decades of constant indoctrination during her formative years, then surely anyone can change their mind. That's the heartening part: the disheartening part is, of course, that the effort it took is far from scalable at present and much more difficult to apply to someone who remains plugged into whatever information sources they are getting their current fix of nonsense from.
zahlman
It's also noteworthy that she was willing to sit with and listen to them in the first place.
mensetmanusman
AI chat bots in the future may be a part of ritual mind cleansing.
staph
Wait are you writing from the past?
pessimizer
I think this is just gloating. Children leaving home for college and quickly abandoning the belief systems of their family is almost more common than the opposite, where they maintain them. Especially if the belief system is something as unpopular as white supremacy mythology; not easy to make new friends at your new school if you don't give that up.
I'm sure she maintains many beliefs that may people would see as racist, along with her classmates. She hasn't been educated or fixed, she just left home.
hn_acc1
IIRC, the stats say that overwhelmingly, children will become a version of their parents, including beliefs, etc. This actually seems more like the exception than the rule.
ilaksh
I remember my first year in college as being the time when I solidified my own first worldview. Prior to that, I had some ideas like the existence of God (in some form) that I was ambivalent about or maybe deferring final judgement. That's when I decided that I was an atheist.
Coincidentally, around the same time my twin brother became a serious Christian. He was socially integrated into a group. He finished college. I did not.
Then years later, maybe late 20s or early 30s, I became convinced that I had been wrong about my government my whole life and that they were not trustworthy. 9/11 being a false flag (which I still believe) was evidence of that.
The interesting thing was at the time when I was in New York I had completely accepted the idea that those three buildings had all turned into dust because the jet hit them. I remember walking around lower Manhattan to pick up a check and the dust was just coating everything.
I had even done some word processing on one of the twin towers leases shortly before the event while temping at Wachtell Lipton. At the time I made no connection.
Anyway, I think an underappreciated aspect of belief graphs is their connection to social groups and identity. It was much easier for me to question institutions when I already felt more marginalized and actually partly blamed society for it being so hard for me to handle my needs and find a place in it.
Another aspect of group membership and beliefs is practical. When groups are competing strategically, they often do so in ways that are not particularly ethical. It's much easier to justify this if you think of the other group as being deeply flawed, evil, invaders, etc.
Although some of these demonization s of the other group do have some kernel of truth to them, they are largely oversimplifications in the belief graphs leading to dangerous inaccuracies.
What are the practical structural and cultural differences that lead to the group divisions? They largely seem geographic, economic, ethnic.
Could a more sophisticated, better integrated, and more accurate belief system help? Or do the social structures and networks largely define the groups?
Are we just basically mammalian ant colonies? Brutally fighting each other for dominance any time there is a resource conflict?
If the other side seems to be trying to hog important resources any time they get a chance, you perceive that you are not playing a fair game. It's not a civil interaction. The other doesn't play by the rules or tell the truth or leave any subtly in discourse. So why should your group, unless it wants to get wiped out?
In my worldview the faint hope is that having more abundance of resources will somehow lead to more civility.
null
wjholden
To the author: I love this idea, but your blog has two problems that made it less enjoyable for me to read. The first is the pull quotes. I find them confusing and unnecessary, especially when they repeat sentences in the preceding paragraph. The second is that I got stuck on the moving graphs while scrolling on my phone. I suggest making them smaller with a different background color or simply make them static images.
staph
I really appreciate this feedback, I'll look into both of those before the next post. Just wanted to say thanks.
wintermutestwin
Just to add to this: I couldn’t read the text in the white boxes of the graphs. Very unfortunate choices of colors…
meowface
Some of the core ideas here seem good, but the node/edge distinction feels too fuzzy. The node "Climate Change Threat" is a claim. Is the node "Efficiency" a claim? Can one challenge the existence of Efficiency? If one instead challenges the benefit of Efficiency, isn't that an edge attack?
I could give a bunch of other examples where the nodes in the article don't feel like apples-to-apples things. I feel less motivated to try to internalize the article due to this.
mcswell
The edges are labeled by transitive verbs, where the arrow points from the subject of that verb to the direct object. (I'm counting particle verbs, like "leads to", as verbs.) The nodes are labeled by nouns. If you can change a noun to a verb, I guess you would be changing what is an edge and what is a node.
Example: In the article's first diagram, there is a node labeled "Innovation". This could be replaced by a node labeled "Capitalist" and a node labeled "Improvement", with an arrow from the first to the second labeled "innovates."
So yes, if you can replace a node by an edge (and vice versa, although I don't give an example), this node vs. edge thing is fuzzy.
skybrian
I'm wary of making an "arguments are soldiers" assumption where facts are mostly useful for making arguments, in an attempt to change people's minds.
We should be curious about what's going on in the world regardless of what ideologies we might find appealing. Knowing what's going on in the world is an end in itself. An article with some interesting evidence in it is useful even if you disagree with the main argument.
Facts may not change minds, but we should still support people who do the reporting that brings us the facts.
staph
I just really wish most people had this same kind of attitude, but can't in good faith say that's what I'm observing.
tk90
If you found this interesting, I highly recommend reading "The Righteous Mind" by Jonathan Haidt. It's deeply impacted how I think of morality and politics from a societal and psychological point of view.
Some ideas in the book:
- Humans are tribal, validation-seeking animals. We make emotional snap judgments first and gather reasons to support those snap judgments second.
- The reason the political right is so cohesive (vs the left) is because they have a very consistent and shared understanding and definitions of what Haidt calls the 5 "moral taste receptors" - care, fairness, loyalty, authority, sanctity. Whereas the left trades off that cohesive understanding with diversity.
thelittlenag
I've really enjoyed Haidt's book, though its really a couple of different books in one. I need to read his other work.
To your point about left and right, an interesting point I heard recently is that the left is coalition-driven whereas the right is consensus-driven (at least in US politics). Mapping this back to Haidt, one of his findings is that the left tends to greatly emphasize one or two of the "moral taste receptors", with the right having a roughly equal emphasis between them. It isn't clear to me how these two points might explain each other, but I do wonder if there isn't some self-reinforcement there. If there is, I wonder how/if that might explain political systems more widely.
moate
>>The reason the political right is so cohesive (vs the left)
Citation excruciatingly needed. This feels like recency bias imo. The Right (I'm assuming we're going US here?) is a coalition of people all walks just as much as the left. I mean, right now large chunks Trump voters are rioting over the Epstein non-release and all the people who were in it for the tax breaks are trying to convince them to stop.
ngriffiths
In practice I think people often don't see the full structure of their own belief graph. Parts of it are clear but for 99% of important issues, it's more fuzzy than portrayed in the figures here. I still think this is an illuminating way of looking at it!
Another major factor is that while the graph may be fuzzy, the people we trust are clear. Only those people are allowed to "fill in" the missing pieces, and I think it takes a lot of work to do that, so it totally makes sense.
If the takeaway is "don't expect conflicting facts to convince your audience" I agree with that, but the reason is they don't trust you, not the conflicting graphs, and the trust is not really a consequence of the graph structure.
(Also, I was writing about similar stuff recently here: https://blog.griffens.net/blog/no-one-reads-page-28/)
fvdessen
In 'Thus Spoke Zarathustra' the argument is made that the most important cultural changes happen outside the debate, where new structures of thought are being built without being noticed. As without a competing thought structure we are unable to even perceive the new structure. It is the dissonances and the debates that lets us introspect our own ideas. Without the dissonance we do not notice new ideas taking hold of us and changing ourselves, and it is only unnoticed that truly radical changes can take place.
PaulHoule
Feelings aren't facts but they are important for persuasion. The methods most able to create radical change are the gentlest
https://en.wikipedia.org/wiki/Rogerian_argument
I disagree with Rapoport's taxonomy, not least "Chinese brainwashing" in the Korean war was not Pavolivan and was rather closer to the T-group method developed in Bethel, ME.
dgb23
Interesting. I often hear/read the term "steelmanning" instead, as in the opposite of constructing a straw man argument.
PaulHoule
Reminds me of the time that I was in a class on geoengineering and was supposed to have a debate with another student about "Should we fund BECCS in Brazil or fund efforts to protect the rainforest?" (I was advocating for the first)
Two days before the debate I went to a talk about the rainforest with my debating partner, then the night before his friend was stabbed at a bar trying to break up a fight and he rode up to Syracuse in an ambulance so it turned out I was well prepared to give his presentation.
[1] https://en.wikipedia.org/wiki/Bioenergy_with_carbon_capture_... the economics of which are particularly good for Brazilian ethanol plants
This is a good blog post. Two thoughts about it:
- Contradictory facts often shouldn't change beliefs because it is extremely rare for a single fact in isolation to undermine a belief. If you believe in climate change and encounter a situation where a group of scientists were proven to have falsified data in a paper on climate change, it really isn't enough information to change your belief in climate change, because the evidence of climate change is much larger than any single paper. It's only really after reviewing a lot of facts on both sides of an issue that you can really know enough to change your belief about something.
- The facts we're exposed to today are often extremely unrepresentative of the larger body of relevant facts. Say what you want about the previous era of corporate controlled news media, at least the journalists in that era tried to present the relevant facts to the viewer. The facts you are exposed to today are usually decided by an algorithm that is trying to optimize for engagement. And the people creating the content ("facts") that you see are usually extremely motivated/biased participants. There is zero effort by the algorithms or the content creators to present a reasonably representative set of facts on both sides of an issue