Skip to content(if available)orjump to list(if available)

Manufactured consensus on x.com

Manufactured consensus on x.com

227 comments

·April 24, 2025

casenmgreen

It looks like Twitter is suppressing posts until they are spammed by hate bots and then making those posts visible.

https://bsky.app/profile/willhaycardiff.bsky.social/post/3lk...

I've also seen evidence of posts Twitter likes (violent and hateful anti-immigration posts - literally a photo of a dummy tied to a chair being shot in the back of the head) being spammed by love bots.

Twitter seems to be a propaganda channel, run by Donald/Elon/et al.

Tireings

It's a propaganda platform since Musk bought it.

Im saying this for ages and never joked.

Plenty of real situations happening like blocking certain people, stoping of fact checking, bot protection and detection etc.

There is a reason why Twitter needed more people before

fourseventy

[flagged]

Llamamoe

Yeah I mean, if the only reason you said that was to "piss off the libs" and hate on transgender men, you really shouldn't be shocked that a platform doesn't want that. Being unable to say hateful things without consequence on social media isn't the same as it being a propaganda machine.

viraptor

I doubt you have any proof of that happening...

gruez

>It looks like Twitter is suppressing posts until they are spammed by hate bots and then making those posts visible.

>https://bsky.app/profile/willhaycardiff.bsky.social/post/3lk...

This could also very well be explained by a ranking algorithm that optimizes for "engagement". Getting spammed by hate bots = "engagement". This would be perfectly consistent with what the guy is experiencing, minus the accusation that the platform is suppressing anti-ukraine posts, which is totally unsubstantiated.

casenmgreen

As I understanding the timing, the post was suppressed until the hate bots spammed it.

Given the post was suppressed, how did the hate bots know about it to spam it?

It seems to me Twitter suppressed the post until they had time to spam it with hate posts.

Bear in mind here also this suppression did not happen for other posts - only for the pro-Ukraine post - so Twitter at the least is specifically suppressing pro-Ukraine posts.

gruez

>It seems to me Twitter suppressed the post until they had time to spam it with hate posts.

Twitter can't spam hate posts in real time? They're literally an AI company.

>Bear in mind here also this suppression did not happen for other posts - only for the pro-Ukraine post - so Twitter at the least is specifically suppressing pro-Ukraine posts.

There's scant evidence of this, besides the vague assertion that "I have 5k more followers there and regularly have posts which are viewed over 100k times". If he normally posts informative and substantive content that gets good engagement, is it really surprising that a generic "I support Zelensky" post would get poor engagement?

OmarShehata

important notes from the essay, this not unique to twitter:

> And if you think this only happens on one social network, you’re already caught in the wrong attention loop.

> The most effective influence doesn’t announce itself. It doesn’t censor loudly, or boost aggressively. It shapes perception quietly — one algorithmic nudge at a time.

readhistory

FWIW, I don't think the person you are responding to said it only happens on Twitter. Just that it happens on Twitter.

awkwabear

This definitely happens on other platforms as well but there is a key difference in noting that twitter is now privately owned by a single person who has shown themself to be insecure and prone to lashing publicly at critics.

I think twitter is uniquely concentrated in its influence by its owner and willingness to do things so blatantly, other platforms need to at least pretend to not steer things so directly as not to upset shareholders.

numpad0

Yeah, artificial delays in content delivery is silently spreading. It's not just Twitter.

Edit: is this why 4chan was hit with the disruption - because there's no room for this delay mechanism?

kmeisthax

No, 4chan was hacked because hideyuki has terrible security hygiene and didn't update shit. Same thing happened to 2channel a decade prior.

hn1986

Billionaire buys social network for instant cultural and political influence. Including amplifying his own posts. Yet, hardly any alarm from the tech or mainstream

zoogeny

It is a bit chilling because of the compound interest that this kind of policy incentivizes. Once you have a handful of powerful X accounts, you have the ability to generate more. So not only can you work to silence others, you can work to increase your capacity to silence others by promoting like-minded allies.

We are at the early stages of this, so we are watching the capture of influence. There is some discussion that influence is the new capital. And we are replicating the systems that allow for the accumulation of capital in this new digital age.

jandrese

It's hard to see how this wasn't by design. Elon loudly released the source code to the algorithm so SEO engineers could optimize their systems to have total control over the narrative. Sure "anybody can read it", but realistically only propagandists are going to go to the trouble and then have the time and resources to act on it.

He basically handed the site over to the IRA and told them to go nuts.

tclancy

The ‘ra? Did I miss a step here?

razster

That dynamic of influence compounding certainly echoes the historical patterns we’ve seen with capital—those who have it can shape systems to acquire more. But it’s worth remembering that this only holds power if we choose to participate.

Personally, I’ve stepped away from anything associated with X.com or Elon Musk. I deleted my accounts, disconnected from the ecosystem entirely—and life is better for it. No doomscrolling, no algorithmic nudging, no subtle behavioral conditioning. Influence may be the new capital, but opting out is still a form of agency. Disengagement can be a powerful act.

We often forget: participation isn’t mandatory.

stevenAthompson

I was going to buy a Tesla. My brother had one and I coveted it. They make neat stuff.

Then Elon started taking testosterone (or whatever it was that jacked up his aggression), using psychedelics, and became incapable of keeping his mouth shut. To compound it he then got involved in politics.

Now I will never buy a Tesla, starlink, or anything else he's involved in because his behavior represents a real risk that any of those companies might cease to exist if Elon gets high and does something stupid, then I'll be stuck without support.

Similarly, a social media account is an investment. I would never invest my time into building relationships on a platform like X. Even if it does survive Musk, the community is broken permanently.

jandrese

Many years ago I was really rooting for Tesla and Elon as they dragged the auto industry kicking and screaming towards electrification. How they focused on the underserved whole home battery market. He even kept his manufacturing domestic unlike most other big companies.

Some cracks started to form in this when he made a reckless wall street bet that he could make a million cars in a year or something and had his employees working double shifts in tents to get it done. In the end he won the bet and got an enormous payout. I remember calculating that if he divided the award in half and split that half evenly among every single Tesla employee that it would amount to about $40,000 per person, a life changing amount of money for most people. Instead he kept it all for himself and gave a press conference about how big of a genius he is.

But the turning point is when there was a kid trapped in a cave and he received some mild criticism over his ill conceived rescue solution and the result was to baselessly claim that the critic was a pedophile.

He's exactly the kind of guy who looks like a god when you only measure things in dollars. He takes big risks and they've paid off more often than not, but he's not someone anybody should really look up to.

PaulKeeble

Based on some of the videos of him it looks like its Ketamine.

zoogeny

I think we should be careful of too much cynicism (although too little is bad as well). There is the old Aesop tale of the fox and the grapes. Being unable to reach the grapes the fox sulks away saying "they were probably sour".

There is a lot to gain for the powerful if they can convince those that they wish to hold that power over that the "grapes are sour", so to speak. That leaves less people fighting for the few grapes available, as we stretch this analogy to its breaking point.

No man is an island, and all that. If the holders of influence decide to start a war, you are in it if you like it or not.

grey-area

There's no probably here, and it is healthy to avoid social media platforms run by people who perform nazi salutes in public and attempt to destroy democracy.

archagon

Yes, but eventually normal people will just end up leaving.

Jordan-117

It reminds me of Voat.co, a social news aggregator that promoted itself as a free-speech haven in an attempt to pick up disaffected Redditors during a series of moderation crackdowns circa 2015. It was initially pretty normal:

https://web.archive.org/web/20150501033432/https://voat.co/

But then they instituted karma-based throttling on participation:

https://web.archive.org/web/20170520210511/https://voat.co/v...

That, plus the influx of racists and misogynists chased off of Reddit, led to a snowball effect where the bigots upvoted themselves into power-user status and censored anyone who stood against them, which discouraged normies from sticking around, which further entrenched the bigotry. Within a few years, virtually every single new post on the site was radically right-wing, blatantly racist/sexist/antisemitic neo-Nazi shit:

https://web.archive.org/web/20200610022710/https://voat.co/

The site shut down by the end of 2020 from lack of funding.

You can see basically the same thing happening on Xitter, it's just slower because the starting userbase was so much larger, and Elon (for now) can continue to bankroll it.

ceejayoz

AKA the “Nazi bar” problem.

https://en.wiktionary.org/wiki/Nazi_bar

kmeisthax

One problem I have with the Nazi Bar framing - or perhaps, how people read it - is that it assumes the behavior is accidental. That is, that the sites that have become Nazi bars did so purely out of a misguided sense of free speech absolutism that has been abused.

In practice, most Nazi bars are run by people actively choosing to kick people out: just the ones wearing the trans pride buttons instead of the ones wearing iron crosses. The kinds of online spaces run by free speech nutters or moderators asleep at the wheel tend to devolve into calling everything cringe, including the Nazis. Actually, Nazis are a particularly easy target for trolling and harassment, both because it is never unethical to laugh at Nazis and because critique makes them jump off the deep end.

During the Jack Dorsey era of Twitter, Twitter was a dive bar. Problematic users rarely got removed off platform, neither left nor right[0]. If people did get banned, it was for egregious offenses even Twitter management couldn't excuse. When Musk bought it, he changed it into a Nazi bar, making sure him and his favored far-right commentators got all the algorithmic boosts while left-wingers got shadowbanned.

Same with all the right-wing communities that forked out of Reddit. /r/The_Donald, Voat, etc. I bet you $10 they all had active policies to ban or bury left-wing content while actively screaming their heads off about "freedom of speech".

And there's a parallel with the actual rise of Hitler as well. I think a lot of Americans have this incorrect picture of a stupendously angry and racist German public, all voting in a landslide for the state-sponsored murder of six million Jews. The reality is that the people who owned the bar - both in Germany and abroad - were rallying behind Hitler since day one, in ways that persisted even beyond the fall of the Nazi state. They're the bits of the deep state[1] that ensured Hitler's insurrection against the Weimar Republic was given a light sentence and that Americans were kept in the dark about the nature of the Holocaust until it was undeniable. Nobody ever actually voted Hitler into office. He took advantage of a technicality and a frightened owner class to seize power for himself.

Yes, it is true that Nazis are malware[2]. Yes, Nazis can independently worm their way into a system and ruin it. However, more often than not, the people who own the Nazi bar don't merely tolerate Nazis, they accept and embrace them.

[0] Before you mention Donald Trump's ban in 2021, keep in mind Twitter had made a policy specifically to justify keeping Donald Trump on platform even when he was breaking rules.

[1] Informal ruling hierarchy parallel to the formalized one we vote for. This term usually also alleges that the informal hierarchy has subverted the formal one; but I'd argue that's almost never necessary for a deep state to exist. All states start deep, formal hierarchy is a transparency mechanism to make it shallow.

See also https://xkcd.com/898/

[2] Fun fact: if you fine-tune an AI to write malicious code unprompted, it becomes a Nazi. See https://www.emergent-misalignment.com/

mrdoops

Manufactured consensus is everywhere there is enough attention to incentivize such an effort. The worst by far is Reddit.

jampa

I've been using Reddit for 12 years. After the API fiasco, the quality dropped a lot. Most popular subreddits are now astroturfed, where every week there is a crusade against something (First it was for banning Twitter, now it is against banning AI Art).

Even in regular posts, Reddit has been a hive mind lately. If you scroll through the comments, most of them will have the same opinion over and over, with comments that add nothing to the discussion, like "I agree," getting hundreds of upvotes.

Aurornis

> I've been using Reddit for 12 years. After the API fiasco, the quality dropped a lot. Most popular subreddits are now astroturfed, where every week there is a crusade against something (First it was for banning Twitter, now it is against banning AI Art).

This didn’t start with the API change drama. The API change protests were their own crusade. The calls to ban Twitter links or AI art are just the next iterations of the same form of protest.

Many of the big subs were thoroughly astroturfed long before the API changes. The famous ones like /r/conservative weren’t even trying to hide the fact that they curated every member and post

gruez

>This didn’t start with the API change drama

The proximate cause IMO is that the protests (ie. moderators shutting down their subreddits) resulted in some moderators being deposed, causing new subreddits and moderators to come in power, which were easier to astroturf or whatever.

kridsdale3

I've been there for 17 (!) years, and I could have written pretty much the same message as you since around 2012. Dennis Kucinich was a huge campaign!

But I agree, since the API thing, it has sucked HARD.

lynndotpy

I agree, the API change was the last nail in the coffin, honestly. Reddit was always bad for several reasons, but it always had some availability of smart people that placed it alongside StackExchange and Hacker News. But 2022 and 2023 really saw a mass exodus of expertise from Reddit (and Twitter, etc.)

Lots of smart people left to Mastodons, at least.

polynomial

Missing the Kucinich connection here, what's the lore?

DustinBrett

Happy to see posts like this, I have the same experience. It fell apart a few years ago with the fiasco's and it's a shell of what it was now. Total echo chamber. Sadly seems to be spreading to HN in some comment sections. And X has it's problems in the other direction. There aren't many places left like how it was, when up and down votes meant something.

chneu

Lol dude reddit has been heavily manipulated since like 2013, if not earlier.

I was heavily involved in buying/selling spam accounts for years on reddit. If you think it wasn't heavily manipulated, at least the frontpage, then lol you were buying it like everyone else.

andrepd

> Reddit has been a hive mind lately. If you scroll through the comments, most of them will have the same opinion over and over, with comments that add nothing to the discussion, like "I agree," getting hundreds of upvotes.

That has been the case for over 10 years now. It's absolutely not a new phenomenon.

DustinBrett

It got much worse a few years ago. I am a daily Reddit user and it was a big difference.

CSMastermind

I feel so old that I remember the post-2016 election when Reddit started down this path. It's been particularly bad in the last few years but agree. Ever since the_donald and the admin's reactions to it, it's been bad.

null

[deleted]

unethical_ban

The API shutdown allows a flooding of bots, crippled 3rd party apps and the moderator tools that kept things clean.

But I don't think the "crusades" are always bot related. Movements get momentum.

gruez

>The API shutdown allows a flooding of bots, crippled 3rd party apps and the moderator tools that kept things clean.

I thought they backed down on the API changes for moderators?

os2warpman

> now it is against banning AI Art

AI art does not exist. There is only slop stolen from artists.

fkyoureadthedoc

Gatekeeping the definition of art probably doesn't help your cause. Even if you convince everyone to say, I don't know, "algorithmically generated images" instead, have you really improved the situation from the artists perspective?

guywithahat

For all the negative things one can say about X, their fact checking (community notes) has actually gotten pretty good, which is something Reddit has yet to implement. Pew has also been ranking them more politically center than most social media sites, although I suppose that's subjective

jandrese

I like the community notes as a concept, but they're often a day late and a dollar short. By the time the community note appears the post has been squeezed of all of its juice and was already on the way out. It's better than nothing, but the entire mechanism runs slower than the speed of propaganda.

guywithahat

While I agree, they still notify everyone who liked it of the community note, so there is some feedback after the fact.

PaulKeeble

They also don't seem to last. I don't know quite how it happens but you see a lot of these community notes disappear 24 hours after they appeared. They act on the tail end of the posts exposure and then are removed for the long term for when the news comes along and uses it as a reference. But all the people who spotted the misinformation see the post and the community note and so everyone walks away "happy".

_DeadFred_

This. It's technically a solution but not a solution at all. It's like giving a calorie count AFTER someone's eaten a meal (or in this case after the tweet has been viewed by the majority of people that are going to view it).

jimbob45

Reddit has stickied posts at the top of each thread. Well-moderated subreddits use them to great effect. Badly moderated subreddits just shadowban everything that doesn’t match with the mods’ politics.

tough

tik tok recently added Footnotes

kridsdale3

And Mark pushed it through for FB and IG, at the same time he wound down the Fact Check system (which only hit like 0.0001% of contentious posts). Liberals reacted very negatively to this change.

ty6853

The most glaring example of this was how reddit did a total 180 before/after the election. Before the election questioning putting a candidate in without a primary was sacrilege. Afterwards it was a popularly supported reason for the loss. It was like watching an inflated balloon of propaganda deflate.

meroes

After the election, the amount of [Removed by Reddit] went from very little, to EVERYWHERE.

That's what did it for me, zero Reddit unless I can't find the information anywhere else, and even then it's for viewing a single post and then I'm gone.

null

[deleted]

cmdli

In the few days following the election, there was a flood of conservative posters all over the place. After about a week, they all disappeared and Reddit returned to its usual politics. I think the difference you are seeing is an atypical amount of conservatism, not the other way around. Most people who voted for Harris still do not think that the lack of a primary was the issue.

ty6853

Probably not, but as someone who didn't vote for either major party, nor am I a conservative, it was glaringly obvious that ramming through without lube someone who totally dive-bombed the prior primary might have avoided a sanity check to filter primary issues.

like_any_other

'Disappeared' of course means that they were banned.

ethagknight

Ive noticed very clearly a material change even on this site, where a comment with a conservative viewpoint would get downvoted into oblivion, and now I seem to see far more diversified opinions. Which is great, I want that.

Klonoar

This slightly speaks to what subreddits a person reads, because I can tell you I had the exact opposite experience. People seemed still very pissed off about it.

alabastervlog

That's bizarre. Putting her at the top of the ticket was very clearly the better of two bad options (it was too late for the better options, by the time the call was made).

There exist people who think Biden had a better shot and replacing him with Harris was a mistake? Did they not look at his approval ratings earlier that year, then look up what that's historically meant for presidential re-elections? Dude was gonna lose, and by the time of the replacement he was likely gonna get crushed. The replacement probably helped down-ballot races, given how badly Bien was likely to perform, so was a good idea even though she lost.

Like, yes, it was per se bad but people blaming that for the defeat is... confusing to me.

fkyoureadthedoc

No I don't think people are saying Biden was the better option. At first, as I recall, people were fairly outraged that they were left with two bad options.

The general tone very quickly shifted to Kamala's brat summer, Kamala is bae type shit.

Even after the fact nobody was questioning Kamala's qualifications. Why, at the 11th hour, were we left with demented grandpa and someone that couldn't win a primary the first time? Whose fault was this? What consequence did they suffer?

The dialogue was mostly around trying to figure out who to blame for people not voting for Kamala. Men? Black dudes? Mexicans? Misogynists? Anyone but whoever was actually responsible for the situation? Idk what it's like now though, I haven't used Reddit in months.

dmonitor

That's just hindsight being 20:20

raffraffraff

Was just gonna say this. Reddit is dreadful. Anything remotely contentious has a single narrative, and if people try to present any alternative perspective, comments get locked. Disagreement = "hate".

viccis

Reddit is SO MUCH WORSE than most people understand. Ignoring for a moment that peoples frontpage Best sort uses engagement metrics rather than upvote/downvotes since 2021, the moderators there have an iron grip over what is allowed.

r/redditminusmods used to track this. Every 12 hours they'd take a snapshot of the top 50 posts and then check ones from the previous 12 hour snapshot to see what percentage had been deleted. When it started, it was averaging 20% or so. By the end, it was at 50/50 or 49/50 deleted almost every single 12 hour period.

Of course, reddit couldn't allow this level of scrutiny, so they banned that subreddit for unstated reasons, and now the only good google result for it actually leads back here. See for yourself how bad it was: https://news.ycombinator.com/item?id=36040282

That only goes to two years ago. It feels like it's gotten even worse since then. That's not even going into some subreddits (worldnews, politics, etc.) creating the illusion of consensus by banning anyone with an opinion outside of a narrow range of allowed ones.

jandrese

> r/redditminusmods used to track this. Every 12 hours they'd take a snapshot of the top 50 posts and then check ones from the previous 12 hour snapshot to see what percentage had been deleted. When it started, it was averaging 20% or so. By the end, it was at 50/50 or 49/50 deleted almost every single 12 hour period.

Is this "mods run amok" or is it the bots gaming the algorithm more effectively and now account for nearly half of all new popular content?

In general my advice to anyone considering Reddit is to start with the default list of subreddits that you get when not logged in. Delete all of those from your list, and track down the small subreddits that interest you. The defaults are all owned by professional influence peddlers now, and what little actual content seeps through is not worth the effort to filter out.

viccis

In the past I would spot check them and there were plenty of submissions that were neither bot submitted nor obviously rule breaking that were deleted. My best guess was that mods of sufficiently large subreddits just like to shape the content that's shown. In most places, there seems to neither be the power user nepotism of late-era Digg nor the Eastern Germany level narrative censorship of subs like worldnews. Rather it just seems like a ton of cooks in the kitchen (huge modlists) with some of the mods seeming to just take action for action's sake. Either way the point is that users aren't really dictating the content.

Don't even get me started about local city subreddit busybody moderators with their online fiefdoms and their "Daily Discussion" post graveyards.

omneity

This would be such an interesting experiment to perform on other social platforms as well alongside some rough semantic analysis to understand which topics are being silenced.

I already got quite a lot of the data pipeline setup for this, so if anyone wants to collab hit me up!

viccis

>alongside some rough semantic analysis to understand which topics are being silenced

You'd have to find somewhere on reddit that wasn't 100% deleted haha

richwater

> The worst by far is Reddit

The website is truly unusable unless you directly go to small niche subreddits and even then you roll the dice with unpaid mods with a power complex.

adeeds

The smaller niche subreddits dedicated to a hobby or type of product are actually some of the worst for astroturfing from what I've seen. It only takes a few shills to start building consensus.

There's a really interesting pattern where you'll see one person start a thread asking "Hey, any recs for durable travel pants?" Then a second comment chimes in "No specific brands, just make sure you get ones that have qualities x, y, and z". Then a third user says "Oh my Ketl Mountain™ travel pants have those exact traits!" Taken on their own the threads look fairly legit and get a lot of engagement and upvotes from organic users (maybe after some bot upvoted to prime the pump)

Then if you dump the comments of those users and track what subreddits they've interacted on, they've had convos following the same patterns about boots in BuyItForLife, Bidets in r/Bidets, GaN USB chargers in USBCHardware, face wash in r/30PlusSkincare, headphones, etc. You can build a whole graph of shilling accounts pushing a handful of products.

tengbretson

The worst part is that in a lot of niche communities knowing the "best" brand for a given activity then becomes a shibboleth, so it really only takes a few strategic instances of planting these seed crystals for the group opinion to be completely captured, and reinforced with minimal intervention.

thmsths

How is that not treated as fraud? As you pointed out, with a little bit of detective work (which is well beyond the means and motivation of a casual internet user, but well within reach of a consumer protection agency) it's fairly easy to expose these manipulative tactics. Commercial communication ought to be clearly labelled as such.

baq

This had been as true when I joined ~15 years ago as it has been true on the day they made me quit cold turkey when they took the API away.

RankingMember

It's great for web searches for answers to very specific questions. "search term" + "reddit" typically gives me a good starting point if not the answer itself to the odd question I have.

raffraffraff

I detest having to keep an account, but unfortunately there a bunch of different products that use it as a semi-official support forum.

raverbashing

And use old, the only interface not designed with the tiktok brain in mind

(and the mobile app is just atrocious, RIF was way better in usability, etc)

Klonoar

It’s clear they know outright removal of it would kill off a portion of the user base, so they’ll just kill it piecemeal - starting with the old messages system and forcing people into the newer chat system.

austin-cheney

I deleted my Reddit account years ago because of echo chamber effect and other people intentionally using that to direct opinion. In all fairness though there is an inherent narcissistic incentive to influence popular opinion irrespective of evidence or consequences. This will continue to be true so long as people rely upon social acceptance as a form of currency.

ethagknight

Manufactured consensus is literally the name of the game for the big news networks. News is/was paid vast sums by the government to tell a certain story. That is Manufactured Consensus. Some countries do a better job making the news seem like a separate arm from the government. The entire point is to direct the populace. That is not the core focus of X, even though it is entirely susceptible to it, and will be encountered on any such platform. yes Reddit is horrible, but I would say Wikipedia is even more dangerous because it presents as basic facts. Reddit at least you know it's some obscene username giving geopolitical strategy rants.

Important to note, I first saw this specific chart and claim of Musk's heavy handed influence via X. Also, I see plenty of dissenting opinions (in a general sense on Trump, Tariffs, Musk, DOGE, etc) on X. Alternative views definitely have reach.

Also important to note, my posts, where I am very knowledgeable in my domain and will spend an unreasonable amount of time authoring posts to make various points, will garner mere double digit views, so when someone cries about no longer have millions of views for their uneducated hot takes... spare the tears.

seadan83

Outside of PBS, do you have evidence for this claim: "News is/was paid vast sums by the government to tell a certain story"?

> Alternative views definitely have reach.

Yes, but are we in a 1984 situation where that reach is managed behind the scenes. Reach, but perhaps not too much reach. With respect to the chart, how do we know that Twitter users are not largely partitioned? How representative is the fact you saw something compared to other "communities" on X?

All the while, even if you saw a 'dissenting' chart, the fact the chart exists is direct evidence to the power of a subtle shadow-ban effect. It's not about tears and whining, it's that a single act by 'powerful' accounts can control who gets visibility, and who does not. The point is that it is not you, the community that controls what is popular, but it is the powerful accounts that do. That is the issue.

alabastervlog

> Outside of PBS, do you have evidence for this claim: "News is/was paid vast sums by the government to tell a certain story"?

Yeah, they wouldn't have to rely so much on Madison Avenue if they were just paying the news agencies to report whatever they want.

Incidentally, I'm not sure I'd characterize even PBS' government funding as "vast sums", either absolutely or relatively (to the rest of their funding).

ethagknight

I get and agree that 'super accounts' like Musk or Taylor Swift or Barack Obama can have an outsized impact that is too powerful.

Strongly argue that TODAY has far more diversity of thought being communicated on various media than 2024. Disagree on being "in 1984 situation," the whole "Biden is sharp as a tack" -> replaced without primary "Campaign of Joy" is as 1984 is you can get. Very clear evidence of syndication occurring across various news outlets, and those syndicated stories don't happen for free. The hard evidence you request is thoroughly concealed and hard to follow as it gets washed through non profits and NGOs. USASpending shows $2mm direct in 2024 to NYT as an example, but it's no stretch to assert indirect sources as well.

the_optimist

Reddit mgmt itself has significant concerns, according to anonymous sources. You heard it here first.

DFHippie

> Outside of PBS

How much influence do you imagine PBS wields and how much money do you suppose is in these vast sums they are paid?

PBS is mostly known for Sesame Street and nature documentaries. Their government funding has been whittled down to almost nothing over years of relentless attack from the Republicans.

Here's some discussion from PBS itself on the topic:

https://www.pbs.org/newshour/show/a-look-at-the-history-of-p...

A pull quote:

"The U.S. is almost literally off the chart for how little we allocate towards our public media. At the federal level, it comes out to a little over $1.50 per person per year. Compare that to the Brits, who spend roughly $100 per person per year for the BBC. Northern European countries spend well over $100 per person per year."

jandrese

> News is/was paid vast sums by the government to tell a certain story.

In the US it is not the government paying these sums, it is the billionaires who bought the media outlets. When you look for editorial bias in the US it's not pro-government, it's pro-wealth. Or more specifically pro-wealthy people.

> I would say Wikipedia is even more dangerous because it presents as basic facts.

Can you give some examples of political bias in Wikipedia articles?

smallmancontrov

Musk didn't just put a thumb on the scale in favor of far-right content, he sat his entire pre-ozempic ass on the scale.

MaxPock

X is once again full of bots selling crypto and financial services .

sojsurf

I went back recently. Maybe I'm in the wrong circles, but I'm seeing neither of these.

I _am_ still seeing lots of recycled content looking for clicks.

stetrain

It never really stopped. All of Elon's crying about bots stopped as soon as he took ownership.

2OEH8eoCRo0

He fixed the bot problem, the problem of Twitter banning Elon's bots.

arrowsmith

Hey, that's not fair! It's also full of porn bots and Holocaust denial.

jjeaff

Don't forget the graphic fight videos with comments full of racial undertones. I have literally never engaged with nor watched more than a few seconds of those types of videos yet my feed is full of them.

2OEH8eoCRo0

It's to radicalize people into becoming racists.

vvpan

Ah yes the "look at what brown people are doing to our cities" accounts... One of the main reasons I am not on there anymore.

dismalaf

Every platform has Holocaust denial because that's the one thing that the far-left and far-right both agree on...

RankingMember

The far-left denies the Holocaust? On what grounds?

josefritzishere

It's really degenerated into a trash heap. I quit years ago.

mmastrac

I pop in from time to time but I only ever see right-wing rage bait (??) and my old timeline is completely gone. I don't engage with any of it either, just scrolling until I finally catch a name I recognize and maybe dropping a like.

Shekelphile

And even on tweets that don't seem to be outwardly ragebait the top replies will always be from blue checked right wing accounts, of course.

FredPret

There's also left-wing rage bait if you scroll down.

It's sad that these social media companies supplanted proper journalism, only to then rot into this.

What do we have now?

gruez

I'm surprised how many upvotes this got (40 points as of me writing this comment), given how little "meat" is actually in this article. The author presents a graph where views for a given user dropped precipitously after a "feud with musk". That's certainly suspicious, and was worth bringing up, but the rest of the blog is just pontificating about "social engineering" and "perception cascades", backed by absolutely nothing. Are people just upvoting based on title and maybe the first paragraph? This post could have been truncated to the graph and very little would be lost.

freehorse

Yeah I also hoped that the article had some more backing for these arguments. The nytimes article, which is cited and from which the first graph is from, is more interesting, as it also includes a couple more cases:

https://www.nytimes.com/interactive/2025/04/23/business/elon...

or from webarchive

https://web.archive.org/web/20250423093911/https://www.nytim...

janalsncm

EM directly manipulates the algorithm to suit his interests. Here’s one we know about: https://www.theguardian.com/technology/2023/feb/15/elon-musk...

ruleryak

This article does not offer any proof. It's hearsay, from the title saying he "reportedly" forced it, in turn citing a Platformer article that itself also provides no proof and instead accepts the stories from fired engineers as gospel. The platformer article then goes on to say that views still fluctuate wildly, and that it isn't in line with a supposed 1000x boost. The same Platformer article then says that they believe the supposed 1000x boost is no longer in effect, but they guess something else must be in place. The Guardian article doesn't bother to mention that part.

janalsncm

> This story is based on interviews with people familiar with the events involved and supported by documents obtained by Platformer.

I think you might want to check the article again. The interviews were not just based on fired engineers. EM did fire one engineer after he told Musk that interest in Musk was declining.

hashstring

Agree about meat, however, the article still made me think.

> What people see feels organic. In reality, they’re engaging with what’s already been filtered, ranked, and surfaced. Naturally, I— and I think many humans have this too- often perceive comments/content that I see as a backscatter of organic content reflecting some sort of consensus. Thousands of people replying the same thing surely gives me the impression of consensus. Obviously, this is not necessarily the truth (and it may be far from it even). However, it remains interesting, because since more people may perceive it as such, it may become consensus after all regardless.

Ultimately, I think it’s good to be reminded about the fact that it’s still algorithmic knobs at play here. Even if that’s something that is not news.

Fidelix

They are upvoting because they hate Elon Musk. It's not that deep.

627467

The article’s angst over X’s “manufactured consensus” is overblown. Influence has always been curated—editors, town criers, or whoever grabbed the mic were the analog algorithms. X’s sin isn’t some evil algo: it’s just running at planetary scale. We’ve ditched thousands of small communities for one global shouting match, so naturally mega-influencers steal the show. Algorithms are just the gears keeping this chaos moving because we crave instant, worldwide chatter. Some folks pretend a perfect algorithm exists (bsky, IG/fb) but it doesn’t come from one team, one database, or one set of criteria. The “perfect” system is a messy web of different algorithms in different spaces, barely connected, each with its own context. Calling out X’s code misses the mark. We signed up for this planetary circus and keep buying tickets.

throwaway7783

But there is no denying that there is a shift in narrative in X posts since its acquisition. So there is certainly more going on than just planetary scale. It was planetary scale before acquisition too. Algorithms have the power to nudge the narrative one way or another at planetary scale.

ljsprague

It was completely transparent and unbiased before its acquisition.

throwaway7783

I can't quite tell if this is sarcasm :). I'll assume it is.

watwut

It was a little right wing biased. Now the bias moved a lot to the right.

EcommerceFlow

Yes, the natural order (that was mass censored for 10+ years) got uncensored. Look at who won the presidency.

throwaway7783

No way of telling one way or another, no? Unless you actually work at X and know exactly how it works.

Note - My original comment was not about whether now is worse than before or vice versa. It is just that narratives shifted in a different way, that had nothing to do with scale. Algorithms are just as likely as uncensored (or censored - how does one know?)

Swoerd

[dead]

w10-1

Is the title ironic? Is this helping?

"Manufacturing consent", the book by Chomsky and Herman, details techniques that are largely unused in this situation. Chomsky's book by disclosing the hidden editor works against the effect rather than for it.

Here it's closer to a state-run media outlet, with the exact ambiguity that implies: a known editor pretending to be objective, except here the editor only really cares about certain topics, and others are encouraged to roam freely (if traceably).

In Chomsky's case, the editor's power comes from being covert, but only if people are fooled, so the book works to diminish it. In this case, the power comes from the editor being known unstoppable. You have to accept it and know yourself as accepting it -- which means you have to buy in as a fan to avoid seeing yourself as herded, or out yourself as an outsider. Since most people take the default step of doing nothing, they get accumulating evidence of themselves as herded or a fan. It's a forcing function (like "you're with us or against us") that conditions acceptance rather than manufacturing consent.

In this case, articles (showing what happens when you oppose the editor) and ensuing discussions like this (ending in no-action) have the reinforcing effect of empowering the editor, and increase the self-censuring effects. They contribute to the aim and effect of conditioning acceptance. So they might not be helpful. (Better would be the abandonment of a platform when it fails to fulfill fairness claims, but that's hard to engineer.)

qnleigh

Do we know that this is how the algorithm actually works? The article only shows one plot of one specific instance, and there could be more than one explanation for the sudden drop in viewership (especially given the involvement of Twitter's owner).

jsheard

> Do we know that this is how the algorithm actually works?

Funnily enough we should know that, since Elon promised to open source the algorithm in the name of transparency. But what actually happened is they dropped a one-time snapshot onto GitHub two years ago, never updated it, and never acknowledged it again. Many such cases.

jandrese

It was enough info that people who professionally post on X/Twitter can play the algorithm like a fiddle. They can get anything they want to the top, and often can even get Elon to re-tweet it.

hashstring

Yes, this 100%.

And never forgot the, isElon boolean var that would increase post visibility. lol, what a shame.

a2128

Below the chart there's a link to a NYTimes article it was sourced from, which has more plots of more instances of this

janalsncm

Their “for you” feed is engagement bait. In other words, it appears to be running almost entirely on CTR. It seems to pull from a pool of posts that are engaged with by those you follow. Limit seems to be 24h.

It’s not a very sophisticated algorithm, likely because the best people aren’t super keen on working there for WLB reasons.

mcintyre1994

I wonder if the algorithm is affected by blocking the high profile account in these cases. Eg I blocked Musk ages ago because if you don’t then the algorithm just constantly pushes his ‘content’ at you. So does the algorithm still prioritise things he’s interacted with for me, or does it only do that for people who haven’t blocked him? I definitely get stuff recommended that I expect he might interact with, but I don’t know if that’s actually specifically why it’s being pushed at me.

numpad0

They've massively raised various penalization thresholds too, so that organically valued content cannot gain popularity. Unless you've blocked everything he sees, you're still affected.

fouc

Reading that I couldn't help but think there's parallels to HN. At least HN tries to be transparent about "the algorithm", and it's essentially a dumb algorithm compared to what X/FB/etc use.

joseda-hg

I think it's fundamentally different in HN, everyone sees the same posts/comments (Barring settings / mod privileges), but in X everyone can get a slightly tailored feed

Clubber

I'm sure there's coordinated efforts to up/downvote content on HN. The political articles are way up and certain reasonable comments seem to be unreasonably downvoted. It seems like it's ramped up since the election. Either that or HN is becoming a natural echo chamber, I'm not quite sure.

If people don't already know, the internet is easily manipulated and people tend to get ideas and reassurances of ideas based on what their group's opinions are, and those opinions are manipulated. It's easy to create multiple accounts, easy to change IP addresses, easy to bot comments; anyone can do it and it's easy to automate.

The earliest example I can recall was manipulating the Amazon ratings system, now it's everywhere.

alabastervlog

"The Algorithm" on most social media platforms to which people apply that term is, crucially, personalized, and (usually) heavily driven by engagement metrics. That's what makes them dangerous and shitty, not just having a voting system or sorting by latest post ("technically an algorithm!" as some posters will helpfully point out in these kinds of discussions)

Philpax

Is it that transparent? I did not know about the vast majority of the items on this list until I encountered it: https://github.com/minimaxir/hacker-news-undocumented

null

[deleted]

_Algernon_

Certainly more transparent than some black box machine learning monstrosity.

2OEH8eoCRo0

I'd like HN to indicate somewhere on the page when mods override flags or make other moderation decisions.

They won't because it would reveal moderation biases and trends.

kuschkufan

on hn the algorithm boosts/hides posts and comments based on the popularity of the user account that upvoted/downvoted? i thought all accounts had the same voting weight.

bikezen

You even even downvote on HN until you pass some bar

cadamsdotcom

In this modern era we heave under the weight of decades of exponential growth. In some cases it’s actually “only” compound growth - but the result has been the same (and very ironic): ossification through stratification.

There are behemoths living among us. There will soon be social media accounts with enough sway to manufacture truth.

What needs to be learned is society is like a national park. Left to its own devices it will end up trashed - people leave garbage, move in and use it for whatever they like. So, we fund a service that keeps parks maintained. We understand the benefits of the National Park Service because they are visible and we are visual creatures. But for some reason we have a more laissez-faire attitude to unchecked accumulation and its downstream effects.

It’s risky for power to be so concentrated. We’re forced to hope for benevolence and there’s no backup plan.

What more can be done to show the orders of magnitude of difference between the most and least powerful?

-__---____-ZXyw

Is there a curated and serious resource from any public body or private individual documenting specific cases of abuse by Twitter?

Any links greatly appreciated.