Skip to content(if available)orjump to list(if available)

YouTube says it'll bring back creators banned for Covid and election content

diego_sandoval

At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

[1] https://www.bbc.com/news/technology-52388586

danparsonson

> the WHO contradicted itself many times during the pandemic

Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

hyperhopper

The united states also said not to buy masks and that they were ineffective during the pandemic.

Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

null

[deleted]

sterlind

it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.

misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.

IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.

adiabatichottub

As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.

cultofmetatron

I remember social studies class. we spent a day covering the 6 day war without once mentioning the nahkba or the mass murder of arabs preceeding it. I don't think the teaching of critical thinking was an actual goal of theirs.

softwaredoug

I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

yongjik

I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.

If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.

asadotzler

My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.

Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.

ultrarunner

At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.

Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.

pfannkuchen

I think the feeling of silencing comes from it being a blacklist and not a whitelist.

If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.

If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.

sterlind

I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?

Jensson

> No one owes you distribution unless you have a contract saying otherwise.

The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.

null

[deleted]

timmg

It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.

Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.

I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."

typeofhuman

Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.

SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.

This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.

Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.

justinhj

So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.

bee_rider

YouTube’s business model probably wouldn’t work if they were made to be responsible for all the content they broadcasted. It would be really interesting to see a world where social media companies were treated as publishers.

Might be a boon for federated services—smaller servers, finer-grained units of responsibility…

sazylusan

Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.

cptnapalm

As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.

hn_throwaway_99

Glad to see this, was going to make a similar comment.

People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

squigz

Online, sure. But online doesn't mean YouTube or Facebook.

sazylusan

Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

The first amendment was written in the 1700s...

electriclove

I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.

andy99

The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?

mapontosevenths

> the government and/or a big tech company shouldn't decide what people are "allowed" to say.

That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

> What if they started banning tylenol-autism sceptical accounts?

What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

MostlyStable

It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.

briHass

The line should be what is illegal, which, at least in the US, is fairly permissive.

The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

mc32

The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.

JumpCrisscross

> the government and/or a big tech company shouldn't decide what people are "allowed" to say

This throws out spam and fraud filters, both of which are content-based moderation.

Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

asadotzler

No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

mitthrowaway2

No, they ban your account and exclude you from the market commons if they don't like what you say.

heavyset_go

This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

What you are arguing for is a dissolution of HN and sites like it.

mulmen

I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?

zetazzed

Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

Aloha

I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.

llm_nerd

It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

It massively amplified the nuts. It brought it to the mainstream.

I'm a bit amazed seeing people still justifying it after all we've learned.

COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

api

Put simply, it seems we forgot the Streisand effect. If you try to suppress something, you amplify it.

I personally think deplatforming is a major reason Trump is president again. If he’s being banned he must be right.

It’s related to the power of martyrdom. The guy who shot Charlie Kirk made Kirk’s ideas and cause a hundred times more powerful.

To suppress something is to amplify it.

LeafItAlone

>It massively amplified the nuts. It brought it to the mainstream.

>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

In theory, I agree, kind of.

But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.

ioteg

[dead]

yojo

I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

wvenable

> But I think we have to realize silencing people doesn't work.

Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.

Many of these things arrived out of nothing and can disappear just as easily.

It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.

system7rocks

We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

In many governments, the government can do no wrong. There are no checks and balances.

The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

But hopefully we will still have a system that can have room for critique in the years to come.

type0

> Is our current White House administration a champion of free speech? Hardly.

So after January 22 2026, US leaves WHO and youtube users will be able to contradict WHO recommendations

electriclove

It is scary how close we were to not being able to continue the conversation.

cactusplant7374

> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

softwaredoug

It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.

HankStallone

It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

dotnet00

To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.

CSMastermind

Why wouldn't you buy it?

The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

It would be more surprising if they left Google alone.

lesuorac

2 years is a pretty long ban for a not even illegal conduct.

Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

asadotzler

No one owes them any distribution at all.

zug_zug

Absolutely. Especially when those election deniers become insurrectionists.

Simulacra

They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.

LeafItAlone

And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.

JumpCrisscross

> wasn't Google/Youtube banning so much as government ordering private companies to do so

No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

spullara

They literally had access to JIRA at Twitter so they could file tickets against accounts.

starik36

That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.

stronglikedan

[flagged]

3cKU

[flagged]

lupusreal

Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.

bluedino

I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.

c-hendricks

Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.

pinkmuffinere

Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.

mvdtnz

Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.

alex1138

Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics

whycome

What exactly constituted a violation of a COVID policy?

PaulKeeble

A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.

carlosjobim

Every opinion different from the opinion of "authorities". They documented it here:

https://blog.youtube/news-and-events/managing-harmful-vaccin...

From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

miltonlost

[flagged]

someuser2345

> content that falsely alleges that approved vaccines are dangerous and cause chronic health effects

The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

> claims that vaccines do not reduce transmission or contraction of disease

Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".

roenxi

That policy catches and bans any scientists studying the negative health effects of vaccines who later turns out to be right.

1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.

2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

> This would include content that falsely says that approved vaccines cause ... cancer ...

Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.

TeeMassive

> This seems like good banning to me. Anti-vaxxer propaganda isn't forbidden thoughts. It's bad science and lies and killing people.

Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.

Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.

perihelions

According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

delichon

My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.

miltonlost

[flagged]

barbacoa

Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.

null

[deleted]

jimt1234

[flagged]

zobzu

[flagged]

potsandpans

Saying lab leak was true

reop2whiskey

is there any political censorship scheme at this large of scale in modern us history?

woeirua

It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

asadotzler

Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

CobrastanJorji

Yeah, there are two main things here that are being conflated.

First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

stronglikedan

The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

3cKU

[dead]

kypro

I've argued this before, but the algorithms are not the core problem here.

For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

woeirua

I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.

theossuary

The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.

squigz

I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.

hsbauauvhabzb

Algorithms that reverse the damage by providing opposing opinions could be implemented.

terminalshort

The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.

woeirua

Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!

TremendousJudge

"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?

ironman1478

There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

asadotzler

No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.

ironman1478

In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.

Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.

reop2whiskey

What if the government is the source of misinformation?

ironman1478

It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.

I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.

We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.

alex1138

Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right

alex1138

[flagged]

rustystump

The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.