Brazil's Supreme Court makes social media liable for user content
141 comments
·June 12, 2025ufo
matheusmoreira
The real coup was perpetrated by the supreme court itself. Judges have always been "gods" in this country, and the supreme court in particular has been blatantly grabbing power since at least 2019.
JumpCrisscross
> real coup was perpetrated by the supreme court itself
The side that storms the capital sort of gives up its ability to call the other illegitimate in a democracy.
That doesn't mean they're right. Just that the situtation in Brazil clearly escalated to the point that suspending the status quo is if not merited, highly historically precedented.
matheusmoreira
Occupying the government buildings is essentially the standard brazilian protest. It's happened many times before and will happen again. This "storming" amounted to little more than a protest.
If only that had been enough to fix things. I watched those judges censor political speech as "fake news" throughout the entire 2022 election. Then they showboated in public about how they had been personally responsible for the former president's defeat.
cesarb
I always like going to the source for things like that. As the article mentioned, the voting hasn't concluded yet. This article has a link to the seven votes so far, with the full explanation for each one: https://noticias.stf.jus.br/postsnoticias/stf-avanca-em-anal...
cvjcvjcvj
You are wrong, sorry.
The STF vote stands at 6-1, and the result is now irreversible. With only 11 justices, even if the remaining four voted against, the majority would still be maintained (6 votes vs. 5).
But there’s only one more fascist judge left on the bench, so the final tally will end up 9 to 2. The decision is final.
cesarb
> The STF vote stands at 6-1, and the result is now irreversible.
Even then, the detailed justification for each vote is interesting. Just the first one is nearly 200 pages, and presents the whole history of the discussion, mentions precedents from not only the country but also about related laws all over the world, explains what would be the consequences of this particular article being considered invalid (it's not a simple "make them liable", it's the removal of a specific article which prevented them from being liable, which means other articles still apply), and so on. This is much more detailed and nuanced than what a single six-paragraph article can tell. That's the reason I prefer to go to the source for things like that.
ty6853
I have no idea how Brazil courts work. In the US the supreme court often has 200 page opinion full of high-IQ hot air, which all sound very nice and nuanced. But in practice they know damn well what they're really doing is a long wink and a nod and then they just deny cert anytime the lower courts do their real bidding because they know everything they wrote was bullshit and they don't want to have to hold themselves to it, the point was just to keep up appearances while rolling out the red carpet.
timbit42
If they manipulate the feed (add, hide, reorder) then they should be held liable for user content.
akoboldfrying
It will be really interesting to watch how this unfolds. My hypothesis: Meta, X et al. will threaten to just wholesale IP-block the country (not worth the risk), and it will become a game of Chicken.
Side note: The density of ads on that page is almost impressive.
EasyMark
I always use an adblocker and that page is a stubborn one, it just keeps blocking after a couple of minutes as the count rises to 100+
cvjcvjcvj
Here’s an analogy: If a radio station or TV channel taught a child to cut themselves, self-harm, or drink toilet water, that broadcaster would be held accountable.
So how can we allow social media platforms to escape responsibility? Children are dying.
ty6853
The problem isn't that a child knows how to cut themselves, it's that they lack guidance in life teaching them why that is a dumb idea, or if too young to learn that, supervising them so that they don't.
matheusmoreira
That reaction would be glorious but I doubt it's going to happen. Too much money at stake. They're likely to just accept it, just like they accept chinese control.
Brazil wants to become like China. One of the judge-kings even declared his admiration for the chinese and their control of communications. It's quite terrifying.
Donald Trump has been threatening sanctions against these people for a while now. What's he waiting for?
defrost
By that logic, the USA also wants to become like China. The current POTUS, Donald J. Trump declared his admiration for the strong chinese response to the Tiananmen Square protests.
* https://www.businessinsider.com/trump-praised-china-tiananme...
* https://www.theguardian.com/us-news/2016/mar/11/donald-trump...
( and numerous other reports since his 1990 comments )
dragonwriter
> By that logic, the USA also wants to become like China.
Yes it (or at least the presently ruling faction) absolutely wants that substantively, but with a White Christian nationalist rather than Communist rhetorical focus.
Not really the kind of “by that logic...” that is useful for rebutting the argument it targets.
matheusmoreira
He's a bit of a tyrant himself, isn't he? It's alarming but I suppose it's not surprising. Haven't seen him defend censorship yet though.
diego_moita
> Brazil wants to become like China.
Here's a clear sign of a typical "Bozotário" paranoia: reduce every sensible discussion to a leftist conspiracy.
> One of the judge-kings even declared his admiration for the chinese and their control of communications.
Sure. And since you found that in your WhatsApp group then it has to be true. Because no one lies in WhatsApp, right?
matheusmoreira
> Here's a clear sign of a typical "Bozotário" paranoia
Nobody mentioned Bolsonaro. He's history.
> And since you found that in your WhatsApp group then it has to be true.
You don't have to take it from me. Here's a top Google search result:
https://www.terra.com.br/noticias/brasil/politica/somos-admi...
"We are admirers of the chinese regime."
Take it up with whoever wrote that article if you disagree. Either way, refrain from replying to my comments in the future. I really have no inclination to engage with sarcastic "WhatsApp uncle" arguments.
msgodel
Heh, good. "Social media" has always been worse than just running a personal website. Especially now that most social media platforms are walling themselves off from the rest of the web.
bethekidyouwant
Who decides what you are allowed to say or see?
HelloMcFly
They already have their finger on the scale with this due to their algorithms. They're pushing these posts to people, not just letting people see posts from communities they have chosen to join. To me the algorithmic nature of post visibility is what muddies the water here and puts them in the place of promoter of speech vs. neutral platform for speech.
No, I don't have a proposal or solution, I'm ultimately out of my depth here. But I do think it's a little more complex than it is sometimes made out to be.
EasyMark
The current government in power who would prefer that they stay in power forever.
mrtksn
It's usually the people you voted for, middle level managers in California, terminally online nerds who moderate communities, angry stalkers and algorithms.
diego_moita
You are confusing "being liable" with censorship.
They're not the same thing, if you care to know.
The first is about accountability, about being responsible for the consequences of one's acts.
The second is preemptive coercion and is indeed a tyrant's basic tool.
bethekidyouwant
Who decides which actions have said consequences for which someone is liable?
diego_moita
Any after the fact investigation and due process in courts of law. It will depend on the specifics of each case: libel, incitement to violence, etc.
That's how it works for books, magazines, newspapers. That's why they have legal advisors to define the boundaries of what and how they can publish.
thrance
Censorship is no longer the tool of choice of wannabe dictators. Instead, the focus is now on editorializing of communication channels. Open up the front page of X in a private tab, and see how everything is far right propaganda. Who needs to censor anything anymore? Simply downrank whatever displeases you into oblivion.
Good for Brazil, maybe they can slow down the fascist disease that rots our democracies from the inside.
EasyMark
I think they can rely on the general lack of critical thinking skills in the general populace as more effective than constraining information. who have been taught to trust what they read, unless it is dead obvious that what they're reading is fiction (like a novel or comic book). All they have to do is flood social media with propaganda and drown out any truth to be found there. It's a very popular tactic on X around election time.
amatheus
Well, if social media can profit from user content it should be liable too right?
cvjcvjcvj
Thanks god.
close04
There's probably a lot of politics behind this judgement that shaped it. But to a certain degree it's fair. Why would social media companies only make money from what users do on their platform but be spared any accountability?
Everything a user does on a social media platform is visible to the company and is monetized. Is any other sector spared any accountability when they know the customer is breaking the law?
This will be expensive for the company because they have to not just moderate, but do it under a patchwork of different countries' laws. But they were more than happy to make bank for years from this, put all that money to good use. It will also create some opportunity for abuse but then again so is allowing anything and everything on the platform.
AnimalMuppet
> Everything a user does on a social media platform is visible to the company and is monetized.
Well, it's visible to the computers. It's potentially visible to a human, but no company has humans actually reading all that content.
And no, an AI is not adequate to filter it, either.
null
FirmwareBurner
>Why would social media companies only make money from what users do on their platform but be spared any accountability?
Here's the thing that confuses me. In every country I know, TV and radio stations are liable for the content they broadcast. Both in terms of programs and also ads. I've seen several fines being issued for such breaches, and my country is very liberal and loosely regulated here. So then why do we let social media broadcast whatever poison they want, plus all those scammy and predatory ads to people and get away with it?
That's how big-tech social media oligarchies got so insanely rich compared to brick and mortar businesses. All the scale of the global internet, with none of the liability. Maybe it's time to change that?
saynay
This isn't entirely true in all cases. Consider something like a live broadcast of a sporting event. If some streaker runs naked across the field, are the stations held to account?
That is, in a way, similar to the problem of user-generated content. There is a limit to how much control a social media company will be able to have over the actions of its users. Unless you replace the system entirely with one where all posts are manually approved by a person before they go up, you will need to have at least some reduced liability for the platform owner.
FirmwareBurner
>If some streaker runs naked across the field, are the stations held to account?
If they made no reasonable attempts to move the camera away or cut the feed to something else, yes absolutely they are held accountable. That's why you have TV directors in the studio. Do you think you can get away with lengthy broadcasting of obscene nudity just because you're live?
Have you seen major sporting events like F1? Their broadcasting rooms look like NASA, dozens of people looking as several camera feeds simultaneously and picking the best one. They'll definitely see a naked man running on the field with his junk out in due time and not share that feed. I assume it's the same for FIFA, NFL, NBA, NHL, Golf, Tennis, Cricket and any other major sporting event.
JumpCrisscross
There is a legitimate argument social media can for being treated differently given their content is user generated. But that argument falls to pieces given algorithmic boosting, which is clearly an editorial slant.
saynay
"Algorithmic boosting" is not (always?) the same as an editorial slant. Promoting the post with the longest title would be an "algorithmic boost", but clearly not editorial in any way. The most common forms of algorithms are just a function of the number of times people viewed it weighted against the age of the post; there is still no editorial slant there. Even recommendations algorithms like YouTube are mostly the same, with an additional weight based on how likely others who watched the same things as you were to view that video.
orbisvicis
Think about how this would apply to hacker news? Not all algorithmic bias is evil. If you want to avoid an echo chamber, design an algorithm to boost interesting content that missed the front page. That shouldn't render you vulnerable to legislation.
And would this allow Brazil to prosecute hacker news for cyber security violations if a user posts content regarding Flipper Zero or Japanese IC cards, or data breaches of Brazilian companies...
I think the line should be drawn at ads... and maybe even all profit centers of social media companies.
AnimalMuppet
Depends on what you mean by "algorithmic boosting".
If you mean, "this user viewed A, B, and C. Other users who viewed them also viewed D; let's show D to this user", then no, that's not an editorial slant or an editorial choice. That's an unbiased algorithm driven by users' choices.
If you mean, "Let's identify posts that are related to position Y on subject Z, and boost them extra", then yes, I agree that is an editorial slant.
My impression is that, when people talk about "editorial control", they usually include the first kind, not just the second.
So: How much of the second kind is going on? More than none, I agree, but how much? Does anyone have data? If not, then we're left with impressions, and my impression is that it's fairly small. (Rather, that it's a quite small number of topics, but since it's probably fairly popular topics it may still amount to a fair number of user impressions.)
FirmwareBurner
>as its content is user generated
Content on TV stations is also mostly third party generated. Channel 5 didn't make The Punisher, they bought the rights to play it on air. Radio 2 didn't make In da Club, but 50 Cent did, they're just playing it on air. Newspapers also publish letters and content from readers, yet unlike social media they're the ones responsible for that content.
Because in many countries, content is age restricted per times of day, so the TV and radio statins need to edit the songs and movies they play to cut out various swear words, violence and sex scenes if they want to air to to general audiences, otherwise they get fined.
So why are social media companies allowed to wreak havok?
BugheadTorpeda6
Yep, just another reason they should be held liable the same as everybody else.
phantomathkg
are you and GP are saying that all social media platform should perform policing to the point that everything outlaw in every countries should be banned?
mlinhares
yes, that's how the law works, social media platforms can't show nazi content in germany cos that's the law there. they're perfectly fine doing it.
BugheadTorpeda6
Yes, I am saying that platforms should be policed to such an extent that they remove and ban all illegal content within a reasonable timeframe. And in the case of something like child porn, knowingly failing to do so quickly should lead to a lot of people going to jail (I think a lot of tech executives and employees belong in jail).
I don't find this to be too much to ask. Every other platform has to do this. The only reason you have a bunch of ridiculously wealthy anti-social people rampaging across the world "disrupting" whatever they see fit is because we decided the rules arbitrarily should't apply to them because of some wide eyed and ridiculous utopian bullshit how "the internet is for free love and knowledge mannnn, and it should be like, freeee brooo". I'm not even slightly sympathetic to that argument and I don't think the Internet has proven to be valuable enough either culturally or productivity wise to justify even the slightest loosening of the rules that apply elsewhere. The whole thing starts to look like a rent seeking scam that was used to destroy a lot of higher quality information resources and businesses if you squint at it for even a second.
littlestymaar
Finally! I've been yelling at the clouds for years now how it made zero sense to apply “hosting services” rules for social media when they haven't been doing mere content hosting for years: with their algorithmic feeds curating what everyone is viewing they are literally acting as content editors, and as such it made no sense to me that they didn't have the same obligations as any other other editors out there.
In fact, because of them the regulation on mere hosting services have increased sharply for no reasons, just because the regulators wanted to have more control on social medias.
diego_moita
Good! We need to go further in this direction.
If magazines, newspapers, movies and TV stations are liable for what they publish why shouldn't social media also be?
The "but it is only transportation of information, like telephones" argument is just ridiculous. It is valid for email at best, it is not valid for social media. They already routinely practice filtering of what is posted.
We shouldn't expect the U.S. to advance this cause. Their congress is too deep in the pockets of lobbyists to be accountable to public interest. It has to come from the E.U. and responsible governments.
Edit: this is a repetition of history. There are a lot of tragic examples of tragedies sparked by publications meant to extract profits from people's paranoia and fear. The most famous one is the witch hunt started by the book Malleus Maleficarum[1] that caused more than 30 000 deaths. We created means to contain these abuses in new media too. The genocide of Rohingya people could have been avoided if Facebook were liable for it.
flenserboy
Yours is an argument to entirely wipe away user content and corporatize the whole of the Internet, a return to the days of TV and Radio. Where, in your world, could citizens find a platform willing to allow them to share their opinions, creations, and find the like-minded?
diego_moita
Wow! Did you really read all of that in my comment?
I'd like to know where did I say that, because I can't find it.
flenserboy
Your position is clear: if responsible governments (lol) punish platforms for what commoners post on them, platforms wishing to survive will, sensibly, not allow commoners to post. Moderation is not enough, as unapproved things will always make it through the cracks.
The deep distrust of unmoderated, un-nannied communication is also apparent:
>The "but it is only transportation of information, like telephones"
>argument is just ridiculous. It is valid for email at best,
>it is not valid for social media. They already routinely practice
>filtering of what is posted.
The at best you use to describe the freedom of communication in email shows that your position has no bottom: everything and anything the masses express, even in private communications, must be policed. You also do not understand that freedom of speech in the US is not moderated or regulated by the government, and is something enjoyed by the people, not just interest or power groups, which adds to the authoritarian vibe of your post.
saynay
It is the end result if you expect the same level of liability as a newspaper or magazine. Every single thing you see in one of those was deliberately put there by a person (well... at least it used to be). If an agent of the print publication deliberately put something in the publication, then the liability falls on the publication and/or that person.
Social media is not the same. The content being posted is not vetted by any agent of the platform, so the liability at least in part falls of the person who posted it. You could argue that the platform should share some liability that is waved as long as they at least try "hard enough" to police their platform, with whatever definition of "hard enough" is chosen. But no automated filter will be perfect, so if you demand the same level of liability as a print publication you are effectively outlawing social media entirely.
For context, it's useful to remind that after the last presidential election, Brazil withstood a coup attempt similar to USA's January 6th. Since then, the courts have taken a tougher stance against social media, including a country-wide blockage of X/Twitter last year.
We can expect that social media companies will lobby against this. In Brazil they find allies in the far-right which are also interested in moderation-free social media. For instance, two weeks ago Meta and Google sponsored an event from Bolsonaro's PL party about social media and AI, including lectures to train party members into how to effectively employ social media and AI tools. https://oglobo.globo.com/politica/noticia/2025/05/21/pl-anun...