The NO FAKES act has changed, and it's worse
119 comments
·June 24, 2025rootlocus
dspillett
> Sounds like the kind of system small companies can't implement and large companies won't care to implement.
Or the sort of thing bigger companies lobby for to make the entry barriers higher for small competition. Regulatory capture like this is why companies above a certain level of size/profit/other tend to swing in favour of regulation when they were not while initially “disrupting”.
a4isms
To adapt the words of composer Frank Wilhoit:
"Crony capitalism consists of but one principle: In-corporations who are protected by regulation, but not bound by it, alongside out-corporations who are bound by regulation, but not protected by it."
dredmorbius
I love Wilhoit's original, and want to note that he's been active lately on HN as well (<https://news.ycombinator.com/user?id=FrankWilhoit>).
His sentiments are pressaged by others, including Adam Smith (1776):
Civil government, so far as it is instituted for the security of property, is in reality instituted for the defense of the rich against the poor, or of those who have some property against those who have none at all.
<https://en.wikisource.org/wiki/The_Wealth_of_Nations/>
Or some guy named Matthew, somewhat earlier:
For whosoever hath, to him shall be given, and he shall have more abundance: but whosoever hath not, from him shall be taken away even that he hath.
null
potato3732842
> Regulatory capture like this is why companies above a certain level of size/profit/other tend to swing in favour of regulation when they were not while initially “disrupting”.
Exactly. This is the big boy version of "even your petty backyard patio needs a PE stamp" type crap.
null
neilv
Large companies will, and it becomes a moat to smaller entrants.
But it sounds much worse than that: infrastructure for textbook tyranny.
Suppress speech of dissidents against the regime, take away their soapboxes and printing presses, demand that the dissident be identified and turned over to the regime, and put fear of sanctions into all who might be perceived by the regime as aiding dissidents through action or inaction.
imtringued
>Large companies will, and it becomes a moat to smaller entrants.
They won't care to implement it. They'll follow the letter of the law, which means they'll aggressively block anything that gets (falsely) reported, but they certainly won't follow its spirit.
bryanrasmussen
the spirit is to tell the powers that be who the powerless person is who crossed them, the big companies definitely will follow that spirit. It is the very spirit which moves them.
Tadpole9181
Because Big Tech doesn't capitulate with tyrants, of course. It's not like there were billionaires right there next to the President while Elon Musk hit a Nazi salute on stage.
coldtea
Also the kind of system that will absolutely be used by large companies where most of online discussion happens to squash whatever the government doesn't like.
spacecadet
Or a system intended to prevent and identify dissidence by labeling anything as "fake".
null
ajross
> Sounds like the kind of system small companies can't implement and large companies won't care to implement.
This is true. Which is of a piece with EFF's general bent these days. They stopped caring about big issues of internet freedom long ago and are now just a parade of Big Tech Bad headlines.
And in an era where (1) Big Tech continues, after several decades, of being really quite a benign steward of society's information and (2) we have a bunch of unsupervised 20-something MAGA bros loading the entirety of the Federal government onto their Macbooks, that seems extremely tone deaf to me.
The tech privacy apocalypse is upon us. And the %!@#! EFF is still whining about Meta and ByteDance for its click stream, because like everyone else on the internet that's what they really care about.
cvz
I don't understand this comment. The fine article is about a proposed law that would allegedly require the implementation of half-baked censorship systems along the same lines as the DMCA. Are you saying that's not a real issue because the EFF also whines about big tech?
ajross
I'm saying I'm fed up with the EFF for their silence in the face of genuine disaster, and am treating them like the click farm they've become. Is NO FAKES a bad law? Probably. Do I trust the EFF to tell me that? Not anymore.
They were a genuine beacon of rationality and justice in the early internet. They're junk-tier blog spam now. And I find that upsetting, irrespective of the status of AI legislation.
dredmorbius
I'd argue quite strongly the opposite, that the EFF do care very deeply about internet freedom, but:
1. Have come to realise that Big Tech (including many substantial backers of the organisation, in the past if not presently) is itself a major threat and ...
2. That the problem is far more nuanced than early views (Barlow's "A Declaration of the Independence of Cyberspace", <https://www.eff.org/cyberspace-independence> and Gilmore's "The Net Interprets Censorship As Damage and Routes Around It", see <https://quoteinvestigator.com/2021/07/12/censor/> in particular) suggested. Absolute freedom turns out to not be freedom, for many, shades of Popper's Paradox of Tolerance.
ajross
The EFF has been yelling that Big Tech is a major threat for like three decades now. The dystopia seems not to have arrived. Until now, when it seems to be rolling out as offical administration policy.
In point of fact, nonsense like "Big Tech Bad" became part of the general antiestablishment mania that got us into this mess! Google and Meta kept our stuff mostly private, it's the nuts we voted for who were the real threat, and we voted for them because the EFF and other thought leaders told us not to trust the people who turned out to have been the good guys.
And I've mostly had it. The EFF turns out to have been a click farm milking sincere but mostly irrelevant outrage into an engine for social destruction. No thanks.
kmeisthax
> Big Tech continues, after several decades, of being really quite a benign steward of society's information
> we have a bunch of unsupervised 20-something MAGA bros loading the entirety of the Federal government onto their Macbooks, that seems extremely tone deaf to me.
The latter is the direct and consequential result of the former. Elon Musk is Big Tech, and he specifically engineered Trump's second electoral victory to get back at Gavin Newsom not letting him run the California Tesla factory at full tilt during COVID. He also turned Twitter into a 24/7 far-right slop machine.
Before that, under the latter half of the Jack Dorsey regime, Twitter was a 9-to-5 liberal slop machine. And before that it was so painfully "neutral" that it let all these far-right nutjobs get a foothold on the Internet to begin with. Remember Jack Dorsey rolling out the "world leaders" policy just to justify not enforcing the rules on Trump? Or Reddit vetoing moderators of far-right subs from shutting down their own hellholes? Big Tech's stewardship of public forums ranges from "asleep at the wheel" at best to "actively malicious and incompetent" at worst.
And this is downstream of the broader disintegration of the liberal coalition. The operation of a large enterprise involves many inherently illiberal acts, and thus the business half of that coalition no longer acts as liberals. In fact, I would argue they fell away decades ago. We didn't notice because the business class had distracted us with a culture war, specifically using their stewardship of social media to amplify opposition and division.
I have plenty of complaints about the EFF, but their lobbying against Big Tech is not part of them.
stodor89
15 years ago that would've been outrageous. But at this point they're just kicking a dead horse.
Uvix
Maybe 25 years ago. But that ship sailed with PATRIOT act and friends post-2001.
spacecadet
Surprised/not surprised. When surveillance capitalism is the MO for generations, it just becomes the norm.
Personally, I find the world becoming more and more about fighting for survival while simultaneously fuck you I got mine...
coldtea
They got the left cheering to support censorship in the guise of fighting "fake news" when it had the uppper hand, lest someone hurt someone's feelings.
Now it will be used againt it, and against in right wing dissidents to the establisment fairy tales too.
input_sh
There's a giant difference between supporting changes to the law that allows further censorship and yelling at companies to be consistent about the enforcement of their own terms of service.
One's a law that everyone has to follow, while the other one's a made-up piece of text that the company can change at any point in time if they wish to do so.
null
bsenftner
Doesn't all this assume that any such media is being "social media" shared? The language of this strikes me as moot within private communities. Could this be the unrealized "thing we want" and that is the killing of social media?
steveselzer
Thing is we understand at an academic level that this is a platform design issue not some abstract problem about free speech or personal responsibility.
The solution is to demand governments force social media companies to implement algorithmic friction coefficients.
It’s simply that the economic incentives involved mean there is no political will to make it happen.
The ruling class benefits from the chaos, division and confusion as perpetuated by social media in its current form. They like it just fine the way it is now.
Some Research:
Aral & Eckles (MIT, 2019): Introducing friction reduces misinformation spread without limiting freedom of expression. • Mozilla & Stanford Studies (2020–2022): Friction (such as unsharing prompts) reduces fake news virality by up to 50%. • Twitter’s 2021 experiments: Users changed or deleted tweets 25% of the time when shown fact-check prompts before posting.
t0bia_s
Why not ban lying and set up ministry of truth?
Attempts to regulate lying are just cover for push certain narratives to favor political opinions.
harvey9
So all the images need to go through replica filters but ai makes it trivially easy to make substantially different images from a single prompt so now we need an ai to infer the 'meaning' of an image. It all sounds like great news for chip makers and power generators.
Melatonic
I'm not sure I agree with the author at all on this - they make a bunch of bold claims (comparing it to the DMCA which is totally different) without much proof.
Counterpoint:
https://www.recordingacademy.com/advocacy/news/no-fakes-act-...
fifteen1506
Such companies have the resources to sell a used snake skin to an elephant.
I can be easily fooled by such blog entries.
daft_pink
I would really like a simple english explanation of what this does without the lobbying/agitator catastrophizing.
Are they saying that this act allows companies to watermark their content to prevent other companies from generating secondary content on their content?
How exactly does this act work?
strix_varius
This has nothing to do with watermarking.
It allows anyone to provide a complaint without any evidence, about any content hosted anywhere, and puts the legal onus on the person hosting that content to prove the negative (that it isn't a "fake"), with the legal requirement to remove the content as soon as is technically feasible (ie, instantly).
So, similarly abusable to the DMCA, but with even broader requirements and even more-impossible-to-prove legitimacy, since it creates a whole new ill-defined class of legal ownership over content.
TLDR: it's a great reason to move hosting outside of the US
fifteen1506
I skimmed the text.
I think it is DMCA for images, tools and derivative works, along as all logging needed to track the creator, the tool and the publisher.
As I said though, I just skimmed the text.
null
ProllyInfamous
Tennessee enacts several genAI laws July 1st; interestingly (perhaps due to state legislator misunderstanding of terminology?), these generic bans are sooo sweeping that they effectively ban owning any GPU.
>Oy': you got a license for that tensor core, mate?!
----
But hey at least we're outlawing marijuana beyond what the Federal 2018 Farm Bill authorizes, nationally /s
----
Our neighboring Friend down in Texas, the goodly-astute Gov-nuh, was wise to veto similar legislations in his own jurisdiction (due to perceived unconstitutionality / legal challenges).
jekwoooooe
[flagged]
coldtea
>It would mean the end of Reddit which would be simply glorious
Yeah, god forbid there's a place people talk somewhat freely. They might have ...bad opinions.
jekwoooooe
Talking freely and Reddit don’t belong in the same sentence. Go say something that goes against the zeitgeist. It doesn’t have to be political go express support for the wrong celebrity or cause or opinion on shoes even. It’s a cesspool of brainwashing and propaganda.
LocalH
No, they should just differentiate between curation done on behalf of and under direction by the user, and curation done to shove new content of the platform's choice in front of the user.
Platforms shouldn't be deciding for you what you see, outside of the obvious outright illegal things like CSAM
Spivak
It would also mean the end of HN.
jekwoooooe
No it’s not. It would just remove protections of 230. That means HN can’t hide behind 230 if they don’t action on illegal content (and that doesn’t seem to be a problem)
Anything else is just a slippery slope fallacy
devwastaken
thats also good. online services need to be held responsible for what they host. no where else do they get magic exceptions to well established law.
PaulDavisThe1st
what other thing is there that is equivalent to online services publishing user-generated content?
coldtea
>no where else do they get magic exceptions to well established law.
Except for all kinds of established laws broken by the state itself (e.g. extrajudicial deportations), or "fast and loose" startups like Uber and AirBnB, or any big enough company really.
And when they can't fit through exceptions to the law, they magically pay politicians and buy their own laws.
But god forbid people can talk freely about it. What if they don't have the right ideas about things?
Analemma_
Time to once again post the “You are Wrong About Section 230” article: https://www.techdirt.com/2020/06/23/hello-youve-been-referre...
In particular, see the very first bullet point: Section 230 makes no mention whatsoever of a publisher/platform distinction. People like you appear to have invented this dichotomy out of whole cloth and attached relevance to it which does not actually exist.
jekwoooooe
“ No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Seems pretty clear to me man
kmeisthax
[dead]
mschuster91
> The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”
You already need point a) to be in place to comply with EU laws and directives (DSA, anti-terrorism [1]) anyway, and I think the UK has anti-terrorism laws with similar wording, and the US with CSAM laws.
Point b) is required if you operate in Germany, there have been a number of court rulings that platforms have to take down repetitive uploads of banned content [2].
Point c) is something that makes sense, it's time to crack down hard on "nudifiers" and similar apps.
Point d) is the one I have the most issues with, although that's nothing new either, unmasking users via a barely fleshed out subpoena or dragnet orders has been a thing for many many years now.
This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies. They can afford to hire proper moderation staff to handle such complaints, they just don't want to because it impacts their bottom line - at the cost of everyone affected by AI slop.
[1] https://eucrim.eu/news/rules-on-removing-terrorist-content-o...
[2] https://www.lto.de/recht/nachrichten/n/vizr6424-bgh-renate-k...
pjc50
This is one of those cases where the need to "do something" is strong, but that doesn't excuse terrible implementations.
Especially at a time when the US is becoming increasingly authoritarian.
marcus_holmes
The EU has a different approach to this kind of regulation than the USA [0]. EU regulations are more about principles and outcomes, while US regulation is more about strict rules and compliance with procedures. The EU tends to only impose fines if the regulations are deliberately being ignored, while the US imposes fines for any non-compliance with the regs.
So while you can compare the two, it's not an apples-to-apples comparison. You need to squint a bit.
The DMCA has proven to be way too broad, but there's no appetite to change that because it's very useful for copyright holders, and only hurts small content producers/owners. This looks like it's heading the same way.
> This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies.
I don't see any exemptions for small businesses, so how do you conclude this?
[0] https://www.grcworldforums.com/risk/bridging-global-business... mentions this but I couldn't find a better article specifically addressing the differences in approach.
noirscape
Yeah one thing the EU as well as most local European courts care about is showing good faith, not just to the court but also to the other party. (Due to how US law works, this isn't quite the case in the US: respect for the court is enforced, respect for the other party isn't required.)
One of the big reasons why CJEU is handing out massive fines to the big tech companies is because of the blatant non-compliance to orders from local courts and DPAs by constantly demanding appeals to them and refusing to even attempt to do anything until CJEU affirms the local courts (and usually ramping up the fines a bit more at that). It's a deliberate perversion of the judicial process so that GAFAM can keep violating the law by delaying the proper final judgement. They'd not have gotten higher fines if they just complied with the local courts while the appeal was going on; that'd be a show of good faith, which European courts tend to value.
mschuster91
Ding ding ding. You are right on the money. Our entire legal system is based on first time offenders getting off with a slap on the wrist (if even that, you'll most likely get away with merely a warning if it's a new law or you were just incompetent or had bad luck) and only if you keep offending you'll get screwed eventually. Our aim is to have as much voluntary compliance as possible to the laws.
In contrast, American jurisdiction is "come down hard from the get-go if it's warranted, but the richer you are the better your chances are at just lawyering yourself out of trouble", and the cultural attitude is "it's better to ask for forgiveness than permission". That fundamentally clashes with our attitude, and not just the general public but especially the courts don't like it when companies blatantly ignore our democratic decisions just to make short term money (e.g. Uber and AirBnB).
johngladtj
None of which is acceptable
mschuster91
[flagged]
AnthonyMouse
The fallacy is in expecting corporations to play the role of the government.
Suppose someone posts a YouTube video that you claim is defamatory. How is Google supposed to know if it is or not? It could be entirely factual information that you're claiming is false because you don't want to be embarrassed by the truth. Google is not a reasonable forum for third parties to adjudicate legal disputes because they have no capacity to ascertain who is lying.
What the government is supposed to be doing in these cases is investigating crimes and bringing charges against the perpetrators. Only then they have to incur the costs of investigating the things they want to pass laws against, and take the blame for charges brought against people who turn out to be innocent etc.
So instead the politicians want to pass the buck and pretend that it's an outrage when corporations with neither the obligation nor the capacity to be the police predictably fail in the role that was never theirs.
johngladtj
[flagged]
anon0502
As I understood, it propose for broad filter so more content which should fall under "fair use" will now be take down faster.
> not your small mom-and-pop startup
not sure why you said this, it's the artists / content makers that suffer.
null
null
privatelypublic
Slippery slope. See how far we've fallen.
JKCalhoun
And yet, why are we here? It would seem some bad actors have lead us to the now familiar, "That's why we can't have nice things." (It's as though permissiveness and anonymity have a slippery slope as well.)
You can hate the legislation, but it would be nice to hear some alternative ideas that go beyond, "We just have to accept all the horrific stuff that the bad actors out there want to throw onto the internet.
adolph
It seems like one approach is to oppose such a law, but that is playing defense and will eventually lose.
Another approach would be to develop OSS that fulfilled the basic requirements in a non-tyrannical manner that supports people who create things. There was the example of Cliff Stoll on this site just yesterday. It is objectively wrong for someone to mistreat his creative work in that way.
What would httpd for content inspection look like? Plagiarism detection? Geo-encoded fair use?
> The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”
Sounds like the kind of system small companies can't implement and large companies won't care to implement.