Skip to content(if available)orjump to list(if available)

Forum with 2.6M posts being deleted due to UK Online Safety Act

mjburgess

What is the meaning of "illegal content" given in the OSA? What will social media platforms be forced to censor (, remove, ..) ... let's take a look:

Table 1.1: Priority offences by category ( https://www.ofcom.org.uk/siteassets/resources/documents/onli... )

Disucssion of offenses related to: prostitution, drugs, abuse & insults, suicide, "stiring up of racial/religious hatred", fraud and "foreign interference".

So one imagines a university student discussing, say: earning money as a prostitute. Events/memories related to drug taking. Insulting their coursemates. Ridiculing the iconography of a religion. And, the worst crime of all, "repeating russian propaganda" (eg., the terms of a peace deal) -- which russians said it, and if it is true are -- of course -- questions never asked nor answered.

This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA, there may have been zero actual actions involved (consider, though, a majority of UK students have taken class-A drugs at most prominent universities).

This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.

anal_reactor

We grew up with the internet being a fun place where fun things happen and you don't need to take it so seriously. It was the symbol of freedom. Then internet evolved into a business center, where everything is taken extremely seriously, don't you dare break the etiquette. It's a sad change to witness, but it is what it is.

RGamma

It was once in a lifetime. Some things are best when not everybody (but especially lawyers) is aware of them.

Maybe the future will be places guarded by real life trust.

whiteandnerdy

I'm no fan of this act but your characterisation is highly misleading.

To pick two examples from the document you linked:

Discussion of being a sex worker would not be covered. The only illegal content relating to sex work would be if you were actively soliciting or pimping. From the document:

* Causing or inciting prostitution for gain offence

* Controlling a prostitute for gain offence

Similarly, discussion of drug use wouldn't be illegal either per se, only using the forum to buy or sell drugs or to actively encourage others to use drugs:

* The unlawful supply, offer to supply, of controlled drugs

* The unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs

* The supply, or offer to supply, of psychoactive substances

* Inciting any offence under the Misuse of Drugs Act 1971

That's very different to criminalising content where you talk about being (or visiting) a prostitute, or mention past or current drug use. Those things would all still be legal content.

mjburgess

Those are indeed against the law. The issue is what these platforms are required to censor on behalf of these other laws.

Recall that we just spent several years where discussion of major political issues of concern to society were censored across social media platforms. Taking an extremely charitable interpretation of what government demands will be made here isn't merely naïve but empirically false.

And the reason I chose those kinds of illegal activities was to show that these very laws themselves are plausibly oppressive as-is, plausibly lacking in "deep democractic" support (ie., perhaps suriving on very thin majorities) -- and so on.

And yet it is these laws for which mass interactive media will be censored.

This is hardly a list with murder at the top.

eikenberry

> [..] as the highs of repressive christian moralism in the mid 20th C.

What makes you pick the mid-20th century as the high point of repressive christian moralism? That doesn't seem even close to the high point if you look back further in history.

mjburgess

I was specifically thinking of the censorship of mass media which took place in the west from the 20s-90s, which enforced a "family values" kind of christian moralism. Prior to the 20s, mass media wasn't particularly censored (https://en.wikipedia.org/wiki/Pre-Code_Hollywood):

USA : * https://en.wikipedia.org/wiki/Hays_Code * https://en.wikipedia.org/wiki/Federal_Communications_Commiss...

UK : https://en.wikipedia.org/wiki/Lord_Chamberlain

> From 1737 to 1968, the Lord Chamberlain had the power to decide which plays would be granted a licence for performance; this meant that he had the capacity to censor theatre at his pleasure.

UK : https://en.wikipedia.org/wiki/Video_nasty

> To assist local authorities in identifying obscene films, the Director of Public Prosecutions released a list of 72 films the office believed to violate the Obscene Publications Act 1959.

bryanrasmussen

I'm supposing they mean

as the highs of (repressive christian moralism in the mid 20th C.)

and not

as the highs of (repressive christian moralism) in the mid 20th C.

null

[deleted]

shadowvoxing

It's a cheapshot at Christianity. That's all it is.

spiderfarmer

You're right. He should have mentioned the Victorian era (1837–1901) as a clear precedent of repressive moralism. Though there are still echoes of it today, where Christians want to ban any sort of criticism of their morality.

null

[deleted]

ThinkBeat

It is odd that stirring up hatred is fine, as long as it does not pertain to religion, race, sexual orientation

Frieren

This is because use of this data could create significant risks to the individual’s fundamental rights and freedoms. For example, the various categories are closely linked with:

- freedom of thought, conscience and religion; - freedom of expression; - freedom of assembly and association; - the right to bodily integrity; - the right to respect for private and family life; or - freedom from discrimination.

ThinkBeat

- freedom of thought - freedom of expression - freedom of assembly and association

Then political views should be protected in the same manner?

Further, would this mean that even mentioning: Problems with child abuse in the Catholic church are forbidden. Problems with LGBT rights in some Islamic groups.

Since both can be seen as spreading hatred of people based on religion.

What about spreading hatred about fat people? Why is that not included?

This response is only intended to point out problems with such censorship as this bill defines.

Spreading or receiving hateful harassment is wrong regardless of the why it should all be banned. Or hateful harassment should be protected under freedom of speech.

To make a law that allows hateful harassment sometimes and makes it illegal in others is inherently not sustainable Since it will almost certainly have to keep expanding as other vulnerable groups are identified and thus deserve equal protection.

milesrout

Where does it say discussion of those offences is illegal content? It says "content that amounts to a relevant offence". Frustratingly that is nonsensical: content surely cannot "amount to an offence" in and of itself. Offences have elements, which fall into two categories: actus reus and mens rea. And "content" cannot be either. Perhaps posting some content or possessing some content is the actus reus of an offence but the content itself does not seem to me to sensibly be able to be regarded as "amounting to an offence" any more than a knife "amounts to an offence". A knife might be used in a violent offence or might be possessed as a weapons possession offence but it makes no sense to me to say that the knife "amounts to an offence".

Either way, the point of that document in aggregate seems to be that "illegal content" is content that falls afoul of existing criminal law already: (possession and distribution of) terrorist training material is already illegal and so it is illegal content. But saying that you committed an offence is not, in and of itself, an offence, so saying you took drugs at university doesn't seem to me like it could be illegal content. Encouraging people to do so might be, but it already is.

Maybe I missed the bit where it says discussing things is illegal, so correct me if I am wrong.

Not your lawyer not legal advice etc etc

josephg

> This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA

There's nothing illegal about hosting a forum. The problem is that you as the site operator are legally required to take down certain kinds of content if and when it appears. Small sites with no money or staff don't have the resources to pay for a full time moderator. That cost scales with the number of users. And who knows whats in those 2.6M historical posts.

From TFA:

> The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it

Maybe an LLM can carry some of the load here for free forums like this to keep operating?

sva_

> Maybe an LLM can carry some of the load here for free forums like this to keep operating?

It can't give you any guarantees, and it can't be held liable for those mistakes.

edoceo

And it misses the point that the law seems to, or could be used to, criminalise the simple discussion of unpleasant (to some) topics.

Without free discourse...well, I think it'd be real bad

mullingitover

This seems to be what the anti-Section 230 folks are going for. The UK just...went ahead and did it?

crimsoneer

All you need to do is have a think about what reasonable steps you can take to protect your users from those risks, and write that down. It's not the end of the world.

handoflixue

1.36 Table 1.2 summarises the safety duties for providers of U2U services in relation to different types of illegal content. The duties are different for priority illegal content and relevant non-priority illegal content. Broadly they include:

a) Duties to take or use proportionate measures relating to the design or operation of the service to prevent individuals from encountering priority illegal content and minimising the length of time that such content is present on the service;

b) Duties to take or use proportionate measures relating to the design or operation of the service to design and operate systems in a way which mitigates and manages the risks identified in the service provider’s risk assessment;

c) A duty to operate the service using proportionate systems and processes designed to swiftly take down (priority or non-priority) illegal content when they become aware of it (the ‘takedown duty’); and

d) A duty to take or use proportionate measures relating to the design and operation of the service to mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence

---

That's a bit more than "have a think"

like_any_other

That is false. The post you replied to virtuously linked directly to the UK government's own overview of this law. Just writing down "reasonable steps" [1] is insufficient - you also have the following duties (quoting from the document):

- Duties to take or use proportionate measures relating to the design or operation of the service to prevent individuals from encountering priority illegal content and minimising the length of time that such content is present on the service;

- Duties to take or use proportionate measures relating to the design or operation of the service to design and operate systems in a way which mitigates and manages the risks identified in the service provider’s risk assessment;

- A duty to operate the service using proportionate systems and processes designed to swiftly take down (priority or non-priority) illegal content when they become aware of it

- A duty to take or use proportionate measures relating to the design and operation of the service to mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence.

- The safety duty also requires providers to include provisions in their terms of service specifying how individuals are to be protected from illegal content, and to apply these provisions consistently.

Even if the language of this law was specific, it requires so many so invasive and difficult steps, no hobbyist, or even small company could reasonably meet. But it's anything but specific - it's full of vague, subjective language like "reasonable" and "proportionate", that would be ruinous to argue in court for anyone but billion dollar companies, and even for them, the end result will be that they are forced to accede to whatever demands some government-sanctioned online safety NGO will set, establishing a neverending treadmill of keeping up with what will become "industry standard" censorship. Because it's either that, or open yourself to huge legal risk that, in rejecting "industry standard" and "broadly recognized" censorship guidance to try to uphold some semblance of free discussion, you have failed to be "reasonable" and "proportionate" - you will be found to have "disregarded best practices and recognized experts in the field".

But, short of such an obvious breach, the rules regarding what can and can't be said, broadcast, forwarded, analysed are thought to be kept deliberately vague. In this way, everyone is on their toes and the authorities can shut down what they like at any time without having to give a reason. [2]

[1] Good luck arguing over what is "reasonable" in court if the government ever wants to shut you down.

[2] https://www.bbc.com/news/world-asia-china-41523073

roenxi

> "foreign interference"

That is a very tricky one to manage on an online forum. If an American expresses an opinion about UK policy, in a literal sense that is literally foreign interference. There isn't a technical way to tell propagandists from opinionated people. And the most effective propaganda, by far, is that which uses the truth to make reasonable and persuasive points - if it is possible to make a point that way then that is how it will be done.

The only way this works is to have a list of banned talking points from a government agency. I'd predict that effective criticism of [insert current government] is discovered to be driven mainly by foreign interference campaigns trying to promote division in the UK.

This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue - what is the point? the flat earthers are never going to gain traction and it doesn't matter if they do - the only topics worth suppressing are things that are plausible and persuasive. The topics most likely to turn out to be true in hindsight.

derefr

> The only way this works is to have a list of banned talking points from a government agency.

How so? The "obvious" solution to me, from the perspective of a politician, would be to 1. require online identity verification for signup to any forum hosted in your country, and then 2. using that information, only allow people who are citizens of your country to register.

(You know, like in China.)

roenxi

That won't stop foreign disinformation. They'll just pay some local to say it.

And China's system doesn't stop disinformation; it promotes disinformation. It it designed to make sure that only China-sponsored disinformation is available. If you want a system for that it is a solved problem; it just isn't a good idea.

bryanrasmussen

The British legal system is a common law one like the U.S I believe, so it would be up to court interpretation.

Foreign interference would probably be interpreted as an organized campaign of interference being launched by a foreign power.

>This runs into the same problem as all disinformation suppression campaigns - governments have no interest in removing the stuff everyone agrees is untrue

at one time everyone agreed Anti-Vaxx was untrue, and now it's American government policy but still just as untrue.

ziddoap

Related post with a large discussion from someone who said:

"Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)

[...] I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.

and it's all being shut down [...]"

For the same reasons.

https://news.ycombinator.com/item?id=42433044

femiagbabiaka

LFGSS was a legendary forum, sad to see it go.

mikrotikker

If it's hosted in the USA what's the problem?

nickdothutton

Anyone* would be crazy to run a UK-based or somewhat UK-centric forum today. Whether it be for a hobby, profession, or just social interaction. The government doesn’t perceive these sites as having any value (they don't employ people or generate corporation tax).

[*] Unless you are a multibillion $ company with an army of moderators, compliance people, lawyers.

whartung

Well I'm on a forum run by a UK company, hosted in the UK, and we've talked about this, but they're staying online. And, no, they're not a multibillion dollar company.

I don't see our moderators needing to do any more work than they're already doing, and have been doing for years, to be honest.

So we'll see how the dice land.

like_any_other

As long as they don't upset anyone with influence (government, media, etc.), they'll probably be fine. Otherwise, at best they'll be looking at a ruinously expensive legal battle to justify if what they did was "reasonable" or "proportionate" - the vague terms used by the law.

For my friends, everything; for my enemies, the law.

ChrisRR

At least they're a UK company though so presumably they've at least got some money to support this. If you're an individual running a hobby forum then you're SOL

DarkmSparks

more than just forums, it's basically a failed state now. I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died, and that would be soon, but I never imagined it would get this bad.

The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.

I'd say "there will be blood on the streets", but there already is...

This video pretty much sums up what the UK is now. https://m.youtube.com/watch?v=zzstEpSeuwU

multjoy

No, the proposal is that there is a power of entry where the police have reasonable grounds to believe stolen property is on the premises and that this is supported by tracking data and that authority to enter is provided and recorded by a police inspector.

This is analogous to s18 PACE post-arrest powers, grafted onto s17 PACE.

The alternative is that we continue to require police to try and get a fast-time warrant while plotted up outside a premises; this is not a quick process, I've done it and it took nearly two hours.

>there will be blood on the streets

Oh, dry up.

DarkmSparks

The topic here is how they made running public forums a crime.

After making secure communications a crime.

And you think a state like that cares about the formalities? lol..

They just doing what every other monarchy and dictatorship has done in a desperate bid to hold onto power while the state collapses due to inept leadership.

selfhoster11

I find it terrifying that you consider this to be legitimate grounds for a search, and a reasonable procedure for obtaining permission to do so. They should get in line and get permission from proper legal authorities, like all other law enforcement.

spacechild1

> The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.

This seems to be limited to stolen geo-tagged items: https://www.theguardian.com/uk-news/2025/feb/25/police-new-p...

I would agree that this law is a slippery slope, but at the same time we should not omit important facts.

DarkmSparks

Its not a slippery slope, its carte blanche for a police force with a reputation for e.g. beating elderly people to death because they looked at them wrong (most famous being Ian Tomlison, but its fairly regular) to not have to hold back just simply because they run into a locked door.

And that is before you get into the court system, which if you need a quick primer, just look at the treatment of Julian Assange - and thats a "best case" for someone with millions of global supporters.

Uk police have targets to hit, they can't hit those targets going after real criminals, so they predominantly target people nieve enough to think they want to help them.

Of course they had to make running public forums a crime.

RLN

>I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died

How small was your school year?! What does Elizabeth (presumably the 2nd) dying have to do with anything?

DarkmSparks

>What does Elizabeth (presumably the 2nd) dying have to do with anything?

Lets just say her replacements brother is Andrew, and his best mate was Jimmey Saville. Should tell you all you need to know about her replacement with less chance of me ending up like David Kelly.

Heads of state do matter, regardless of how much propaganda they push that they only matter in other countries. These laws are not something the labour voters asked for.

crimsoneer

[flagged]

JansjoFromIkea

"it was going to get bad once Elizabeth died"

What do you think she was doing?

crimsoneer

I'm sure she had massively strong views on the online safety act and encryption

ChrisRR

This comment has got daily mail reader written all over it

huang_chung

[flagged]

_fjg8

The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.

A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

mschuster91

> A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

Facebook can actually train AI to detect CSAM, and is probably already doing so in cooperation with NCMEC and similar organisations/authorities across the world.

Your average small website? No chance. Obtaining training material actively is seriously illegal everywhere, and keeping material that others upload is just as bad in most jurisdictions.

The big guys get the toys, the small guys have to worry all the goddamn time if some pedos are going to use their forum or whatnot.

CamperBob2

No, that is not how it works. Large companies can afford compliance costs. Smaller ones can't.

andrei_says_

I believe file uploading services like cloudinary have this capability already. It does have a cost, but it exists.

_fjg8

What are the compliance costs for this law that would apply to a small independent forum?

jonatron

HEXUS stopped publishing in 2021, and the company no longer exists. The forums were kept because they don't take much work to keep online. Now, there's a lot of work to do, like reading hundreds of pages of documents and submitting risk assessments. There's nobody to do that work now, so the idea was it could go into read only mode. The problem with that was, some users may want their data deleted if it becomes read only. Therefore, the only option is to delete it.

ninininino

Sort of like burning down a library because you can't make it ADA compliant and install a wheelchair ramp.

zamadatix

I feel as though the "sort of" is doing a lot of work there.

thereisnospork

Iirc UC Berkeley(?) did exactly that to their YouTube library of recorded lectures due to an accessibility lawsuit.

moffkalast

I think a more accurate comparison would be burning down a library because you can't afford the manpower to check every single book for arbitrarily defined wrongthink.

beeflet

It's more like shutting down a library because you are unwilling to censor the books

jacooper

Why don't they just anonymize the users? Discourse does this, and it's apparently GDPR compliant.

londons_explore

gdpr compliance depends a lot on who you ask, and only a court can make the final decision.

Stripping all usernames out of a forum certainly makes it safer, but I don't think anyone can say there still won't be a few pissed off users who wrote things they now regret on there, and can be tracked back to individuals based on context/writing style alone.

ChrisRR

The online safety act is different to GDPR

nerdile

Summary: The UK has some Online Safety Act, any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks. The law applies to any site that targets UK citizens or has a substantial number of UK users, where "substantial number" is not defined.

I'm going to guess this forum is UK-based just based on all the blimey's. Also the forum seems to have been locked from new users for some time, so it was already in its sunset era.

The admin could just make it read only except to users who manually reach out somehow to verify their age, but at the same time, what an oppressive law for small UK forums. Maybe that's the point.

zimpenfish

IANAL

> any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks.

But I believe you only need age verification if pornography is posted. There's also a bunch of caveats about the size of user base - Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified whether it applies to / will be policed for, e.g., single-user self-hosted Fediverse instances or small forums.

I don't blame people for not wanting to take the risk. Personally I'm just putting up a page with answers to their self-assessment risk questionnaire for each of my hosted services (I have a surprising number that could technically come under OSA) and hoping that is good enough.

tremon

I believe you only need age verification if pornography is posted

But if you let users interact with other users, you're not in control of whether pornographic material is posted, so it's safer to comply beforehand.

I commend you for keeping your site up and hoping for the best. I don't envy your position.

bostik

> Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified [...]

This has echoes of the Snooper's Charter and Apple's decision to withdraw ADP from all of UK.

It is not enough for regulators to say they won't anticipate to enforce the law against smaller operators. As long as the law is on the books, it can (and will) be applied to a suitable target regardless of their size.

I saw this this same bullshit play out in Finland. "No, you are all wrong, we will never apply this to anything outside of this narrow band" -- only to come down with the large hammer less than two years later because the target was politically inconvenient.

jonathanstrange

I geo-block UK visitors on all of my websites. It's sad but the safest solution.

_bin_

why? if you're located elsewhere you can literally just ignore UK/EU law. they don't have jurisdiction over you; worst-case scenario is probably them ordering ISPs to block your site.

rixed

What if a large number of brits access your websites from a different country? :-/

null

[deleted]

null

[deleted]

mattlondon

It's for 7 million active UK users per month. https://www.ofcom.org.uk/siteassets/resources/documents/onli... - definition on page 64.

That's quite sizeable. How many sites can you name have 7 million monthly active UK users? That's over one-in-ten of every man, woman and child in the UK every month using your site.

kimixa

Yes, the actual draft doesn't really add many requirements to non "large" services, pretty much having a some kind of moderation system, have some way of reporting complains to that, and a filed "contact" individual. I note it doesn't require proactive internal detection of such "harmful" content that many people here seem to assume, just what they already have 'reason to believe' it's illegal content. Even hash-based CASM detection/blacklisted URLs isn't required until you're a larger provider or a file share product.

It just seems like an overly formalized way of saying "All forums should have a "report" button that actually goes somewhere", I'd expect that to be already there on pretty much every forum that ever existed. Even 4chan has moderators.

csense

Rather than shut it down, would it be possible to sell the forum to someone in the US for a little bit of money, like $20 or something?

Idea being the US-based owner migrates the DB with posts and user logins to servers hosted on US soil, then if the UK government comes knocking the former owners in the UK can say "Sorry it doesn't belong to us anymore, we sold it, here's the Paypal receipt." (Ideally they'd sell the domain too, but as long as you still have the DB you could always host the forum at a different domain.)

Any forum admins here willing to add another forum to their portfolio?

ryandrake

Or maybe open it up to scraping so someone can archive it--if the content is that useful, surely some hobbyist outside U.K. with a few GB of disk space would be willing to host it.

CursedUrn

The US owner would still be obliged to follow the UK rules, apparently. It's unclear how punishment will be enforced exactly.

throwaway48476

Just create a block page for UK IPs with VPN ads.

rich_sasha

It's awkward.

It's clear this law affects terribly bona fide grassroots online communities. I hope HN doesn't start geoblocking the UK away!

But then online hate and radicalization really is a thing. What do you do about it? Facebook seems overflowing with it, and their moderators can't keep up with the flow, nor can their mental health keep up. So it's real and it's going to surface somewhere.

At some level, I think it's reasonable that online spaces take some responsibility for staying clear of eg hate speech. But I'm not sure how you match that with the fundamental freedom of the Internet.

observationist

You don't. "Hate speech" is code for "the government knows better and controls what you say."

Yes, racism exists and people say hateful things.

Hate speech is in the interpretation. The US has it right with the first amendment - you have to be egregiously over the line for speech to be illegal, and in all sorts of cases there are exceptions and it's almost always a case-by-case determination.

Hateful things said by people being hateful is a culture problem, not a government problem. Locking people up because other people are offended by memes or shitposts is draconian, authoritarian, dystopian nonsense and make a mockery of any claims about democracy or freedom. Europe and the UK seem hellbent for leather to silence the people they should be talking with and to. The inevitable eventual blowback will only get worse if stifling, suppressing, and prosecuting is your answer to frustrations and legitimate issues felt deeply but badly articulated.

rich_sasha

I see no reason why hate speech should be given the benefit of the doubt. And no, it's not because my government told me so, I have my own opinion, which is that freedom of speech ends where threats of violence appear.

If you don't want it tolerated online, which I don't, you need some kind of legal statement saying so. Like a law that says, you can't do it, and websites can't just shrug their shoulders and say it's not their problem.

I don't line this legislation as it seems to be excessive, but I disagree that the root issue it tries to address is a made up problem.

EDIT it just struck me that in speech and otherwise, the US has a far higher tolerance for violence - and yes I do mean violence. Free speech is taken much further in the US, almost to the point of inciting violence. Liberal gun laws mean lots of people have them, logically leading to more people being shot. School shootings are so much more common, and it appears there is no widespread conclusion to restrict gun ownership as a result.

Maybe that's a core difference. Europeans genuinely value lower violence environments. We believe all reasonable things can be said without it. That doesn't make this legislation good. But at least it makes sense in my head why some people glorify extreme free speech (bit of a tired expression in this age).

david422

> I see no reason why hate speech should be given the benefit of the doubt

Because a lot of speech people don't like gets relabeled as hate speech - which is's not. Or a lot of discussion/debate topics that are sensitive get relabeled as hate.

mitthrowaway2

I agree that threats of violence cross a line, but I think that many countries interpret hate speech to be much broader than this, and there's certainly room for people to disagree, or for one person to say something in a neutral and non-hateful way that another person interprets as a hateful attack.

Some edge cases might include: arguing about interpretations of historical events (eg. Holocaust denial, colonialism, nuclear bombings); arguing about the economic effects of immigration policy; suggesting that one country or another is currently committing genocide; suggesting that one country or another is not currently committing genocide; expressing support for a country or political party that some consider to be committing genocide; arguing that travel restrictions should be imposed on certain countries to contain an epidemic; writing "kill all men" on reddit; publishing a satirical political cartoon depicting the prophet Mohammad; advocating political independence for some geographic region; expressing support for the police in an instance in which they took a state-authorized violent action; expressing support for a vigilante; expressing support for one's country during a violent conflict; expressing sympathy with the opposing side during a conflict; demanding stronger legal penalties for criminals (eg. supporting Singapore's death penalty for drug dealers); publishing a fiction novel in which the villain is a member of a minority group and acts in accordance with a stereotype.

Personally, while I think limits are necessary, the guidelines should be extremely specific and the interpretation extremely narrow to minimize any chilling effect on legitimate expression and discussion. Even where speech can verge into hurtful or offensive territory, I think it's important to allow it in the open, because I think dialogue builds more bridges than it burns. I am concerned that a lot of internet hate-speech legislation goes too far into leaving hatred open to interpretation, which results in conversation spaces being closed down because of the potential liability.

Asooka

The problem is that policing hate speech creates a police state worse than allowing hate speech to exist. The system you need to create to police the hate speech will result in more violence against people than letting the hate speech exist. To me, your very statement "freedom of speech ends where threats of violence appear" is a form of hate speech. You are hating on my principle of free speech. It actually makes me physically sick to read those words, because I know where they lead.

Generally on the Internet you would make use of existing tools to prevent people from talking to you if you find them hurtful. For example, I could just block you and not deal with you any more. Sometimes people get around those to harass others. That is definitely bad and we already have laws against harassment and ways for law enforcement to find those individuals without creating a full police state on the Internet. Posting your opinion once is not harassment, no matter how much it makes me want to puke. Or as we used to say in a more civilised time, I abhor your speech, but I will fight to the death for your right to speak it.

I don't know where you got your conclusion from - I am European and I don't mind violent speech. In fact I think we generally need a lot more freedom since many countries give their citizens barely more freedom than serfs had. School shootings have been a perennial favourite for your type to parade around so you can rule over a disarmed population, but e.g. Czechia lets you have a gun at home as easily as the USA and they do not have that problem. USA's problem is mostly societal.

Your opinion sounds like it was formed in the ivory tower of university with no connection with reality. Please get more varied life experience and reconsider your position.

psunavy03

> Liberal gun laws mean lots of people have them, logically leading to more people being shot.

Explain Czechia and Switzerland, then, please.

mrandish

> Free speech is taken much further in the US, almost to the point of inciting violence.

Yes, that's where we (here in the U.S.) draw the legal line. But almost inciting violence is not inciting violence. Since the U.S. made free speech the focus of the very first rule in the constitution, an enormous amount of jurisprudence and precedent has emerged around exactly how to make those tricky case by case judgements. Whether one agrees with it or not, it's easily the most evolved, detailed and real-world tested (over many decades) body of free speech law humanity has. Because it's deep, complex and controversial, there's also quite a bit of misunderstanding and misinformation about U.S. free speech law. I see incorrect assertions and assumptions quite often in mainstream media outlets who should know better. Here's a good primer on some of the most common misunderstandings: https://www.theatlantic.com/ideas/archive/2019/08/free-speec...

I've studied and read a lot about free speech and the first amendment as I find it fascinating. It took me quite a while to really understand how and why the U.S. implementation got to where it really is (and not the exaggerations and extrapolations that sometimes get amplified). In terms of free speech current practice and precedent, I now think the U.S. has got it just about right in the tricky balance between ensuring the open exchange of ideas (even unpopular ones) against preventing actually real and serious defamation, libel and incitement. To be sure, the U.S. system is based on the principle that it's not the job of the current government in power to force adults to be nice, reasonable or respectful in either words or tone. Freedom of speech means the freedom to be wrong, stupid, or mean, to be insulting or offensive - even to provoke or inflame should you choose to.

While the government won't send men with guns to force you to shut up, other citizens are also free to exercise their rights to tell you (and everyone else) you're an asshole, that you're wrong and exactly why. They are equally free to be rude, offensive and even hateful against your ideas and you. One of the key ideas behind the U.S. constitution is every fundamental right granted to all citizens comes with matching responsibilities for all citizens. In other words, no right is free - they have actual, personal costs for each citizen. In the case of the first amendment, the responsibilities include tolerating speech that's wrong, boorish, offensive or even hateful. As well as the responsibility to exercise your own good judgement on which speech to ignore, reject and/or counter. The open marketplace of ideas, like all markets, is two-sided. Another responsibility is accepting the consequences of exercising your free speech unwisely. Your fellow citizens are free to ignore, argue, yell back, openly mock or just laugh at you. Ultimately, the framers of the constitution believed the majority of citizens can figure out for themselves who's an idiot and who's worth listening to. Which ideas are worth considering and which are important to stand against.

thorncorona

What defines hate speech? Who defines hate speech? Does hate speech result from the speech or the actions of those against the speech? Should the speech of protestors have consequences for disturbing the peace? What consequences should the state force onto individuals for speech, or actors affected by speech?

Americans for lack of a better description grapple with violence of the state differently than Europeans, but it seems neither are without consequence.

crimsoneer

But this is a very us centric view. The rest of the world doesn't tolerate people going around being violent because of the constitution.

thrance

How would you feel about receiving daily credible death threats to you and your family? Should that be tolerated too in the name of the first amendment?

Point is, we must draw the line somewhere. It's never "everything goes". Tolerating intolerance always ends up reducing freedom of expression.

Look at the US, the government is doing everything it can to shove trans people back in the closet, their voices are silenced and government websites are rewritten to remove the T in LGBT. By the very same people who abused "the first amendment" to push their hateful rhetoric further and further until it's become basically fine to do nazi salutes on live TV.

"Free speech absolutism" is a mirage, only useful to hateful people who don't even believe in it.

dinkumthinkum

Death threats are not protected by free speech. I know you are trying to make a hyperventilating political point but it’s just not a genuine thing. I am a little surprised at the anoint of those on HN that are against free speech. I mean, don’t you realize that without it, a government you don’t like could imprison you for “denying basic facts oh biology” just as another country does for “denying historical events”. It’s madness.

energy123

Hate speech is the thing that plays on the radio station that directly causes the mass graves of the Rwandan genocide. The physical call to violence is just the very last step in a long chain of escalating hate speech, but it is no more culpable than the preceding hate speech that created the environment where that physical call to violence is acted on.

will4274

During the Rwandan genocide, the radio stations played incitement to violence. While "hate speech" is inclusive of speech that incites violence, the types of hate speech which people have contemporary political disagreements about (including this thread) do not include such incitement.

More importantly, causality doesn't erase culpability. The step that immediately preceded the [Charlie Hebdo shooting](https://en.wikipedia.org/wiki/Charlie_Hebdo_shooting) was publishing a cartoon in a newspaper. Those who create hateful environments may have some culpability, but those that act almost always have greater culpability than those who speak.

notatty

[flagged]

kypro

> But then online hate and radicalization really is a thing.

I'm not trying to be edgy, but genuinely why do you care if someone says or believes something you feel is hateful? Personally I'm not convinced this is even a problem. I'd argue this is something that the government has been radicalising people in the UK to believe is a problem by constantly telling us how bad people hating things is. Hate doesn't cause any real world harm – violence does. And if you're concerned about violence then there's better ways to address that than cracking down on online communities.

In regards to radicalisation, this is a problem imo. I think it's clear there is some link between terrorism and online radicalisation, but again, I'd question how big a problem this is and whether this is even right way to combat these issues... If you're concerned about things like terrorism or people with sexist views, then presumably you'd be more concerned about the tens of thousands of unvetted people coming into the country from extremist places like Afghanistan every year? It's not like online radicalisation is causing white Brits to commit terror attacks against Brits... This is obviously far more an issue of culture than online radicalisation.

So I guess what I'm asking is what radicalisation are you concerned with exactly and what do you believe the real world consequences of this radicalisation are? Do you believe the best way to stop Islamic terrorism in the UK is to crack down on content on the internet? Do we actually think this will make any difference? I don't really see the logic in it personally even if I do agree that some people do hold strange views these days because of the internet.

staticautomatic

Hate and radicalization are products of existential purposelessness. You can’t make them go away by preventing existentially purposeless people from talking to each other.

mschuster91

> You can’t make them go away by preventing existentially purposeless people from talking to each other.

At least you can limit the speed of radicalization. Every village used to have their village loon, he was known and ignored to ridiculed. But now all the loons talk to each other and constantly reinforce their bullshit, and on top of that they begin to draw in the normies.

rich_sasha

No, you can't, but also theres is no reason why the law about allow these to be up. Plenty of people have racist thoughts, and that's not illegal (thoughts in general aren't), but go print a bunch of leaflets inciting racist violence and that is illegal.

I see this as an internet analogy.

mjburgess

https://en.wikipedia.org/wiki/2023_Quran_burnings_in_Sweden

Does burning a religious book "incite violence" ? It causes it, for sure. Free expression brings about, in the fanatic, a great desire to oppress the speaker. That's why we have such a freedom in the first place.

thorncorona

It seems though that allowing a country which already has problems with “lawful free speech,” to tamp down more on free speech would bring issues no?

Without mentioning the oxymoron that lawful free speech is.

staticautomatic

Yes, incitement is illegal, but you haven't said what kind of speech you actually have in mind. Rather, you've made a tautological assertion that we can't allow incitement because incitement is illegal.

notatty

[flagged]

baggy_trough

Governmental attempts to reduce "online hate" (however defined, as it is entirely subjective) are just going to make our problems worse.

verisimi

> online hate and radicalization really is a thing

People have always had opinions. Some people think other people's opinions are poor. Talking online was already covered by the law (eg laws re slander).

Creating the new category of 'hate speech' is more about ensuring legal control of messages on a more open platform (the internet) in a way that wasn't required when newspapers and TV could be managed covertly. It is about ensuring that the existing control structures are able to keep broad control of the messaging.

notatty

[flagged]

mjburgess

Is it a thing?

I mean we had the holocaust, Rwandan genocide and the transatlantic slave trade without the internet.

The discovery, by the governing classes, that people are often less-than-moral is just as absurd as it sounds. More malign and insidious is that these governors think it is their job to manage and reform the people -- that people, oppressed in their thinking and association enough -- will be easy to govern.

A riot, from time to time -- a mob -- a bully -- are far less dangerous than a government which thinks it can perfect its people and eliminate these.

It is hard to say that this has ever ended well. It is certainly a very stupid thing in a democracy, when all the people you're censoring will unite, vote you out, and take revenge.

rich_sasha

It is a thing for sure. How often it happens, I don't know.

I read a number of stories about school children being cyber-bullied on some kind of semi-closed forum. Some of these ended in suicide. Hell, it uses to happen a lot on Facebook in the early days.

I totally understand a desire to make it illegal, past a certain threshold. I can see how you start off legislating with this in mind, then 20 committees later you end up with some kind of death star legislation requiring every online participant to have a public key and court-attested age certificate, renewed annually. Clearly that's nonsense, but I do understand the underlying desire.

Because without it, you have no recourse if you find something like this online. For action to be even available, there has to be a law that says it's illegal.

potato3732842

> Clearly that's nonsense, but I do understand the underlying desire.

I wanna eat hamburgers like Peter Griffin in the stroke episode. But I don't because I'm an adult with logical thinking abilities and I know that there are consequences to my actions even if they are not immediate.

I have less, much less, than zero sympathy for people who advocate for doing things with law and government that the history textbooks are stuffed full of the horrific and nearly inevitable eventual consequences of.

Having benign motives doesn't absolve people for being stupid.

Not that any of this is in disagreement with your points.

mjburgess

Of course hatred, bullying, etc. is real -- what I was referring to is some special amount or abundance of it as caused by free discussion on the internet (rather than, say, revealed by it; or even, minimised by it).

We're not running the counter-factual where the internet does not exist, or was censored from the start, and where free expression and discussion has reduced such things.

The salem witch trials are hardly a rare example of a vicious mob exploiting a moral panic to advance their own material interests -- this is something like the common case. It's hard to imagine running a genocide on social media -- more likely it would be banned as "propganda" so that a genocide could take place.

We turned against the internet out of disgust at what? Was is the internet it itself, or just a unvarinished look at people? And if the latter, are we sure the internet didnt improve most of them, and hasnt prevented more than its caused?

I see in this moral panic the same old childish desire to see our dark impulses as alien, imposed by a system, to destroy the system so that we can return to a self-imposed ignorance of what people are really thinking and saying. It's just victorian moralism and hypocricy all over again. Polite society is scandalised by the portrait of dorian gray, and we better throw the author in jail .

notatty

I mean, is it impossible that the commodified web is a sufficient but not necessary condition for atrocities? "But we had the Holocaust without it!" Okay, nobody said the internet was THE cause of ALL atrocities, just that it's actively contributing to today's atrocities. I think your logic is a bit... wrong.

mjburgess

That's not quite my argument. A little more formally:

There's a base rate of human malevolence running in each society. We do not know this base rate, and we can only sample malevolence via mass media (, police reports, etc.). If the mass media (including internet) were a neutral measurement device then we could say for sure that what we're seeing is just the background conditions of society leading to eg., riots, etc.

Because our measuring device isnt neutral we have a problem: are the things we see caused by our measuring? Do we cause more malevolence by participating in social media, which also makes us aware of it?

My argument is that we are presently significantly over-estimating the effect of our participation in the internet as a cause. My view is that its effects at reducing bad-stuff are likely more potent than its effects at causing it, and the vast majority of what we see isn't caused by the internet at all.

One argument for this is that it seems baseline malevolence (violence, etc.) is significantly decreasing, is historically very high, and that nothing we see via the internet is suprisingly above this historical case.

j8k99kuyr

[flagged]

mrtesthah

Online hate is skyrocketing in large part because billionaires and authoritarian regimes are pumping in millions of dollars to uplift it. Let’s address this issue at its source.

notatty

[flagged]

hexator

Feels more and more like we're at the end of an era when it comes to the internet.

kelnos

How so? This is just the UK. While the UK really does want to enforce this globally, they really have no enforcement power against non-UK citizens who do not reside in the UK.

Certainly it's possible (and perhaps likely!) that the EU and US will want to copycat this kind of law, but until that happens, I think your alarm is a bit of an overreaction.

nradov

A lot of people who travel internationally occasionally transit through UK jurisdiction, such as a connection at LHR. This potentially places forum operators in personal legal jeopardy. Would the UK authorities really go after some random citizen of another country for this? Probably not, but the risk isn't zero.

hexator

Similar laws are being written elsewhere, Section 230 may not last the next few years. It's not just the UK.

gotoeleven

Well the attacks on section 230 from the right are about removing censorship not adding censorship so I'm not sure section 230 is a good comparison.

Yizahi

USA has backdoor laws afaik. Sweden is targeting Signal to force them create a backdoor. And this is only from regular news, I'm not even reading infosec industry updates. All govts are targeting privacy tools and the clock is ticking for them. I'm only hoping that one day these fuckers will be targeted themselves via exploits they have forced on us.

ss64

First they came for the British And I did not speak out Because I was not British...

mbostleman

By Internet do you mean Western Civilization?

DoingIsLearning

I was gonna say looking at world affairs, it's starting to feel like the end of the Westphalian system.

TeaBrain

Any pretense of the Westphalian system in most of Europe ended with the European Union.

notatty

[flagged]

notatty

[flagged]

massifgreat

These UK laws might boost Tor usage.. let's hope something good will come from the full censorship political tyranny in Europe.

sedatk

If enough people switch to Tor, then Tor will get banned. Technical solutions don’t fix bad policies.

LAC-Tech

If you're in a struggle against a hostile regime, you don't refuse to use the weapons available to you because they're not what will bring you final victory. You use whatever you can.

sedatk

Don’t refuse of course, but any workaround will be unsustainable, and you will eventually run out of measures unless the issue gets addressed politically.

beeflet

Tor is pretty hard to block. I think that some sort of mixnet is pretty much the solution to all ISP/Government spying and censorship on the web as they make the law de-facto unenforcable

sedatk

It's not, really. All governments have to do is to block all IP addresses in exitnodes.txt, and suddenly it's only a handful of people who can bootstrap Tor using custom exit nodes remain.

pmdr

I doubt it. I think these laws were made to herd users towards big tech's established platforms that are 'policed' by community guidelines deemed 'appropriate' and where the content is never more than a takedown request away.

Welcome to the new internet.

(and it's funny how everyone's yelling 'fascist' at whatever happens in the US instead)

a0123

Two countries can be fascist at the same time.

And it's not like the UK and the US aren't known for exchanging the worst of the worst with each other all the time.

throw_m239339

Right, it is called Regulatory Capture, because big actors have the means to comply.

pembrook

Trust me, while the big social media sites love this, it wasn't their lobbying that made this happen.

The UK government has a long history of meddling in media coverage to achieve certain aims. Up until Covid, legacy media still had control over the narrative and the internet was still considered 'fringe,' so governments could still pull the tried-and-true levers at 1-3 of the big media institutions to shape opinion.

Post-covid, everyone became internet nerds and legacy media in english-speaking countries fully lost control of the narrative.

This regulation is intended to re-centralize online media and bring back those narrative control levers by creating an extremely broad surface area of attack on any individual 'creator' who steps out of line.

aqueueaqueue

Leave this "vile" "unsafe" forum and go talk on ... er ... Twitter.

tehjoker

they were alarmed they lost what used to be tight control of media narratives around e.g. the gaza genocide and are working overtime to concentrate control so it doesn't happen again

let it be known the UK used its carve out territory in Cyprus to process bomb shipments to the IDF in furtherance of a genocide

https://www.aljazeera.com/news/2024/1/15/uk-bases-in-cyprus-...

anigbrowl

The UK is not in Europe, which would otherwise impose human rights legal constraints on UK government legislation.

ksp-atlas

The UK is in Europe, it didn't suddenly break off and float away, it's just not part of the EU, there's a bunch of European countries that aren't in the EU

milesrout

The UK is in Europe. What other continent would it be in?

It isn't in the EU, but it is a member of the Council of Europe, which is why it is still a party to European Declaration of Human Rights and the European Court of Human Rights still hears appeals from the UK.

No international agreement can ever or has ever been capable of imposing legal constraints on the British Parliament because it is absolutely sovereign.

ta1243

The UK is a signatory to the european convention on human rights (hell it wrote it), despite what Farage and the the Mail convinced you of in 2016 this was unrelated to the EU

Freedom2

You know perfectly well that the UK is in Europe. Not necessary part of the EU, but Europe as a continent, yes.

anigbrowl

Y'all know perfectly well this refers to the UK leaving the EU.

kittikitti

I have it on good authority that the majority of Tor nodes are compromised.

jlaporte

I sympathize with the operators of these forums of course -- the UK Online Safety Act is poorly conceived.

HOWEVER.

Deleting their forums? "The act will require a vast amount of work to be done on behalf of the Forums and there is no-one left with the availability to do it." [1]

This is a false dichotomy. Put Cloudflare in front of the site, block UK traffic [2], and you're done. 5 minute job.

[1] https://forums.hexus.net/hexus-news/426608-looks-like-end-he...

[2] https://developers.cloudflare.com/waf/custom-rules/use-cases...

atherton33

I don't know the detail here, but in many of the discussions I've seen the operators themselves are based in the UK, and that changes the calculus.

zinekeller

Yeah, GP is, to put it charitably, not understanding the situation.

> About Us

> HEXUS.net is the UK’s number one independent technology news and reviews website.

notatty

[flagged]

chmorgan_

Wow, UK has these crazy laws too? The German hate speech laws made headlines a week or so ago (https://www.cbsnews.com/news/germany-online-hate-speech-pros...). They'll confiscate your electronics if you insult someone and they actively monitor the Internet for prohibited speech.

msie

So sites will geoblock the uk and users will use VPN software. Ugh. More software layers, more waste. Also a problem that is solved by a layer of indirection.