Skip to content(if available)orjump to list(if available)

Forum with 2.6M posts being deleted due to UK Online Safety Act

mjburgess

What is the meaning of "illegal content" given in the OSA? What will social media platforms be forced to censor (, remove, ..) ... let's take a look:

Table 1.1: Priority offences by category ( https://www.ofcom.org.uk/siteassets/resources/documents/onli... )

Disucssion of offenses related to: prostitution, drugs, abuse & insults, suicide, "stiring up of racial/religious hatred", fraud and "foreign interference".

So one imagines a university student discussing, say: earning money as a prostitute. Events/memories related to drug taking. Insulting their coursemates. Ridiculing the iconography of a religion. And, the worst crime of all, "repeating russian propaganda" (eg., the terms of a peace deal) -- which russians said it, and if it is true are -- of course -- questions never asked nor answered.

This free-thinking university student's entire online life seems to have been criminalised in mere discussion by the OSA, there may have been zero actual actions involved (consider, though, a majority of UK students have taken class-A drugs at most prominent universities).

This seems as draconian, censorious, illiberal, repressive and "moral panic"y as the highs of repressive christian moralism in the mid 20th C.

ziddoap

Related post with a large discussion from someone who said:

"Lfgss shutting down 16th March 2025 (day before Online Safety Act is enforced)

[...] I run just over 300 forums, for a monthly audience of 275k active users. most of this is on Linode instances and Hetzner instances, a couple of the larger fora go via Cloudflare, but the rest just hits the server.

and it's all being shut down [...]"

For the same reasons.

https://news.ycombinator.com/item?id=42433044

jonatron

HEXUS stopped publishing in 2021, and the company no longer exists. The forums were kept because they don't take much work to keep online. Now, there's a lot of work to do, like reading hundreds of pages of documents and submitting risk assessments. There's nobody to do that work now, so the idea was it could go into read only mode. The problem with that was, some users may want their data deleted if it becomes read only. Therefore, the only option is to delete it.

nickdothutton

Anyone* would be crazy to run a UK-based or somewhat UK-centric forum today. Whether it be for a hobby, profession, or just social interaction. The government doesn’t perceive these sites as having any value (they don't employ people or generate corporation tax).

[*] Unless you are a multibillion $ company with an army of moderators, compliance people, lawyers.

whartung

Well I'm on a forum run by a UK company, hosted in the UK, and we've talked about this, but they're staying online. And, no, they're not a multibillion dollar company.

I don't see our moderators needing to do any more work than they're already doing, and have been doing for years, to be honest.

So we'll see how the dice land.

aimazon

The opposite is true. The new law makes it considerably more risky for large companies because the law is specifically designed to hold them to account for conduct on their platforms. The (perceived) risk for small websites is unintended and the requirements are very achievable for small websites. The law is intended for and will be used to eviscerate Facebook etc. for their wrongs. We are far more likely to see Facebook etc. leave the UK market than we are see any small websites suffer.

A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

mschuster91

> A small website operator can keep child pornography off their platform with ease. Facebook have a mountain to climb — regardless of their resources.

Facebook can actually train AI to detect CSAM, and is probably already doing so in cooperation with NCMEC and similar organisations/authorities across the world.

Your average small website? No chance. Obtaining training material actively is seriously illegal everywhere, and keeping material that others upload is just as bad in most jurisdictions.

The big guys get the toys, the small guys have to worry all the goddamn time if some pedos are going to use their forum or whatnot.

CamperBob2

No, that is not how it works. Large companies can afford compliance costs. Smaller ones can't.

aimazon

What are the compliance costs for this law that would apply to a small independent forum?

DarkmSparks

more than just forums, it's basically a failed state now. I knew when I left (I was the last of my school year to do so) it was going to get bad once Elizabeth died, and that would be soon, but I never imagined it would get this bad.

The plan for April is to remove the need for police to obtain a warrant to search peoples homes - that bad.

I'd say "there will be blood on the streets", but there already is...

This video pretty much sums up what the UK is now. https://m.youtube.com/watch?v=zzstEpSeuwU

nerdile

Summary: The UK has some Online Safety Act, any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks. The law applies to any site that targets UK citizens or has a substantial number of UK users, where "substantial number" is not defined.

I'm going to guess this forum is UK-based just based on all the blimey's. Also the forum seems to have been locked from new users for some time, so it was already in its sunset era.

The admin could just make it read only except to users who manually reach out somehow to verify their age, but at the same time, what an oppressive law for small UK forums. Maybe that's the point.

zimpenfish

IANAL

> any websites that let users interact with other users has to police illegal content on its site and must implement strong age verification checks.

But I believe you only need age verification if pornography is posted. There's also a bunch of caveats about the size of user base - Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified whether it applies to / will be policed for, e.g., single-user self-hosted Fediverse instances or small forums.

I don't blame people for not wanting to take the risk. Personally I'm just putting up a page with answers to their self-assessment risk questionnaire for each of my hosted services (I have a surprising number that could technically come under OSA) and hoping that is good enough.

tremon

I believe you only need age verification if pornography is posted

But if you let users interact with other users, you're not in control of whether pornographic material is posted, so it's safer to comply beforehand.

I commend you for keeping your site up and hoping for the best. I don't envy your position.

bostik

> Ofcom have strongly hinted that this is primarily aimed at services with millions of users but haven't (yet) actually clarified [...]

This has echoes of the Snooper's Charter and Apple's decision to withdraw ADP from all of UK.

It is not enough for regulators to say they won't anticipate to enforce the law against smaller operators. As long as the law is on the books, it can (and will) be applied to a suitable target regardless of their size.

I saw this this same bullshit play out in Finland. "No, you are all wrong, we will never apply this to anything outside of this narrow band" -- only to come down with the large hammer less than two years later because the target was politically inconvenient.

jonathanstrange

I geo-block UK visitors on all of my websites. It's sad but the safest solution.

_bin_

why? if you're located elsewhere you can literally just ignore UK/EU law. they don't have jurisdiction over you; worst-case scenario is probably them ordering ISPs to block your site.

rixed

What if a large number of brits access your websites from a different country? :-/

mattlondon

It's for 7 million active UK users per month. https://www.ofcom.org.uk/siteassets/resources/documents/onli... - definition on page 64.

That's quite sizeable. How many sites can you name have 7 million monthly active UK users? That's over one-in-ten of every man, woman and child in the UK every month using your site.

hexator

Feels more and more like we're at the end of an era when it comes to the internet.

mbostleman

By Internet do you mean Western Civilization?

DoingIsLearning

I was gonna say looking at world affairs, it's starting to feel like the end of the Westphalian system.

kelnos

How so? This is just the UK. While the UK really does want to enforce this globally, they really have no enforcement power against non-UK citizens who do not reside in the UK.

Certainly it's possible (and perhaps likely!) that the EU and US will want to copycat this kind of law, but until that happens, I think your alarm is a bit of an overreaction.

nradov

A lot of people who travel internationally occasionally transit through UK jurisdiction, such as a connection at LHR. This potentially places forum operators in personal legal jeopardy. Would the UK authorities really go after some random citizen of another country for this? Probably not, but the risk isn't zero.

hexator

Similar laws are being written elsewhere, Section 230 may not last the next few years. It's not just the UK.

gotoeleven

Well the attacks on section 230 from the right are about removing censorship not adding censorship so I'm not sure section 230 is a good comparison.

rich_sasha

It's awkward.

It's clear this law affects terribly bona fide grassroots online communities. I hope HN doesn't start geoblocking the UK away!

But then online hate and radicalization really is a thing. What do you do about it? Facebook seems overflowing with it, and their moderators can't keep up with the flow, nor can their mental health keep up. So it's real and it's going to surface somewhere.

At some level, I think it's reasonable that online spaces take some responsibility for staying clear of eg hate speech. But I'm not sure how you match that with the fundamental freedom of the Internet.

observationist

You don't. "Hate speech" is code for "the government knows better and controls what you say."

Yes, racism exists and people say hateful things.

Hate speech is in the interpretation. The US has it right with the first amendment - you have to be egregiously over the line for speech to be illegal, and in all sorts of cases there are exceptions and it's almost always a case-by-case determination.

Hateful things said by people being hateful is a culture problem, not a government problem. Locking people up because other people are offended by memes or shitposts is draconian, authoritarian, dystopian nonsense and make a mockery of any claims about democracy or freedom. Europe and the UK seem hellbent for leather to silence the people they should be talking with and to. The inevitable eventual blowback will only get worse if stifling, suppressing, and prosecuting is your answer to frustrations and legitimate issues felt deeply but badly articulated.

rich_sasha

I see no reason why hate speech should be given the benefit of the doubt. And no, it's not because my government told me so, I have my own opinion, which is that freedom of speech ends where threats of violence appear.

If you don't want it tolerated online, which I don't, you need some kind of legal statement saying so. Like a law that says, you can't do it, and websites can't just shrug their shoulders and say it's not their problem.

I don't line this legislation as it seems to be excessive, but I disagree that the root issue it tries to address is a made up problem.

EDIT it just struck me that in speech and otherwise, the US has a far higher tolerance for violence - and yes I do mean violence. Free speech is taken much further in the US, almost to the point of inciting violence. Liberal gun laws mean lots of people have them, logically leading to more people being shot. School shootings are so much more common, and it appears there is no widespread conclusion to restrict gun ownership as a result.

Maybe that's a core difference. Europeans genuinely value lower violence environments. We believe all reasonable things can be said without it. That doesn't make this legislation good. But at least it makes sense in my head why some people glorify extreme free speech (bit of a tired expression in this age).

thorncorona

What defines hate speech? Who defines hate speech? Does hate speech result from the speech or the actions of those against the speech? Should the speech of protestors have consequences for disturbing the peace? What consequences should the state force onto individuals for speech, or actors affected by speech?

Americans for lack of a better description grapple with violence of the state differently than Europeans, but it seems neither are without consequence.

Asooka

The problem is that policing hate speech creates a police state worse than allowing hate speech to exist. The system you need to create to police the hate speech will result in more violence against people than letting the hate speech exist. To me, your very statement "freedom of speech ends where threats of violence appear" is a form of hate speech. You are hating on my principle of free speech. It actually makes me physically sick to read those words, because I know where they lead.

Generally on the Internet you would make use of existing tools to prevent people from talking to you if you find them hurtful. For example, I could just block you and not deal with you any more. Sometimes people get around those to harass others. That is definitely bad and we already have laws against harassment and ways for law enforcement to find those individuals without creating a full police state on the Internet. Posting your opinion once is not harassment, no matter how much it makes me want to puke. Or as we used to say in a more civilised time, I abhor your speech, but I will fight to the death for your right to speak it.

I don't know where you got your conclusion from - I am European and I don't mind violent speech. In fact I think we generally need a lot more freedom since many countries give their citizens barely more freedom than serfs had. School shootings have been a perennial favourite for your type to parade around so you can rule over a disarmed population, but e.g. Czechia lets you have a gun at home as easily as the USA and they do not have that problem. USA's problem is mostly societal.

Your opinion sounds like it was formed in the ivory tower of university with no connection with reality. Please get more varied life experience and reconsider your position.

staticautomatic

Hate and radicalization are products of existential purposelessness. You can’t make them go away by preventing existentially purposeless people from talking to each other.

rich_sasha

No, you can't, but also theres is no reason why the law about allow these to be up. Plenty of people have racist thoughts, and that's not illegal (thoughts in general aren't), but go print a bunch of leaflets inciting racist violence and that is illegal.

I see this as an internet analogy.

mjburgess

https://en.wikipedia.org/wiki/2023_Quran_burnings_in_Sweden

Does burning a religious book "incite violence" ? It causes it, for sure. Free expression brings about, in the fanatic, a great desire to oppress the speaker. That's why we have such a freedom in the first place.

thorncorona

It seems though that allowing a country which already has problems with “lawful free speech,” to tamp down more on free speech would bring issues no?

Without mentioning the oxymoron that lawful free speech is.

mschuster91

> You can’t make them go away by preventing existentially purposeless people from talking to each other.

At least you can limit the speed of radicalization. Every village used to have their village loon, he was known and ignored to ridiculed. But now all the loons talk to each other and constantly reinforce their bullshit, and on top of that they begin to draw in the normies.

verisimi

> online hate and radicalization really is a thing

People have always had opinions. Some people think other people's opinions are poor. Talking online was already covered by the law (eg laws re slander).

Creating the new category of 'hate speech' is more about ensuring legal control of messages on a more open platform (the internet) in a way that wasn't required when newspapers and TV could be managed covertly. It is about ensuring that the existing control structures are able to keep broad control of the messaging.

baggy_trough

Governmental attempts to reduce "online hate" (however defined, as it is entirely subjective) are just going to make our problems worse.

mjburgess

Is it a thing?

I mean we had the holocaust, Rwandan genocide and the transatlantic slave trade without the internet.

The discovery, by the governing classes, that people are often less-than-moral is just as absurd as it sounds. More malign and insidious is that these governors think it is their job to manage and reform the people -- that people, oppressed in their thinking and association enough -- will be easy to govern.

A riot, from time to time -- a mob -- a bully -- are far less dangerous than a government which thinks it can perfect its people and eliminate these.

It is hard to say that this has ever ended well. It is certainly a very stupid thing in a democracy, when all the people you're censoring will unite, vote you out, and take revenge.

rich_sasha

It is a thing for sure. How often it happens, I don't know.

I read a number of stories about school children being cyber-bullied on some kind of semi-closed forum. Some of these ended in suicide. Hell, it uses to happen a lot on Facebook in the early days.

I totally understand a desire to make it illegal, past a certain threshold. I can see how you start off legislating with this in mind, then 20 committees later you end up with some kind of death star legislation requiring every online participant to have a public key and court-attested age certificate, renewed annually. Clearly that's nonsense, but I do understand the underlying desire.

Because without it, you have no recourse if you find something like this online. For action to be even available, there has to be a law that says it's illegal.

mjburgess

Of course hatred, bullying, etc. is real -- what I was referring to is some special amount or abundance of it as caused by free discussion on the internet (rather than, say, revealed by it; or even, minimised by it).

We're not running the counter-factual where the internet does not exist, or was censored from the start, and where free expression and discussion has reduced such things.

The salem witch trials are hardly a rare example of a vicious mob exploiting a moral panic to advance their own material interests -- this is something like the common case. It's hard to imagine running a genocide on social media -- more likely it would be banned as "propganda" so that a genocide could take place.

We turned against the internet out of disgust at what? Was is the internet it itself, or just a unvarinished look at people? And if the latter, are we sure the internet didnt improve most of them, and hasnt prevented more than its caused?

I see in this moral panic the same old childish desire to see our dark impulses as alien, imposed by a system, to destroy the system so that we can return to a self-imposed ignorance of what people are really thinking and saying. It's just victorian moralism and hypocricy all over again. Polite society is scandalised by the portrait of dorian gray, and we better throw the author in jail .

mrtesthah

Online hate is skyrocketing in large part because billionaires and authoritarian regimes are pumping in millions of dollars to uplift it. Let’s address this issue at its source.

massifgreat

These UK laws might boost Tor usage.. let's hope something good will come from the full censorship political tyranny in Europe.

anigbrowl

The UK is not in Europe, which would otherwise impose human rights legal constraints on UK government legislation.

ksp-atlas

The UK is in Europe, it didn't suddenly break off and float away, it's just not part of the EU, there's a bunch of European countries that aren't in the EU

pmdr

I doubt it. I think these laws were made to herd users towards big tech's established platforms that are 'policed' by community guidelines deemed 'appropriate' and where the content is never more than a takedown request away.

Welcome to the new internet.

(and it's funny how everyone's yelling 'fascist' at whatever happens in the US instead)

a0123

Two countries can be fascist at the same time.

And it's not like the UK and the US aren't known for exchanging the worst of the worst with each other all the time.

aqueueaqueue

Leave this "vile" "unsafe" forum and go talk on ... er ... Twitter.

throw_m239339

Right, it is called Regulatory Capture, because big actors have the means to comply.

pembrook

Trust me, while the big social media sites love this, it wasn't their lobbying that made this happen.

The UK government has a long history of meddling in media coverage to achieve certain aims. Up until Covid, legacy media still had control over the narrative and the internet was still considered 'fringe,' so governments could still pull the tried-and-true levers at 1-3 of the big media institutions to shape opinion.

Post-covid, everyone became internet nerds and legacy media in english-speaking countries fully lost control of the narrative.

This regulation is intended to re-centralize online media and bring back those narrative control levers by creating an extremely broad surface area of attack on any individual 'creator' who steps out of line.

tehjoker

they were alarmed they lost what used to be tight control of media narratives around e.g. the gaza genocide and are working overtime to concentrate control so it doesn't happen again

let it be known the UK used its carve out territory in Cyprus to process bomb shipments to the IDF in furtherance of a genocide

https://www.aljazeera.com/news/2024/1/15/uk-bases-in-cyprus-...

kittikitti

I have it on good authority that the majority of Tor nodes are compromised.

aimazon

Headline: 2.6M posts

Reality: the forum has negative 358 posts in the last month. The forum has negative ~2k posts over the last 12 months. The forum is so inactive that they’re deleting posts faster than creating them. 8 people have created accounts in the last year.

The forum has been long dead.

pembrook

Apparently any piece of informational older than a year has no value to you?

Thankfully you aren't writing the laws in my country.

Creating a law that makes internet creators want to delete all historical record for fear of potential prosecution under extremely broad terms -- doesn't seem like it's in the interest of the greater good.

aimazon

The law has absolutely nothing to do with historic content, it has no provisions for or relevance to content published decades ago. Even in the most cautious response to this law, there is no reason to take content offline.

ziddoap

I'm not sure why you're comparing total posts to monthly new posts. The tragedy here is that 2.6 million posts, potentially full of great content, is being deleted.

>The forum is so inactive that they’re deleting posts faster than creating them.

They've been in read-only mode, more or less, for awhile. Primarily, again, due to the (at the time proposed, now passed) law.

Not to mention, this comment is missing the forest for the trees. This is not the only forum or website to shutter operations in the wake of the UK Online Safety Act.

aimazon

The forum has had less than 100k posts in the last 10 years.

Forums and small websites have been killed off by changing consumer behaviour, the shift to big social media platforms. Using big numbers to suggest that the UK Online Safety Act is responsible for killing off these smaller independent websites is disingenuous.

If you do the same exercise for the other forums, you’ll find they’re all long dead too.

ziddoap

I posted another example in this thread of someone running forums with 275k monthly active users that also decided to shut down. That does not qualify as "long dead".

That's just one other example. I can assure you that it is not just long-dead forums deciding to shut down, despite your preconceived notion.

_bin_

actual question, why bother? if they are domiciled in the UK, sell it to someone outside it or move the company elsewhere. let the britons kick and scream; the fun thing about the internet is they can't really do anything about it.

jonatron

The company no longer exists and it doesn't make any money so it isn't worth anything.

_bin_

someone owns the forum and it's still a big archive of stuff. backlinks pointing there, old info, can run ads so it should cashflow somehow.

omer9

So, what makes the UK Online Safety Act close the forum?

jonatron

This list of requirements is excessive and nobody wants to read through endless documents and do endless risk assessments. https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...

Children's access assessments - 32 pages

Guidance on highly effective age assurance and other Part 5 duties - 50 pages

Protecting people from illegal harms online - 84 pages

Illegal content Codes of Practice for user-to-user services - 84 pages

dv_dt

What happens with cross nation access? Will international sites start to refuse accounts to brits?

yuriks

I believe lobste.rs is one site that's going to geoblock the UK as a precautionary measure at least

null

[deleted]

mystified5016

Because the UK refuses to elaborate on who qualifies under the act, and the only "safe" way to operate a website that might hypothetically be used by someone in the UK is to simply not.

The costs required to operate any website covered by this act (which is effectively all websites) is grossly excessive and there are either NO exceptions, or the UK has refused to explain who is excepted.

hu3

Couldn't they wait for some kind of inquiry from UK Gov and then closed the forum reactively if it was an unreasonable financial burden?

zimpenfish

> The costs required to operate any website covered by this act (which is effectively all websites) is grossly excessive

That depends what you count as the costs. If you're a small site[0] and go through the risk assessment[1], that's the only costs you have (unless pornography is involved in which case yes, you'll need the age verification bits.)

[0] ie. you don't have millions of users

[1] Assuming Ofcom aren't being deliberately misleading here.

lofaszvanitt

they don't want to reengineer the forum...

datadeft

Finally the true decentralized internet could start.