Skip to content(if available)orjump to list(if available)

Lobsters blocking UK users because of the Online Safety Act

jjcm

I definitely get the proactive response here, as I’ve considered the same for my small platform. The biggest issue is the definitions of who the majority of the requirements apply to is quite hard to find. It’s buried on page 65 of this pdf:

https://www.ofcom.org.uk/siteassets/resources/documents/onli...

It defines “large service” as “a service that has more than 7 million monthly active United Kingdom users”. This is 25% of the UK population. If your service isn’t a household name, it mainly doesn’t apply to you, but the language they use makes it seem like this applies to more.

johneth

7 million is about 10% of the UK population, not 25%.

cmacleod4

Many of the requirements apply to services of any size. This is why I will be blocking UK access to my little service (hosted in my home in the UK) when the act comes into force.

Red_Comet_88

They also block Brave browsers entirely. I tried reading into it, but it appears to be one of those "programmer personality quirks". The fellow that runs the site appears to think Brave is a scam of some sort, and just decided to block the entire browser.

Oh well, I'll stick to HN.

acaloiar

This is why Brave was blocked [1.]. I'd have felt the same way, and people are entitled to their grudges. I think the grudge is reasonable.

[1.] https://github.com/lobsters/lobsters-ansible/issues/45#issue...

maeil

There's millions of much stronger reasons to hold grudges against Google and Microsoft, yet their browsers are somehow fine? It's incredibly arbitrary. For what it's worth I think crypto has been a scourge on society so I'm not making this point for any crypto shilling reasons.

acaloiar

Not only is not "incredibly arbitrary" for lobsters to respond in kind to Brave's specific, targeted action against lobsters -- it's not arbitrary whatsoever.

blueflow

You can scroll through the lobste.rs moderation log to get in impression of how moderation on lobste.rs works:

https://lobste.rs/moderations

Also incidents like this:

https://lobste.rs/s/zp4ofg/lobster_burntsushi_has_left_site

This kind of stuff gives me hugbox vibes, i would not feel safe there. I'm somewhat sure some of the moderators use the website as personal political leverage.

hagbard_c

I used to have an account on that site as well and left for the same reason: repeated messages telling me I had been flagged and I needed to reconsider when I dared to venture too far outside the desired narrative. There will be many others who have made or will make the same decision which leaves sites like this with a population which is mostly ideologically cohesive. Maybe that is a good thing for those sites and maybe the participants feel 'safe' in such an environment but it surely is lacking in stimulating curiosity and widening one's intellectual horizon.

420_14_88_69

Sounds familiar, except here there's no notification when you get flagged/deaded/shadowbanned.

420_14_88_69

[flagged]

gaganyaan

What does 14 88 refer to in your username?

null

[deleted]

bb88

Is it me, or is brave more of a cryptocurrency platform that pretends to be a browser?

A lot of people don't really like the toxic discussions that crypto usually tends to devolve in. So it makes sense to block the browser if you don't want those people on your server.

ykonstant

I am no fan of Brave, but where is the logic here? Just because someone uses Brave, they will engage in toxic discussions on crypto? Am I missing something?

tonfreed

It's probably more of a "I don't like Brendan Eich" thing, but the maintainer can't really say that without sounding unhinged.

Then again, I actively go out of my way to be toxic on the internet, so maybe they have a point

bb88

The crypto world is full of toxic.

secondcoming

I use Brave for YouTube and other streaming, and I've never encountered any crypto stuff. They have/had their own BAT token that dealt with paid advertising but I've not seen it mentioned in quite some time.

As far as I can tell it's just another browser that blocks a lot of internet crap.

sshine

It’s my impression that it’s mostly a browser. But I don’t use it, because who needs another WebKit clone.

TiredOfLife

It's you.

jsheard

Related, a list of other sites which are blocking the UK or shutting down altogether rather than deal with OSA:

https://onlinesafetyact.co.uk/in_memoriam/

jackjeff

As a person living in the UK, I really hope the rest of the world gives the middle finger to this pathetic extra territorial law by totally ignoring it.

They can ask ISPs to do the censorship if they really want to keep us “safe”.

tzs

I don't know anything about lobste.rs, but they mention lfgss and when that was discussed on HN a couple months ago the person that runs lfgss mentioned these as things they would have to do to comply:

> 1. Individual accountable for illegal content safety duties and reporting and complaints duties

> 2. Written statements of responsibilities

> 3. Internal monitoring and assurance

> 4. Tracking evidence of new and increasing illegal harm

> 5. Code of conduct regarding protection of users from illegal harm

> 6. Compliance training

> 7. Having a content moderation function to review and assess suspected illegal content

> 8. Having a content moderation function that allows for the swift take down of illegal content

> 9. Setting internal content policies

> 10. Provision of materials to volunteers

> 11. (Probably this because of file attachments) Using hash matching to detect and remove CSAM

> 12. (Probably this, but could implement Google Safe Browser) Detecting and removing content matching listed CSAM URLs

A lot of those sound scary to deal with but upon closer look don't actually seem like much of a burden. Here's what I concluded when I looked into this back then.

First, #2, #4, #5, #6, #9, and #10 only apply to sites that have more than 7 000 000 monthly active UK users or are "multi-risk". Multi-risk means being at medium to high risk in at least two different categories of illegal/harmful content. The categories of illegal/harmful content are terrorism, child sexual exploitation or abuse, child sex abuse images, child sex abuse URLs, grooming, encouraging or assisting suicide, and hate.

Most smaller forums that are targeting particular subjects or interests probably won't be multi-risk. But for the sake of argument let's assume a smaller forum that is multi-risk and consider what is required of them.

#1 means having someone who has to explain and justify to top management what the site is doing to comply.

#2 means written statements saying which senior managers are responsible for the various things needed for compliance.

#3 is not applicable. It only applies to services that are large (more than 7 000 000 active monthly UK users) and are multi-risk.

#4 means keeping track of evidence of new or increasing illegal content and informing top management. Evidence can come from your normal processing, like dealing with complaints, moderation, and referrals from law enforcement.

Basically, keep some logs and stats and look for trends, and if any are spotted bring it up with top management. This doesn't sound hard.

#5 You have to have something that sets the standards and expectations for the people who will dealing with all this. This shouldn't be difficult to produce.

#6 When you hire people to work on or run your service you need to train them to do it in accord with your approach to complying with the law. This does not apply to people who are volunteers.

#7 and #8 These cover what you should do when you become aware of suspected illegal content. For the most part I'd expect sites could handle it like the handle legal content that violates the site's rules (e.g., spam or off-topic posts).

#9 You need a policy that states what is allowed on the service and what is not. This does not seem to be a difficult requirement.

#10 You have to give volunteer moderators access to materials that let them actually do the job.

#11 This only applies to (1) services with more than 7 000 000 monthly active UK users that have at least a medium risk of image-based CSAM, or (2) services with a high risk of image-based CSAM that either have at least 700 000 monthly active UK users or are a "file-storage and file-sharing service".

A "file-storage and file-sharing service" is:

> A service whose primary functionalities involve enabling users to:

> a) store digital content, including images and videos, on the cloud or dedicated server(s); and

> b) share access to that content through the provision of links (such as unique URLs or hyperlinks) that lead directly to the content for the purpose of enabling other users to encounter or interact with the content.

#12 Similar to #11, but without the "file-storage and file-sharing service" part, so only applicable if you have at least 700 000 monthly active UK users and are at a high risk of CSAM URLs or have at least 7 000 000 monthly active UK users and at least a medium risk of CSAM URLs.

Canada

I'd like to make a prediction:

The requirements will be modified to include a larger number of sites whenever the government may find this to be convenient. The MAU will limit will be reduced, and/or the scope of "illegal/harmful content" will be expanded.

dang

Related ongoing thread: In memoriam - https://news.ycombinator.com/item?id=43152154

kmeisthax

As someone who self-hosts Mastodon, should I be geoblocking the UK as well?

For the record, I only host it for myself, so I'm pretty sure I wouldn't have received any of the legal protections that the OSA is now stripping away, and thus geoblocking the UK wouldn't matter. But if there's something else I'm missing, please let me know.

apple4ever

I would as a matter of principle, but also just to be safe.

ddtaylor

Block and route around failures.

sepositus

Is a decentralized approach to communication an effective bulwark against this type of legislation? As is mentioned in the linked thread, the act itself is extremely hard to parse, so maybe it's not even possible to know the answer. Just curious if anyone has done research from that angle.

zimpenfish

Neil Brown[0] has been attending the Ofcom online sessions and asking them about Fediverse servers but they've been unhelpfully vague as to how/if/why/when they fall under OSA.

[0] https://onlinesafetyact.co.uk

do_not_redeem

ELI5: Why should a US-based site, hosted on US soil and run by a US citizen, care about laws in all the hundreds of other random countries located thousands of miles away?

randunel

Because he can face legal consequences for breaking the (local elsewhere) law. The US do the exact same, look up FATCA.

apple4ever

That's a bit different though. If it was similar, he would just be blocked from interacting with the US at all. FATCA doesn't require arrests of non-US people breaking the law, it just harshly punishes those who do by excluding them from the US and holding a 30% amount of any money. (Note: I oppose FATCA in its entirely, just pointing out the difference here).

AdrianB1

Because they can be arrested and extradited to UK. Not a high chance, but not zero. And a realistic one is to be arrested while traveling in Europe and extradited to UK.

do_not_redeem

Are there any previous cases of the UK intercepting and arresting a US citizen traveling through Europe for something that isn't a crime in the US?

sshine

There are no documented cases of the UK intercepting and arresting a US citizen traveling through Europe for an act that is not a crime in the United States.

Extradition treaties, such as the UK–US Extradition Treaty of 2003, allow for extradition requests between the two countries.

They generally require that the alleged offense be a crime in both jurisdictions ("dual criminality") which ensures that individuals are not extradited for actions that are not considered crimes in their home country.

regularjack

Would you take the risk if it were you hosting the site?

puppycodes

Idk about you but I tend to avoid travel to police states and places where being rude is a crime.

Extraditing someone because their website uses encryption would force them to prosecute so many people and organizations it would be a joke. You would have to really piss them off.

LAC-Tech

Question: why pro-active block some authoritarian countries like the UK, but not others like China? Is it because only the UK passes legislation that threaten people outside of its borders? Or does China do it too and we ignore it?

ok_dad

China probably can’t get the USA to extradite an American citizen for breaking a Chinese law. The UK can.

ChocolateGod

The UK can't even get the USA to extradite an American woman who was driving on the wrong side of the road and killed a child.

The US-USA extradition treaties have always been one way.

do_not_redeem

Are there any examples in history of the US extraditing someone with no UK ties to the UK?

mattlondon

If you commit a crime in the UK then I would expect extradition to be a genuine risk.

It's a connected world, so online activities are probably pretty grey-areas. If you defraud someone (for example) but they're in another country, where did the crime happen?

LAC-Tech

Yeah that makes sense.

I thought extradition laws usually had some kind of clause like "only applies if it'd be a crime in both jurisdictions", but now I think about that I can't remember where I heard that or why I think it so I will admit complete ignorance.