Comment on Important News - Geoblocking of the UK
Emperor@feddit.uk 2 days agoDo you yet what compliance is going to look like for feddit?
Behind what we are currently doing (NSFW filter on, Lemmy’s own systems, ways to contact Admins directly and AutoMod), so far I think we need:
- More contact email addresses
- A contact form
- A policy document on abuse
- A policy documents outlining the risk assessment and mitigations
The law isn’t designed to be onerous for small website owners (and that is pretty much anyone whose membership isn’t in the upper 100s of thousands) and all of the above should exceed the requirements. If not, then it demonstrates we’ve done our homework and improved our processes and documentation (which it seems is the laws intent - it forces you to think about the issues and what you can do about it, when you may have been muddling on through until now), so we are opening a dialogue with The Powers That Be and if there is room for improvement they will let us know, rather than having to get threatening.
If you see a significant enough level of migration, would you bite the bullet and just shut up shop for the UK anyway? What’s the point in investing in compliance methods if the core userbase decides to move away?
I really don’t see it coming to that. It should make no difference to the users of the site. It’s a bit of a time sink for Admins at the moment but, once it is done, it should, hopefully, only require the odd tweak at most.
swizzlestick@lemmy.zip 2 days ago
Thanks for taking the time, very informative :)
I suppose a lot of it comes down to how nsfw is handled. If there are no means to access it on the home instance, then what you’re doing is probably A-OK.
That’s assuming the filter is locked on and any communities that fall into (or are likely to) the target categories are prevented from forming or expunged where they already exist. No need for invasive methods to verify age if there’s nothing to verify for.
The only problem I can see would be other federated instances that may feed poorly filtered or flat out unfiltered/untagged nsfw into yours. I imagine that’s going to be a decent chunk of the risk assessment, given that federation with others is the main point of lemmy as a whole.
Emperor@feddit.uk 2 days ago
It was the decision of the original Admin but it has made life easier (as in the CSAM spam attacks - we don’t need to figure out if it’s real or not, if it’s porn, it’s gone) and will make complying with the OSA simpler. I suspect there’s a way to navigate those waters but I am focused on our own needs to get this boxed off and then I’ll have a ponder on that topic. We’ve been discussing this with DGR for a while now and that is the main hurdle we’ve struggled with.
It’s like porn spam - it gets flagged up quickly and dealt with. One bonus of federation is that if it is removed from the home instance it is gone for us too, so it often gets sorted before we even realise there’s a problem.
The OSA document we’ll post is 50% risk assessment, 50% mitigation - the law accepts that, in spite of your best efforts, shit happens, the more important thing is what processes and systems are in place to deal with it quickly and efficiently when it does.
swizzlestick@lemmy.zip 2 days ago
Solid.
Lots of thought clearly already gone into it and I hope you do well.
I like my home here, nsfw and all, but I’ll still likely end up making an alt over on feddit.uk as per the invite way up there. Linked by bio both ways as per your rules ofc.
That’s it, there’s no more question marks and you can have your peace now 😅
Emperor@feddit.uk 2 days ago
And that’s the key to the legislation. Small sites need only read the summary documents then run through the online tools, what it is designed to do, at least for us, is to make us think about our systems and processes, then tighten things up (we needed more contact email addresses) and more explicitly state what you are doing.
The bulk of the legislation is aimed at the large social media firms to make sure they look after their users - TikTok is currently being sued by British parents because the algorithm seems to have promoted a dangerous challenge that killed their kids (the OSA gets a mention in some news articles about it).