I've written about the new #OnlineSafetyAct rules from Ofcom and what they mean for small website owners.
-
I've written about the new #OnlineSafetyAct rules from Ofcom and what they mean for small website owners.
I've tried to be reasonable in my interpretation and my criticism. Feedback welcome!
-
@Edent While most of the measures seem reasonable for small sites the legal liability for not moderating hard enough could be millions of £. I don't blame LFGSS for closing. The owner has received death threats. Those idiots will soon be able to use a law that could financially destroy the owner of LFGSS for basically zero cost.
The law designed to hold big tech accountable probably won't, but it will ensure only big tech survives.
-
@dmaonR sure, but the same could be said of GDPR. The fines there can be substantial.
-
@Edent I generally agree.
I can understand why people are concerned about the risks to them, but for the most part, this seems to be a bit like GDPR - most of the effort needed went into initial assessments/documenting compliance for things you were already doing.
-
-
-
@Edent do you think this will affect usenet groups?
-
@chaz6 does Usenet even still exist?
If so, yeah. The people who host or run those groups should probably make sure it isn't a harmful environment.
(That's probably why it died. Just a toxic pit of old skool griefers from what I remember.) -
@samueljohnson @ben
That's a reasonable point. -
Major Denis Bloodnokreplied to Terence Eden last edited by
@Edent @chaz6 Beyond filtering out binaries, it's completely impractical for someone who runs a news server ("hosts") to check what's posted; it would be like requiring the Royal Mail to check every letter.
No-one "runs" most newsgroups - they are unmoderated.
(To a degree this is ofc why USENET imploded, but "The people who host or run those groups should probably make sure it isn't a harmful environment" is pure fantasy. One group can't do it and the other group doesn't exist.)
-
Terence Edenreplied to Major Denis Bloodnok last edited by
@denisbloodnok @chaz6
You haven't read the guidance, have you?There's no requirement to proactively scan every piece of content. You need to provide a mechanism for people to report / complain about harmful content.
Then you can moderate as you see fit.
-
Major Denis Bloodnokreplied to Terence Eden last edited by
-
Terence Edenreplied to Major Denis Bloodnok last edited by
@denisbloodnok
If you can find me a news server which meets Ofcom's definition of a large service, I'll be astonished. -
@Edent I’m interested/worried as to how it will affect devs of small apps that share data between users. Using the online checker tool: an app that shares items on a shopping list between users and maybe messages to pick up the eggs, falls completely within scope. A lone dev deciding whether to comply with all the regulatory burdens or just leave the UK market, has a fairly easy choice. We’re going to see a balkanisation of app markets, many apps will be removed from sale in the UK.
-
@ColGrenfell If they're small (under 7 million users), the burden is minimal. I've just gone through it and it's all basic stuff (albeit badly communicated).
Frankly, if an app developer can't put a "report content" button in their app, I'd question their ability to operate at all.
-
@Edent it’s just something I’ve not had any experience of.. and I know there’s a lot of catching up to do. So in my theoretical shopping app, when a user presses ‘report content’ for a private message sent between users , what does the solo dev then do? Send data to the police? To Ofcom? It’s easy to have one user block another, but other legal obligations does the dev then have?
-
@ColGrenfell As with any UGC, they can either moderate, ban the user, tell the complainant they can block the user, or do nothing.
The guidance makes it clear that there is no legal obligation to report anything to the police / Ofcom (except under *very* limited circumstances).
-
We've done the #OnlineSafetyAct thing for @openbenches
It was pretty simple.
0. Assessed ourselves as small & low risk.
1. Put a "report" button on every page with UGC.
2. Have a page explaining our complaints policy.
3. …errr…
4. That's it. We think.A small write-up at https://www.openbenches.org/blog/online-safety-act/
Thoughts?
-
@Edent it’s a shame you had to waste time doing this.
-
@ret I absolutely do not consider it a waste of time. It was really useful to think about how we keep our community safe.
-
-
@Edent @openbenches Thank you for sharing!