Looking through the #OnlineSafetyAct stuff for small providers and it seems… pretty reasonable?
-
-
I want to double-check that my interpretation aligns with the latest docs from Ofcom, but my starting point is that running a service for only one's own use is out of scope. (It makes no sense for it to be otherwise, IMHO.)
Add someone else as a (human) user? My view is that that service, and that admin, is now in scope of the core obligations.
-
@Edent And yet at the same time we have reports such as this:
https://social.treehouse.systems/@dee/113662184456889247 which take a very different point of view. Probably because park benches are not so controversial? -
@solarisfire @neil
Single user, doesn't look like it.
https://www.ofcom.org.uk/siteassets/resources/documents/online-safety/information-for-industry/illegal-harms/overview-of-regulated-services.pdf?v=387540But, be aware, if someone replies to you with an unwanted / illegal image, it will probably be stored in your cache and - depending on your configuration - be viewable on the web.
-
@Edent We processed 4.6M images on mastodon.social in the last month. My small instance with 15 active users processed 50k of them. There is no way to scan this without increasing the costs greatly using a 3rd party service. And you can not do it yourself because the CSAM databases are all under NDAs and require a lot of legal agreements to even start have access to the algorithm (which is very closed source)
-
@jalal TBH, that sounds like someone burnt out who is looking for a reasonable excuse to get out.
Or someone who only reads reactionary commentary without understanding what's going on.
Should a bike forum with lots of users have a "report dodgy content" button? Yeah, probably.
-
@renchap the hashing itself seems open source - https://github.com/facebook/ThreatExchange/tree/main/pdq - but, yes the DBs look pretty locked down.
That said, I'm not sure I would want to run a service with that many images without some sort of protection - whether it was a legal requirement or not.
-
This post is deleted!
-
@Edent PDQ is something that Meta has been pushing, but from the information I got, you want PhotoDNA, and it's algorithm is closed source (and under NDA).
I am not saying that the current situation is ideal (even if from our experience CSAM on well-moderated Mastodon servers is very infrequent) but mandating every website with UGC to do content scanning is the death of 99% of them. And the "free quotas" that you mentioned do not help at all except for the tiniest sites. -
-
@renchap very true. But the guidance seems to only mandate scanning if you assess the risk as needing it.
If your risk assessment shows that a well-moderated site has a low likelihood, then you won't need it.
(From my quick reading of the guidance.) -
-
@solarisfire @Edent There is perhaps also a question of legislative intent / risk, and whether this is designed to tackle, or whether Ofcom will attempt to enforce, against solo users whose services can be abused in this way. FWIW, I'm unconvinced that this brings a solo user instance in scope, but I can absolutely are why someone might want to address is anyway.
-
@neil @Edent I also don't believe it's just users who reply to you with an image that get stored in the cache, I think it could extend to caching images posted to just about any timeline the instance is aware of. I only follow 446 people, my wife 55. We don't get replied to that often yet the instance is using over 300GB in cache. According to the DB my instance is aware of 242,975 accounts and is storing 1,131,584 media attachments. That's not just from people replying to me!
-
-
@solarisfire I don't understand why you'd configure it to store that much media from people you don't know.
Why not set the cache to be zero (or a few hundred MB)?
There's no advantage in continually storing other people's content.
-
@Edent @jalal for sure, and for those of us old enough and grumpy enough to have been through this a few cycles, it's probably fine (let me show you my thick IR35 file).
But for the (majority of?) folk who aren't legally inclined, the choices appears to be (a) shut up shop, (b) spend significant money on a lawyer explaining wtf, or (c) risk terrifying fines on the basis that it *probably* won't come to pass.
I have complete sympathy for those who choose the path to walk away.
-
-
@Edent @jalal who said anything about large? AIUI you don't have a get-out-of-jail-free card just because you have a dozen users?
And there were plenty of people nervously eyeing the exit when GDPR came along. Fortunately (or not) all the "SEO experts" told them they could just stick up a consent banner and everything would be fine...
-
@ahnlak there's no point us continuing this discussion if you haven't read the guidance.
Muting this thread now. Feel free to come back when you've read and understood it.