Opened two new Mastodon issues with ideas that could improve moderation processes, would be interested in hearing what people think:
-
Opened two new Mastodon issues with ideas that could improve moderation processes, would be interested in hearing what people think:
- https://github.com/mastodon/mastodon/issues/32962
- https://github.com/mastodon/mastodon/issues/32963I've also been working more on other admin/moderation pages more too, and work is hopefully able to move forwards and land soon-ish.
-
@thisismissem oooh that second one is really good. Restorative af.
On the first one, is the message meant to convey that the moderators have been told and are deciding what to do? Because as written that message reads to me as “we know, so stop telling us, we may or may not do anything about it.” Maybe something like “this post/comment has been reported to the admins of thing.blob”
-
@RomanceReviews yeah, the former: "we're reviewing this post and handling it, no need to report it further”
e.g. account freezes aren't public information, so when we freeze someone's account, it doesn't make their content disappear. So if we freeze and request the user to take down the offending post, the post remains up during that process.
-
@RomanceReviews wording is obviously totally open for change, it's just a ticket to get the ball rolling. It's also important to note that this isn't automatic from something being reported, but rather manually applied by the moderator to specific posts.
-
@thisismissem Regarding 32962, I like that this also reduces the probability of something being reported multiple times and bloating reports. If public, it makes it possible to see how long mods take to handle something. That said, I wonder if there could be a baseline delay - if it's too quick after being reported, one could semi-guess who has filed a report (?), something which can also contribute to escalation or harassment.
Would it work for very long posts? The start of an affected thread?
-
@thisismissem @RomanceReviews I’m not familiar with how Mastodon handles moderation, so this may be a stupid question:
Is there value in posts being reported more than once? Is it just a binary ‘this has been reported’ signal or do moderators prioritise things that are reported by multiple people? For example, I can imagine a malicious person going and just reporting a hundred posts that they don’t like, but if a dozen people report a post then it’s more likely to be spam (assuming the dozen reports are not from sockpuppets). If something is already reported, is it useful to have a single click to say ‘I agree with the report’? And the ability to learn accounts that tend to agree with the instance moderators’ views that can be a quick-to-review list?
-
@MartinVuilleme the idea is that this is manually added by the moderators, not automatically based on reporting or report age.
-
@david_chisnall @RomanceReviews to be able to agree with a report would mean exposing the report contents publicly. So yes, we have to accept multiple reports.
In the report list, they are all grouped by reported account, such that you can see the multiple reports all together.
-
@thisismissem tangential, I opened a PR a while ago about report-forwarding, and it's had nobody look at it in the months since. What do I need to do to get someone to take a look?
https://github.com/mastodon/mastodon/pull/31816 -
Emelia 👸🏻replied to Greg last edited by [email protected]
@greg i've been working on it, I just had to pause work due to the 4.3 release and needing to have heart surgery.
Specifically I'm working on the admin side of report forwarding. I do think we're unlikely to add disabling of all report forwarding.
-
@thisismissem thanks for the update. I think forwarding is generally valuable but yeah the basic assumption of "every other admin is also reasonable" has proven false... Mine was a quick change, if something else is in the works (eg admin review of reports before auto-forward) them that would obviously be better
-
@greg well, you can't juat disable completely as otherwise it adds load on admins to find actually problematic reports and refile them manually, which can also only work for public posts.