Moderation:
-
@mighty_orbot
¯\_(ツ)_/¯ This conversation has reached my personal limit for aggressively missing the point. -
Paul Cantrellreplied to Sara Joy :happy_pepper: last edited by
There’s a counter-point here, and a strong one, about the value of not isolating people. I think of all the queer kids out there who suffered thinking they’re the only ones, and found hope and community online. I think of how, despite caring my whole life about racial justice, I completely missed huge swaths of the reality of racism in the US until I heard Black people discussing their lived experiences online. All that I want.
-
Completely agree with @Linza This is a solvable problem that's not getting solved because companies don't want to. They don't care and/or don't want it to influence the bottom line. Lack of moderation is making them money, so there's no incentive to take proper care of it. So incentivese them.
It's also definitely not philosophically intractable. Behavior and speech are moderated through laws and societal norms on a daily basis and we're pretty ok with that.
-
Sara Joy :happy_pepper:replied to Paul Cantrell last edited by
@inthehands @mainec oh totally agreeeeee
I have learned so much social good from the internet, and I think the younger generations are also more open minded and free of a lot of traditional societal bullshit because of it.
But while I love that I have been able to find *my* people, those of like mind, we have similar feelings and problems and loves and values - well, everyone else gets to do that too. And some of those people have very different values - some even harmful "values" - to mine.
-
@usmu @Linza
I might have said some of the same 10 years ago. My thinking has changed.Among other things, the moderation failures well-meaning, hard-working, and definitely not profit-motivated instance admins have experienced right here on the Fedi suggests this “they just don’t want to solve it” view is naive. Not to excuse Meta et al for being awful. But I do think the problem is intrinsically difficult, and not pure negligence. Remember this: https://hachyderm.io/@inthehands/113043105185561409
-
Paul Cantrellreplied to Sara Joy :happy_pepper: last edited by
-
@Linza @sandofsky
Please take a moment to read the article and think about whether higher pay would make it ethical to do to people what Meta did to these Kenyans. -
@datarama
Fair, but also: https://hachyderm.io/@inthehands/113182972198853236 -
@inthehands @sandofsky I'm 99% certain you know what I mean but if you want to hang on to "labor will always be abused so there can be no humans employed ever" then weird hill to die on but OK.
-
@Linza
That is not at all what I’m saying, and is a ridiculous strawman.I’m saying some jobs are so toxic in their current form that they simply should not be done by humans. Asking people to, say, clean up nuclear waste before the invention of protective gear is not just a labor problem.
So no, I have no idea what the hell you’re saying.
-
@inthehands I also don't have answers.
But I think there must be a way for people (including vulnerable people) to find community *without* the "everything you say will be scrutinized by 150000 hostile strangers (and, now, robots)" model.
-
That's why moderation needs to be properly supported. Both by tools to make moderation possible as well as education on what proper moderation entails for everybody involved. This is not being done (enough). The more proper moderation gets normalised the easier it gets for moderation to be done well, because it disincentiveses unwanted behavior. Lets start by getting FB et al to take this seriously. By no means perfect, but it seems a way to make some quick gains.
1/2
-
@datarama
Yup. -
To be honest I find that insinuation that this is a "couldn't you just..." denigrating. What I'm saying is that if you're in a business that requires moderation you need to take that requirement seriously. If you don't, you need to be forced to do so. This takes works on a lot of fronts, but lets start doing it instead of complaining it's hard.
2/2
-
Paul Cantrellreplied to usmu 🍋🏳️🌈🤟 last edited by [email protected]
@usmu @Linza I do agree that FB et al should be forced to take this seriously — and forced to back off their growth plans if they don’t have a moderation strategy that can support them.
I’m still •extremely• skeptical that “This is a solvable problem that's not getting solved because companies don't want to.” It may be solvable, but I don’t think •anyone• actually knows how to solve it at scale. Yet. There isn’t some suppressed cure for cancer here, as it were.