Developmental informatics hacker

  • 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: July 30th, 2023

help-circle

  • Maybe it would be good to know more about this leakage. Are these isolated communities? I’ve personally not encountered any CSAM so far. The only thing I’ve seen so far was a transphobe and they were banned quickly.

    And about that subjecting moderators to bad stuff. Is that true? Why would anyone constantly post CSAM in a place where it constantly gets removed and their accounts banned?

    ATM it seems to me like these are isolated instances?


  • Perhaps this technical approach is the wrong way entirely. In a scale free network it might seem like a good approach because of the seemingly infinite number of edges the hub nodes service (yt, twitter). The numbers are so large that you have a tendency to come up with a technical solution.

    However a network can be laid out in a way that is more conducive to meaningful moderation. With meaningful in this case I am referring to there being people involved rather than algos. This requires having small world communities with influential core members or moderators.

    This allows for a more inclusive/wider and more nuanced moderation. For example I assume that yt detects and removes CSAM, however it still has CSAM-like content because it is legal, but it would still be filtered otherwise. Likewise issues such as transphobia are not legal problems and thus are not properly moderated. On the flip end, stuff gets removed that has nothing wrong with it. When different communities create their own meaning through values and principles based on those values, we will have more diversity, and this allows for social progress in the long run.

    This might be the case for the federated structure of Lemmy.

    Of course this ignores communities that break off and do their own thing and polarize into a more extreme form. I feel that is a different problem that requires a different solution.

    Excuse me for being all over the place with this post, but I have to run :)


  • Thanks for the thought you put into your answer.

    I’ve been thinking: CSAM is just one of the many problems communities face. E.g. Youtube is unable to moderate transphobia properly, which has significant consequences as well.

    Let’s say we had an ideal federated copy of the existing system. It would still not detect many other types of antisocial behavior. All I’ms saying is that the existing approach by M$ feels a bit like it’s based on a moral tunnel vision and trying to solve complex human social issues by using some kind of silver bullet. It lacks nuance. Whereas in fact this is a community management issue.

    Honestly I feel it’s really a matter of having manageable communities with strong moderation. And the ability to report anonymously, in case one becomes involved in something bad and wants out.

    Thoughts?