Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • JBloodthorn@kbin.social
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.

    AI is now apparently generating entire children, abusing them, and uploading video of it.

    Or, they are counting “CSAM-like” images as CSAM.

    • docrobot@lemmy.sdf.org
      link
      fedilink
      arrow-up
      17
      arrow-down
      4
      ·
      1 year ago

      Of course they’re counting “CSAM-like” in the stats, otherwise they wouldn’t have any stats at all. In any case, they don’t really care about child abuse at all. They care about a platform existing that they haven’t been able to wrap their slimy tentacles around yet.