Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • HughJanus@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I don’t know what you do about problem #1, though.

    Well the simple answer is that it doesn’t have to be illegal to remove it.

    The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      In which case, admins should err on the side of caution and remove something that might be illegal.

      I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.

      In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.