This is entirely the fault of the IWF and Microsoft, who create “exclusive” proprietary CSAM prevention software and then only license big tech companies to use it.

  • Anomander@kbin.social
    link
    fedilink
    arrow-up
    23
    ·
    1 year ago

    Putting the blame on Microsoft or IWF is meaningfully missing the point.

    People were responsible for moderating what showed up on their forums or servers for years prior to these tools’ existence, people have been doing the same since those tools existed. Neither the tool nor it’s absence are responsible for child porn getting posted to Fediverse instances. If those shards won’t take action against CSAM materials now - what good will the tool do? We can’t run it here and have the tool go delete content from someone elses’ box.

    While those tools would make some enforcement significantly easier, the fact that enforcement isn’t meaningfully occurring on all instances isn’t something we can point at Microsoft and claim is their fault somehow.

  • 👁️👄👁️@lemm.ee
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    The same people who are mad at Meta for scraping already public information, are now mad at Microsoft for not forcing themselves into the fedi to scan all private and public content? Consistent view points are hard!

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    1 year ago

    Publishing a list of hashes would make it trivial for abusers to know when their images are being flagged. It would be better to get M$ to do the scanning work themselves

    • HelixDab@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Bingo. It would also make it trivial to alter images just enough so that it wouldn’t match the hash, and then they can post shit that would need to be manually flagged and removed.

      I already see things like this with pirated media; pirates will include extraneous material bundled with the target media so that it’s not automatically flagged and removed.

    • BootlegHermit@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      To me it seems like a push towards the whole “own nothing” idea. Whether it’s something like CSAM detection or even mundane SaaS, things are slowly shifting away from the end user having control over their “own” devices.

      I’m torn, because on the one hand, pedophiles and child abusers deserve the severest of consequences in my opinion; on the other hand, I also think that people should be able to do and/or say whatever they want so long as its not causing actual harm to another.

      • elscallr@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        It’s much more likely it’s a matter of preventing their detection technology from falling into the hands of people that would wish to circumvent it.