Generative AI-powered tools are making it easy to create deepfakes — and lawyers are trying to seize the moment as an opportunity.

  • conciselyverbose@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    The reality is that we’re past the point where videos constitute useful evidence unless you can strongly demonstrate the integrity of the source against manipulation. You have to have a credible witness say “I taped this” or a security camera you are extremely confident has not been breached, etc.

    Video on its own is not sufficient with where the technology is. You might be able to point to most outputs of public tools and find artifacts, but it’s close enough that pixel perfect fakes are entirely plausible, especially for lower resolution/compressed videos.

  • Saganastic@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 year ago

    I think it would be great if we had digital signatures for videos. Sort of like https, or a signed exe on windows. A video could be cryptographically signed at the time of creation. There could be some level of confidence then that if you’re watching a signed video it’s an unadulterated original copy created by say, Peter’s iphone on July 5, 2023 at 11 pm.

    • bobtheowl2@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      That will get tricky as the second you upload a video to youtube or any cloud service, it gets transcoded into dozens of other different sized and formatted videos, which will effectively strip all that out. But I’m sure we could still come up with some solution that uniquely identify the source somehow.

  • style99@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    The standard for legal proof has always been stronger on the motive/opportunity side than on the method side. This just lends some wind to those sails.