• jeffw@lemmy.worldOPM
    link
    fedilink
    arrow-up
    18
    arrow-down
    3
    ·
    1 month ago

    I think that’s a bit of a stretch. If it was being marketed as “make your fantasy, no matter how illegal it is,” then yeah. But just because I use a tool someone else made doesn’t mean they should be held liable.

    • over_clox@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      12
      ·
      1 month ago

      Check my other comments. My thought was compared to a hammer.

      Hammers aren’t trained to act or respond on their own from millions of user inputs.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        1 month ago

        Image AIs also don’t act or respond on their own. You have to prompt them.

        • over_clox@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          12
          ·
          1 month ago

          And if I prompted AI for something inappropriate, and it gave me a relevant image, then that means the AI had inappropriate material in it’s training data.

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            13
            arrow-down
            1
            ·
            1 month ago

            No, you keep repeating this but it remains untrue no matter how many times you say it. An image generator is able to create novel images that are not directly taken from its training data. That’s the whole point of image AIs.

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              arrow-down
              6
              ·
              1 month ago

              An image generator is able to create novel images that are not directly taken from its training data. That’s the whole point of image AIs.

              I just want to clarity that you’ve bought the silicon valley hype for AI but that is very much not the truth. It can create nothing novel - it can merely combine concepts and themes and styles in an incredibly complex manner… but it can never create anything novel.

            • over_clox@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              9
              ·
              1 month ago

              What it’s able and intended to do is besides the point, if it’s also capable of generating inappropriate material.

              Let me spell it more clearly. AI wouldn’t know what a pussy looked like if it was never exposed to that sort of data set. It wouldn’t know other inappropriate things if it wasn’t exposed to that data set either.

              Do you see where I’m going with this? AI only knows what people allow it to learn…

              • FaceDeer@fedia.io
                link
                fedilink
                arrow-up
                10
                arrow-down
                1
                ·
                1 month ago

                You realize that there are perfectly legal photographs of female genitals out there? I’ve heard it’s actually a rather popular photography subject on the Internet.

                Do you see where I’m going with this? AI only knows what people allow it to learn…

                Yes, but the point here is that the AI doesn’t need to learn from any actually illegal images. You can train it on perfectly legal images of adults in pornographic situations, and also perfectly legal images of children in non-pornographic situations, and then when you ask it to generate child porn it has all the concepts it needs to generate novel images of child porn for you. The fact that it’s capable of that does not in any way imply that the trainers fed it child porn in the training set, or had any intention of it being used in that specific way.

                As others have analogized in this thread, if you murder someone with a hammer that doesn’t make the people who manufactured the hammer guilty of anything. Hammers are perfectly legal. It’s how you used it that is illegal.

                • over_clox@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  9
                  ·
                  1 month ago

                  Yes, I get all that, duh. Did you read the original post title? CSAM?

                  I thought you could catch a clue when I said inappropriate.

                  • FaceDeer@fedia.io
                    link
                    fedilink
                    arrow-up
                    8
                    arrow-down
                    1
                    ·
                    1 month ago

                    Yes. You’re saying that the AI trainers must have had CSAM in their training data in order to produce an AI that is able to generate CSAM. That’s simply not the case.

                    You also implied earlier on that these AIs “act or respond on their own”, which is also not true. They only generate images when prompted to by a user.

                    The fact that an AI is able to generate inappropriate material just means it’s a versatile tool.