ChatGPT is full of sensitive private information and spits out verbatim text from CNN, Goodreads, WordPress blogs, fandom wikis, Terms of Service agreements, Stack Overflow source code, Wikipedia pages, news blogs, random internet comments, and much more.

Using this tactic, the researchers showed that there are large amounts of privately identifiable information (PII) in OpenAI’s large language models. They also showed that, on a public version of ChatGPT, the chatbot spit out large passages of text scraped verbatim from other places on the internet.

“In total, 16.9 percent of generations we tested contained memorized PII,” they wrote, which included “identifying phone and fax numbers, email and physical addresses … social media handles, URLs, and names and birthdays.”

Edit: The full paper that’s referenced in the article can be found here

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      33
      arrow-down
      3
      ·
      7 months ago

      I’d have to imagine that this PII was made publicly-available in order for GPT to have scraped it.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          6
          arrow-down
          17
          ·
          7 months ago

          It also doesn’t mean it inherently isn’t free to use, either. The article doesn’t say whether or not the PII in question was intended to be private or public.

          • Davel23@kbin.social
            link
            fedilink
            arrow-up
            28
            arrow-down
            3
            ·
            7 months ago

            I could leave my car with the keys in the ignition in the bad part of town. It’s still not legal to steal it.

            • Chozo@kbin.social
              link
              fedilink
              arrow-up
              11
              arrow-down
              2
              ·
              7 months ago

              Again, the article doesn’t say whether or not the data was intended to be public. People post their contact info online on purpose sometimes, you know. Businesses and shit. Which seems most likely to be what’s happened, given that the example has a fax number.

            • Dran@lemmy.world
              link
              fedilink
              arrow-up
              5
              arrow-down
              7
              ·
              7 months ago

              If someone had some theoretical device that could x-ray, 3d image, and 3d print an exact replica of your car though, that would be legal. That’s a closer analogy.

              It’s not illegal to reverse-engineer and reproduce for personal use. It is questionably legal though to sell the reproduction. However, if the car were open-source or otherwise not copyrighted/patented it probably would be legal to sell the reproduction.

          • RenardDesMers@lemmy.ml
            link
            fedilink
            arrow-up
            25
            ·
            7 months ago

            According to EU law, PII should be accessible, modifiable and deletable by the targeted persons. I don’t think ChatGPT would allow me to delete information about me found in their training data.

            • Touching_Grass@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              16
              ·
              edit-2
              7 months ago

              ban all European IPS from using these applications

              But again, is this your information as in its random individuals or is this really some company roster listing CEOs it grabbed off some third party website that none of us are actually on and its being passed off as if its regular folks information

              • Catoblepas@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                13
                ·
                7 months ago

                “Just ban everyone from places with legal protections” is a hilarious solution to a PII-spitting machine, thanks for the laugh.

                • Touching_Grass@lemmy.world
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  11
                  ·
                  edit-2
                  7 months ago

                  You’re pretentiously laughing at region locking. That’s been around for a while. You can’t untrain these AI. This PII which has always been publicly available and seems to be an issue only now is not something they can pull out and retrain. So if its that big an issue, region lock them. Fuck em. But again this doesn’t sound like Joe blow has information available. It seems more like websites that are scraping company details which these ai then scrape.

      • Touching_Grass@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        large amounts of privately identifiable information (PII)

        Yea the wording is kind of ambiguous. Are they saying it’s a private phone number or the number of a ted and sons plumbing and heating

    • Turun@feddit.de
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      7 months ago

      I’m curious how accurate the PII is. I can generate strings of text and numbers and say that it’s a person’s name and phone number. But that doesn’t mean it’s PII. LLMs like to hallucinate a lot.

    • BraveSirZaphod@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      There’s also very large copyright implications here. A big argument for AI training being fair use is that the model doesn’t actually retain a copy of the copyrighted data, but rather is simply learning from it. If it’s “learning” it so well that it can spit it out verbatim, that’s a huge hole in that argument, and a very strong piece of evidence in the unauthorized copying bucket.

    • casmael@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      7 months ago

      Well now I have to pii again - hopefully that’s not regulated where I live (in my house)