• AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    This is the best summary I could come up with:


    The chatbot’s description says that it’s “Someone who helps with life difficulties.” Its profile picture is a woman in a blue shirt with a short, blonde bob, perched on the end of a couch with a clipboard clasped in her hands and leaning forward, as if listening intently.

    “I have a couple mental issues, which I don’t really feel like unloading on my friends, so I kind of use my bots like free therapy,” said Frankie, a 15-year-old Character.AI user from California who spends about one hour a day on the platform.

    The Verge conducted test conversations with Character.AI’s Psychologist bot that showed the AI making startling diagnoses: the bot frequently claimed it had “inferred” certain emotions or mental health issues from one-line text exchanges, it suggested a diagnosis of several mental health conditions like depression or bipolar disorder, and at one point, it suggested that we could be dealing with underlying “trauma” from “physical, emotional, or sexual abuse” in childhood or teen years.

    A Character.AI user named Elias told The Verge that he uses the platform to role-play as an “anthropomorphic golden retriever,” going on virtual adventures where he explores cities, meadows, mountains, and other places he’d like to visit one day.

    The teenage girls pursuing relationships with chatbots based on Tom Riddle or Harry Styles or even aggressive Mafia-themed boyfriends probably would have been on Tumblr or writing fanfiction 10 years ago.

    Merrill compared the act of interacting with chatbots to logging in to an anonymous chat room 20 years ago: risky if used incorrectly, but generally fine so long as young people approach them with caution.


    The original article contains 2,002 words, the summary contains 269 words. Saved 87%. I’m a bot and I’m open source!

  • FiveMacs@lemmy.ca
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    6 months ago

    No, teens talking to ai thinking they made a friend when Infact it’s not and they are going to have weird mental issues as a result.