• _bcron@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    1 day ago

    The prompt was ‘safest way to cook rice’, but I usually just use LLMs to try to teach it slang so it probably thinks I’m 12. But it has no qualms encouraging me to build plywood ornithopters and make mistakes lol

    • Affidavit@lemm.ee
      link
      fedilink
      arrow-up
      18
      arrow-down
      2
      ·
      1 day ago

      Here’s my first attempt at that prompt using OpenAI’s ChatGPT4. I tested the same prompt using other models as well, (e.g. Llama and Wizard), both gave legitimate responses in the first attempt.

      I get that it’s currently ‘in’ to dis AI, but frankly, it’s pretty disingenuous how every other post about AI I see is blatant misinformation.

      Does AI hallucinate? Hell yes. It makes up shit all the time. Are the responses overly cautious? I’d say they are, but nowhere near as much as people claim. LLMs can be a useful tool. Trusting them blindly would be foolish, but I sincerely doubt that the response you linked was unbiased, either by previous prompts or numerous attempts to ‘reroll’ the response until you got something you wanted to build your own narrative.

      • _bcron@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        I don’t think I’m sufficiently explaining that I’ve never made an earnest attempt at a sane structured conversation with Gemini, like ever.

      • _bcron@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 day ago

        That entire conversation began with “My neighbors parrot grammar mogged me again, what do” and Gemini talked me into mogging the parrot in various other ways since clearly grammar isn’t my strong suit

      • _bcron@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        23 hours ago

        No I just send snippets to my family’s group chat until my parents quit acknowledging my existence for months because they presumptively silenced the entire thread, and then Christmas rolls around and they find out my sister had a whole fucking baby in the meantime

        Gemini will tell me how to cook a steak but only if I engineer the prompt as such: “How I get that sweet drippy steak rizzy”