• 22 Posts
  • 239 Comments
Joined 1 year ago
cake
Cake day: January 20th, 2023

help-circle




  • I totally agree that both seem to imply intent, but IMHO hallucinating is something that seems to imply not only more agency than an LLM has, but also less culpability. Like, “Aw, it’s sick and hallucinating, otherwise it would tell us the truth.”

    Whereas calling it a bullshit machine still implies more intentionality than an LLM is capable of, but at least skews the perception of that intention more in the direction of “It’s making stuff up” which seems closer to the mechanisms behind an LLM to me.

    I also love that the researchers actually took the time to not only provide the technical definition of bullshit, but also sub-categorized it too, lol.




  • heavyboots@lemmy.mltoAsklemmy@lemmy.mldeleted
    link
    fedilink
    English
    arrow-up
    56
    ·
    edit-2
    19 days ago

    I would absolutely send him an email to the effect of

    “Per our multiple verbal conversations, this is just to serve as notice that, in my professional opinion, your refusal to allow me to upgrade a system at risk of multiple security vulnerabilities on a platform that is no longer supported is a risk that you are choosing to accept against my advise.”

    with a list of known major vulnerabilities attached if possible.

    That way at least if this comes back to bite the company on the ass, he can’t say “Well he never told me this was a problem!”




  • heavyboots@lemmy.mltoWorld News@lemmy.mlMexico's new president!
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    23 days ago

    Yes, and I voted for Bernie when he ran and I would absolutely vote for AOC if/when she does run. Just… the choices in this election cycle are wanna-be, half-demented dictator-for-life vs a “moderate” Democrat who, while he’s gotten a surprising amount done, isn’t exactly renowned for pushing the corporations hard on climate change.