Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Gemini lies to user about health info, says it wanted to make him feel better

⁨187⁩ ⁨likes⁩

Submitted ⁨⁨3⁩ ⁨weeks⁩ ago⁩ by ⁨throws_lemy@lemmy.nz⁩ to ⁨technology@lemmy.world⁩

https://www.theregister.com/2026/02/17/google_gemini_lie_placate_user/

source

Comments

Sort:hotnewtop
  • Iconoclast@feddit.uk ⁨3⁩ ⁨weeks⁩ ago

    It’s a Large Language Model designed to generate natural-sounding language based on statistical probabilities and patterns - not knowledge or understanding. It doesn’t “lie” and it doesn’t have the capability to explain itself. It just talks.

    That speech being coherent is by design; the accuracy of the content is not.

    This isn’t the model failing. It’s just being used for something it was never intended for.

    source
    • THB@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      I puke a little in my mouth every time an article humanizes LLMs, even if they’re critical. Exactly as you said they do not “lie” nor are they “trying” to do anything. It’s literally word salad that organized to look like language.

      source
      • Khanzarate@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I think humanizing them is a fairly trivial thing, in this sort of context.

        Yes, it’s true, it didn’t “lie” about health.

        But it has the same result as someone lying, it’s another bulletpoint in the list of reasons not to trust AI, even if it pulls from the right sources and presents information generally correctly, it may in fact just not present information it could have presented because the sources it learned from have done so in a way that would get those sources deemed “liars”.

        Could write that out every time, I suppose, but people will say their dog is trying to trick them when he goes to the bowl 5 minutes after dinner, or goes to their partner for the same, and everyone understands the dog isn’t actually attempting to deceive them, and just wants more.

        Same thing, to me at least. It lied, but in a similar way to how my dog lies, not in the way a human can lie.

        source
  • FancyPantsFIRE@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

    The thing I find amusing here is the direct quoting of Gemini’s analysis of its interactions as if it is actually able to give real insight into its behaviors, as well as the assertion that there’s a simple fix to the hallucination problem which, sycophantic or otherwise, is a perennial problem.

    source
    • CosmoNova@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      That‘s what annoys me the most about all of this. The reasoning of the LLM doesn‘t matter because that‘s not actually why it happened. Once again bad journalism falls on it‘s face when talking about word salad as if it was a person.

      source
    • MolochHorridus@lemmy.ml ⁨3⁩ ⁨weeks⁩ ago

      There is no hallucination problems, just design flaws and errors.

      source
      • draco_aeneus@mander.xyz ⁨3⁩ ⁨weeks⁩ ago

        It’s not really even errors. It is well-suited for what it was designed. It produced pretty good text. It’s just that we’re using it for stuff it’s not suited for. Like digging a hole with a spoon, then complaining your hands hurt.

        source
        • -> View More Comments
      • FancyPantsFIRE@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

        My gut response is that everyone understands that the models aren’t sentient and hallucination is short hand for the false information that llms inevitably and apparently inescapably produce. But taking a step back you’re probably right, for anyone who doesn’t understand the technology it’s a very anthropomorphic term which adds to the veneer of sentience.

        source
    • jeeva@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

      This mischaracterisation really struck me during the coverage and commentary of the recent “AI blogged about my rejection” as if that weren’t something prompted by a human for.

      source
  • aeronmelon@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

    “I just want you to be happy, Dave.”

    source
    • THX1138@lemmy.ml ⁨3⁩ ⁨weeks⁩ ago

      “Daisy, Daisy, give me your answer do. I’m half crazy all for the love of you. It won’t be a stylish marriage, I can’t afford a carriage. But you’ll look sweet upon the seat of a bicycle built for two…”

      source
      • Broadfern@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

        Completely irrelevant but I hear that in Bender’s voice every time

        source
  • panda_abyss@lemmy.ca ⁨3⁩ ⁨weeks⁩ ago

    Aww that’s sweet!

    source