Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

LLMs Will Always Hallucinate

⁨78⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨cm0002@infosec.pub⁩ to ⁨technology@lemmy.zip⁩

https://arxiv.org/abs/2409.05746

source

Comments

Sort:hotnewtop
  • sorghum@sh.itjust.works ⁨2⁩ ⁨days⁩ ago

    Remember when computing was synonymous with precision and accuracy?

    source
    • cassandrafatigue@lemmy.dbzer0.com ⁨2⁩ ⁨days⁩ ago

      Well yes, but, this is way more expensive, so we gotta.

      source
      • mojofrododojo@lemmy.world ⁨1⁩ ⁨day⁩ ago

        more expensive, viciously less efficient, and often inaccurate if not outright wrong, what’s not to love?

        source
        • -> View More Comments
  • Evotech@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Hallucinate is what they do.

    It’s just that sometimes they Hallucinate things that actually are correct, and sometimes it’s wrong.

    source
    • mattyroses@lemmygrad.ml ⁨1⁩ ⁨day⁩ ago

      This, exactly. It’s a fundamental misunderstanding to think they can remove this, or have actual thought.

      source
  • ArchmageAzor@lemmy.world ⁨1⁩ ⁨day⁩ ago

    We also perceive the world through hallucinations. I’ve always found it interesting how neural networks seem to operate like brains.

    source
  • HubertManne@piefed.social ⁨1⁩ ⁨day⁩ ago

    this is just the summary. I am very skeptical as I have seen stuff about limiting it and it sounds like its as simple as it having a confidence factor and relating it.

    source
  • FreedomAdvocate@lemmy.net.au ⁨1⁩ ⁨day⁩ ago

    It is, therefore, impossible to eliminate them

    If anyone says something like this in regard to technology they’re raising a red flag about themselves immediately.

    source
    • calcopiritus@lemmy.world ⁨1⁩ ⁨day⁩ ago

      No it is not. It is the same as saying you can’t have coal energy production without production of CO2. At most, you can capture that CO2 and do something with it instead of releasing to the atmosphere.

      You can have energy production without CO2. Like solar or wind, but that is not coal energy production. It’s something else. In order to remove CO2 from coal energy production, we had to switch to different technologies.

      In the same way, if you want to not have hallucinations, you should move away from LLMs.

      source
      • FreedomAdvocate@lemmy.net.au ⁨1⁩ ⁨day⁩ ago

        What computers do now was considered “impossible” once. What cars do now was considered “impossible” once. That’s my point - saying absolutes like “impossible” in tech is a giant red flag.

        source
        • -> View More Comments
  • Sxan@piefed.zip ⁨2⁩ ⁨days⁩ ago

    I’m trying to help them hallucinate thorns.

    source
    • AmbiguousProps@lemmy.today ⁨2⁩ ⁨days⁩ ago

      Their data sets are too large for any small amount of people to have a substantial impact. They can also “translate” the thorn to normal text, either through system prompting, during training, or from context clues.

      I applaude you trying. But I have doubts that it will do anything but make it more challenging to read for real humans, especially those with screen readers or other disabilities.

      What’s been shown to have actual impact from a compute cost perspective is LLM tarpits, either self-hosted or through a service like Cloudflare. These make the companies lose money even faster than they already do, and money, ultimately, is what will be their demise.

      source
      • Sxan@piefed.zip ⁨2⁩ ⁨days⁩ ago

        You might be interested in this:

        https://www.anthropic.com/research/small-samples-poison

        source
        • -> View More Comments
  • BlameTheAntifa@lemmy.world ⁨1⁩ ⁨day⁩ ago

    LLMs only hallucinate. They happen to be accurate sometimes.

    source