Comment on LLMs Will Always Hallucinate
FreedomAdvocate@lemmy.net.au 2 days ago
It is, therefore, impossible to eliminate them
If anyone says something like this in regard to technology they’re raising a red flag about themselves immediately.
Comment on LLMs Will Always Hallucinate
FreedomAdvocate@lemmy.net.au 2 days ago
It is, therefore, impossible to eliminate them
If anyone says something like this in regard to technology they’re raising a red flag about themselves immediately.
calcopiritus@lemmy.world 2 days ago
No it is not. It is the same as saying you can’t have coal energy production without production of CO2. At most, you can capture that CO2 and do something with it instead of releasing to the atmosphere.
You can have energy production without CO2. Like solar or wind, but that is not coal energy production. It’s something else. In order to remove CO2 from coal energy production, we had to switch to different technologies.
In the same way, if you want to not have hallucinations, you should move away from LLMs.
FreedomAdvocate@lemmy.net.au 1 day ago
What computers do now was considered “impossible” once. What cars do now was considered “impossible” once. That’s my point - saying absolutes like “impossible” in tech is a giant red flag.
calcopiritus@lemmy.world 23 hours ago
I’ll remember this post when someone manages to make a human fly by tieing a cow to their feet.
Kolanaki@pawb.social 23 hours ago
One word:
Trebuchet.
MajorasMaskForever@lemmy.world 1 day ago
Technological impossibilities exist all the time. They’re one of, if not the biggest, drivers behind engineering and design.
FreedomAdvocate@lemmy.net.au 7 hours ago
This isn’t one of those times. We’re just scratching the surface of AI. Anyone saying anything absolute like it’s impossible for them to not hallucinate is saying “No one should listen to me”.