If an amateur mycologist picks and eats the wrong mushroom that an LLM said was fine to eat, is the LLM liable for the death legally and/or financially?
I mean, I know better than to pick random mushrooms and eat them, but I don’t really care for mushrooms - though some have some delightful effects when metabolized, lol. The only ones of THOSE I tried, I knew who grew them, and saw the “operation,” and reviewed his sources before trying one.
Call me paranoid, but I’m not blindly trusting a high school drop out to properly identify mushrooms when professionals make mistakes to the point where any mycologist will tell you, DON’T TRUST PICS OR THE INTERNET.
It can be too difficult to tell from those sources, and I doubt the LLM and the human asking questions have the right wavelength of discussion to not produce misleading, if not entirely fabricated, results.
Melvin_Ferd@lemmy.world 9 months ago
But why not ask it for a source if this is information that has some critical piece to it. It’s right far more than it’s wrong and works as a great tool to speed up learning. I’m really interested in people sharing what prompts they used and the wrong answers it produced.
Pyr_Pressure@lemmy.ca 9 months ago
What’s the point of AI if you need to search for the source to make sure it’s right everytime? Just skip a step and search for a source first thing.
Melvin_Ferd@lemmy.world 9 months ago
There’s so many ways to answer this that I’m surprised it’s asked in the first place. AI is not some be all end all of knowledge. It’s a tool like any other.