Comment on Man Gives Himself 19th Century Psychiatric Illness After Consulting With ChatGPT

yesman@lemmy.world ⁨2⁩ ⁨days⁩ ago

The thing that bothers me about LLMs is that people will acknowledge the hallucinations and lies LLMs spit out when their discussing information the user is familiar with.

But that same person will somehow trust an LLM as an authority on subjects to which they’re not familiar. Especially on subjects that are on the edges or even outside human knowledge.

Sure I don’t listen when it tells me to make pizza with glue, but it’s ideas about Hawking radiation are going to change the field.

source
Sort:hotnewtop