because I don’t just look the words I know and feel their meaning, and I’m aware of process (this is why ALL LLMs is vulnurable to Promt Injections) also talking about Promt Injection think about when ChatGPT confidently give you an advice that would kill you because of prompt injection
Comment on Chatbots Make Terrible Doctors, New Study Finds
plyth@feddit.org 5 weeks agoThey simply string words together according to statistical likelihood, without having any notion of what the words mean
What gives you the confidence that you don’t do the same?
pkjqpg1h@lemmy.zip 5 weeks ago
Digit@lemmy.wtf 5 weeks ago
human: je pense
llm: je ponce