cross-posted from: programming.dev/post/34472919
(a) Truth-seeking. LLMs shall be truthful in responding to user prompts seeking factual information or analysis.
They have no idea what LLMs are if they think LLMs can be forced to be “truthful”. An LLM has no idea what is “truth” it simply uses its inputs to predict what it thinks you want to hear base upon its the data given to it. It doesn’t know what “truth” is.
0ops@piefed.zip 1 day ago
Wow I just skimmed it. This is really stupid. Unconstitutional? Yeah. Evil? A bit. But more than anything this is just so fucking dumb. Like cringy dumb. This government couldn't just be evil they had to be embarrassing too.
aeternum@lemmy.blahaj.zone 1 hour ago
insert Always Was meme