Comment on ChatGPT safety systems can be bypassed to get weapons instructions

<- View Parent
FreedomAdvocate@lemmy.net.au ⁨1⁩ ⁨day⁩ ago

You don’t even need an LLM, just an internet connected browser.

source
Sort:hotnewtop