Comment on ChatGPT safety systems can be bypassed to get weapons instructions

<- View Parent
ceenote@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

Admittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.

source
Sort:hotnewtop