Comment on ChatGPT safety systems can be bypassed to get weapons instructions
ceenote@lemmy.world 2 weeks agoAdmittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.
PixelatedSaturn@lemmy.world 2 weeks ago
Id still want to double check😀.