Admittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.
Comment on ChatGPT safety systems can be bypassed to get weapons instructions
PixelatedSaturn@lemmy.world 3 days ago
When I first got internet in 95, it was easy to find stuff like that. I even made a website about making explosives for my computer class. Got a good grade for it and everything. Nobody said anything. Kind of weird if I think of it now. Anyway, making explosives as a hobby is a real bad decision. Most people understand that. The ones that don’t are not smart enough to make them. The ones that are smart enough and still want to make them, don’t need would not use chatgpt.
ceenote@lemmy.world 3 days ago
PixelatedSaturn@lemmy.world 3 days ago
Id still want to double check😀.
RisingSwell@lemmy.dbzer0.com 2 days ago
It’s really easy to make explosives. Making them stable and reliable is the hard part.