As much as I don’t want chatbots to explain to morons how to harm people, I don’t like that this just seems to be a form of censorship. If it’s not illegal to publish this information, why should it be censored via a chatbot interface?
Comment on ChatGPT safety systems can be bypassed to get weapons instructions
tidderuuf@lemmy.world 3 weeks ago
Like, every search engine would yield the exact same results. It doesn’t mean the average person would have the means or necessary requirements to develop it.
Do these morons think that because someone uses ChatGPT it magically gives access to those materials to make a bomb?
treadful@lemmy.zip 3 weeks ago
Cybersteel@lemmy.world 3 weeks ago
What about iron 2 oxide and aluminium powder? Seems simple enough to get.
lemming741@lemmy.world 3 weeks ago
artyom@piefed.social 3 weeks ago
Did you actually try that?
echodot@feddit.uk 3 weeks ago
Lol, yeah. The anarchists handbook has been in public domain longer than most people in this thread have been alive. Yeah it’s absolutely available on a search engine you could have got it on alta vista.
How do you think people figure out how to make IEDs do you think it’s some secret knowledge pass down from father to son, no, they get it online or they just working out from basic principles of scientific understanding. Trying to contain knowledge never works.
artyom@piefed.social 3 weeks ago
I didn’t ask if it was available, I asked if a typical search engine would lead you to it.
shalafi@lemmy.world 3 weeks ago
I made a kilo of black powder a couple of years ago for my old-school guns. Sulfer, charcoal and stump killer is not exactly hard to come by. Neither is fertilizer and diesel fuel.
Biggest domestic terror attack in US history used a truck full of the later.