Comment on ChatGPT safety systems can be bypassed to get weapons instructions
CubitOom@infosec.pub 2 weeks ago
Remember kids, if you want to look up something that you don’t want the government to know about, don’t use the internet to do it.
Also, LLMs are not the best source for asking about how to make things that explode.
einkorn@feddit.org 2 weeks ago
Uhm, why not go to true and trusted Wikipedia? TM 31-210 Improvised Munitions Handbook
CubitOom@infosec.pub 2 weeks ago