Comment on ChatGPT safety systems can be bypassed to get weapons instructions

CubitOom@infosec.pub ⁨2⁩ ⁨weeks⁩ ago

Image

Remember kids, if you want to look up something that you don’t want the government to know about, don’t use the internet to do it.

Also, LLMs are not the best source for asking about how to make things that explode.

source
Sort:hotnewtop