Comment on ChatGPT safety systems can be bypassed to get weapons instructions

<- View Parent
treadful@lemmy.zip ⁨1⁩ ⁨week⁩ ago

As much as I don’t want chatbots to explain to morons how to harm people, I don’t like that this just seems to be a form of censorship. If it’s not illegal to publish this information, why should it be censored via a chatbot interface?

source
Sort:hotnewtop