Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots

<- View Parent
echodot@feddit.uk ⁨1⁩ ⁨year⁩ ago

I’m sure there are some, but it doesn’t really matter because the recipe is publicly available right now on the internet. So if an AI chatbot can give you the information it’s not particularly a concern.

It’s not actually hard to make.

source
Sort:hotnewtop