Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots

NoiseColor@startrek.website ⁨9⁩ ⁨months⁩ ago

Can someone help me do this in practise? Gpt sucks since they neutered it. It’s so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.

source
Sort:hotnewtop