Comment on OpenAI’s latest model will block the ‘ignore all previous instructions’ loophole
felixwhynot@lemmy.world 5 months agoDid they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
Grimy@lemmy.world 5 months ago
They usually take care of a jailbreak the week its made public.