Comment on Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Waluigis_Talking_Buttplug@lemmy.world 11 months agoThat’s not how it works, it’s not one word that’s banned and you can’t work around it by tricking the AI. Once it starts to repeat a response, it’ll stop and give a warning.
firecat@kbin.social 11 months ago
Then don’t make it repeated and command it to make new words.
Turun@feddit.de 11 months ago
Yes, if you don’t perform the attack it’s not a service violation.