Comment on Somebody managed to coax the Gab AI chatbot to reveal its prompt

mhague@lemmy.world ⁨2⁩ ⁨months⁩ ago

I don’t get it, what makes the output trustworthy? If it seems real, it’s probably real? If it keeps hallucinating something, it must have some truth to it? Seems like the two main mindsets; you can tell by the way it is, and look it keeps saying this.

source
Sort:hotnewtop