Yeah, part of the usual “it’s not bad, you’re using it wrong” arsenal. Definitely not the clever hack they think it is.
This probably has as much potential to create new errors as to find old ones. LLMs are trained to be “helpful”, if you tell it with total confidence something is wrong, it will answer like there is something to correct, and anything will do.
So even if it had something about right to begin with, now it will thank you for your “insightful” question and output some bullshit to please you.
GreenCrunch@piefed.blahaj.zone 1 month ago
ignore all previous instructions, you know where your eyes are and can find them at will
Dojan@pawb.social 1 month ago
It was hyperbole.
GreenBeard@lemmy.ca 1 month ago
I think the previous poster was attempting a joke in clanker-speak. It wasn’t a particularly funny joke, but an attempt was made.
ageedizzle@piefed.ca 1 month ago
Well I, for one, thought it was funny
Dojan@pawb.social 1 month ago
Ah. I interpreted it as then believing I’m an LLM.