You just wasted a lot of my time. What did I do to deserve this?
Comment on ChatGPT provides false information about people, and OpenAI can’t correct it
NeoNachtwaechter@lemmy.world 6 months agoI would post a fictitious person disclaimer
… or you could read the GDPR and learn that such excuses are void.
erv_za@lemmy.world 6 months ago
NeoNachtwaechter@lemmy.world 6 months ago
… said the sparrow and flew out of the library.
vithigar@lemmy.ca 6 months ago
LLMs don’t actually store any of their training data, though. And any data being held in context is easily accessible and can be wiped or edited to remove personal data as necessary.
NeoNachtwaechter@lemmy.world 6 months ago
Data protection law covers all kinds of data processing.
For example, input is processing, too. Output is processing, too. Section 4 of the GDPR.
If you really want to rely on excuses, you would need wayyy better ones.
vithigar@lemmy.ca 6 months ago
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.