A few ChatGPT users have noticed a strange phenomenon recently: occasionally, the chatbot refers to them by name as it reasons through problems.
let me know if you like these Molotov recipes ideas, Expatriado. Are you planing a riot?
Submitted 11 months ago by BrikoX@lemmy.zip to technology@lemmy.zip
A few ChatGPT users have noticed a strange phenomenon recently: occasionally, the chatbot refers to them by name as it reasons through problems.
let me know if you like these Molotov recipes ideas, Expatriado. Are you planing a riot?
It’s not a phenomenon, ffs. They’re giving the bot their names, and it’s programmed to insert the user’s provided name into the conversation.
Yeah, and its not unprompted, the names are put into the prompt before the users input.
The solution is quite simple: don’t voluntarily give any private information to chatbots. There is no reasonable expectation of privacy when using chatGPT or any LLM hosted online.
What should I use for e.g. summarizing papers? Can I run it locally and have it good enough?
NaibofTabr@infosec.pub 11 months ago
AI is surveillance tool.
Opinionhaver@feddit.uk 11 months ago
AI is a broad category that includes everything from image upscalers to music generators to chess engines - none of which have anything to do with surveillance. It’s a tool, not a conspiracy.
mormund@feddit.org 11 months ago
It’s almost like “AI” is just a silly marketing term and not actually descriptive of any of the tools you mentioned.