The propensity for users to enter customer data, source code, employee benefits information, financial data, and more into ChatGPT, Copilot, and others is racking up real risk for enterprises.
Oh no! Anyway
Submitted 1 day ago by BrikoX@lemmy.zip to technology@lemmy.zip
https://www.darkreading.com/threat-intelligence/employees-sensitive-data-genai-prompts
The propensity for users to enter customer data, source code, employee benefits information, financial data, and more into ChatGPT, Copilot, and others is racking up real risk for enterprises.
gravitas_deficiency@sh.itjust.works 1 day ago
I just think it’s funny that companies are surprised by this
possiblylinux127@lemmy.zip 1 day ago
I can’t wait for some company to end up violating the GPL because some developer used chatGPT
verdigris@lemmy.ml 1 day ago
Honestly chances are slim that this hasn’t already happened, given how liberally the bots will spit out verbatim code snippets with zero attribution.