As of April 24 you’ll be feeding the Octocat unless you opt out
To opt out, GitHub users should visit /settings/copilot/features and disable “Allow GitHub to use my data for AI model training” under the Privacy heading.
Submitted 7 hours ago by BrikoX@lemmy.zip to technology@lemmy.zip
https://www.theregister.com/2026/03/26/github_ai_training_policy_changes/
As of April 24 you’ll be feeding the Octocat unless you opt out
To opt out, GitHub users should visit /settings/copilot/features and disable “Allow GitHub to use my data for AI model training” under the Privacy heading.
Do you guys really believe these opt-out buttons do anything?
Not /s btw, genuine question.
Maybe. But I wonder how would it apply: are my contributions to another user’s repository still used for training if that user didn’t opt out.
Placbo buttons to boil the frog.
Color me shocked! Jk everybody saw that coming… It was probably hinted at to get reactions, then went ahead when people didn’t bitch too much.
The data GitHub wants includes:
- Model outputs that have been accepted or modified;
- Model inputs including code snippets shown;
- Code context surrounding your cursor position;
- Comments and documentation you’ve written;
- File names and repo structure;
- Interactions with Copilot features (e.g. chats); and
- Feedback (e.g. thumbs up/down ratings)…As the FAQs explain: “If a Copilot user has their settings set to enable model training on their interaction data, code snippets from private repositories can be collected and used for model training while the user is actively engaged with Copilot while working in that repository.”
People still won’t leave 🤷
I’m home sick today. Gonna start migrating my shit away from there
I stand corrected. Luckily 👍
driving_crooner@lemmy.eco.br 6 hours ago
What if people start sending malicious code to github? Poisoning the AI model