New development policy: code generated by a large language model or similar technology (e.g. ChatGPT, GitHub Copilot) is presumed to be tainted (i.e. of unclear copyright, not fitting NetBSD’s licensing goals) and cannot be committed to NetBSD.
Lots of stupid people asking “how would they know?”
That’s not the fucking point. The point is that if they catch you they can block future commits and review your past commits for poor quality code. They’re setting a quality standard, and establishing consequences for violating it.
If your AI generated code isn’t setting off red flags, you’re probably fine, but if something stupid slips through and the maintainers believe it to be the result of Generative AI, they will remove your code from the codebase and you from the project.
It’s like laws against weapons. If you have a concealed gun on your person and enter a public school, chances are that nobody will know and you’ll get away with it over and over again. But if anyone ever notices, you’re going to jail, you’re getting permanently trespassed from school grounds, and you’re probably not going to be allowed to own guns for a while.
And, it’s a message to everyone else quietly breaking the rules that they have something to lose if they don’t stop.
omgitsaheadcrab@sh.itjust.works 5 months ago
Ok but how is anyone meant to know if you generated your docstrings using copilot?
Terces@lemmy.world 5 months ago
How do they know that you wrote it yourself and didn’t just steal it?
This is a rule to protect themselves. If there is ever a case around this, they can push the blame to the person that committed the code for breaking that rule.
sirico@feddit.uk 5 months ago
This is the only reason rules exist, not to stop people doing a thing but to be able to enforce or defect responsibility when they do.
mannycalavera@feddit.uk 5 months ago
They’ll use AI to detect it… obviously. ☺️
Zos_Kia@lemmynsfw.com 5 months ago
I’m saddened to use this phrase but it is literally virtue signalling. They have no way of knowing lmao
best_username_ever@sh.itjust.works 5 months ago
It’s actually simple to detect: if the code sucks or is written by a bad programmer, and the docstrings are perfect, it’s AI. I’ve seen this more than once and it never fails.
Venator@kbin.social 5 months ago
It's also probably to make things slightly simpler from a legal perspective.
SMillerNL@lemmy.world 5 months ago
Are they long, super verbose and often incorrect?
Xantar@lemmy.dbzer0.com 5 months ago
Magic, I guess ?