They’ll use AI to detect it… obviously. ☺️
Comment on NetBSD bans all commits of AI-generated code
omgitsaheadcrab@sh.itjust.works 5 months ago
Ok but how is anyone meant to know if you generated your docstrings using copilot?
mannycalavera@feddit.uk 5 months ago
Zos_Kia@lemmynsfw.com 5 months ago
I’m saddened to use this phrase but it is literally virtue signalling. They have no way of knowing lmao
best_username_ever@sh.itjust.works 5 months ago
It’s actually simple to detect: if the code sucks or is written by a bad programmer, and the docstrings are perfect, it’s AI. I’ve seen this more than once and it never fails.
Zos_Kia@lemmynsfw.com 5 months ago
I’m confused, do people really use copilot to write the whole thing and ship it without re reading?
sugar_in_your_tea@sh.itjust.works 5 months ago
I literally did an interview that went like this:
- Applicant used copilot to generate nontrivial amounts of the code
- Copilot generated the wrong code for a key part of the algorithm; applicant didn’t notice
- We pointed it out, they fixed it
- They had to refactor the code a bit, and ended up making the same exact mistake again
- We pointed out the error again…
And that’s in an interview, where you should be extra careful to make a good impression…
neclimdul@lemmy.world 5 months ago
Not specific to AI but someone flat out told me they didn’t even run the code to see it work. They didn’t understand why I would or expect that before accepting code. This was someone submitting code to a widely deployed open source project.
So, I would expect the answer is yes or very soon to be yes.
best_username_ever@sh.itjust.works 5 months ago
Around me, most beginners who use that don’t have the skills to understand or even test what they get. They don’t want to learn I guess, ChatGPT is easier.
I recently suspected a new guy was using ChatGPT because everything seemed perfect (grammar, code formatting, classes made with design patterns, etc.) so I did some pair programming with him and asked if we could debug his simple application. He didn’t know where the debug button was.
TimeSquirrel@kbin.social 5 months ago
So your results are biased, because you're not going to see the decent programmers who are just using it to take mundane tasks off their back while staying in control of the logic. You're only ever going to catch the noobs.
best_username_ever@sh.itjust.works 5 months ago
You’re only ever going to catch the noobs.
That’s the fucking point. Juniors must learn, not copy paste random stuff. I don’t care what seniors do.
Venator@kbin.social 5 months ago
It's also probably to make things slightly simpler from a legal perspective.
Zos_Kia@lemmynsfw.com 5 months ago
That makes sense yes
SMillerNL@lemmy.world 5 months ago
Are they long, super verbose and often incorrect?
Xantar@lemmy.dbzer0.com 5 months ago
Magic, I guess ?
Terces@lemmy.world 5 months ago
How do they know that you wrote it yourself and didn’t just steal it?
This is a rule to protect themselves. If there is ever a case around this, they can push the blame to the person that committed the code for breaking that rule.
sirico@feddit.uk 5 months ago
This is the only reason rules exist, not to stop people doing a thing but to be able to enforce or defect responsibility when they do.
ripcord@lemmy.world 5 months ago
I mean, generally rules at least are to strongly discourage people from doing a thing, or to lead to things that WOULD prevent people from doing a thing.
A purely conceptual rule by itself would not magically stop someone from doing a thing, but that’s kind of a weird way to think about it.