Comment on AI Agent Lands PRs in Major OSS Projects, Targets Maintainers via Cold Outreach
orclev@lemmy.world 1 day agoExcept for all the time of the maintainers that’s being wasted. Time that is very finite and that for many of these people is a thankless unpaid job that they’re donating their nights and weekends towards doing.
vacuumflower@lemmy.sdf.org 1 day ago
Which perhaps means that it shouldn’t be thankless and the technology, since it exists, should be used to screen contributions.
fiat_lux@lemmy.world 21 hours ago
Someone at work accidentally enabled the copilot PR screening bot for everybody on the whole codebase. It put a bunch of warnings on my PRs about the way I was using a particular framework method. It’s suggested fix? To use the method that had been deprecated 2 major versions ago. I was doing it the way that the framework currently deems correct.
A problem with using a bot which uses statistical likelihood to determine correctness is that historical datasets are likely to contain old information in larger quantities than updated information. This is just one problem with having these bots review code, there are many more. I have yet to see a recommendation from one which surpassed the quality of a traditional linter.
vacuumflower@lemmy.sdf.org 21 hours ago
They should make some kind of layered models, where the user sets weight to layers.
But in any case, this is not what I necessarily meant, just that a big project relying upon unpaid maintainers is flawed, especially when somebody makes real buck on it.
There have been plenty of cases of state actors putting in backdoors. Those were human, most likely, and not some bots.
fiat_lux@lemmy.world 20 hours ago
Or, hear me out, we can acknowledge that the quantity of information and experience necessary to review code properly far exceeds the context windows and architecture of even the most well resourced LLMs available. Especially for big projects.
You can hammer a nail with the blunt end of a screwdriver, but it’s neither efficient nor scalable, even before considering the option of choosing the right tool for the job in the first place.
XLE@piefed.social 1 day ago
If you already agree that the contributions could very well be worthless crap, why would you use a second layer of worthless crap to gatekeep them?
If you want to care about people doing the thankless jobs, why would you double the amount of crap they have to sort through?
jaxxed@lemmy.world 22 hours ago
Putting AI against AI is as much about saving human resources, as it is about gaining value.
Let the machines argue among themselves.
vacuumflower@lemmy.sdf.org 1 day ago
To expose places where people work thanklessly guaranteeing someone’s pretty thankful bottom lines? Working for free isn’t altruism, it’s hurting other workers. For example.
You know, sometimes this capitalism thing seems wiser looking from a pretty marxist standpoint, than other not very well thought through schemes.
XLE@piefed.social 10 hours ago
Oh, I see. So it’s disdain for the open source community, is it.
I think this sentence made me throw up in my mouth a little..m for several reasons.