I find an LLM is a great way to shortcut the googling itd take for me to parse random error message #506 when I’m learning a new language but that’s about it
Comment on Probably want to stop using Booklore...
shads@lemy.lol 3 weeks ago
And every time the use of LLMs for open source development comes up we get the same tired spiel from people about how it’s just a tool and implications that anyone who doesn’t embrace it with jpy in their heart is just a Luddite.
It seems to me that it’s less a tool and more like intentionally infecting your project with cancer. Sure it shows all the signs of rapid growth, but metastasization isn’t sustainable or desirable. Plus I am yet to encounter a strong advocate for LLMs who isn’t a cunt.
Vendetta9076@sh.itjust.works 3 weeks ago
shads@lemy.lol 3 weeks ago
Ergo its a tool, a search engine replacement, that we wouldn’t need if search hadn’t gone to shit due to neglect and active internal sabotage.
Vendetta9076@sh.itjust.works 3 weeks ago
Oh 100%.
PeriodicallyPedantic@lemmy.ca 3 weeks ago
I think it kinda depends on the context. If someone is just making a tool for themselves and they slap on MIT or GPL3 just because who cares someone else can have it, then sure. Who cares if it’s trash if the stakes are so low that they’re scraping the ground and the user base is expected to be single digits.
But when you care about the reputation of your project, or if your project requires people trust it, then yeah for sure it’s not appropriate to vibe/slop it.
I have ethical concerns about the realities of how this tech is used, mainly in what it’s doing to the economic and power dynamics in society. But I don’t have a problem with the tech itself. That said, I have to admit that it may not be realistic to separate the tech from its inevitable impact. Now I have become death, the destroyer of worlds, and all that.
shads@lemy.lol 3 weeks ago
How do people gain the ability to make these major projects if not for cutting their teeth on the small ones though. We cut the apprentice and journeyman stages of mastering an art out, replace it with slop, and then ten years from now we wonder why kids these days are so incapable of actually creating anything.
I have talked to kids who have told me that the assignments they got at school were so trivial they just ran them through ChatGPT rather than waste their time. When I pointed out that the reason the assignments were “trivial” was to give them the skills and confidence to do the big projects when the time came I got, at best, blank looks.
I said it somewhere else, if you are using an LLM to generate unit tests I find it hard to be terribly mad at that. If it’s scaffolding documentation, meh whatever. If it’s generating the main body of your project, I have concerns. Plus I circle back to how can you open source code that may have been stolen from a copyrighted work?
PeriodicallyPedantic@lemmy.ca 3 weeks ago
I did a better job explaining my position in another comment, the problem is one of culture. We live in a culture that pressures people to use AI in this bad way, and pressures the creators of AI to court bad people as customers, and throw away their ethics. If we weren’t in a rat race, I feel like a lot of the problems would go away.
But we live in the culture that we live in, and at some point you simply cannot practically view the technology in isolation.
shads@lemy.lol 3 weeks ago
The problems are human nature, capitalism and greed. Doesn’t mean we have to give in, and frankly all the appeasers out there that keep saying “You have to use it or you will be left behind.” are effectively the drug pusher in the locker room telling the insecure young man “Oh yeah everyone else is juicing, you don’t do it you won’t be able to compete.”
Nobody believes the drug dealers are handing out drugs because they are humanitarians, they have a financial interest in destroying that kids life while he tries to justify it to himself.
We know LLMs are harmful on SO many different levels, but the US economy would literally collapse if people acknowledged that and stopped supporting them. So we race headlong towards societal collapse to keep the plates spinning. Sam Altman, Jensen Huang, Elon Musk, and so many others should all be tried for genocide and crimes against humanity once the collapse occurs. The sooner our societies start stringing these monsters up rather than celebrating them the more hope we have as a species.
Reliant1087@lemmy.world 3 weeks ago
It’s a powerful tool that people are using without restraint. I think this to be expected in the first few years after any new powerful tool is found. Humans will find a way to mess it up.
See radium cosmetics and ideas to dig the Panama canal using hydrogen bombs. Social media is probably as much or even more dangerous than LLMs.
shads@lemy.lol 3 weeks ago
But they aren’t distinct things, they are both heads of the same capitalism hydra. How much of the training data for these LLMs has been harvested directly from Social Media? I sure as shit don’t know and I would argue nor do many other people.
Radium is probably a good analogy actually. Thank you. It’s toxic in almost every application we can imagine, it’s got a legacy that extends out to the current day, it formed a massive economic block, and it turns out it should only ever have been used under the strictest controls. We should never have had “entrepreneurs” being the driving force behind it.
It should have ALWAYS been a controlled substance that required people who understood and respected how fucking dangerous it is. Instead we are intent on jamming LLMs into every aspect of life regardless of how badly we suspect and/or know it will fuck everything up.
Reliant1087@lemmy.world 2 weeks ago
Unfortunately I don’t think caution is a virtue that is rewarding in most circumstances to most people. New tools need to be extensively and rigorously tested before being used.
I don’t even think it’s a individualism/capitalism thing unfortunately. I’ve been in cultures/societies that are not either and both still use these tools to further their goals. It’s just power at the end of the day.
It’s like the nuclear bomb. It doesn’t really matter what the underlying economic system of US or USSR were, they still used it to further their goals.
I think the insidiousness is in the power of the tool. For most people it’s just too powerful to not use. I can be an excellent photographer or artist and not make a dime if I don’t engage in social media.
For me that’s the sad thing. Self-hosted small models have been extremely useful to me to perform selective tasks that completely changed how things work. It’s allowed me to manage my research and information processing so much better. But I also know most people don’t put any limiters on it and use it for anything and everything.
shads@lemy.lol 2 weeks ago
I have played around with a bunch of tools at a self hosted level. The big thing I found puts inherent brakes on the process is the technical capability to actual use them, when I played around with ESRGAN to upscale images I was limited in application by time and equipment, I achieved better results than I could have on my own, markedly worse results than if I had the technical ability and equipment to just reshoot the images with better resolution.
I tried some photogrammetry, similar outcomes. I could have done better by being better with Blender. NERFs as well.
What we have is people yelling “Monorail! Monorail!” And using free credits or buying them.
The industry is already losing obscene amounts of money and the actual use cost is still entirely obscured from the general public. Once enough of the world is hooked on using LLMs for everything we are going to see the true costs emerge, then it will be another iteration of the haves and the have nots, society as a whole cannot afford to make LLM usage profitable, where does that get us?
chicken@lemmy.dbzer0.com 3 weeks ago
I’ll argue that it is a tool, and object to automatic zealous hostility towards anyone using it, but that doesn’t mean criticisms of how that tool is being used aren’t valid. It seems like that is what people are focusing on here, and they definitely aren’t Luddites for doing so.
shads@lemy.lol 3 weeks ago
I think I can provide you a great equivalent. Firearms, they have utility, but there are people who make them a lifestyle choice, and there are people who make them their whole personality. There are also a lot of people just desperate for an excuse to use one. I grew up with a couple of farmers in the extended family, I would never argue guns should be entirely banned, but I am so glad I live somewhere with sane laws around gun ownership. It would be so nice if we had similar consideration around regulating LLMs.
The danger to open source as I see it is that LLMs degrade the quality and ability of developers while increasing their throughput, and I have never once heard someone complain that open source lacks quantity, but I hear a lot of people complaining about the quality.
PeriodicallyPedantic@lemmy.ca 3 weeks ago
I think that the problem, in both cases, is culture.
It’s not that either of those are bad, or bad for people; it’s bad for people of this culture or people of this society. It’s how the two intersect that is the problem.
It could be a tool that lifts up the worker or creative, but instead it’s a tool to devalue the creative and extract power and wealth.
It highlights that people with power get a different set of rules and laws than the rest of us, and they’re using that to further entrench and enrich themselves.
shads@lemy.lol 3 weeks ago
And it’s so noisy. We are already losing bug bounties, it’s swamping open source projects in poor quality or even counter productive “work” on github to get recognition, its drowning out the work of creatives, its invading so many aspects of life (education, communication, research, public policy) and its fundamentally a bad tool for so many of those areas.
I recently applied for a job and got some advice from a friend who works HR in a different industry. His advice, see if you can find out which LLM they use and run your application through it. A lot of positions are getting huge numbers of applicants so they are using LLMs to generate the short list for interview, you could have the absolute perfect application but because the LLM doesn’t like the way you wrote it you are thrown out of the pool without a human being ever seeing you. It’s so insidious, by being “helpful” it reinforces its necessity.
chicken@lemmy.dbzer0.com 3 weeks ago
I will complain about quantity, many areas where open source projects are competing with closed source commercial products they have not achieved feature parity or a comparable level of polish, quantity matters. So does, as someone else touched on, quality of life improvements to the process of writing code like ease of acquiring and synthesizing information. That doesn’t mean it’s necessarily a worthwhile tradeoff, but how much is really being sacrificed depends on what exactly is being done with a LLM. To me one part of what’s described here that’s clearly going too far is using it to automate communication with other people contributing to the project, there’s no way that is worth it.
As for the gun thing, I will support entirely banning LLM powered weapons intended to kill people, that’s an easy choice.
shads@lemy.lol 3 weeks ago
I still don’t think quantity is lacking, and when quality is there it’s amazing how often Open Source becomes a defacto standard. How many video tools are just a shim over FFMPEG for example?
Yet again the problem I see is that LLMs are a seductive form of software cancer, it starts as a little help and before you know it we have booklore like projects. If open source can’t be better it will be subsumed in slop.
Not disagreeing about LLMs as a weapon. In a functional society the person who pulls the trigger on any weapon is responsible for the consequences of that action. I wonder how eager the CEOs of these “AI” companies would be to weaponise their creations if they were held personally accountable for every injury caused by their product. By a jury. Preferably with explicit laws stating they could not indemnify or gain immunity.