Wild to see you call for a "sane take" when you strawman the actual water problem into "draining the oceans."
Local residents with nearby data centers aren't being told to take fewer showers with salt water from the ocean.
The topic is : using AIs for game dev.
I’m just going to be upfront: AI haters don’t know the actual way this shit works except that by existing, LLMS drain oceans and create more global warming than the entire petrol industry, and AI bros are filling their codebases with junk code that’s going to explode in their faces from anywhere between 6 months to 3 years.
Wild to see you call for a "sane take" when you strawman the actual water problem into "draining the oceans."
Local residents with nearby data centers aren't being told to take fewer showers with salt water from the ocean.
Is that a problem with the existence of llms as a technology, or shitty corporations working with corrupt governments in starving local people of resources to turn a quick buck?
If you are allowing a data center to be built, you need to make sure you have power etc to build it without negativitely impacting the local people. It’s not the fault of an LLM that they fucked this shit up.
Are you really gonna use the “guns don’t kill people, people kill people” argument to defend LLMS?
Let’s not forget that the first ‘L’ stands for “large”. These things do not exist without massive, power and resource hungry data centers. You can’t just say “Blame government mismanagement! Blame corporate greed!” without acknowledging that LLMs cease to exist without those things.
And even with all of those resources behind it, the technology is still only marginally useful at best. LLMs still hallucinate, they still confidently distribute misinformation, they still contribute to mental health crises in vulnerable individuals, and no one really has any idea how to stop those things from happening.
What tangible benefit is there to LLMs that justifies their absurd cost? Honestly?
making up deficiencies in your own artistic and linguistic skills , getting easy starting points for coding solutions.
LLMs still hallucinate,
Emergent behaviour can be useful in coming up with new ideas that you were not expecting and areas to explore
they still confidently distribute misinformation,
yeah, that’s been a problem since language, if you want a statement more close to the topic at hand, the printing press.
they still contribute to mental health crises in vulnerable individuals, and no one really has any idea how to stop those things from happening.
so does the fucking internet.
lime@feddit.nu 1 day ago
as someone who has studied ml since around 2015, i’m still not convinced. i run local models, i train on CC data, i triple-check everything, and it’s just not that useful. it’s fun, but not productive.