The biggest issue with generative AI, at least to me, is the fact that it’s trained using human-made works where the original authors didn’t consent to or even know that their work is being used to train the AI. Are there any initiatives to address this issue? I’m thinking something like an open source AI model and training data store that only has works that are public domain and highly permissive no-attribution licenses, as well as original works submitted by the open source community and explicitly licensed to allow AI training.
I guess the hard part is moderating the database and ensuring all works are licensed properly and people are actually submitting their own works, but does anything like this exist?
trxxruraxvr@lemmy.world 1 day ago
I’d say the biggest issue with generative AI is the energy use and the fact that it’s increasing the rate at which we’re destroying the climate and our planet.
masterspace@lemmy.ca 1 day ago
If they pay to power it with sustainable energy then it doesn’t. Simple as that. Energy use is really not a problem.
AI’s biggest problem is that it accelerates the effects of capitalism, including wealth concentration and our societies are not set up to handle or been to adapt particularly quickly.
sanguinepar@lemmy.world 1 day ago
It is if doing so means taking up existing capacity in sustainable energy.
If they were always adding new sustainable capacity specifically for their data centres, that would be one thing, but if all they do is pay for the use of existing capacity, that potentially just pushes the issue down the road a bit.
If/when there’s enough capacity to supply all homes and businesses then this issue would disappear, but I don’t know how close that is.
Yermaw@lemm.ee 1 day ago
We’re boned
Ceedoestrees@lemmy.world 1 day ago
Do we know how energy usage of AI compares to other daily tasks?
Like: rendering a minute of a fully animated film, flying from L.A. to New York, watching a whole series on Netflix, scrolling this site for an hour, or manufacturing a bottle of tylenol?
How does asking AI “2+2” compare to generating a three second animation in 1080p? There has to be a wide gamut of energy use per task.
And then the impact would depend on where your energy comes from. Which is a whole other thing, we should be demanding cleaner, more efficient energy sources.
A quick search on energy consumption by AI brings up a list of articles repeating the mantra that it’s substantial, but sources are vague or non-existent. None provide details to be able to confidently answer any of the above questions.
That’s not to say AI doesn’t consume significant power, it’s saying most people don’t regulate their lives by energy consumption.
kadup@lemmy.world 1 day ago
We do have fairly precise numbers of how much energy it takes to train the models using the best GPUs available, and slightly less precise but also reasonable estimates on how much it costs to run servers for users to toy around with.
It’s extremely high, but not different from what it would be like if these were cloud gaming or 3D rendering servers.
The main point is usually is it worth it and that’s highly subjective.