Is it a conspiracy? For months, YouTubers have been quietly griping that something looked off in their recent video uploads. Following a deeper analysis by a popular music channel, Google has now confirmed that it has been testing a feature that uses AI to artificially enhance videos. The company claims this is part of its effort to “provide the best video quality,” but it’s odd that it began doing so without notifying creators or offering any way to opt out of the experiment.
Tinkering in this way with someone‘s work should be regarded as copyright infringement regardless of what their TOS say. It shouldn‘t be allowed and Youtube should pay up big time for this theft. But of course this is probably not how it‘s going to be handled because Google is mega rich and the law does not apply to AI somehow.
Chozo@fedia.io 3 weeks ago
It's actually a big difference. "AI" is an almost meaningless term without specifying what type of AI it is. ChatGPT is an AI, Sora is an AI, the "magic eraser" in your photos app is an AI, the AOL chatbot "SmarterChild" was also AI. "AI" can mean almost anything even remotely adjacent to "machine learning" right now. Just calling a tool "AI" says literally nothing about what the tool is or what it does. This sort of reductive, dismissive attitude toward anything an author doesn't understand in tech articles is getting really worrying lately.
Flagstaff@programming.dev 3 weeks ago
Real “AI” doesn’t exist anyway. We may as well call it Algorithmic Idiocy.
cecilkorik@piefed.ca 2 weeks ago
And that's why it's become a meaningless term. If that's what we're being told we're supposed to call it then that's what we're going to call it. Blame greedy companies marketing for that, not the journalists and people trying to make sense of the endless stream of meaningless garbage pouring out of these so-called "AI" companies.
Soup@lemmy.world 2 weeks ago
When all the music “radio stations” effectively changed nothing but called their algorithms “AI” to follow the hype.