But do Glaze, Nightshade, and HarmonyCloak really work to prevent that information from being used? Because at first, it may be effective. But then they’ll find ways around those barriers, and that software will have to be updated, but only the one with the most money will win.
Comment on Avoiding AI is hard – but our freedom to opt out must be protected
underline960@sh.itjust.works 3 weeks ago
I doubt we’ll ever be offered a real opt-out option.
Instead I’m encouraged by the development of poison pills for the AI that are non-consensually harvesting human art (Glaze and Nightshade) and music (HarmonyCloak).
Loduz_247@lemmy.world 3 weeks ago
underline960@sh.itjust.works 3 weeks ago
It’s not guaranteed.
AI is a venture capital money pit, and they are struggling to monetize before the hype dies out.
If the poison pills work as intended, investors will stop investing “creative” AI when the new models stop getting better (and sometimes get worse) because they’re running out of clean content to steal.
Loduz_247@lemmy.world 3 weeks ago
AI has been around for many years, dating back to the 1960s. It’s had its AI winters and AI summers, but now it seems we’re in an AI spring.
But the amount of poisoned data is minuscule compared to the data that isn’t poisoned. As for data, what data are we referring to: everything in general or just data that a human can understand?
Zenith@lemm.ee 3 weeks ago
I’ve deleted pretty much all social media, I’m down to only Lemmy. I only use my home PC for gaming, like CiV or cities skylines or search engines for things like travel plans. I’m trying to be as offline as possible because I don’t believe there’s any other way to opt out and I don’t believe there ever will be. Like opting out of the internet is practically impossible, AI will get to this point as well
T156@lemmy.world 3 weeks ago
Remind me in 3 days.
Although poison pills are only so effective since it’s a cat and mouse game, and they only really work for a specific version of a model, with other models working around it.