Comment on Tool preventing AI mimicry cracked; artists wonder what’s next
pennomi@lemmy.world 8 months ago
Glaze has always been fundamentally flawed and a short term bandage. There’s no way you can make something appear correctly to a human and incorrectly to a computer over the long term - the researchers will simply retrain on the new data.
admin@lemmy.my-box.dev 8 months ago
Agreed. It was fun as a thought exercise, but this failure was inevitable from the start. Ironically, the existence and usage of such tools will only hasten their obsolescence.
The only thing that would really help is GDPR-like fines (based as a percentage of income, not profits), for any company that trains or willingly uses models that have been trained on data without explicit consent from its creator.
FaceDeer@fedia.io 8 months ago
That would "help" by basically introducing the concept of copyright to styles and ideas, which I think would likely have more devastating consequences to art than any AI could possibly inflict.
admin@lemmy.my-box.dev 8 months ago
No, Just the concept of getting a say in who can train AIs on your creations.
So yes, that would leave room for a loophole where a human could recreate your creation (without just making a copy), and they could then train their model on that. It isn’t water tight. But it doesn’t need to be, just better than what we have now.
FaceDeer@fedia.io 8 months ago
That's the same thing. Whatever you want to call it, "copyright" or some other word, the end result is that you're wanting to give people the right to control other people's ability to analyze the things that they see on public display. And control what general concepts other people put into future works.
I really don't see how going in that direction is going to lead to a better situation than we have now. Frankly it looks more like a path to a nightmarish corporate-controlled dystopia to me.