Comment on Data poisoning: how artists are sabotaging AI to take revenge on image generators
cm0002@lemmy.world 10 months agothey create statistical models based on input data.
Any output from a model trained on material that they don’t have copyright for is a violation of copyright
There’s no copyright violation, you said it yourself, any output is just the result of a statistical model and the original art would be under fair use derivative work (If it falls under copyright at all)
BURN@lemmy.world 10 months ago
Considering most models can spit out training data, that’s not a true statement. Training data may not be explicitly saved, but it can be retrieved from these models.
Existing copyright law can’t be applied here because it doesn’t cover something like this.
It 100% should be a copyright infringement for every image generated using the stolen work of others.
cm0002@lemmy.world 10 months ago
You can get it to spit out something very close, maybe even exact depending on how much of your art was used in the training (Because that would make your style influence the weights and model more)
But that’s no different than me tracing your art or taking samples of your art to someone else and paying them to make an exact copy, in that case that specific output is a copyright violation. Just because it can do that, doesn’t mean every output is suddenly a copyright violation.
BURN@lemmy.world 10 months ago
However since it’s required to use all of the illegally obtained and in-licensed work to create it, it is a copyright violation, just as tracing over something would be. Again, existing copyright law cannot be applied here because this technology works in a vastly different way than a human artist.
A hard line has to be made that will protect artists. I’d prefer it go even farther in protecting individual copyright while weakening overall copyright for corporate owners.