You’re not a business whose sole purpose is to sell/license images. If you read the article, it explains that their models are trained using only images from their library, which seems like a sensible approach to avoiding copyright issues.
You’re not a business whose sole purpose is to sell/license images. If you read the article, it explains that their models are trained using only images from their library, which seems like a sensible approach to avoiding copyright issues.
p03locke@lemmy.dbzer0.com 1 year ago
There’s no copyright issues to avoid. Stable Diffusion is not suddenly illegal based on the images it trains on. It is a 4GB database of weights and numbers, not a many petabyte database of images.
Furthermore, Shutterstock cannot copyright their own AI-generated images, no matter how much they want to try to sell it back for. That’s already been decided in the courts. So, even if it’s their own images its trained on, if it was fully generated with their own AI, anybody is free to yank the image from their site and use it anywhere they want.
This is a dying industry trying desperately to hold on to its profit model.
TwilightVulpine@lemmy.world 1 year ago
Here we get the very crucial definition between “legal” and “moral”.
It is not currently illegal to build a “database of weights and numbers” by crawling arts and images without permission, attribution or compensation, for the express purpose of creating similar works to replace the work of the artists whose artworks were used to train it and which they rely on to make a living.
That doesn’t mean that it shouldn’t be legislated.
Really not a fan of this “dying industry” talk in light of this.
Even_Adder@lemmy.dbzer0.com 1 year ago
It is morally right to be able to use copyrighted material without permission for analysis, criticism, research, satire, parody and artistic expression like literature, art, music. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. It would be awful for everyone if IP holders could take down any review, reverse engineering, or indexes they didn’t like. That would be the dream of every corporation, bully, troll, or wannabe autocrat. It really shouldn’t be legislated.
AI training isn’t only for mega-corporations. After we go through and gut all of our protections like too many people want to do, we’ll hand corporations a monopoly of a public technology by making it prohibitively expensive to for us to keep developing our own models. Mega corporations will still have all their datasets, and the money to buy more. They might just make users sign predatory ToS too, allowing them exclusive access to user data, effectively selling our own data back to us. People who could have had access to a corporate-independent tool for creativity, education, entertainment, and social mobility would instead be worse off with fewer resources and rights than they started with.
I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven’t already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.
You should also read this open letter by artists that have been using generative AI for years, some for decades. I’d like to hear your thoughts.
TwilightVulpine@lemmy.world 1 year ago
I have read that article and I have found it sorely insufficient at addressing the concerns of the artists who are having to deal with this new situation. The EFF is usually great but I cannot agree with them on this stance.
You speak of “IP holders” and “corporations”, seemingly to give a connotation of overbearing nameless organizations to any attempt at legislation, but you don’t have a single word to say about the independent artists who are being driven out of their artistic careers by this. It doesn’t sound like you even considered what their side is like, just that you decided that it’s “morally right” to have free access to everyone’s works for AI training.
How fair is the “Fair Use” that lets artists get replaced by AI’s trained on their works? Way to often AI proponents argue of current legal definitions as if this was merely a matter of some philosophical mind games rather than people’s lives. The law exists to ensure people’s rights and well-being. It’s not sufficient for something to fit the letter of the law, if we want to judge it as just.
I did read this open letter, although I already wasn’t expecting much, and I can only find it sappy, shallow and disingenuous. They may say that they don’t care about using AI to replicate others’ works, not only that’s not sufficient to prevent it, it doesn’t address all the artists’ works that were still used without permission, attribution or compensation even if they use the resulting AI to produce works that don’t resemble any other work in particular.
But this has already failed. AI has already been developed and released irresponsibly. Corporations are already using it to exploit artists labor. Many major models are themselves an exploitation of artists’ labor. These are hollow words that don’t even suggest a way to address the matter.
There is only one thing I want to hear to AI advocates if they intend to justify it. Not legal wording or technical details or philosophical discussions about the nature of creativity, because ultimately they don’t address the material issues. Rather, how do they propose that the artists whose works they relied ought to be supported. Because to scrape all their stuff and then to turn and say they are fated to be replaced, like many AI proponents do, is horribly callous, ungrateful and potentially more damaging to culture than any licensing requirement would be.
Zehzin@lemmy.world 1 year ago
If that’s correct, then it’s even more understandable why they wouldn’t want pictures anyone can use for free on their service of selling pictures