How is training AI with art on the web different to a person studying art styles? I’d say if the AI is being monetized in some capacity, then sure maybe there should be laws in place. I’m just hard-pressed to believe that anyone can have sole control of anything once it gets on the Internet.
Comment on This new data poisoning tool lets artists fight back against generative AI
MamboGator@lemmy.world 1 year ago
This is cool. I think generative AI is great, but the way it’s being trained right now largely without consent from the artists or subjects is unequivocally unethical. Until the law catches up with the technology, people need ways of protecting themselves.
9thSun@midwest.social 1 year ago
Zeth0s@lemmy.world 1 year ago
I work in AI and I believe it is different. Society is built to distribute wealth, so that everyone can live a decent life. People and AI should be treated differently in front of the law. Non-commercial, open source AI should be treated differently than commercial or closed source models
vidarh@lemmy.stad.social 1 year ago
Society is built to distribute wealth, so that everyone can live a decent life.
As a goal, I admire it, but if you intend this as a description of how things are it’d be boundlessly naive.
Zeth0s@lemmy.world 1 year ago
That’s absolutely not how it is now, just the goal we should set for ourselves. A goal we should consider when regulating AI
realharo@lemm.ee 1 year ago
How is training AI with art on the web different to a person studying art styles?
Human brains clearly work differently than AI, how is this even a question?
The term “learning” in machine learning is mainly a metaphor.
vidarh@lemmy.stad.social 1 year ago
Human brains clearly work differently than AI, how is this even a question?
It’s not all that clear that those differences are qualitatively meaningful, but that is irrelevant to the question they asked, so this is entirely a strawman.
Why does the way AI vs. the brain learn make training AI with art make it different to a person studying art styles? Both learn to generalise features that allows them to reproduce them. Both can do so without copying specific source material.
The term “learning” in machine learning is mainly a metaphor.
How do the way they learn differ from how humans learn? They generalise. They form “world models” of how information relates. They extrapolate.
Also, laws are written with a practical purpose in mind - they are not some universal, purely philosophical construct and never have been.
This is the only uncontroversial part of your answer. The main reason why courts will treat human and AI actions different is simply that they are not human. It will for the foreseeable future have little to do whether the processes are similar enough to how humans do it.
realharo@lemm.ee 1 year ago
Now you’re just cherry picking some surface-level similarities.
You can see the difference in the process in the results, for example in how some generated pictures will contain something like a signature in the corner, simply because it resembles the training data. Or how it is at least possible to get the model to output something extremely close to the training data - gizmodo.com/ai-art-generators-ai-copyright-stable….
That at least proves that the process is quite different to the process of human learning.
The question is how much those differences matter, and which similarities you want to focus on.
FooBarrington@lemmy.world 1 year ago
I agree that the training isn’t fundamentally different, but that monetization of the output has to be controlled. The big difference between AI and humans is the speed with which they create - you have to employ an army of humans to match the output of a couple of GPUs. For noncommercial projects this is amazing. For commercial projects, it destroys the artists livelihoods.
But this simply means that training shouldn’t be controlled, inference in commercial contexts should be.
rhombus@sh.itjust.works 1 year ago
The real issue comes in ownership of the AI models and the vast amount of labor involved in the training data. It’s taking what is probably hundreds of thousands of hours of labor in the form of art and converting it into a proprietary machine, all without compensating the artists involved. Whether you can make a comparison to a human studying art is irrelevant, because a corporation can’t own an artist, but they can own an AI and not have to pay it.
regbin_@lemmy.world 1 year ago
Disagree. It’s only unethical if you use it to generate the artist’s piece and claim it as yours.
MamboGator@lemmy.world 1 year ago
[deleted]9thSun@midwest.social 1 year ago
I don’t see how AI training couldn’t be considered transformative as the whole idea is to consume input, break it down into data, and output something new. The way I’m understanding what you’re saying is like this: Instead of only paying royalties when I try to monetize a cover song, I’d have to pay every time I practiced it.
ElectroVagrant@lemmy.world 1 year ago
Until the law catches up with the technology, people need ways of protecting themselves.
I agree, and I wonder if the law might be kicked into catching up quicker as more companies try to adopt these tools and inadvertently infringe on other companies’ copyrighted material. 😅
0xD@infosec.pub 1 year ago
I don’t see a problem with it training on all materials, fuck copyright. I see the problem in it infringing on everyone’s copyright and then being proprietary, monetized bullshit.
If it trains on an open dataset, it must be completely and fully open. Everything else is peak capitalism.
Smoogs@lemmy.world 1 year ago
You’re not owed nor entitled to an artist’s time and work for free.
Turun@feddit.de 1 year ago
Of course not.
Technically the issue only arises, because these images are accessible on the internet in the first place. So in a way the artist made the choice to make the image public. This does not grant a license to everyone who looks at it, but if a license is required to train a model is unclear and currently discussed in court.
kayrae_42@lemmy.world 1 year ago
The problem is the only way for artists to get people to see and eventually buy their art or commissions is to post some of their work publicly. Historically you would go out on the street and set up a stall, now social media is our digital street. Galleries don’t take everyone, having the ability to even get a meeting with one is difficult without the right connections. Most artists are never successful enough to completely live off their art, if they can make any money at all it is great for them. Then along comes an AI model that takes their work because it’s on the internet scrapes it into its training set and now any chance they had in an over saturated market is even smaller, because hey, I can just do this with AI. This idea that copyright and IP shouldn’t exist at all is kinda absurd. Would you just go through a street art walk, take high res photos of every picture they have on display, not take any business cards, and when they ask what you are doing, go “it’s ok, I’m training an AI data model so people can just make work that looks exactly like this. They shouldn’t have to ever buy from you. Capitalism is a joke. Bye!” The art walk was free, but it was also a sales pitch, because that’s how the art world works. You are hoping to get seen, that someone likes it enough to buy, and maybe buy more.
barsoap@lemm.ee 1 year ago
I am perfectly entitled to type random stuff into google images, pick out images for a mood board and some as reference, regardless of their copyright status, thank you.
It’s what every artist does, it’s perfectly legal, and what those models do is actually even less infringing because they’re not directly looking at your picture of a giraffe and my picture of a zebra when drawing a zebra-striped giraffe, they’re doing it from memory. At least in the text2img case img2img is a different matter.
Smoogs@lemmy.world 1 year ago
Art takes effort. You’re not entitled to that for free.