How does it “know” anything? Same for the Kirb Meister.
Comment on Users dodge A.I. filters to generate pictures of pop-culture characters performing terrorism
Fisk400@feddit.nu 1 year ago
How does the AIs even know what a Kirby is? Kirby is copyrighted and I doubt Microsoft got permission from Nintendo to include images of Kirby in their training data.
I_Clean_Here@lemmy.world 1 year ago
Fisk400@feddit.nu 1 year ago
Yes, that is the problem that I am highlighting with the question.
Also the people generating this isn’t writing “Kirby doing 9/11”. They are writing “Kirby sitting in a cockpit with two skyscrapers outside the window”
There are stock photos of all those things that you can buy but there is no legal stock photo assets of Kirby.
bioemerl@kbin.social 1 year ago
You shouldn't need permission to include images in training data.
Fisk400@feddit.nu 1 year ago
You should if you plan on making money on it. I assume that Microsoft does.
bioemerl@kbin.social 1 year ago
You should have to actually pay for works that cost money, but authors don't have the right to nitpick and five tune the ways that their stuff is used so long as the uses aren't copying and redistributing their work.
Authors have tried to use excessive control of copyright many times and get shot down every single time with things like web scraping and search engines. Demanding payment for AI training specifically is a massive grift and overreach from a small group of people trying to hold back progress so they can make a quick buck.
Fisk400@feddit.nu 1 year ago
It’s fun to see that the only way for AI tech bros to approach the copyright problem is to claim that it’s a Jewish conspiracy that shouldn’t exist.
EternalNicodemus@lemmy.world 1 year ago
Based as hell ngl
vector_zero@lemmy.world 1 year ago
Why not? If you feed the entirety of a given IP (say, every frame in every Star Wars film or show), you could train an AI to produce imagery derived exclusively from copyrighted material.
NounsAndWords@lemmy.world 1 year ago
I think the final product and the ideas and concepts that it holds are the important aspect for copyright.
If I cut up a Star Wars poster into 1,000,000 tiny pieces, and then reassemble them into a self portrait with no reference whatsoever to Star Wars and sell it, would I have committed copyright infringement?
If I did the same thing but made a stormtrooper out of the pieces, is the copyright issue with the source material, or the final product?
Flaky@iusearchlinux.fyi 1 year ago
You raise a good point, but I really don’t care about Nintendo’s feelings.
Whirling_Cloudburst@lemmy.world 1 year ago
This is probably the most valid concern really. I remember people drawing Yogi the Bear wearing a gestapo uniform and other such absurdities back in high school art class. The only thing that has changed is that you don’t need the same level of skills to be an attention seeker in a place where the public can be found.
I’m more concerned about people making convincing false narratives on cable news than fictional characters doing things that are in very poor taste.
Hubi@feddit.de 1 year ago
DallE was intentionally trained to include a ton of pop culture and celebrities. Copyright laws are not ready to deal with these scenarios and the current approach of most companies is to offload the legal responsibility for generated images to their users.
Zoldyck@lemmy.world 1 year ago
Unofficial fan-made things perhaps?
Fisk400@feddit.nu 1 year ago
Nintendo still owns the trademark on those and the creator owns the copyright.
BreadstickNinja@lemmy.world 1 year ago
Copyright law does not explicitly cover inclusion in a training data set, though that will be tested by a number of court cases currently underway.
Copyright historically is exactly what it says in its name: the right to copy or reproduce an image for commercial purposes. Because an AI doesn’t reproduce the images in its training data set, and because AI generation models do not include the image data of their training set, it’s not explicitly covered.
My personal opinion is that copyright law would need to be updated to cover the training data case, but the courts could circumvent that and declare it covered under existing law. That would be based on a misunderstanding of how image generation works, but courts don’t always necessarily act based on technical understanding.
Buttons@programming.dev 1 year ago
And if the courts rule in a way that limits US company’s ability to train AIs, then I hope they practice their shocked Picachu faces for when people start using AIs created by countries that don’t care about US law.