Don’t bother reading this idiot’s stupid wall of text.
Comment on Former Meta exec says asking for artist permission will kill AI industry
Even_Adder@lemmy.dbzer0.com 5 days agoBut the law is largely the reverse. It only denies use of copyright works in certain ways. Using things “without permission” forms the bedrock on which artistic expression and free speech are built upon.
AI training isn’t only for mega-corporations. Setting up barriers that only benefit the ultra-wealthy will only end with corporations gaining a monopoly of a public technology by making it prohibitively expensive and cumbersome for regular folks. What the people writing this article want would mean the end of open access to competitive, corporate-independent tools and would jeopardize research, reviews, reverse engineering, and even indexing information. They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.
I recommend reading this article by Kit Walsh, and this one by Tory Noble staff attorneys at the EFF, this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries, and these two by Cory Doctorow.
deur@feddit.nl 5 days ago
despoticruin@lemm.ee 5 days ago
Yeah, anyone who thinks stealing content explicitly for financial gain is fair use needs their head checked.
tane6@lemm.ee 4 days ago
Insanely weak they deleted this
tane6@lemm.ee 4 days ago
The fact that this is upvoted is so funny but unsurprising given the types who visit this site
ICastFist@programming.dev 4 days ago
Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use
Instead of charging for everything they scrap, law should force them to release all their data and training sets for free. “But they spent money and time and resources!” So did everyone who created the stuff they’re using for their training, so they can fuck off.
The article by Tory also says these things:
I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.
Even_Adder@lemmy.dbzer0.com 4 days ago
You can plagiarize with a computer with copy & paste too. That doesn’t change the fact that computers have legitimate non-infringing use cases.
I agree
But 99.9% of the internet is stuff that no one would miss. Things don’t have to have value to you to be worth having around. That trash could serve as inspiration for your 0.1% of people or garner feedback for people to improve.
ICastFist@programming.dev 4 days ago
The apparent main use for AI thus far is spam and scam, which is what I was thinking about when dismissing most content made with that. While the internet was already chock full of that before AI, its availability is increasing those problems tenfold
Yes, people use it for other things, like “art”, but most people using it for “art” are trying to get a quick buck ASAP before customers get too smart to fall for it. Writers already had a hard time getting around, now they have to deal with a never ending deluge of AI books, plus the risk of a legally distinct enough copy of their work showing up the next day.
Put it another way, the major use of AI thus far is “i want to make money without effort”
Even_Adder@lemmy.dbzer0.com 4 days ago
It definitely seems that way depending on what media you choose to consume. You should try to balance the doomer scroll with actual research and open source news.
skulblaka@sh.itjust.works 4 days ago
I don’t really disagree with your other two points, but
They sure do, of which that is not one. That’s de facto copyright infringement or plagiarism. Especially if you then turn around and sell that product.
8uurg@lemmy.world 4 days ago
The key point that is being made is that it you are doing de facto copyright infringement of plagiarism by creating a copy, it shouldn’t matter whether that copy was made though copy paste, re-compressing the same image, or by using AI model. The product being the copy paste operation, the image editor or the AI model here, not the (copyrighted) image itself. You can still sell computers with copy paste (despite some attempts from large copyright holders with DRM), and you can still sell image editors.
However, unlike copy paste and the image editor, the AI model could memorize and emit training data, without the input data implying the copyrighted work. (exclude the case where the image was provided itself, or a highly detailed description describing the work was provided, as in this case it would clearly be the user that is at fault, and intending for this to happen)
At the same time, it should be noted that exact replication of training data isn’t exactly desirable in any case, and online services for image generation could include a image similarity check against training data, and many probably do this already.