Exactly, there are blatant examples of direct plagiarism spat out by these LLMs.
Comment on The Irony of 'You Wouldn't Download a Car' Making a Comeback in AI Debates
Floey@lemm.ee 2 months ago
While I agree that using copyrighted material to train your model is not theft, text that model produces can very much be plagiarism and OpenAI should be on the hook when it occurs.
overload@sopuli.xyz 2 months ago
freeman@sh.itjust.works 2 months ago
Operating system have been used to commit copyright infringement much more effectively and massively by copying copyrighted material verbatim.
OS vendors are not liable, the people who make and distribute the copies are. The same applies for Word processors, image editors etc.
You are for a massive expansion on the scope of copyright limiting the freedoms of the general public not just AI corps or tech corps.
protist@mander.xyz 2 months ago
Using your logic, the one making the copy in a word processor is the person typing, and the one making the copy in a LLM is OpenAI
freeman@sh.itjust.works 2 months ago
Nope. The output is based on the users input in both cases.
protist@mander.xyz 2 months ago
No, the output in a word processor is just explicitly user input, whereas the output in a LLM is based on user input and the training data OpenSI scraped.
leftzero@lemmynsfw.com 2 months ago
OS vendors aren’t selling¹ what users copy into the clipboard.
¹ Well, Microsoft probably is, especially with that recall bullshit, and I don’t trust Google and Apple not to do it either… but if any of them is doing it they should get fined into bankruptcy.
freeman@sh.itjust.works 2 months ago
Neither are AI vendors. We have locally hosted AI models and they don’t contain what they output. You can tell by the actual size.
Floey@lemm.ee 2 months ago
Those analogies don’t make any sense.
Anyway, as a publisher, if I cannot get OpenAI/ChatGPT to sign an indemnity agreement where they are at fault for plagiarism then their tool is effectively useless because it is really hard to determine something in not plagiarism. That makes ChatGPT pretty sus to use for creatives. So who is going to pay for it?
freeman@sh.itjust.works 2 months ago
Yes they do.
Which is why you want an agreement to make them liable for copyright infringement (plagiarism is not a crime itself).
You would have to pay for distributing copyright infringing material whether created by AI or humans or just straight up copied.
I don’t care if AI will be used,commercially or otherwise.
I am worried about further limitations being placed upon the general public (not “creatives”/publishers/AI corps) either by reinterpretation of existing laws, amendment of existing laws or legislation of brand new rights (for copyright holders/creators, not the general public).
I don’t even care who wins, the “creatives” or tech/AI, just that we don’t get further shafted.
Floey@lemm.ee 2 months ago
Something like Microsoft Word or Paint is not generative.
It is standard for publishers to make indemnity agreements with creatives who produce for them, because like I said, it’s kinda difficult to prove plagiarism in the negative so a publisher doesn’t want to take the risk of distributing works where originality cannot be verified.
I’m not arguing that we should change any laws, just that people should not use these tools for commercial purposes if the producers of these tools will not take liability, because if they refuse to do so their tools are very risky to use.
I don’t see how my position affects the general public not using these tools, it’s purely about the relationship between creatives and publishers using AI tools and what they should expect and demand.