Comment on Grisham, Martin join authors suing OpenAI: “There is nothing fair about this”
ryathal@sh.itjust.works 1 year agoAI training is taking facts which aren’t subject to copyright, not actual content that is subject to it. The original work or a derivative isn’t being distributed or copied. While it may be possible for a user to recreate a copyrighted material with sufficient prompting, the fact it’s possible isn’t any more relevant than for a copy machine. It’s the same as an aspiring author reading all of Martin’s work for inspiration. They can write a story based on a vaguely medieval England full of rape and murder, without paying Martin a dime. What they can’t do is call it Westeros, or have the main character be named Eddard Stork.
There may be an argument that a copy needs to be purchased to extract the facts, but that’s not any special license, a used copy of the book would be sufficient.
AI isn’t doing anything that hasn’t already been done by humans for hundreds of years, it’s just doing it faster.
BraveSirZaphod@kbin.social 1 year ago
Legally, I think you're basically right on.
I think what will eventually need to happen is society deciding whether this is actually the desired legal state of affairs or not. A pretty strong argument can be made that "just doing it faster" makes an enormous difference on the ultimate impact, such that it may be worth adjusting copyright law to explicitly prohibit AI creation of derivative works, training on copyrighted materials without consent, or some other kinds of restrictions.
I do somewhat fear that, in our continuous pursuit for endless amounts of convenient "content" and entertainment to distract ourselves from the real world, we'll essentially outsource human creativity to AI, and I don't love the idea of a future where no one is creating anything because it's impossible to make a living from it due to literally infinite competition from AI.
ryathal@sh.itjust.works 1 year ago
I think that fear is overblown, ai models are only as good as their training material. It still requires humans to create new content to keep models growing. Training ai on ai generated content doesn’t work out well.
Models aren’t good enough yet to actually fully create quality content. It’s also not clear that the ability for them to do so is imminent, maybe one day it will. Right now these tools are really onlyngood for assisting a creator in making drafts, or identifying weak parts of the story.
damndotcommie@lemmy.basedcount.com 1 year ago
Which is why I really hate the fact that they and the media had dubbed this “intelligence”. Bigger programs and more data doesn’t just automatically make something intelligent.
Zormat@lemmy.blahaj.zone 1 year ago
This is such a weird take imo. We’ve been calling agent behavior in video games AI since forever but suddenly everyone has an issue when it’s applied to LLMs.