Comment on [deleted]

<- View Parent
Jomega@lemmy.world ⁨1⁩ ⁨year⁩ ago

I understand that AI is a complex program and not just pressing buttons. That’s not the issue I have with it. My issue is, what happens when the technology improves significantly? It’s my understanding that LLMs keep improving themselves by continuing to train on (often unethically) acquired data. In its present form, sure, maybe we don’t have to worry. But give it 10 years or so, how much more competent will it be?

Let’s look at just the film industry for a second. We already have a huge problem with Hollywood churning out franchise films at the expense of everything else. But even these cash cows are made via the vision of someone whose name is attached to it. Somebody got paid to write Halloween 36: The Final Halloween for Real This Time. That person may or may not have gave a shit about writing a good story, or they may have just wanted a paycheck. Either way, that paycheck could be used to fund something they care about much more. Once AI reaches the point where it could spit out a passable script, what incentive does Mr. Bigshot the Hollywood producer have to involve a writer at all? And because no writer is receiving a paycheck, less risks are taken in general, because risks don’t guarantee profit

I might just be letting my anxieties get the better of me, and I really hope I am. I just can’t seem to move past the bad feeling I’m getting from this.

source
Sort:hotnewtop