You know what the difference is, trying to act otherwise is just being obtuse.
How is this different from a human doing an impersonation?
Laticauda@lemmy.ca 9 months ago
TheGrandNagus@lemmy.world 9 months ago
Can you seriously not answer that question yourself?
tillimarleen@feddit.de 9 months ago
well, you seem to have trouble doing it
stopthatgirl7@kbin.social 9 months ago
You could say it’s not, which means in US law at least, it’s settled and they could be sued.
echodot@feddit.uk 9 months ago
There was a difference between complete duplication and impersonation for the purposes of satire.
Fordiman@programming.dev 9 months ago
Largely? The lack of convincing emotional range.
RizzRustbolt@lemmy.world 9 months ago
Can’t fake timbre.
photonic_sorcerer@lemmy.dbzer0.com 9 months ago
Because it can be done fast, reliably and at scale.
ArmokGoB@lemmy.dbzer0.com 9 months ago
Our entire society would collapse if we couldn’t do things fast, reliably, and at scale.
idiomaddict@feddit.de 9 months ago
Yes, but if “things” is replaced by scamming artists, that’s a shitty soxiety
ArmokGoB@lemmy.dbzer0.com 9 months ago
Artists aren’t being scammed. They’re being replaced by automated systems. It’s the same thing that happened to weavers and glassblowers. The issue isn’t that their job is being automated. It’s that people replaced by automation aren’t compensated. Blame the game, not the players.
BraveSirZaphod@kbin.social 9 months ago
I don't think it's a particularly mentally challenging concept to understand that we're not upset about the general concept of doing things at scale, and that it depends on what the thing in question is.
For instance, you'd probably not be terribly upset about me randomly approaching you on the street once - mildly annoyed at most. You'd probably be much more upset if I followed you around 24/7 every time you entered a public space and kept badgering you.
photonic_sorcerer@lemmy.dbzer0.com 9 months ago
Yes, but this is a new tool with new implications.
cloudy1999@sh.itjust.works 9 months ago
This, and it’s not a human. All these analogies trying to liken a learning algorithm to a learning human are not correct. An LLM is not a human.