This, and it’s not a human. All these analogies trying to liken a learning algorithm to a learning human are not correct. An LLM is not a human.
Because it can be done fast, reliably and at scale.
cloudy1999@sh.itjust.works 1 year ago
Because it can be done fast, reliably and at scale.
This, and it’s not a human. All these analogies trying to liken a learning algorithm to a learning human are not correct. An LLM is not a human.
ArmokGoB@lemmy.dbzer0.com 1 year ago
Our entire society would collapse if we couldn’t do things fast, reliably, and at scale.
idiomaddict@feddit.de 1 year ago
Yes, but if “things” is replaced by scamming artists, that’s a shitty soxiety
ArmokGoB@lemmy.dbzer0.com 1 year ago
Artists aren’t being scammed. They’re being replaced by automated systems. It’s the same thing that happened to weavers and glassblowers. The issue isn’t that their job is being automated. It’s that people replaced by automation aren’t compensated. Blame the game, not the players.
gregorum@lemm.ee 1 year ago
It’s much closer to having glass blowing artists designs, perfectly replicated in an automated fashion, and at scale. I would argue that it is tantamount to being scammed.
BraveSirZaphod@kbin.social 1 year ago
I don't think it's a particularly mentally challenging concept to understand that we're not upset about the general concept of doing things at scale, and that it depends on what the thing in question is.
For instance, you'd probably not be terribly upset about me randomly approaching you on the street once - mildly annoyed at most. You'd probably be much more upset if I followed you around 24/7 every time you entered a public space and kept badgering you.
photonic_sorcerer@lemmy.dbzer0.com 1 year ago
Yes, but this is a new tool with new implications.