cross-posted from: lemmy.ml/post/35078393
AI has made the experience of language learners way shittier because now people will just call them AI.
Submitted 7 months ago by HiddenLayer555@lemmy.ml to showerthoughts@lemmy.world
cross-posted from: lemmy.ml/post/35078393
AI has made the experience of language learners way shittier because now people will just call them AI.
Why? Somebody learning a new language would make mistakes, an AI would make no mistakes.
I wouldn’t say no mistakes, just different types of mistakes.
Who fucking cares what other people think…
Every single person on earth. There’s nothing more inherently human than caring about what others think.
I get what you mean though - I also try not to let other people’s judgment affect what I do, but I’d be lying to myself if I claimed it doesn’t, and especially if I claimed I don’t care.
Anyone who has to communicate with other people.
So, everyone.
Exclusively if it’s people you care about or you depend on. Anyone else can royally fuck themselves with their opinions.
As a person who learned English as a second language, I would say probably not. If anything, a human’s grammar/conjugations might be off if they’re learning a new language. A machine, as others have pointed out, would have proper grammar but might be nonsensical.
The way LLM write compared to people who just learning the second language is probably gonna be significantly different, so i doubt there will be the case. Unless the person learn the second language from LLM then yeah.
It’s much more likely that we sound like children than we sound like an LLM.
Children aren’t obsessed with technically correct punctuation, whereas LLMs are. For example, they just love to use em dashes — like this.
Nah. If I’m learning a new language, I’m going to speak like a toddler at first. I’m more likely to be accused of that than an LLM capable of long paragraphs giving minimal accuracy about stuff
Nah, there’s a difference between chatbot and not knowing a language.
They’re predictive text, so grammar is actually one of it’s strong suits. The problem is the words don’t actually mean anything.
Someone online would likely spend a little time trying to understand it, run thru a translator, realized it was slop, and move on relatively quickly.
Flax_vert@feddit.uk 7 months ago
Wouldn’t someone make mistakes unlike AI?
x00z@lemmy.world 7 months ago
What if you use AI to translate some text because you can’t express yourself well enough yet?
Flax_vert@feddit.uk 7 months ago
I think translation machines have always used similar language models
Apytele@sh.itjust.works 7 months ago
faythofdragons@slrpnk.net 7 months ago
What if we had two circulatory systems, one for blood, and one for pesto?
Flax_vert@feddit.uk 7 months ago
Haven’t machine translators always been some form of language model? Pretty sure it was invented by training a system with two human translations of the same document/work