Comment on What could go wrong?
theneverfox@pawb.social 1 day agoThis isn’t a new thing, people have gone off alone into this kind of nonsensical journey for a while now
The time cube guy comes to mind
There’s also temple OS written in holy C, he was close to some of the stuff in the article
And these are just two people functional and loud enough to be heard. This is a thing that happens, maybe LLMs exacerbate a pre existing condition, but people have been going off the deep end like this long before LLMs came into the picture
DeceasedPassenger@lemmy.world 1 day ago
Your point is not only valid but also significant, and I feel stands in addition, not contradiction, to my point. These people now have something to continuously bounce ideas off; a conversational partner that never says no. A perpetual yes-man. The models are heavily biased towards the positive simply by nature of what they are, predicting what comes next. You (may or may not) know how in improv acting there’s a saying called “yes, and” which serves to keep things always moving forward. These models exist in this state, in perpetuity.
Previously, people who have ideas such as these will experience near-universal rejection from those around them (if they don’t have charisma in which case they start a cult) which results in a (relatively, imo) small number of extreme cases. I fear the presence of such a perpetual yes-man will only accelerate all kinds of damage that can emerge from nonsensical thinking.
theneverfox@pawb.social 17 hours ago
I agree, it’s certainly not going to help people losing touch. But that’s not what worries me - that’s a small slice of the population, and models are beginning to get better at rejection/assertion
What I’m more worried about is the people who are using it almost codependently to make decisions. It’s always there, it’ll always give you advice. Usually it’s somewhat decent advice, even. And it’s a normal thing to talk through decisions with anyone
The problem is people are offloading their thinking to AI. It’s always there, it’s always patient with you… You can literally have it make every life decision for you.
It’s not emotional connection or malicious AI I worry about… You can now walk around with a magic eight ball that can guide you through life reasonably well, and people are starting to trust it above their own judgement