Yeah those also can’t think, and it will not change soon
The real problem though is not if LLM can think or not, it’s that people will interact with it as if it can, and will let it do the decision making even if it’s not far from throwing dice
Yeah those also can’t think, and it will not change soon
The real problem though is not if LLM can think or not, it’s that people will interact with it as if it can, and will let it do the decision making even if it’s not far from throwing dice
realitista@lemmus.org 2 days ago
I don’t think people primarily want to use it for decision making. For me it just turbocharges research, compiling stuff quickly from many sources, writes code for small modules quite well, generates images for presentations, etc, does more complex data munging from spreadsheets or even saved me a bunch of time taking a 50 page handwritten ledger and near perfectly converting it to excel… None of that requires decision making, but it saves a bunch of time. Honestly I’ve never asked it to make a decision so I have no idea how it would perform… I suspect it would more describe the pros and cons than actually try to decide something.