I think you’ve misunderstood the purpose of a rubber duck: The point is that by formulating your problems and ideas, either out loud or in writing, you can better activate your own problem solving skills. This is a very well established method for reflecting on and solving problems when you’re stuck, it’s a concept far older than chatbots, because the point isn’t the response you get, but the process of formulating your own thoughts in the first place.
prole@lemmy.blahaj.zone 17 hours ago
Right, but a rubber duck isn’t a sycophantic chatbot that isn’t capable of conceptualizing anything.
thebestaquaman@lemmy.world 16 hours ago
That is correct. However, an LLM and a rubber duck have in common that they are inanimate objects that I can use as targets when formulating my thoughts and ideas. The LLM can also respond to things like “what part of that was unclear”, to help keep my thoughts flowing. NOTE: The point of asking an LLM “what part of that was unclear” is NOT that it has a qualified answer, but rather that it’s a completely unqualified prompt to explain a part of the process more thoroughly.
This is a very well established process: Whether you use an actual rubber duck, your dog, writing a blog post / personal memo (I do the last quite often) or explaining your problem to a friend that’s not at all in the field. The point is to have some kind of process that helps you keep your thoughts flowing and touching in on topics you might not think are crucial, thus helping you find a solution. The toddler that answers every explanation with “why?” can be ideal for this, and an LLM can emulate it quite well in a workplace environment.
prole@lemmy.blahaj.zone 16 hours ago
True, I remember seeing so many articles about computer engineers getting psychosis and killing themselves after talking to a toddler.
If you really cannot see the difference between what an LLM does, and the other processes you described, then I don’t know what to tell you. Good luck with the brain rot.
thebestaquaman@lemmy.world 16 hours ago
It really seems like you’re wilfully misinterpreting what I’m writing.