“it’s your fault that it just made up tools that don’t exist” is a bold statement, bro.
Comment on Lutris now being built with Claude AI, developer decides to hide it after backlash
CompassRed@discuss.tchncs.de 3 days agoThe symptoms you describe are caused by bad prompting. If an AI is providing over-complicated solutions, 9 times out of 10 it’s because you didn’t constrain your problem enough. If it’s referencing tools that don’t exist, then you either haven’t specified which tools are acceptable or you haven’t provided the context required for it to find the tools. You may also be wanting too much out of AI. You can’t expect it to do everything for you. You still have to do almost all the thinking and engineering if you want a quality project - the AI is just there to write the code. Sure, you can use an AI to help you learn how to be a better engineer, but AIs typically don’t make good high-level decisions. Treat AI like an intern, not like a principal engineer.
Bronzebeard@lemmy.zip 3 days ago
CompassRed@discuss.tchncs.de 3 days ago
No, it’s not. It doesn’t have intention. It’s literally just a tool. If you don’t get the results you expect with a tool when other people do get those results, then the problem isn’t the tool.
Bronzebeard@lemmy.zip 19 hours ago
If the tool can’t be consistent in it’s output, it’s not a reliable or worthwhile tool to use.
There is such a thing as a bad tool.
CompassRed@discuss.tchncs.de 19 hours ago
Good thing that’s not the case then.
Zos_Kia@jlai.lu 2 days ago
The junior analogy comes to mind. If you hire a fresh face and they ship code that doesn’t work, it’s definitely on you, bro.
oneofmany@lemmy.world 3 days ago
“It can’t be that stupid, you must be prompting it wrong.”
CompassRed@discuss.tchncs.de 3 days ago
It’s not about stupid or smart. It’s a tool, not a person. If you don’t get the same results that other people get with the same tool, then what could possibly be the problem other than how the person is using the tool?