Comment on ChatGPT has a style over substance trick that seems to dupe people into thinking it's smart, researchers found

<- View Parent
sj_zero ⁨1⁩ ⁨year⁩ ago

When it gets stuff wrong though, it doesn't just get stuff wrong, it gets stuff completely made up. I've seen it create entire apis, I've seen it generate legal citations out of whole cloth and entire laws that don't exist. I've seen it very confidently tell me to write a command that clearly doesn't work and if it did then I wouldn't be asking a question.

But I don't think that the alternative to chat GPT would even be stackoverflow, it would be an expert. Given the choice between the two, you would definitely want an expert every time.

Sort:hotnewtop