to be fair, fucking up maths problems is very human-like.
I wonder if it could also be trained on a great deal of mathematical axioms that are computer generated?
to be fair, fucking up maths problems is very human-like.
I wonder if it could also be trained on a great deal of mathematical axioms that are computer generated?
Cabrio@lemmy.world 1 year ago
It doesn’t calculate anything though. You ask chatgpt what is 5+5, and it tells you the most statistically likely response based on training data. Now we know there’s a lot of both moronic and intentionally belligerent answers on the Internet, so the statistical probability of it getting any mathematical equation correct goes down exponemtially with complexity and never even approaches 100% certainty even with the simplest equations because 1+1= window.
dbilitated@aussie.zone 1 year ago
i know it doesn’t calculate, that’s why I suggested having known correct calculations in the training data to offset noise in the signal?