Comment on A courts reporter wrote about a few trials. Then an AI decided he was actually the culprit.

<- View Parent
vrighter@discuss.tchncs.de ⁨3⁩ ⁨days⁩ ago

here’s that same conversation with a human:

“why is X?” “because y!” “you’re wrong” “then why the hell did you ask me for if you already know the answer?”

What you’re describing will train the network to get the wrong answer and then apologize better. It won’t train it to get the right answer

source
Sort:hotnewtop