I wasn't asking about the car's logic algorithm; we all know that the SDC made an error, since it [checks notes] hit another car. We already know it didn't do the correct thing. I was asking how else you think the developers should be working on the software other than one thing at a time. That seemed like a weird criticism.
Comment on Waymo issued a recall after two robotaxis crashed into the same pickup truck
bstix@feddit.dk 9 months agoYou drive a car and can’t quite figure out what is happening in front of you.
Do you:
- A: Turn up the music and plow right through.
- B: Slow down (potentially to a full stop) and assess the situation.
- C : Slow down, close your eyes and continue driving slowly into the obstacle
- D: Sound the horn and flash the lights
From the description offered in the article the car chose C, which is wrong.
Chozo@kbin.social 9 months ago
bstix@feddit.dk 9 months ago
Sorry, I didn’t answer your question. Consider the following instead:
Your self driving car has crashed into a god damn tow truck with a backwards facing truck.
Do you:
- A: Program your car to deal differently with fucking backwards facing trucks on tow trucks
- B: Go back to question one and make your self driving car pass a simple theory test.
According to the article the company has chosen A, which is wrong.
HeyThisIsntTheYMCA@lemmy.world 9 months ago
I mean that’s machine learning for ya
lengau@midwest.social 9 months ago
Given the millions of global road deaths annually I think B is probably the least popular answer.
Tetsuo@jlai.lu 8 months ago
Honestly slowing down too much can easily create an accident that didn’t exist in the first place.
Not every situation can be handled by slowing down.
If that’s the default behavior on high speed road this could be deadly for the car behind you.