Comment on Ai Code Commits
spankmonkey@lemmy.world 3 days agoRight now they do between a combination of extra oversight, generally travelling at slow speeds, and being resticted in area. Kind of like how children are less likely to die in a swimming pool with lifeguards compared to rivers and beaches without lifeguards.
Once they are released into the wild I expect a number of high profile deaths, but also assume that those fatalities will be significantly lower than the human average due to being set to be overly cautious. I do expect them to have a high rate of low speed collisions when they encounter confusing or absent road markings in rural areas.
MangoCats@feddit.it 3 days ago
Not self driving but “driver assist” on a rental we had recently would see skid marks on the road and swerve to follow them - every single time. That’s going to be a difference between the automated systems and human drivers - humans do some horrifically negligent and terrible things, but… most humans tend not to repeat the same mistake too many times.
With “the algorithm” controlling thousands or millions of vehicles, when somebody finds a hack that causes one to crash, they’ve got a hack that will cause all similar ones to crash. I doubt we’re anywhere near “safe” learn from their mistakes self-recoding on these systems yet, that has the potential for even worse and less predictable outcomes.
JordanZ@lemmy.world 3 days ago
Image
Watched a Tesla do weird things at this intersection because the lines are painted erroneously. It stopped way back from where a sane person would in the left turn lane. I can only presume it was because the car in the center had their tires ‘over the line’ even though it’s a messed up line. There is plenty of room but it got confused and just stopped like the full ~3 car lengths back from the light where the road is narrower because of the messed up line.