Which one gets priority?
The one that says there’s a danger.
Comment on Tesla Robotaxis Reportedly Crashing at a Rate That's 4x Higher Than Humans
halcyoncmdr@piefed.social 2 weeks agoI don’t think it’s necessarily about cost. They were removing sensors both before costs rose and supply became more limited with things like the tariffs.
Too many sensors also causes issues, adding more is not an easy fix. Sensor Fusion is a notoriously difficult part of robotics. It can help with edge cases and verification, but it can also exacerbate issues. Sensors will report different things at some point. Which one gets priority? Is a sensor failing or reporting inaccurate data? How do you determine what is inaccurate if the data is still within normal tolerances?
More on topic though… My question is why is the robotaxi accident rate different from the regular FSD rate? Ostensibly they should be nearly identical.
Which one gets priority?
The one that says there’s a danger.
Alright, so the radar is detecting a large object in front of the vehicle while travelling at highway speeds. The vision system can see the road is clear.
So with your assumption of listening to whatever says there’s an issue, it slams on the brakes to stop the car. But it’s actually an overpass, or overhead sign that the radar is reflecting back from while the road is clear. Now you have phantom braking.
Now extend that to a sensor or connection failure. The radar or a wiring harness is failing and sporadically reporting back close contacts that don’t exist. More phantom braking, and this time with no obvious cause.
Now you have phantom braking.
Phantom braking is better than Wyle E. Coyoteing a wall.
and this time with no obvious cause.
Again, better than not braking because another sensor says there’s nothing ahead. I would hope that flaky sensors is something that would cause the vehicle to show a “needs service” light or something. But, even without that, if your car is doing phantom braking, I’d hope you’d take it in.
But, consider your scenario without radar and with only a camera sensor. The vision system “can see the road is clear”, and there’s no radar sensor to tell it otherwise. Turns out the vision system is buggy, or the lens is broken, or the camera got knocked out of alignment, or whatever. Now it’s claiming the road ahead is clear when in fact there’s a train currently in the train crossing directly ahead. Boom, now you hit the train. I’d much prefer phantom breaking and having multiple sensors each trying to detect dangers ahead.
FYI, the fake wall was not reproducible on the latest hardware, that test was done on an older HW3 car, not the cars operating as robotaxi. The new hardware existed at the time, but he chose to use outdated software and hardware for the test.
NotMyOldRedditName@lemmy.world 2 weeks ago
Regular FSD rate has the driver (you) monitoring the car so there will be less accidents IF you properly stay attentive.
The FSD rides with a saftey monitor (passenger seat) had a button to stop the ride.
The driverless and no monitor cars have nothing.
So you get more accidents as you remove that supervision.
73ms@sopuli.xyz 2 weeks ago
The unsupervised cars are very unlikely to be involved in these crashes yet because according to Robotaxi tracker there was only a single one of those operational and only for the final week of January.
As you suggest there’s a difference in how much the monitor can really do about FSD misbehaving compared to a driver in the driver’s seat though. On the other hand they’re still forced to have the monitor behind the wheel in California so you wouldn’t expect a difference in accident rate based on that there, would be interesting to compare.
NotMyOldRedditName@lemmy.world 2 weeks ago
There are multiple unsupervised cars around now, it was only the 1 before earnings call, then a few days after earnings they came back and weren’t followed by chase cars. There’s a handful of videos over many days out there now if you want to watch any. The latest gaffe video I’ve seen is from last week where it drove into a construction zone that wasn’t blocked off.
I would still expect a difference between California and people like you and me using it.
My understanding is that in California, they’ve been told not to intervene unless necessary, but when someone like us is behind the steering wheel what we consider necessary is going to be different than what they’ve been told to consider necessary.
So we would likely intervene much sooner than the saftey driver in California, which would mean we were letting the car get into less situations we perceive to be be dicey.
73ms@sopuli.xyz 2 weeks ago
Yeah I seen that video and another where they went back and forth for an hour in a single unsupervised Tesla. One thing to note is that they are all geofenced to a single extremely limited route that spans about a 20 minute drive along Riverside Dr and S Lamar Blvd with the ability to drive on short sections of some of the crossing streets there, that’s it.