Comment on Tesla is banned from driving schools because of new turn signals
Voroxpete@sh.itjust.works 10 months agoIt regularly kills people. It can’t be used on a lot of road types (but people still do because Tesla makes no effort to prevent it). It’s still marketed as Full Self Driving despite the fact that Tesla has stated on the record that it is, and I quote, “Not capable of driving itself.”
They’re trying to have their cake and eat it too. Any time it benefits them, they claim that their cars are completely autonomous vehicles powered by the most advanced AI. Any time they get their wrists slapped, they claim that it’s an assistive feature like cruise control that cannot and will not ever replace the human behind the wheel.
psud@lemmy.world 10 months ago
Could you link an article saying so? I couldn’t find anything with a quick google search about people being killed by Tesla FSD
Zink@programming.dev 10 months ago
Maybe search for killed while on autopilot?
psud@lemmy.world 10 months ago
That’s all the people who were asleep on the highway or driving at very high speed in town
The recent versions don’t allow either of those behaviours now, so those crashes aren’t happening anymore.
And the deaths I’m interested in are these ones being caused by FSD, not lane keeping and cruise control. Loads of brands do lane keeping and cruise control and implement it no better than Tesla
Zink@programming.dev 10 months ago
But does FSD change the logic for the lane keeping and the speed & distance?
Aren’t one of the features “navigate on autopilot?”
NotMyOldRedditName@lemmy.world 10 months ago
Just keep in mind that FSD is only as safe as they claim because it’s supervised.
I would hope that even a reasonably working system would be better with a human vigilantly watching it than a human driving regularly.
The system would have to be really bad to be worse than that.
Voroxpete@sh.itjust.works 10 months ago
I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.
As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.
The point is that when it does go wrong, it often goes spectacularly wrong, such as this case where a Tesla plowed into a truck or this thankfully low speed example of a very confused Tesla driving into oncoming traffic.
Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?
A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.