Well, because 99% of the time, it’s fairly decent. That 1%'ll getchya tho.
Buffalox@lemmy.world 2 weeks ago
The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.
What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?
NikkiDimes@lemmy.world 1 week ago
ayyy@sh.itjust.works 1 week ago
To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.
Buffalox@lemmy.world 1 week ago
Many Tesla owners are definitely dead many times, on the inside.
NikkiDimes@lemmy.world 1 week ago
…It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)
echodot@feddit.uk 1 week ago
Even with the distances I drive and I barely drive my car anywhere since covid, I’d probably only last about a month before the damn thing killed me.
Even ignoring fatalities and injuries, I would still have to deal with the fact that my car randomly wrecked itself, which has to be a financial headache.
echodot@feddit.uk 1 week ago
That’s probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.
Let’s say that it’s only 0.01% risk, that’s still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.
It wouldn’t be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they’re never going to add lidar scanners so is literally never going to get any better it’s always going to be this bad.
FreedomAdvocate@lemmy.net.au 1 week ago
Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.
The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.
echodot@feddit.uk 1 week ago
Your saying this on a video where it drove into a tree and flipped over. There isn’t time for a human to react, that’s like saying we don’t need emergency stops on chainsaws, the operator needs to just not drop it.
FreedomAdvocate@lemmy.net.au 1 week ago
What false advertising? It’s called “Full Self Driving (Supervised)”.
Buffalox@lemmy.world 1 week ago
For many years the “supervised” was not included, AFAIK Tesla was forced to do that.
FreedomAdvocate@lemmy.net.au 1 week ago
The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.
SkyezOpen@lemmy.world 1 week ago
The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.
Buffalox@lemmy.world 1 week ago
No if you look at Waymo as an example, they are actually autonomous, and ask for assistance in situations they are “unsure” how to handle.
But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left almost into a tree.
echodot@feddit.uk 1 week ago
Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.
What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.
Buffalox@lemmy.world 1 week ago
I’ve argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.