They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation
Comment on Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks
spankmonkey@lemmy.world 13 hours agoTesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.
AA5B@lemmy.world 12 hours ago
ayyy@sh.itjust.works 11 hours ago
It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.
BlueLineBae@midwest.social 13 hours ago
I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, it would on while it drove on the tracks which I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.
meco03211@lemmy.world 13 hours ago
Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can’t imagine a scenario that wouldn’t be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to “blame it on the driver”. What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don’t see that being a possibility.
pixeltree@lemmy.blahaj.zone 12 hours ago
Of course they know, they’re using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they’ve been shown to abuse it in the past
AA5B@lemmy.world 12 hours ago
They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault
SoleInvictus@lemmy.blahaj.zone 13 hours ago
That would require their self driving algorithm to actually detect an accident. I doubt it’s capable of doing so consistently.
spankmonkey@lemmy.world 13 hours ago
On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn’t a drop down of the same depth as the rails. Someone who is caught off guard isn’t going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn’t really available.
So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren’t fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.
ayyy@sh.itjust.works 11 hours ago
If you’re about to be hit by a train, driving forward through the barrier is always the correct choice. It will move out of the way and you stay alive to fix the scratches in your paint.
spankmonkey@lemmy.world 11 hours ago
Maybe you should read the article.
roguetrick@lemmy.world 12 hours ago
I guess I’m a train now.