Tesla recalls nearly all vehicles sold in US to fix system that monitors drivers using Autopilot::Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system when using Autopilot.
I live in a major metropolitan area, drive a model 3, and almost never use autopilot.
I am lucky enough to rarely be in stop-and-go traffic, but when I am, I don’t even use cruise control, because it’s too reactive to the car in front of me and subsequently too jerky for my preference.
As for autopilot, I was on a relatively unpopulated freeway in the second lane from the right, when a small truck came around a clover leaf to merge into the right lane next to me. My car flipped out and slammed on the breaks. The truck wasn’t even coming into my lane; he was just merging. Thankfully there was a large gap behind me, and I was also paying enough attention to immediately jam on the accelerator to counteract it, but it spooked me pretty badly. And this was on a road that it’s designed for.
Autopilot (much less FSD) can’t really think like our brains can. It can only “see” so far ahead and behind. It can’t look at another driver’s behavior and make assessments that they might be distracted or or drunk. We’re not there yet. We’re FAR from there.
givesomefucks@lemmy.world 11 months ago
This is the important part, it’s not just Tesla not giving a shit about their customers safety, it’s dangerous to literally anyone in the same road as a Tesla
CmdrShepard@lemmy.one 11 months ago
This case went to trial already and the jury sided with Tesla because the driver was holding his foot on the accelerator to override cruise control, ignored the vehicles warnings, drove through numerous flashing lights, crashed through a stop sign, and then hit the couple, later stating “I expect to be the driver and be responsible for this… I was highly aware that was still my responsibility to operate the vehicle safely." Any vehicle is capable of doing this with a reckless driver behind the wheel.
givesomefucks@lemmy.world 11 months ago
You may be confused, there’s a lot of Tesla court cases to keep track of tho, I should have been specific.
arstechnica.com/…/tesla-fights-autopilot-false-ad…
DreadPotato@sopuli.xyz 11 months ago
Idiot drivers do idiot things, hardly unique to Tesla drivers. The whole autopilot feature-set is pretty clearly marked as beta on the screen with a clear warning, that you have to acknowledge to enable, that it is considered a beta-feature and that you should be attentive when using it.
I agree that the FSD feature-set is advertised with capabilities it in no way possesses, but everyone seems to forget that autopilot and FSD are two separate things. Autopilot is only TACC+Lane-Assist and isn’t advertised as a fully self driving technology.
givesomefucks@lemmy.world 11 months ago
Because Tesla made it as confusing as possible on purpose and misleads consumers…
Which is why they’re getting sued about it as I type this…
Gork@lemm.ee 11 months ago
The system should nevertheless be designed to handle all types of road and weather conditions. It’s a safety-related system. To not do so, regardless of the reason (probably cost savings) is negligence on the part of Tesla.
AdamEatsAss@lemmy.world 11 months ago
You would think a road with few drivers would be easier for the autopilot? But maybe the road lacked lines and marking? But wouldn’t you want the car to default to human control not just keep going? Any car I’ve had with lane keep assist turns off of it can’t find the lines. It’s a pretty simple failsafe. Rather have the driver a little annoyed than injured.
luthis@lemmy.nz 11 months ago
TIL