Comment on Lawsuits test Tesla claim that drivers are solely responsible for crashes
autotldr@lemmings.world [bot] 6 months ago
This is the best summary I could come up with:
SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.
Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.
Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”
The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.
In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.
Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.
The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I’m a bot and I’m open source!
NeoNachtwaechter@lemmy.world 6 months ago
Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.
In case the software makes severe mistakes surprisingly, normal drivers maybe don’t have a chance to regain control. Normal drivers are not like educated test drivers.
hoshikarakitaridia@lemmy.world 6 months ago
My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
umami_wasbi@lemmy.ml 6 months ago
But does the driver have a reasonable chance with adequate timeframe to regain control?
Like what happened with Boeing 737 Max MCAS incident, expects the pilot to disengage the trim motor in mere 4 seconds, which accoriding to a pilot “a lot to ask in an overwheming situation” or something similar.
Normal people in soon-to-crash situation are likely to freeze for a second or two, and the fear kicks up. How the driver reacts next is hard to predict. Yet, at the speed most US drivers love to go (I saw 70+ mph on freeway is the norm), the time avalible for them to make an well thought out decision I guess is quite short.
hoshikarakitaridia@lemmy.world 6 months ago
You made me think about this for a second.
In my head, the reason is not specifically to punish the driver, but to make drivers always be aware and ready to take control again. Yes 100 ppl will have 1000 different ways to react to such a software error, but you need ppl to pay attention, and in law the only way is to use punishment. Obviously this needs to be well calculated but either you have multiple lines of defense (the software, the driver, maybe even additional safety features) or you have to remove the autonomous system.
NeoNachtwaechter@lemmy.world 6 months ago
Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn the steering wheel hard to the left.
You will have no chance.
What have you done wrong? What is it what you are accountable for?
AA5B@lemmy.world 6 months ago
For mine
So did the car think there was an impending collision? That should be obvious in the logs and the only reason for sudden maneuvers
AA5B@lemmy.world 6 months ago
The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.
If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)
That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance
NeoNachtwaechter@lemmy.world 6 months ago
No. That difference is meaningless, since both softwares provide autonomy level 2. The responsibilities are exactly the same.