Comment on Elektrek: "Tesla FSD Beta tried to kill me last night"

<- View Parent
Ocelot@lemmies.world ⁨1⁩ ⁨year⁩ ago

I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.

First, a little about me. I am a software by trade with a lot of experience in cloud and AI technologies. I have been an FSD beta tester since late 2021 with tens of thousands of incident-free miles logged on it.

I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.

I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.

The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.

Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.

OK, now that being said, lets dig in:

November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge

April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet

February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety

December 6, 2021: Tesla accused of faking 2016 Full Self Driving video

March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car

June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck

March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes

May 7, 2016: First known fatality involving Tesla’s Autopilot system

So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.

source
Sort:hotnewtop