Comment on Elektrek: "Tesla FSD Beta tried to kill me last night"
silvercove@lemdro.id 1 year agoAre you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
Comment on Elektrek: "Tesla FSD Beta tried to kill me last night"
silvercove@lemdro.id 1 year agoAre you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
Ocelot@lemmies.world 1 year ago
so can you provide a link of an accident caused by FSD?
naeemthm@lemmy.world 1 year ago
Your posts here show you’re not interested in reality, but I’ll leave a link anyway
motortrend.com/…/tesla-fsd-autopilot-crashes-inve…
Excited to see your response about how this is all user error.
Ocelot@lemmies.world 1 year ago
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software by trade with a lot of experience in cloud and AI technologies. I have been an FSD beta tester since late 2021 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
May 7, 2016: First known fatality involving Tesla’s Autopilot system
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
naeemthm@lemmy.world 1 year ago
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software” by trade isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”
caranddriver.com/…/report-tesla-autopilot-crashes…
You claim the timeline is important here and this is all post-2022.
zeppo@lemmy.world 1 year ago
Musk just did a 20 minute video that ended with it trying to drive into traffic.
Ocelot@lemmies.world 1 year ago
this one? Where does it drive into traffic? youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
zeppo@lemmy.world 1 year ago
The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.
PipedLinkBot@feddit.rocks [bot] 1 year ago
Here is an alternative Piped link(s): piped.video/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
silvercove@lemdro.id 1 year ago
One of many many examples: businessinsider.com/tesla-stops-tunnel-pileup-acc…
Ocelot@lemmies.world 1 year ago
See my huge post about that very accident. Do you have any other “Many many examples”?
silvercove@lemdro.id 1 year ago
Here is more: motortrend.com/…/tesla-fsd-autopilot-crashes-inve…
How many do you want?