Musk just did a 20 minute video that ended with it trying to drive into traffic.
Comment on Elektrek: "Tesla FSD Beta tried to kill me last night"
Ocelot@lemmies.world 1 year agoso can you provide a link of an accident caused by FSD?
zeppo@lemmy.world 1 year ago
Ocelot@lemmies.world 1 year ago
this one? Where does it drive into traffic? youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
zeppo@lemmy.world 1 year ago
The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.
Ocelot@lemmies.world 1 year ago
The video didn’t end there, it was at the beginning. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…
This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.
PipedLinkBot@feddit.rocks [bot] 1 year ago
Here is an alternative Piped link(s): piped.video/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
silvercove@lemdro.id 1 year ago
One of many many examples: businessinsider.com/tesla-stops-tunnel-pileup-acc…
Ocelot@lemmies.world 1 year ago
See my huge post about that very accident. Do you have any other “Many many examples”?
silvercove@lemdro.id 1 year ago
Here is more: motortrend.com/…/tesla-fsd-autopilot-crashes-inve…
How many do you want?
Ocelot@lemmies.world 1 year ago
ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.
naeemthm@lemmy.world 1 year ago
Your posts here show you’re not interested in reality, but I’ll leave a link anyway
motortrend.com/…/tesla-fsd-autopilot-crashes-inve…
Excited to see your response about how this is all user error.
Ocelot@lemmies.world 1 year ago
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software by trade with a lot of experience in cloud and AI technologies. I have been an FSD beta tester since late 2021 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
May 7, 2016: First known fatality involving Tesla’s Autopilot system
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
naeemthm@lemmy.world 1 year ago
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software” by trade isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”
caranddriver.com/…/report-tesla-autopilot-crashes…
You claim the timeline is important here and this is all post-2022.
Ocelot@lemmies.world 1 year ago
Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.
Again, Teslas come with a factory installed 360 dashcam. Where are all of the videos of these FSD related incidents?
adeoxymus@lemmy.world 1 year ago
Tbh the other side is also anecdotal. There’s no stats here.
CmdrShepard@lemmy.one 1 year ago
What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.
I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.