I saw the videos of them running over infants in strollers. Does that count?
Comment on Elektrek: "Tesla FSD Beta tried to kill me last night"
Ocelot@lemmies.world 1 year ago
Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.
Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.
Hotdogman@lemmy.world 1 year ago
Ocelot@lemmies.world 1 year ago
on FSD? link please
CmdrShepard@lemmy.one 1 year ago
The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.
Ocelot@lemmies.world 1 year ago
Dan O’Dowd of Green Hill Software. You should listen to the podcast with whole mars catalog of him trying to explain himself. Its really wild.
Tesla took him to court and won
kinther@lemmy.world 1 year ago
Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…
Ocelot@lemmies.world 1 year ago
Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.
kinther@lemmy.world 1 year ago
But is it technically the user’s data, or is there some clause in Tesla car ownership that says it is Tesla the company’s data?
Forgive me I’m ignorant of the fine details. I purchased a Chevy Bolt but had been looking into a Tesla as an alternative until Elon tried to be the super-cool Twitter guy.
Ocelot@lemmies.world 1 year ago
It’s definitely the users data. There are a few tesla dashcam channels out there loaded with footage of other drivers acting like idiots.
DingoBilly@lemmy.world 1 year ago
Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.
Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.
Ocelot@lemmies.world 1 year ago
all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.
DingoBilly@lemmy.world 1 year ago
You were provided evidence and disregard it and make excuses for it. It’s hard to have a discussion if you just exclude all evidence for it.
Think of it another way, you’re saying there’s absolutely no way that FSD has ever failed in its publicly available software, even with hundreds of thousands of cars on the road? Use a logic test on yourself and ask if that’s realistic.
Kage520@lemmy.world 1 year ago
Fsd makes a TON of mistakes. I’ve had the beta from the first public release. I don’t trust it to do anything more than lane holding and cruise control, with maybe some supervised lane changes. But it’s a beta. I understand that I am helping to test beta software.
FSD in its current form should not be given to everyone. Tesla had it right when they gave it only to proven drivers (okay, it would have been better to test with paid employees, but I digress).
FSD right now is like handing the keys to your 15 year old child and going to sleep in the back while they drive you home.
CmdrShepard@lemmy.one 1 year ago
Can you point to this evidence as I don’t see it anywhere?
Also busting out a strawman argument one reply in to the discussion isn’t a good sign for the strength of your argument.
CaptainAniki@lemmy.flight-crew.org 1 year ago
[deleted]Ocelot@lemmies.world 1 year ago
Please let me know where I stated anything inaccurate.
drdabbles@lemmy.world 1 year ago
Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.
Ocelot@lemmies.world 1 year ago
Can you link a few? Something where FSD directly or indirectly causes an accident?
drdabbles@lemmy.world 1 year ago
You’re working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won’t. Hell, they performed a recall because it was driving through stops. Something it’ll still do, of course, but they performed a recall.
Astroturfed@lemmy.world 1 year ago
Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago… SOOOO safe, all the news stories of it killing people are fake!
Ocelot@lemmies.world 1 year ago
The recall was most definitely not for “driving through stops”. It was to fix the behavior of doing a “rolling stop”, which is something 99.5% of drivers do, which is how it learned to do that. Where do you see that it still does not make a complete stop at stop signs?
I’m not trying to remain in the dark here, I’m just presenting facts. I’m very open to change my mind on this situation entirely just give me the facts.
CmdrShepard@lemmy.one 1 year ago
FSD has only been out for less than 3 years.
drdabbles@lemmy.world 1 year ago
The first public release was much later than the smaller beta, which I had access to.
Astroturfed@lemmy.world 1 year ago
Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.
Ocelot@lemmies.world 1 year ago
not all accidents are that violent. I would even accept a simple fender bender. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?
Astroturfed@lemmy.world 1 year ago
Look man, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.
FlyingSquid@lemmy.world 1 year ago
Wait, are you now suggesting you won’t accept that Teslas ever get into accidents without video evidence?
LibertyLizard@slrpnk.net 1 year ago
Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.
Ocelot@lemmies.world 1 year ago
They do whatever gets them clicks. Facts do not matter.
drdabbles@lemmy.world 1 year ago
They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.
silvercove@lemdro.id 1 year ago
Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
Ocelot@lemmies.world 1 year ago
so can you provide a link of an accident caused by FSD?
naeemthm@lemmy.world 1 year ago
Your posts here show you’re not interested in reality, but I’ll leave a link anyway
motortrend.com/…/tesla-fsd-autopilot-crashes-inve…
Excited to see your response about how this is all user error.
Ocelot@lemmies.world 1 year ago
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software by trade with a lot of experience in cloud and AI technologies. I have been an FSD beta tester since late 2021 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
May 7, 2016: First known fatality involving Tesla’s Autopilot system
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
zeppo@lemmy.world 1 year ago
Musk just did a 20 minute video that ended with it trying to drive into traffic.
Ocelot@lemmies.world 1 year ago
this one? Where does it drive into traffic? youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
silvercove@lemdro.id 1 year ago
One of many many examples: businessinsider.com/tesla-stops-tunnel-pileup-acc…
Ocelot@lemmies.world 1 year ago
See my huge post about that very accident. Do you have any other “Many many examples”?