“Some of you may die, but it’s a risk I’m willing to take.” - Lord Farquaad and Musk
Elektrek: "Tesla FSD Beta tried to kill me last night"
Submitted 1 year ago by silvercove@lemdro.id to technology@lemmy.world
https://electrek.co/2023/09/01/tesla-fsd-beta-tried-to-kill-me-last-night/
Comments
lucidinferno@lemmy.world 1 year ago
bezerker03@lemmy.bezzie.world 1 year ago
I mean… They opted into a beta. Beta means this may happen.
grue@lemmy.world 1 year ago
Even if those dipshits “opted in,” the rest of us sharing the road sure as Hell didn’t!
ours@lemmy.film 1 year ago
This isn’t just some email web app that may have a few bugs, it’s putting lives at risk on the road. They shouldn’t be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.
BCat70@lemmy.world 1 year ago
I guarantee that the other drivers on that road didn’t opt for a “beta”.
abcxyz@lemm.ee 1 year ago
I just can’t understand how regulators all over the world allow these things on the road. How the fuck do you allow the release of potentially deadly (for everyone involved, not just for the user) software en masse for the public to beta test for you… This is not Diablo IV…
foo@programming.dev 1 year ago
Beta only means buggy piece of shit to people who use software and then mostly gamers. In industries where prototypes can kill people a “beta” product is one that is safe for the intended use. For example, if you invented a new way to do internal scans of people, before you can even test it on humans you would have done extensive testing on animals to know what works, what doesn’t, and what gives them cancer, and have done the modelling to have a strong understanding on if it is safe with humans.
Nobody would tolerate a scanner that gave people cancer, oops
Blum0108@lemmy.world 1 year ago
Musquaad
Scotty_Trees@lemmy.world [bot] 1 year ago
“Run, run, run, as fast as you can. You can’t catch me. I’m the gingerbread man!”
FlyingSquid@lemmy.world 1 year ago
I’m not especially sympathetic to the Tesla drivers this might kill.
I’m worried about everyone else.
PsychedSy@sh.itjust.works 1 year ago
I consider the suicide attempts a feature. I’ll test for you, Tesla.
Asudox@lemmy.world 1 year ago
It shouldn’t have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What’s so hard about driving a real car manually? Did you all become fatass lazy people that don’t even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it’s amazing, but it isn’t as good as a human YET, thus causing life threatening accidents.
dufr@lemmy.world 1 year ago
It can’t be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.
echodot@feddit.uk 1 year ago
Self-driving cars are actually only legal in a few countries. And those countries have tests.
It’s only the United States that just lets anyone do what everyone earth it is that they want, even if it’s insanely dangerous.
Everywhere else any car company that’s espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they’re not going to come across difficult situations.
tony@lemmy.hoyle.me.uk 1 year ago
In its current state it has basically no chance IMO.
If they’d concentrated in making AP/Highway driving smarter first they might have got that through… there are already rules for that… but cities? I’d love to see the autonomous car that could drive through London or Manchester.
Ocelot@lemmies.world 1 year ago
Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.
Zummy@lemmy.world 1 year ago
The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.
Chocrates@lemmy.world 1 year ago
Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.
I dont know is Musk is responsible enough to be the one to get us there though.
sdf05@lemmy.world 1 year ago
This is like that show “Upload”; the guy literally gets killed by a car
FlyingSquid@lemmy.world 1 year ago
That was a really good show.
III@lemmy.world 1 year ago
You should finish watching that first episode before making such bold statements.
Ocelot@lemmies.world 1 year ago
I mean I think its still a valid point. The car in the show was sabotaged, and that is definitely something that might be a thing once all cars self-drive. Especially once they remove controls like steering wheels.
There hasn’t been a tesla FSD hack yet, but it would take spoofing a software update (and spoof the authentication and certs, etc)… The attacker would need to have access to a pretty massive supercomputer to make their own custom self-driving software and today getting the certs and everything right is next to impossible… but even then its only next to impossible, not impossible.
jabjoe@feddit.uk 1 year ago
Well hold on there, he survived the crash, and would probably have been ok. It was the upload that killed him.
sdf05@lemmy.world 1 year ago
Yeah, my bad 🤣 I meant the car technically endangered him to not live longer 😔
Ocelot@lemmies.world 1 year ago
Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.
Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.
silvercove@lemdro.id 1 year ago
Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.
Ocelot@lemmies.world 1 year ago
so can you provide a link of an accident caused by FSD?
Hotdogman@lemmy.world 1 year ago
I saw the videos of them running over infants in strollers. Does that count?
Ocelot@lemmies.world 1 year ago
on FSD? link please
CmdrShepard@lemmy.one 1 year ago
The ones from that guy who runs his own competing autonomous driving company who also refused to allow anyone else to perform the test with the car (which was all proven to be bullshit later because he was hitting the accelerator pedal)? There’s a lot of misinformation and FUD floating around out there.
kinther@lemmy.world 1 year ago
Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…
Ocelot@lemmies.world 1 year ago
Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.
DingoBilly@lemmy.world 1 year ago
Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.
Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.
Ocelot@lemmies.world 1 year ago
all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.
drdabbles@lemmy.world 1 year ago
Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.
Ocelot@lemmies.world 1 year ago
Can you link a few? Something where FSD directly or indirectly causes an accident?
CmdrShepard@lemmy.one 1 year ago
FSD has only been out for less than 3 years.
Astroturfed@lemmy.world 1 year ago
Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.
Ocelot@lemmies.world 1 year ago
not all accidents are that violent. I would even accept a simple fender bender. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?
LibertyLizard@slrpnk.net 1 year ago
Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.
Ocelot@lemmies.world 1 year ago
They do whatever gets them clicks. Facts do not matter.
drdabbles@lemmy.world 1 year ago
They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.
Mockrenocks@lemmy.world 1 year ago
Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. “Beta testing” a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.
fosforus@sopuli.xyz 1 year ago
I like my Tesla but there’s no way I’ll be switching that thing on.
megalodon@lemmy.world 1 year ago
FFS. He was testing a beta update at 73 miles per hour. Is he really expecting sympathy?
spezz@lemmynsfw.com 1 year ago
Maybe it shouldnt be released for real world use with such major bugs then. Dont give me the crap that iTs DiFfErEnT because tesla is a “technology company” either. Its a car, safety features on it should work damn near 100% of the time before it is released.
megalodon@lemmy.world 1 year ago
What crap am I giving? I’m just saying it’s a stupid idea to beta test self driving technology on the highway.
SomeRandomWords@lemmy.blahaj.zone 1 year ago
I thought all FSD updates were beta updates? Did I miss the announcement of FSD going GA and being stable?
If that’s the case, then yeah I probably wouldn’t test run a new update on the highway first. But I also have no idea if this issue happens at lower speeds as well.
megalodon@lemmy.world 1 year ago
Isn’t that the issue? He’s using something that’s still in beta on the highway.
coffeebiscuit@lemmy.world 1 year ago
Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.
zeppo@lemmy.world 1 year ago
What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?
Ocelot@lemmies.world 1 year ago
Have you seen how humans drive? Its not a very high bar to do better.
elxeno@lemm.ee 1 year ago
From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.
Nomad@infosec.pub 1 year ago
Funny how George Hotz of Comma.ai predicted this exact same issue years ago: “if I were Elon Musk I would not have shipped that lane change”.
This issue likely arises as the cars sensors can not look “far enough ahead” on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.
drdabbles@lemmy.world 1 year ago
Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.
Ocelot@lemmies.world 1 year ago
FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to “Navigate on autopilot” so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.
FSD beta is currently available in most of Europe and has been for several months.
MrSqueezles@lemm.ee 1 year ago
The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.
meco03211@lemmy.world 1 year ago
Not Auto Pilot (AP). There’s a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There’s also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go “on ramp to off ramp”. So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn’t need to touch it outside of answering the nag (the frequent nag to “apply force to the steering wheel” to tell it you are still alive and paying attention)*.
coffeebiscuit@lemmy.world 1 year ago
It’s the beta part that scares me the most, the type of assistance isn’t really relevant. People shouldn’t be driving around in betas. These aren’t phones.
ripcord@kbin.social 1 year ago
I'm not getting what this reply has to do with that you appear to have replied to