Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it’s too late, money already in the bank.
Tesla’s Autopilot and Full Self-Driving linked to hundreds of crashes, dozens of deaths
Submitted 6 months ago by jorge@feddit.cl to technology@lemmy.world
Comments
tearsintherain@leminal.space 6 months ago
over_clox@lemmy.world 6 months ago
They just recalled all the Cybertrucks, because their ‘smort’ technology is too stupid to realize when an accelerator sensor is stuck…
simplejack@lemmy.world 6 months ago
The accelerator sensor get stuck, petal does. The face of the accelerator off and wedges the petal into the down position.
Granite@kbin.social 6 months ago
Pedal, not petal.
Not trying to be an asshole, just a nudge to avoid misunderstandings (although the context is clear in this case)
over_clox@lemmy.world 6 months ago
I realize it’s the pedal that gets stuck, but the computer registers the state of the pedal via a sensor.
The computer should be smart enough to realize something ain’t right when it registers that both the accelerator and brake pedals are being pressed at the same time. And in that case, the brake should always take priority.
TypicalHog@lemm.ee 6 months ago
It only matters if the autopilot does more kills than an average human driver on the same distance traveled.
NIB@lemmy.world 6 months ago
If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.
Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.
Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.
TypicalHog@lemm.ee 6 months ago
What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots. Also, this article is not even about camera only.
Geobloke@lemm.ee 6 months ago
No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them
PresidentCamacho@lemm.ee 6 months ago
This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.
gallopingsnail@lemmy.sdf.org 6 months ago
Tesla’s self driving appears to be less safe and causes more accidents than their competitors.
“NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”
iamtrashman1312@lemmy.world 6 months ago
So your stance is literally “human lives are a worthy sacrifice for this endeavor”
doubtingtammy@lemmy.ml 6 months ago
It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.
And where’s the LIDAR again?
Tja@programming.dev 6 months ago
But… Panel gaps!
mortemtyrannis@lemmy.ml 6 months ago
Knock knock
“Who is it?”
“Goons”
“Hired Goons”
TypicalHog@lemm.ee 6 months ago
Let them, people are dumb!
mojofrododojo@lemmy.world 6 months ago
this is bullshit.
A human can be held accountable for their failure, bet you a fucking emerald mine Musk won’t be held accountable for these and all the other fool self drive fuckups.
sabin@lemmy.world 6 months ago
So you’d rather live in a world where people die more often, just so you can punish the people who do the killing?
TypicalHog@lemm.ee 6 months ago
Where did I say that a human shouldn’t be held accountable for what their car does?
SirEDCaLot@lemmy.today 6 months ago
This is 100% correct. Look at the average rate of crashes per mile driven with autopilot versus a human. If the autopilot number is lower, they’re doing it right and should be rewarded and NHTSA should leave them be. If the autopilot number is higher, then yes by all means bring in the regulation or whatever.
flerp@lemm.ee 6 months ago
Humans are extremely flawed beings and if your standard for leaving companies alone to make as much money as possible is that they are at least minimally better than extremely flawed, I don’t want to live in the same world as you want to live in.
kava@lemmy.world 6 months ago
Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount
I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.
Because while it’s clear by now Teslas aren’t the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.
We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.
Blackmist@feddit.uk 6 months ago
The question isn’t “are they safer than the average human driver?”
The question is “who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?”
Because if the answer is “nobody”, they shouldn’t be on the road. There’s zero accountability, and because it’s all wibbly-wobbly AI bullshit, there’s no way to prove that the issues are actually fixed.
Trollception@lemmy.world 6 months ago
So it’s better to put more lives in danger so that there can be someone to blame?
dream_weasel@sh.itjust.works 6 months ago
The answer is the person behind the wheel.
Tesla makes it very clear to the driver they you still have to pay attention and be ready to take over any time. Full self driving engages the in cabin nanny cam to enforce that you pay attention, above and beyond the frequent reminders to apply turning force to the steering wheel.
Now, once Tesla goes Mercedes and says you don’t have to pay attention, it’s gonna be the company that should step in. I know that’s a big old SHOULD, but right now that’s not the situation anyway.
ipkpjersi@lemmy.ml 6 months ago
The question isn’t “are they safer than the average human driver?”
How is that not the question? That absolutely is the question. Just because someone is accountable for your death doesn’t mean you aren’t already dead, it doesn’t bring you back to life. If the death rate for self-driving vehicles is really that much lower, you are risking your life that much more by trusting in human drivers.
kava@lemmy.world 6 months ago
Because if the answer is “nobody”, they shouldn’t be on the road
Do you understand how absurd this is? Let’s say AI driving results in 50% less deaths. That’s 20,000 people every year that isn’t going to die.
And you reject that for what? Accountability? You said in another comment that you don’t want “shit happens sometimes” on your headstone.
You do realize that’s exactly what’s going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. “Shit happens”
By not changing it, ironically, you’re advocating for exactly what you claim you’re against.
Maddier1993@programming.dev 6 months ago
I don’t agree with your argument.
Making a human go to prison for wiping out a family of 4 isn’t going to bring back the family of 4. So you’re just using deterrence to hopefully make drivers more cautious.
Yet, year after year… humans cause more deaths by negligence than tools can cause by failing.
The question is definitely “How much safer are they compared to human drivers”
It’s also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.
John_McMurray@lemmy.world 6 months ago
The driver. Your whole statement is a total straw man.
slumberlust@lemmy.world 6 months ago
The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.
TypicalHog@lemm.ee 6 months ago
Well, it should obviously be the owner of the car.
machinin@lemmy.world 6 months ago
I was looking up info for another comment and found this site. It’s from 2021, but the information seems solid.
www.flyingpenguin.com/?p=35819
This table was probably most interesting, unfortunately the formatting doesn’t work on mobile, but I think you can make sense of it.
Car 2021 Sales So Far Total Deaths Tesla Model S 5,155 40 Porsche Taycan 5,367 ZERO Tesla Model X 6,206 14 Volkswagen ID 6,230 ZERO Audi e-tron 6,884 ZERO Nissan Leaf 7,729 2 Ford Mustang Mach-e 12,975 ZERO Chevrolet Bolt 20,288 1 Tesla Model 3 51,510 87
So many cars with zero deaths compared to Tesla.
It isn’t if Tesla’s FSD is safer than humans, it’s if it’s keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).
dream_weasel@sh.itjust.works 6 months ago
That’s kind of a tough article to trust if I’m being honest. It may in fact be true, but it’s an opinion piece.
I find it a little weird to look only within sales for the year and also not to discuss the forms of autopilot or car use cases.
For example, are we talking about highway only driving, lane keeping assist, end to end residential urban, rural unmarked roads? Some of these are harder problems than others. How about total mileage as well? I’m not sure what the range is on a Nissan leaf, but I think comparing it to a Taycan or mach e seems disingenuous.
All that being said, yeah Tesla has a lot of deaths comparatively, but still way less than regular human drivers. I worry that a truly autonomous experience will not be available until and unless a manufacturer like Tesla pushes the limits on training data and also the fed responds by making better laws. Considering Elon douchiness, I’m also kinda happy Tesla is doing that and catching flak, but paving the way for more established manufacturers.
We were early adopters of Tesla, and trust me the cars are made cheap and the “autopilot” drives like shit even now, but it’s amazing the progress that has been made in the last 6 years.
ipkpjersi@lemmy.ml 6 months ago
I think people are just trying to hate on Tesla because it’s Elon (and fair enough) rather than self-driving itself. Although there’s also the side of things that self-driving vehicles are already safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream.
TypicalHog@lemm.ee 6 months ago
Their competitors? Which of their competitor has self-driving at the same level as Tesla (so we can compare)?
ChaoticEntropy@feddit.uk 6 months ago
I would highlight that not all Teslas will be being driven in this mode on a regular basis, if ever.
NikkiDimes@lemmy.world 6 months ago
For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.
suction@lemmy.world 6 months ago
Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms
axo@feddit.de 6 months ago
Accoring to the math in this video: :
- 150 000 000 miles have been driven with Teslas “FSD”, which equals to
- 375 miles per tesla purchased with FSD capabilities
- 736 known FSD crashes with 17 fatalities
- equals 11.3 deaths per 100M miles of teslas FSD
Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven…
Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car
dufkm@lemmy.world 6 months ago
a human produces 1.35 deaths per 100M miles driven
My car has been driven around 100k miles by a human, i.e. it has produced 0.00135 deaths. Is that like a third of a pinky toe?
Llewellyn@lemm.ee 6 months ago
Yeah, another 900k, and you’ll be ded.
NotMyOldRedditName@lemmy.world 6 months ago
That number is like 1.5 billion now.
curiousPJ@lemmy.world 6 months ago
If Red Bull can be successfully sued for false advertising from their slogan “It gives you wings”, I think it stands that Tesla should too.
set_secret@lemmy.world 6 months ago
VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.
It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.
Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.
We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).
It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.
I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).
WormFood@lemmy.world 6 months ago
a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)
I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?
and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model
mojofrododojo@lemmy.world 6 months ago
a driver who was fully asleep. or maybe a driver who was dead?
why does it need to become a specious comparison for it to be valid in your expert opinion? because those comparisons are worthless.
HeavyRaptor@lemmy.zip 6 months ago
I lost faith in the verge after how they handled the whole PC build fiasco
Snapz@lemmy.world 6 months ago
“And yes, I hate elon too, but”
PersnickityPenguin@lemm.ee 6 months ago
A couple of my criticisms with the article, which is about “autopilot” and not fsd:
-conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.
-tge definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.
Eheran@lemmy.world 6 months ago
Any time now it will be released. Like 7 years ago the taxis.
froh42@lemmy.world 6 months ago
“If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.
That’s a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.
Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving. For example lane keeping in my car - you don’t notice it when driving, it is just below your level of attention. But when I’m unconcentrated for a moment the car just stays on the lane, even on curving roads. It’s just designed to steer a bit later than I would do.
Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)
It feels like the software in my car could do a lot more, but its features are undersold.
The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.
In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can’t say if I might have been fast enough with reaction times)
What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn’t compromise saftey in the name of marketing and hyperbole.
If Tesla’s Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.
I feel they are rather designed to be able to show off “cool stuff”.
ForgotAboutDre@lemmy.world 6 months ago
Tesla’s autopilot isn’t the best around. It’s just the most deployed and advertised. People creating autopilot responsibly don’t beta test them with the kind of idiots that think Tesla autopilot is the best approach.
nek0d3r@lemmy.world 6 months ago
I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.
machinin@lemmy.world 6 months ago
Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.
The heater question - is Tesla’s FSD caught drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data ice linked elsewhere in this thread.
JackbyDev@programming.dev 6 months ago
CGP Grey also seems to believe self driving cars with the absence of traffic lights is the solution to traffic as opposed to something like trains.
KingThrillgore@lemmy.ml 6 months ago
A comment above points to a nearly 11x increase over human caused fatalities
phoenixz@lemmy.ca 6 months ago
Yeah and that’s the problem, they’re no where near “as good”
Snapz@lemmy.world 6 months ago
“Hey, you guys know that I love… but…”
PlexSheep@infosec.pub 6 months ago
Fuck cars, those ones specifically
dependencyinjection@discuss.tchncs.de 6 months ago
When I see this comment it makes me wonder, how do you feel when you see someone driving a car?
Should I feel guilty for owning a car. I’m 41 and I got my first car when I was 40, because I changed careers and it was 50 miles away.
I rarely used it outside of work and it was a means to get me there. I now work remote 3 days so only drive 2.
I don’t have social media or shop with companies like Amazon. I have just been to my first pro-Palestine protest.
Am I to be judged for using a car?
Numberone@startrek.website 6 months ago
Is linked to excess deaths,l? Technically it could be saving lives at a population scale. I doubt that’s the case, but it could be. I’ll read the article now and find out.
TypicalHog@lemm.ee 6 months ago
As I said! People in this thread are dumb (IMO). If they read the article they would literally see most of these crashes were because of autopilot misuse. I’m highly confident even with these deaths - there would be more then this if there was no autopilot at all and if these people were driving manually. I got no data on this but that’s just my hunch.
NikkiDimes@lemmy.world 6 months ago
Well, did you find out?
unreasonabro@lemmy.world 6 months ago
Obviously the time to react to the problem was before the system told you about it, that’s the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves. But fuck it let’s do it anyway, sure, and while we’re at it we can do away with the concept of the driver’s license because nothing matters any more who gives a shit we’re all obviously fucking retarded.
alienanimals@lemmy.world 6 months ago
Tesla has very misleading marketing surrounding the “autonomy” of their vehicles. Mercedes Benz is the first (and only) company in the US to have a level 3: fully self-driving.
catch22@programming.dev 6 months ago
What!!! I thought Elon had it all figured out, No Way!
twitter.com/elonmusk/status/1744821656990675184
\s
Landsharkgun@midwest.social 6 months ago
Stop. Using. Cars.
letsgo@lemm.ee 6 months ago
OK.
Question: how do you propose I get to work? It’s 15 miles, there are no trains, the buses are far too convoluted and take about 2 hours each way (no I’m not kidding), and “move house” is obviously going to take too long (“hey boss, some rando on the internet said “stop using cars” so do you mind if I take indefinite leave to sell my house and buy a closer one?”).
Toes@ani.social 6 months ago
This is speculation, but were most of them from people who disabled the safety features?
root@precious.net 6 months ago
There are some real Elon haters out there. I think they’re ugly as sin but I’m happy to see more people driving vehicles with all the crazy safety features, even if they aren’t perfect.
You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.
NutWrench@lemmy.world 6 months ago
“self-driving cars” are not going to be a thing within our lifetimes. It’s a problem that requires MUCH smarter AIs than we currently have.
autotldr@lemmings.world [bot] 6 months ago
This is the best summary I could come up with:
In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.
The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.
NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.
Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.
Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.
The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.
The original article contains 788 words, the summary contains 158 words. Saved 80%. I’m a bot and I’m open source!
bitwolf@lemmy.one 6 months ago
I just read on LinkedIn a post from a Tesla engineer laid off.
He said “I checked my email while auto piloting to work”.
The employees know more than anyone its capabilities and they still take the same stupid risk.
kikutwo@lemmy.world 6 months ago
This just breaking, cars linked to thousands of crashes and deaths.
Socsa@sh.itjust.works 6 months ago
I guess we should ban autopilot so we can go back to nobody having accidents in cars.
antlion@lemmy.dbzer0.com 6 months ago
These are spanning from the earliest adopters, up until August of last year. Plenty of idiots using a cruise control system and trusting their lives to beta software. Not the same as the current FSD software.
Your own car insurance isn’t based on your driving skill when you had your learners permit. When Tesla takes on the liability and insurance for CyberCab, you’ll know it’s much safer than human drivers.
NarrativeBear@lemmy.world 6 months ago
Cars linked to hundreds of crashes, dozens of deaths.
EmperorHenry@discuss.tchncs.de 6 months ago
and the pedestrian-emergency-break on tesla cars, and many other cars with that feature will malfunction sometimes causing people behind you to rear-end you.
werefreeatlast@lemmy.world 6 months ago
It’s just a dozen! You know how many people COVID took? And everyone wanted COVID! …it spreads of the air? Where’s my fabric non filtering 😷 mask with added holes baby!? So you know…how cool would it be if you’re riding a ordinary car and someone else is driving it into a wall or semi, except it’s actually not a sentient being but an algorithm? It would be pretty cool right?
magnetosphere@fedia.io 6 months ago
Why does the FTC allow it to be marketed as “Full Self-Driving”? That’s blatant false advertising.
reddig33@lemmy.world 6 months ago
As is “autopilot”. There’s no automatic pilot. You’re still expected to keep your hands on the wheel and your eyes on the road.
halcyoncmdr@lemmy.world 6 months ago
I am so sick and tired of this belief because it’s clear people have no idea what Autopilot on a plane actually does. They always seem to assume it flies the plane and the pilot doesn’t do anything apparently. Autopilot alone does not fly the damned plane by itself.
“Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.
There are more advanced systems available on the market that can be installed on smaller planes and in use on larger jets that can do things like auto takeoff, auto land, following waypoints, etc. without pilot input, but basic plain old autopilot doesn’t do any of that.
That expanded capability is similar to how things like “Enhanced Autopilot” on a Tesla can do extra things like change lanes, follow highway exits on a navigated route, etc. Or how “Full Self-Driving” is supposed to follow road signs and lights, etc. but those are additional functions, not part of “Autopilot” and differentiated with their own name.
Autopilot, either on a plane or a Tesla, alone doesn’t do any of that extra shit. It is a very basic system.
The average person misunderstanding what a word means doesn’t make it an incorrect name or description.
Catoblepas@lemmy.blahaj.zone 6 months ago
It’s not even the closest thing to self driving on the market, Mercedes has started selling a car that doesn’t require you to look at the road.
Bell@lemmy.world 6 months ago
Only works under 40 mph. Only available in 2 states. Not available until the end of this year.
caranddriver.com/…/2024-mercedes-benz-eqs-s-class…
Thorny_Insight@lemm.ee 6 months ago
You can literally type in an address and the car will take you there with zero input on the driver’s part. If that’s not full self driving then I don’t know what is. What FSD was capable of a year ago and how it performs today is completely different.
Not only does these statistics include the way less capable older versions of it, it also includes accidents caused by autopilot which is a different system than FSD. It also fails to mention how the accident rate compares to human drivers. If we replace every single car in the US with a self-driving one that’s 10x safer driver than your average human that means you’re still getting over 3000 deaths a year due to traffic accidents. That’s 10 people a day. If one wants to ban these systems because they’re not perfect then that means they’ll rather have 100 people die every day instead of 10.
Turun@feddit.de 6 months ago
That may be because Tesla refuses to publish proper data on this, lol.
Yeah, they claim it’s ten times better than a human driver, but none of their analysis methods or data points are available to independent researchers. It’s just marketing.
machinin@lemmy.world 6 months ago
Who is responsible if there is an accident, you or Tesla? That is the difference from true FSD and regular driver assistance features.
Regarding driving regulations -
If we had better raw data, I’m sure we could come up with better conclusions. Knowing the absolutely tremendous amount of BS that Musk spews, we can’t trust anything Tesla reports. We’re left to speculate.
At this point, it is probably best to compare statistics for other cars with similar technologies. For example, Volvo reported that they went 16 years without a fatal accident in their XC90 model. That was a couple of years ago, I don’t know if they have been able to keep that record up. With that kind of record that has lasted for so long, I think we have to ask why Tesla is so bad.