Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”
No shit
Submitted 11 months ago by L4s@lemmy.world [bot] to technology@lemmy.world
https://jalopnik.com/tesla-whistleblower-says-autopilot-system-is-not-safe-e-1851077068
Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”
No shit
These were my first words upon just seeing the title 😂
Is it really whistleblowing if what they’re leaking is already common knowledge?
Proof by looking at internal information and data.
The data leaked by Krupski included lists of Tesla employees, often featuring their social security numbers, in addition to thousands of accident reports, and internal Tesla communications. Handelsblatt and others have used these internal memos and emails as the basis for stories on the dangers of Autopilot and the reasons for the three-year delay in Cybertruck deliveries. From NYT:
How does any of that prove the claim? Surely independent crash data would show these vehicles are involved in many more accidents than other vehicles if it’s true, but that doesn’t seem to be the case.
Not common enough.
Some even still believe Elmo is a genius.
Elmo
My first thought was about the muppet and I was like “He is tho.”
Is it really whistleblowing
It is, and it is important.
Employees are usually bound by loyalty and contract not to tell any internals. But public knowledge often needs confirmation, otherwise it is only rumours.
Isn’t this already an established fact?
Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets. They need the scale, and the public traffic, and the idiots in the drunkards and the kids speeding. The only thing that’s going to stop them from working on autopilot will be that it’s no longer financially reasonable to keep going. Even a couple handfuls of deaths aren’t going to stop them.
Unfortunately this is one of those things that you can’t significantly develop/test on closed private streets.
Even if we hold this to be true (and I disagree in large part), the point is that Tesla’s systems aren’t at that stage yet. Failing to recognize lights correctly during live demos and such are absolutely things you can test and develop on closed streets or in a lab. Tesla’s shouldn’t be allowed on roads until they’re actually at a point where there are no glaring flaws. And then they should be allowed in smaller numbers.
Do you really think they didn’t test that before they got to this point?
I’m willing to bet they had been through that intersection before hundreds of times and never seen this. It’s not like it can’t detect a stoplight and they’re just out there randomly running through them all.
Of the millions of variables that were around them something blinded it to light this time. The footage from that run has probably been reviewed at nauseam at this point and is done more for them finding the problem than they could have done sitting in a closed warehouse making guesses when the car never fails to detect a red light.
That’s true, but I think the issue people have with “AutoPilot” is about marketing.
Tesla brands their cars’ solution as being a full replacement for human interaction and word from Musk, other Tesla employees, media personalities close to Tesla, and fanboys all make out like the car drives itself and the only reason you need a driver in place is to satisfy laws.
It’s bullshit. They know exactly what they’re doing when they do the above, when they call their system “AutoPilot”, when Musk makes claims his cars can travel from one side of the US to the other without human interaction (only to never actually do it, of course!), and sells car upgrades as Full Self Driving support.
If they branded it as Assisted Driving, Advanced Cruise Control, Smart Cruise, or something along those lines, like all the other carmakers do with their similar systems, I’d be less inclined to blame Tesla when there’s an unfortunate incident.
But Tesla markets and encourages, both officially and unofficially, that their cars have the ability to drive themselves, look after themselves, and that you’re safe when using the system. It’s a lie and I’m absolutely astounded they’ve had little more than a series of slaps on the wrist.
100% accurate.
They want people to use it so they get data from it. Accidents and deaths will happen… honestly, they’ll always happen… they happen now without it, it’s just more acceptable because it’s human error. Road safety is absolutely awful.
The reason they get away with it is Lobbying, Money and Political favors. They got where they are by greasing a whole shit ton of wheels with dumptrucks of money.
Shitty means, but pretty righteous ways.
My pet theory is that people are so erratically different as drivers that the data they’re collecting is too variable to make it useful. Throw in people who aren’t paying attention or are on their phones and Teslas might teach themselves bad behavior.
Tbf, Tesla’s are the only cars that actually know you are on your phone and/or not paying attention.
The fact is that most technology that we take for granted today went through a similar evolutionary phase with public use before they became as safe as they are now, especially cars themselves. For well over a century, the automobile has made countless leaps and bounds in safety improvements due to data gathered from public use studies.
We learn by doing.
Well they got rid of their relations department because Elon’s twitter is all you need
I don’t get it…
Autopilot averages an airbag deployment every five million miles.
The average driver in the U.S. averages one every 600,000 miles.
Idk. Doesn’t seem like it works perfectly, but it does seem to work pretty well.
The comparison is a little flat when you consider autopilot has minimum viable weather and road condition requirements to activate, no snow or hail, etc, while human drivers must endure and perform optimally in all road and weather conditions.
Man that’s some interesting brigading we have going on here. You throw facts at them they just explode.
He says this yet they’re already out on the roads logging millions of miles. Just because we get sensational headlines about a driver behind the wheel who crashed into the side of a semi, doesn’t mean they’re any more dangerous than any other car. AFAIK Tesla still has far fewer wrecks than many others. These driving aids have a lot of room for improvement but they only need to be better than an average driver in order to reduce accidents.
I think it’s very likely that specially Tesla would go ahead with technology that is dangerous in certain situations, as long as it only happens rarely.
We all know what kind of a man Elon is.
It’s not just about what he’s saying. It’s about the internal data he’s leaking to back up the claims.
Nah it’s because they decided to use cameras instead of LiDAR and then try to make it autonomous instead of driver aid.
AI is at its best when it’s opening up productivity and freedom to think critically or leisurely, the same way sticky notes help someone study.
Autopilot is just advanced cruise control. I think you’re conflating it with FSD which is their autonomous driving feature.
I mean, are humans really any better?
I know it’s not the answer you’re looking for but, what is safer for pedestrians, cyclists and other drivers, is to have less cars on the roads. Buses can move dozens of people with a single trained professional driver. Trains can move hundreds. It’s illogical to try to push for autonomous cars for individuals when we already have “self driving” technologies that are much much safer and much more efficient.
I agree. That’s why I don’t own a car.
You anti car people find any way to insert your views into a conversation. Let me guess, you also do Crossfit?
Depends on the Autopilot feature.
I was test driving model 3 and summon almost ran over a little kid in the parking lot until my wife ran in front of the car.
At least when my car’s collision sensors misread something, my eyeballs are there for redundancy.
Someone paying proper attention, probably. But a huge chunk of accidents happen because idiots are looking at their phones or fall asleep on the wheel, and at least a self driving cars won’t do that.
No, they just relinquish control to a sleepy driver without a warning whenever they are about to crash.
What? No. Of course not.
Gee thanks for reporting on the obvious, Jalponik.
We knew this. And even this whistleblower report is old.
What a garbage news outlet.
Oh look; more FUD.
Elon is definitely short on your investment my dude.
That’s your response? Claiming that I think he’s going to pay me?
cm0002@lemmy.world 11 months ago
I lost all trust in their ‘Autopilot’ the day I read Musk said (Paraphrasing) “All we need are cameras, there’s no need for secondary/tertiary LIDAR or other expensive setups”
Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!
Ottomateeverything@lemmy.world 11 months ago
As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That’s how you know the priorities are backwards.
TheRealKuni@lemmy.world 11 months ago
Also, my robot vacuum has LiDAR. It’s not expensive relative to a car.
frozen@lemmy.frozeninferno.xyz 11 months ago
Skimping on cost is how disasters happen. Ask Richard Hammond. “Spared no expense” my ass, hire more than 2 programmers, you cheap fuck.
mosiacmango@lemm.ee 11 months ago
The crazier and stupier shit was that part of his justification was that “people drive and they only have eyes. We should be able to do the same.”
tcely@fosstodon.org 11 months ago
That's terrifying for showing how little he understands about the problem he is attempting to solve.
Humans use up to four senses at times to accomplish the task of driving.
@mosiacmango
@cm0002
Akasazh@feddit.nl 11 months ago
Reminds me of Mao not brushing his teeth, because tigers didn’t brush theirs either.
JohnEdwa@sopuli.xyz 11 months ago
Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!
Like when the camera sees the broad side of a truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo… Wait, shit.
brbposting@sh.itjust.works 11 months ago
RIP Joshua Brown:
DreadPotato@sopuli.xyz 11 months ago
This (sensor fusion) is a valid issue in mobile robotics. Adding more sensors doesn’t necessarily improve stability or reliability.
Gargantu8@lemmy.world 11 months ago
What brand of car has better autopilot with other sensors?
ImFresh3x@sh.itjust.works 11 months ago
Uhhhh…
…any level 4 car actually, according to the federal governments and all the agencies who regulate this stuff.
NAVYA, Volvo/Audi, Mercedes, magna, baidu, Waymo.
Tesla isn’t even trying to go past level 3 at this point.
chakan2@lemmy.world 11 months ago
All of them.
chitak166@lemmy.world 11 months ago
To be fair, humans have proven all you need are visual receptors to navigate properly.
Maalus@lemmy.world 11 months ago
To be fair, current computers / AI / whatever marketing name you call them aren’t as good as human brains.
0xD@infosec.pub 11 months ago
Visual receptors… And 3-dimensional vision with all the required processing and decision making behind that based on the visual stimuli, lol.
cm0002@lemmy.world 11 months ago
And how many vehicle accidents and deaths are there today? Proven that humans suck at driving maybe
No we don’t, we use sight, sound and touch/feeling to drive at a minimum
lefaucet@slrpnk.net 11 months ago
Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.
I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.
Lidar has severe problems too. I’ve used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.
Cameras will eventually be great! Really they already are, but they’ll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.
That driver really should have been paying attention. Thee car fucking tells you to all the time.
If a camera has a problem the whole system aborts.
In the future this will mean the car will pull over, but it’'s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.
BaronDoggystyleVonWoof@lemmy.world 11 months ago
So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).
You can’t have a reliable self driving system if that is the case.
Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?
NeoNachtwaechter@lemmy.world 11 months ago
Gracefully? LMAO
You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it’s okay and it is allowed to disengage now!
elephantium@lemmy.world 11 months ago
This is exactly the problem. If I’m driving, I need to be alert to the driving tasks and what’s happening on the road.
If I’m not driving because I’m using autopilot, … I still need to be alert to the driving tasks and what’s happening on the road. It’s all of the work with none of the fun of driving.
Fuck that. What I want is a robot chauffer, not a robot version of everyone’s granddad who really shouldn’t be driving anymore.
chakan2@lemmy.world 11 months ago
Just in time to slam you into an emergency vehicle at 80…but hey…autopilot wasn’t on during the impact, not Musk’s fault.
noxy@yiffit.net 11 months ago
good thing regular cameras aren’t affected by reflective surfaces
oh wait