Considering 68.25% of all US crashes involving driver assist systems were due to Tesla Autopilot, I agree it’s an experiment.
Tesla whistleblower calls cars with Autopilot “experiments in public roads”
Submitted 11 months ago by DannyMac@lemmy.world to technology@lemmy.world
Comments
nicolascage@lemmy.world 11 months ago
EvacuateSoul@lemmy.world 11 months ago
This is one of those times you should realize how misleading statistics can be. Can you think of what might be a more informative measurement if we are actually after the truth?
BleatingZombie@lemmy.world 11 months ago
The number of inches of Elon’s dick is in your throat?
burgersc12@sh.itjust.works 11 months ago
100% of the crashes were caused by autopilot
nicolascage@lemmy.world 11 months ago
Okay mate, why don’t you show us all what the “more informative measurement” is for this?
FinalRemix@lemmy.world 11 months ago
My parents have that Lidar cruise control on their Toyota. It was active—but not on—one day when I was driving, and the damned car started freaking out BRAKE BRAKE BRAKE thinking I’m about to plow into a parked car because there was a gentle curve in the road.
RagingRobot@lemmy.world 11 months ago
My car looks to slam on the brakes when I back out of my driveway sometimes. Super annoying
BB69@lemmy.world 11 months ago
FSD, maybe. But autopilot operates fine and is no different than what most major manufacturers offer.
Ghostalmedia@lemmy.world 11 months ago
Last time I tried autopilot was 4 years ago, so I imagine things have become better. That said, on a test drive, on a rainy day, auto lane change did some fighting stuff. Thought lanes were clear, learned they weren’t, then violently ripped the car back to the origin lane in conditions that were prime for hydroplaning.
My wife and I were scared shitless, and the woman from Telsa, who was also in the car, tried to reassure us by saying “it’s ok, this is normal.”
Then we return the car to the parking lot and auto park almost took out a kid in an enclosed parking structure.
I imagine it’s become better in 4 years, but how that was street legal baffled my mind.
BB69@lemmy.world 11 months ago
None of what you mentioned is in basic autopilot. Autopilot is lane keep and traffic aware cruise control only.
helenslunch@feddit.nl 11 months ago
Auto lane change is not a function of Autopilot
AtmaJnana@lemmy.world 11 months ago
My vehicle can do almost all the same stuff as “autopilot” but it turns the autosteering and cruise off if I dont touch the wheel every 30 seconds. Its all the same types of sensors,etc. And mine isn’t even a luxury brand. Just the higher end trim package of a budget vehicle.
BB69@lemmy.world 11 months ago
Autopilot also shuts off with no driver input. Faster than 30 seconds too.
autotldr@lemmings.world [bot] 11 months ago
This is the best summary I could come up with:
“In late 2021, Lukasz realised that—even as a service technician—he had access to a shockingly wide range of internal data at Tesla,” the group’s prize announcement said.
Krupski was also featured last month in a New York Times article titled, “Man vs. Musk: A Whistleblower Creates Headaches for Tesla.”
But Krupski now says that “he was harassed, threatened and eventually fired after complaining about what he considered grave safety problems at his workplace near Oslo,” the NYT report said.
Krupski “was part of a crew that helped prepare Teslas for buyers but became so frustrated with the company that last year he handed over reams of data from the carmaker’s computer system to Handelsblatt, a German business newspaper,” the report said.
The data Krupski leaked included lists of employees and personal information, as well as “thousands of accident reports and other internal Tesla communications.”
Krupski told the NYT that he was interviewed by the NHTSA several times, and has provided information to the US Securities and Exchange Commission about Tesla’s accounting practices.
The original article contains 705 words, the summary contains 172 words. Saved 76%. I’m a bot and I’m open source!
spudwart@spudwart.com 11 months ago
Non-consentual Human Experimentation is a war crime.
lando55@lemmy.world 11 months ago
It’s peace time though so it doesn’t qualify
/s
lud@lemm.ee 11 months ago
It’s consentual if you buy it though.
Calling it a war crime is slightly extreme.
spudwart@spudwart.com 11 months ago
Except the other drivers on the road aren’t all in Teslas, yet they are non-consentually and possibly even unknowingly a part of this experiment.
there1snospoon@ttrpg.network 11 months ago
If you hit another motorist or pedestrian, it’s no longer consensual.
War crime is a tad much sure. Let’s just make it a felony.
rsuri@lemmy.world 11 months ago
Random question I’ve always wondered about. My understanding is that autopilot relies on optical sensors exclusively. And image recognition tends to rely on getting loads of data to recognize particular objects. But what if there’s an object not in the training data, like a boulder in a weird shape? Can autopilot tell anything is there at all?
captainjaneway@lemmy.world 11 months ago
Yeah obstructions can be generalized to a road being blocked. Object recognition includes recognizing the shape of an object via curves, shadows, depth, etc. You don’t need to know it’s a boulder to know a large object is in the road.
topinambour_rex@lemmy.world 11 months ago
Something like snow? The vehicle ignores what to do or where to go.
helenslunch@feddit.nl 11 months ago
I’ve put like 10k miles on AutoPilot. It’s not an experiment.
nicolascage@lemmy.world 11 months ago
Ah yes, “I’ve personally done this and I’m the most important, therefore it’s not true. It’s the experts and engineers who are wrong.”
Besides “everybody look at me!!”, what is your point?
helenslunch@feddit.nl 11 months ago
It’s not that I know better than them. It’s that they are blowing things out of proportion.
tsonfeir@lemm.ee 11 months ago
The more important person to punish is the one who let them do it