The probe hones in on one of Tesla’s most eyebrow-raising decisions when it comes to its driver assistance package: the insistence on exclusively relying on camera sensors instead of LiDAR and radar like its competitors, which CEO Elon Musk has long derided as a “crutch.”
In 2022, the company went all-in on cameras, ditching ultrasonic sensors in its vehicles altogether — a decision that could prove to be a major mistake as it struggles to catch up with its competition and has now promised robust self-driving capabilities to owners who may lack the necessary sensor hardware.
stsquad@lemmy.ml 1 week ago
I can see the argument that visible light should be enough given we humans can drive with just two eyes and a few mirrors. However that argument probably misses the millions of years of evolution of our neural networks have gone through while hunting and tracking threats that happens to make predicting where other cars might be mostly fine.
I have a feeling regulators aren’t going to be happy with a claim of driving better than the average human. FSD should be aiming to be at least 10x better than the best human drivers and we’re a long way off from that.
DarkSurferZA@lemmy.world 1 week ago
This is one of the comments that Elon Musk uses a lot when he says humans drive with their eyes, but its untrue. We actually have a wide array of sensory systems that help us drive. Firstly, we use our ears, eyes and body motion to drive. Secondly, unlike a fixed camera mounted on a car, our heads are in constant motion. This means that we cover blind spots better than a fixed camera, and we are able to determine if it’s a small deer really close by, and a large deer really far away. Our brains take multiple 3d images and stitch them together to determine size, distance and speed.
The best way to explain the driving using your eyes fallacy is basically to look at fpv RC cars, and see how much sensory information you have been robbed of while trying to pilot the vehicle
NotMyOldRedditName@lemmy.world 1 week ago
Nothing yoy said there can’t be done by cameras other than sound and the car has a microphone inside.
All it really means is maybe the car needs more cameras and more microphones.
Determining distance with images from multiple angles can provide accurate distances.
Doom@ttrpg.network 1 week ago
Self driving cars should be flawless and personally, as someone who does not have any professional experience in tech, I do not understand why you’d ever rely on human senses to act as sensors for a machine. We can’t even see all the colors, why would you not give this thing like four different versions of detecting things? If I have a self driving car I want a fuckin radar on my windshield that knows where every object within 40 feet is.
NotMyOldRedditName@lemmy.world 1 week ago
Except the radar doesn’t know where every object is. It can’t detect stopped things while traveling at high speeds.
Im_old@lemmy.world 1 week ago
Because tech (any tech) is even more fallible than humans.