Self driving cars should be flawless and personally, as someone who does not have any professional experience in tech, I do not understand why you’d ever rely on human senses to act as sensors for a machine. We can’t even see all the colors, why would you not give this thing like four different versions of detecting things? If I have a self driving car I want a fuckin radar on my windshield that knows where every object within 40 feet is.
Comment on Elon Musk May Have Made a Huge Mistake on Full Self-Driving That It's Too Late to Correct.
stsquad@lemmy.ml 3 weeks ago
I can see the argument that visible light should be enough given we humans can drive with just two eyes and a few mirrors. However that argument probably misses the millions of years of evolution of our neural networks have gone through while hunting and tracking threats that happens to make predicting where other cars might be mostly fine.
I have a feeling regulators aren’t going to be happy with a claim of driving better than the average human. FSD should be aiming to be at least 10x better than the best human drivers and we’re a long way off from that.
Doom@ttrpg.network 3 weeks ago
NotMyOldRedditName@lemmy.world 3 weeks ago
Except the radar doesn’t know where every object is. It can’t detect stopped things while traveling at high speeds.
pupbiru@aussie.zone 3 weeks ago
the large majority of current self driving cars have radar, lidar, ultra sonic, and cameras. their detection sets overlap, and complement each other so they can see a wide array of things that others can’t. focusing on 1 and saying “it doesn’t see X” is a very poor argument when others see those things just fine
NotMyOldRedditName@lemmy.world 3 weeks ago
The point is that to be truly autonomous when vision is the only fail safe reliable sensor, then vision MUST work to have a truly autonomous vehicle.
You can’t rely on radar without vision or lidar because it can’t see stopped vehicles at high speed.
You can’t rely on lidar in rain/fog/snow/dust because the light bounces off of the particles and gives bad data, plus it can’t tell youanything about what the object is or might intend to do, only that it’s there.
Only vision can do all of those, it’s just a matter of number of cameras, camera quality, and AI processing capabilities.
If vision can do all those things perfectly, maybe you don’t need those other sensors after all?
Im_old@lemmy.world 3 weeks ago
Because tech (any tech) is even more fallible than humans.
DarkSurferZA@lemmy.world 3 weeks ago
This is one of the comments that Elon Musk uses a lot when he says humans drive with their eyes, but its untrue. We actually have a wide array of sensory systems that help us drive. Firstly, we use our ears, eyes and body motion to drive. Secondly, unlike a fixed camera mounted on a car, our heads are in constant motion. This means that we cover blind spots better than a fixed camera, and we are able to determine if it’s a small deer really close by, and a large deer really far away. Our brains take multiple 3d images and stitch them together to determine size, distance and speed.
The best way to explain the driving using your eyes fallacy is basically to look at fpv RC cars, and see how much sensory information you have been robbed of while trying to pilot the vehicle
NotMyOldRedditName@lemmy.world 3 weeks ago
Nothing yoy said there can’t be done by cameras other than sound and the car has a microphone inside.
All it really means is maybe the car needs more cameras and more microphones.
Determining distance with images from multiple angles can provide accurate distances.
pupbiru@aussie.zone 3 weeks ago
you’re not wrong, but also that’s a fantasy with current technology. meanwhile, cars are dangerous heavy hard boxes travelling around at high speed while we “get the technology right”, and that’s unacceptable