Radar. Only a small handful of cars have LIDAR.
Comment on Tesla Vision fails as owners complain of Model 3 cameras fogging up in cold weather
NataliePortland@lemmy.ca 1 year ago
Every other car uses LIDAR and Elon thinks he’s such a forward thinker for shunning it. So dumb
StenSaksTapir@feddit.dk 1 year ago
floofloof@lemmy.ca 1 year ago
The driving assist features of my Honda CR-V also stop working whenever it’s snowy. It’s not just a Tesla issue.
Nudding@lemmy.world 1 year ago
You shouldn’t be out of park while there’s snow or ice on your vehicle. Clean off your fucking vehicles.
floofloof@lemmy.ca 1 year ago
Of course, and I do. I’m talking about the snow or ice that accumulates around the front bumper and grille while driving down the highway. At some point the car will notify me that driver assistance features are no longer available due to the sensor being obstructed.
CmdrShepard@lemmy.one 1 year ago
I mean the stuff builds up on the front of your car while driving. Moisture sticks to the front and then refreezes.
Aleric@lemmy.world 1 year ago
As yet one another guy who is sick and tired of clumps of snow flying off of people’s vehicles onto my windshield, thank you for your service.
pineapplelover@lemm.ee 1 year ago
His argument makes sense. Human vision is not too different from just a camera. I see the argument for lidar but it can also be a bit more expensive to accomplish the same task. I’m open to listening to your argument as to why lidar technology would be a better path for self driving cars.
nephs@lemmy.world 1 year ago
The eye is the fucking whole argument for the stupid creationism. The most complex piece of machinery in the human body and shit.
That man thinks he’s god, to create similar functionality.
Has he fucking tried to keep his eyes open in fucking cold weather?
Why not just use humans eyes outside of earth’s atmosphere?!
He’s just so fucking stupid. Rich and stupid. The shit he spends his “hard earned” money would be so much better and efficient if spent controlled by mostly anyone else.
poopkins@lemmy.world 1 year ago
That’s ironic, because the evolution of the human eye is a good example of a case where iterative change has led to a clunky and suboptimal design.
I suppose that analogy is oddly appropriate for Tesla Vision.
cricket98@lemmy.world 1 year ago
Yeah i’m sure all these lemmy know it alls could produce a functional self driving car
FlyingSquid@lemmy.world 1 year ago
It seems to me that we want to make self-driving cars safer that human drivers. And to make them safer, you want them to use every kind of sensor that is practical to avoid accidents. LIDAR alone isn’t the path. Neither is visual alone.
Also, suggesting that a car with cameras is equivalent to a human with a human brain that has eyes attached to it is a little silly.
accideath@lemmy.world 1 year ago
Humans don’t just use eyes when driving. Sound, and touch also play big roles, for example when it comes hearing ambulances nearing or to feel road conditions. And we have a really good sense for depth and distance, that’s much harder to replicate with just cameras. And even Humans aren’t allowed to drive with headphones on (at least here in Germany), because it’s dangerous to limit the amount of sensors available to us.
Besides that, even our sight is faaaar from perfect either and there are quite a lot of accidents caused by drivers just not seeing another driver or some other obstacle. Our vision is pretty good, yes, but the amount of guessing our brain has to do for us to actually see what we do isn’t exactly small.
I don’t know about you, but I would prefer a self driving vehicle to be safer than a human. Because if it isn’t, why bother? And how could it be safer, if it uses less information than humans, who are shit drivers already?
And yes, lidar is more expensive but so what? It’s cheap enough to add it to phones. Expensive phones, yes but in the grand scheme of things, they’re still quite a bit cheaper than a car and Teslas aren’t exactly cheap cars either. And Tesla used to include radar in their cars until they didn’t. And the cars didn’t get that much cheaper…
And to give a positive example: Mercedes Benz are the first to launch a Level 3 autonomous vehicle. And guess what? It uses Lidar, audio sensors, road condition sensors, etc. and actually achieved L3 autonomy, while Tesla’s FSD is constantly tested to be one if the worst performing Level 2 systems in the industry, despite their claims of greatness…
Socsa@sh.itjust.works 1 year ago
Lidar is not just more of expensive, it is extremely fragile in a vehicle which is bouncing around at highway speeds.
accideath@lemmy.world 1 year ago
Well, doesn’t seem to bother any other car manufacturer much. Probably because the benefits outweigh the complexity disadvantages
loutr@sh.itjust.works 1 year ago
The obvious argument is that eyes are far from perfect and fail us all the time, especially when going fast. We are quite good at making up for it, but saying “We have eyes so my self driving cars will have eyes too” is pretty fucking dumb.
ItsMeSpez@lemmy.world 1 year ago
We also recognized that we need to keep our windshields clear of fog in order for our eyes to work properly.
aesthelete@lemmy.world 1 year ago
Human vision is not too different from just a camera.
Oh yeah, human vision also causes people to mistake a blue truck for the sky and drive right into it. /s
AngryCommieKender@lemmy.world 1 year ago
It was a white/gold truck, not a blue/black truck…
aesthelete@lemmy.world 1 year ago
Hah even worse
Socsa@sh.itjust.works 1 year ago
Literally yes? Humans hit way dumber shit every single day.
aesthelete@lemmy.world 1 year ago
Sure but usually because they weren’t looking or couldn’t see it…not because they mix up birds and stop signs or some of the other dumb shit computer vision algorithms do.
CmdrShepard@lemmy.one 1 year ago
That’s literally what happened.
GoodEye8@lemm.ee 1 year ago
That argument doesn’t make sense because human vision isn’t that great either. When it’s dark or raining or snowing or foggy our vision is pretty shit.
I’m not saying LIDAR is better but rather point out that actually you want different types of sensors to accurately assess the traffic, because just one type of sensor isn’t likely to cut it. If you look at other manufacturers they’re not using only LIDAR or only camera. Some use LIDAR + camera, some user RADAR + camera, some user LIDAR, RADAR and camera. And I’m pretty sure that as manufacturers will aim for higher SAE levels they will add even more sensor into their cars. It’s only Tesla who thinks they can somehow do more with less.
Socsa@sh.itjust.works 1 year ago
People here have no idea what they are talking about, or how absurdly difficult it is to actually deploy lidar to a consumer vehicle. There’s a reason why Tesla is shipping more advanced driver assist tech than anyone else, and it’s because they went against the venture capitalist Lidar obsession which is holding everyone back. There’s a reason why there are basically zero cars shipping with lidar today.
You don’t need mm depth maps to do self driving. Not that you get that from lidar on rough roads anyway.
pineapplelover@lemm.ee 1 year ago
There are some test cars with lidar. It has the spinny thing on top and looks pretty interesting. I believe those cars are pretty successful. I don’t think they’re being mass produced though, because the costs might be a little prohibitive.
learningduck@programming.dev 1 year ago
The most advanced that’s not even on autonomous level 3. It’s funny that Mercedes is the first to get level 3 approval in California and they don’t even boasting that as much.
That aside, a secondary sensor that help verifying if the vision get it right would be nice. It could be just a radar or whatever. Imagine if the vision fail to recognize a boy in a Halloween costume as a person, at least the secondary sensor will the car to stop due to contradict perception.
pineapplelover@lemm.ee 1 year ago
I think it’s undeniable the combination of camera and lidar will be the best solution. I just hope this can be coss effective. Maybe over time we can be able to adapt and improve the technology and make it more economical so that it is safer for our roads.
learningduck@programming.dev 1 year ago
Think of that Coyote and the roadrunner cartoon. If there’s a graffiti that looks like a tunnel the coyote may run into the tunnel based on vision alone, but a secondary sensor will help telling that there’s a wall.
Irl, If the vision failed to recognize that there’s something on the road, at least a secondary sensor that protests.
HERRAX@sopuli.xyz 1 year ago
You can also test driving in direct sunlight without sunglasses or the suncover. You get notifications and beeping noises whenever you drive in clear weather, making the lane assist (I refuse to call it autopilot) quite irrational in most weather… It’s actually worse for me than driving in cold weather.
poopkins@lemmy.world 1 year ago
While I disagree with you that you think his argument makes sense, I’m upvoting your comment because it encourages discourse and provides more insight and depth to this topic. I wish more people on Lemmy did the same.
Socsa@sh.itjust.works 1 year ago
Very few production vehicles have lidar. Like almost zero
eskimofry@lemmy.world 1 year ago
Think of it this way: He is such a visionary that fog is blinding his vehicles.