Comment on Things at Tesla are worse than they appear
AA5B@lemmy.world 3 days agoCameras alone are not sufficient enough for autonomous driving.
I disagree with this assertion, because they’re correct that the only being that can currently drive is relying on vision. Vision alone is sufficient for driving.
But autonomous driving really hasn’t succeeded yet. We still have no idea what is required for autonomous driving or whether we can do it at all, regardless of sensors.
So you’re implying that we can definitely do autonomous driving but can’t do it the way humans do, whereas I say we won’t know the requirements until we find some that succeed, and we may never
tfm@europe.pub 3 days ago
Yeah sure. If you want the same bad results as humans deliver, in terms of crash rates, than it’s possible. I wouldn’t trust it. Also human vision and processing is completely different from computer vision and processing.
AA5B@lemmy.world 3 days ago
Presumably we have the intelligence to set requirements before something can be called self-driving - that’s usually what the fuss is about, whether the marketing is claiming it’s something it’s not.
If they fail with their approach, I’m fine with that, just like I’m fine if Waymo fails with their approach. Obviously there’s a problem if it runs over some old lady at a stop sign and drags them down the street but that’s clearly a failure for them
tfm@europe.pub 3 days ago
We already have that www.sae.org/blog/sae-j3016-update
AA5B@lemmy.world 3 days ago
Yes, we have the definitions, but I haven’t read about whether they’re effectively required. Is there a test, a certification authority, rules for liability or revocation? Have we established a way to actually require it.
I hope we wouldn’t let manufacturers self-certify, although historical data is important evidence. I hope we don’t aid profitability of manufacturers by either limiting liability or creating a path to justice doomed to fail
AA5B@lemmy.world 3 days ago
The thing is humans are horrible drivers, costing a huge toll in lives and property every year.
We may already be at the point where we need to deal with the ethics of inadequate self-driving causing too many accidents vs human causing more. We can clearly see the shortcomings of all self driving technology so far, but is it ethical to block Immature technology if it does overall save lives?