The thing is humans are horrible drivers, costing a huge toll in lives and property every year.
We may already be at the point where we need to deal with the ethics of inadequate self-driving causing too many accidents vs human causing more. We can clearly see the shortcomings of all self driving technology so far, but is it ethical to block Immature technology if it does overall save lives?
AA5B@lemmy.world 3 days ago
Yes, we have the definitions, but I haven’t read about whether they’re effectively required. Is there a test, a certification authority, rules for liability or revocation? Have we established a way to actually require it.
I hope we wouldn’t let manufacturers self-certify, although historical data is important evidence. I hope we don’t aid profitability of manufacturers by either limiting liability or creating a path to justice doomed to fail
tfm@europe.pub 2 days ago
This stuff is highly regulated …wikipedia.org/…/Regulation_of_self-driving_cars
Mercedes has the first autonomous car (L3) you can buy, which you can only activate at low speeds on certain roads in Germany. It’s only possible because of Lidar sensors and when activated you are legally allowed to look at your phone as long as you can take over in 10 or so seconds.
You aren’t allowed to do this in a Tesla, since the government doesn’t categorize Teslas as autonomous vehicles which requires L3+.
No car manufacturer can sell real autonomous vehicles without government approval. Tesla FSD is just Marketing bs. Others are far ahead in terms of autonomous driving tech.