No if you look at Waymo as an example, they are actually autonomous, and ask for assistance in situations they are “unsure” how to handle.
But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left almost into a tree.
FreedomAdvocate@lemmy.net.au 1 week ago
They can’t stop and ass for assistance at 100km/h on a highway.
Buffalox@lemmy.world 1 week ago
According to the driver it was on FSD, and it was using the latest software update available.
www.reddit.com/user/SynNightmare/
Maybe the point is then, that Tesla FSD shouldn’t be legally used on a highway.
But it probably shouldn’t be used anywhere, because it’s faulty as shit.
And why can’t is slow down to let the driver take over in a timely manner, when it can break for no reason.
It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!
FreedomAdvocate@lemmy.net.au 1 week ago
According to the driver, with zero evidence backing up the claim.
Buffalox@lemmy.world 1 week ago
There have been other similar cases lately, which clearly indicate problems with the car.
The driver has put up the footage from all the cameras of the car, so he has done what he can to provide evidence.
www.reddit.com/r/TeslaFSD/…/1328_fsd_accident/
It’s very clear from the comments, that some have personally experienced similar things, and others have seen reporting of it.
This is not an isolated incident. It’s just has better footage than most.