This used to be my job. They're not controlling the cars. They're basically completing real-time CAPTCHAs, telling the car whether the cameras see a stop sign, a bicycle, temporary barriers, etc. If the car can't identify an object that could possibly cross its path, it pulls over and stops until an operator can do a sanity-check on whatever the car's confused by. They only need to be able to identify objects on the road, not know the rules of the road.
Comment on It Turns Out That When Waymos Are Stumped, They Get Intervention From Workers in the Philippines
Zwuzelmaus@feddit.org 2 weeks ago
And these foreign crowd workers know the local traffic rules? Maybe they even have regular drivers licenses?
Chozo@fedia.io 2 weeks ago
whereIsTamara@lemmy.org 2 weeks ago
Can you imagine the lawsuits?
Zwuzelmaus@feddit.org 2 weeks ago
No. I am not from there.
whereIsTamara@lemmy.org 2 weeks ago
So if a business has AI drive a car, but then AI hands it over to a human who has no drivers license in the location, they are essentially allowing someone to operate a vehicle without a license, who is not even inside the country. If that car crashes into someone, Waymo has to explain why they let someone wildly unqualified and unlicensed operate for them. That’s millions in damages for gross neglect.
criticon@lemmy.ca 2 weeks ago
Here’s a short video of someone receiving help. They explain briefly that they provide instructions to the vehicle, they don’t do the actual driving
NotMyOldRedditName@lemmy.world 2 weeks ago
This is how it generally behaves, but they are capable of taking direct control in more difficult situations. It’s only very slow maneurvers though, it’s not like they would be driving it down the street.
Zwuzelmaus@feddit.org 2 weeks ago
at a very low speed over a very short distance.
LOL so when they get in a situation in a tunnel that is 10 or 20 km long (ok you don’t have that in poor Usa, but we have them here), they first drive it at 10km/h and then they give up after 300m? Because the rules are the rules??
NotMyOldRedditName@lemmy.world 2 weeks ago
From the description it’s really not meant to solve that. In a situation like that they’d have to send someone, but they would be able to get out of the middle of a lane, off to the side, even if that only gives an extra foot or two of space.
snooggums@piefed.world 2 weeks ago
That is like the person steering to avoid a collision while cruise control and lane assist are on, it isn’t actually fully autonomous.
Perspectivist@feddit.uk 2 weeks ago
I think the interventions here are more like: “that’s a trash can someone pushed onto the road - let me help you around it” rather than: “let me drive you all the way to your destination.”
It’s usually not the genuinely hard stuff that stumps AI drivers - it’s the really stupid, obvious things it simply never encountered in its training data before.
Cherry@piefed.social 2 weeks ago
Feels like the robot hoovers when they encounter an unexpected poo.
fartographer@lemmy.world 2 weeks ago
Ancient texts show that robot hoovers did not have a means of intervention
MoffKalast@lemmy.world 2 weeks ago
Saw this blog post recently about waymo’s sim setup and they really do seem to be generating pretty much everything in existence. The level of generalization on the model they seem to be using is either extremely low or they abort immediately at the earliest sign of low perplexity.
Kushan@lemmy.world 2 weeks ago
I’m guessing it’s the latter, they need to keep accidents to a minimum if they’re ever going to get broad legislation to legalise them.
Every single accident is analysed to death by the media and onlookers alike, with a large group of people wanting it to fail.
This is a prime example, we’ve known about the human intervention for a while now but period people seem surprised that those people are in another country.
Zwuzelmaus@feddit.org 2 weeks ago
Hm. Interesting. But that makes them look even mode incapable than I feared.
Perspectivist@feddit.uk 2 weeks ago
Broadly speaking, an AI driver getting stumped means it’s stuck in the middle of the road - while a human driver getting stumped means plowing into a semi truck.
I’d rather be inconvenienced than killed. And from what I’ve seen, even our current AI drivers are already statistically safer than the average human driver - and they’re only going to keep getting better.
They’ll never be flawless though. Nothing is.
MrScottyTay@sh.itjust.works 2 weeks ago
Ai drivers have run over and crushed people slowly before too though because they didn’t see the person as an “obstacle” to be avoided, or because they were on the ground, it didn’t see them
Zwuzelmaus@feddit.org 2 weeks ago
As long as they use level 3 autonomous cars and then cheat with remote operators instead of using real level 5 cars, such statistics remain quite meaningless.
However, they tell about the people who use them as arguments.