But it should drive cars?
Oh, definitely. Humans are shit at that. Get bored when we have to concentrate for 10 minutes.
Comment on AI shouldn’t make ‘life-or-death’ decisions, says OpenAI’s Sam Altman
captainastronaut@seattlelunarsociety.org 1 year ago
But it should drive cars? Operate strike drones? Manage infrastructure like power grids and the water supply? Forecast tsunamis?
Too little too late, Sam. 
But it should drive cars?
Oh, definitely. Humans are shit at that. Get bored when we have to concentrate for 10 minutes.
As advanced cruise control, yes. No, but in practice it doesn’t change a thing as humans can bomb civilians just fine themselves. Yes and yes.
If we’re not talking about LLMs which is basically computer slop made up of books and sites pretending to be a brain, using a tool for statistical analysis to analyze a shitload of data (like optical, acoustic and mechanical data to assist driving or seismic data to forecast tsunamis) is a bit of a no-brainer.
pearsaltchocolatebar@discuss.online 1 year ago
Yes on everything but drone strikes.
A computer would be better than humans in those scenarios. Especially driving cars, which humans are absolutely awful at.
Deceptichum@kbin.social 1 year ago
So if it looks like it’s going to crash, should it automatically turn off and go “Lol good luck” to the driver now suddenly in charge of the life-and-death situation?
pearsaltchocolatebar@discuss.online 1 year ago
I’m not sure why you think that’s how they would work.
Deceptichum@kbin.social 1 year ago
Well it's simple, who do you think should make the life or death decision?
LWD@lemm.ee 1 year ago
Have you seen a Tesla drive itself? Never mind ethical dilemmas, they can barely navigate the downtown without hitting pedestrians
pearsaltchocolatebar@discuss.online 1 year ago
Teslas aren’t self driving cars.
LWD@lemm.ee 1 year ago
According to their own website, they are
www.tesla.com/autopilot