Comment on AI shouldn’t make ‘life-or-death’ decisions, says OpenAI’s Sam Altman
pearsaltchocolatebar@discuss.online 1 year agoTeslas aren’t self driving cars.
Comment on AI shouldn’t make ‘life-or-death’ decisions, says OpenAI’s Sam Altman
pearsaltchocolatebar@discuss.online 1 year agoTeslas aren’t self driving cars.
LWD@lemm.ee 1 year ago
According to their own website, they are
www.tesla.com/autopilot
pearsaltchocolatebar@discuss.online 1 year ago
Well, yes. Elon Musk is a liar. Teslas are by no means fully autonomous vehicles.
LWD@lemm.ee 1 year ago
Ah, the trusty “no true autopilot” defense
wikibot@lemmy.world [bot] 1 year ago
Here’s the summary for the wikipedia article you mentioned in your comment:
No true Scotsman, or appeal to purity, is an informal fallacy in which one attempts to protect their generalized statement from a falsifying counterexample by excluding the counterexample improperly. Rather than abandoning the falsified universal generalization or providing evidence that would disqualify the falsifying counterexample, a slightly modified generalization is constructed ad-hoc to definitionally exclude the undesirable specific case and similar counterexamples by appeal to rhetoric. This rhetoric takes the form of emotionally charged but nonsubstantive purity platitudes such as “true”, “pure”, “genuine”, “authentic”, “real”, etc. Philosophy professor Bradley Dowden explains the fallacy as an “ad hoc rescue” of a refuted generalization attempt.
^to^ ^opt^ ^out^^,^ ^pm^ ^me^ ^‘optout’.^ ^article^ ^|^ ^about^