He’s referring to the fact that the Effective Altruism / Less Wrong crowd seems to be focused almost entirely on preventing an AI apocalypse at some point in the future, and they use a lot of obscure math and logic to explain why it’s much more important than dealing with war, homelessness, climate change, or any of the other issues that are causing actual humans to suffer today or are 100% certain to cause suffering in the near future.
pglpm@lemmy.ca 1 year ago
“Bayesian analysis”? What the heck has this got to do with Bayesian analysis? Does this guy have an intelligence, artificial or otherwise?
cygnosis@lemmy.world 1 year ago
pglpm@lemmy.ca 1 year ago
Thank you for the explanation! – it puts that sentence into perspective. I think he put it in a somewhat unfortunate and easily misunderstood way.
Mahlzeit@feddit.de 1 year ago
It’s likely a reference to Yudkowsky or someone along those lines. I don’t follow that crowd.
intensely_human@lemm.ee 1 year ago
It’s hard to say for sure. He might.
Transporter_Room_3@startrek.website 1 year ago
Big word make sound smart