He’s referring to the fact that the Effective Altruism / Less Wrong crowd seems to be focused almost entirely on preventing an AI apocalypse at some point in the future, and they use a lot of obscure math and logic to explain why it’s much more important than dealing with war, homelessness, climate change, or any of the other issues that are causing actual humans to suffer today or are 100% certain to cause suffering in the near future.
pglpm@lemmy.ca 11 months ago
Thank you for the explanation! – it puts that sentence into perspective. I think he put it in a somewhat unfortunate and easily misunderstood way.