Comment on [deleted]

<- View Parent
Multiplexer@discuss.tchncs.de ⁨18⁩ ⁨hours⁩ ago

I think the point is not that it is really going to happen at that pace, but to show that it very well might happen within our lifetime. And also the authors have adjusted the earliest possible point of a possible hard to stop runaway scenario to 2028 afaik.

Kind of like the atomic doomsday clock, which has been oscillating between a quarter to twelve and a minute before twelve during the last decades, depending on active nukes and current politics. Helps to illustrate an abstract but nonetheless real risk with maximum possible impact (annihilation of mankind - not fond of the idea…)

Even if it looks like AI has been hitting some walls for now (which I am glad about) and is overhyped, this might not stay this way. So although AGI seems unlikely at the moment, taking the possibility into account and perhaps slowing down and making sure we are not recklessly risking triggering our own destruction is still a good idea, which is exactly the authors’ point.

Kind of like scanning the sky with telescopes and doing DART-style asteroid research missions is still a good idea, even though the probability for an extinction level meteorite event is low.

source
Sort:hotnewtop