Comment on In a 38-page essay, Dario Amodei warns of civilization-level damage from superintelligent AI, questioning whether humanity has the maturity to handle such power

<- View Parent
Perspectivist@feddit.uk ⁨5⁩ ⁨days⁩ ago

It’s perfectly valid to discuss the dangers of AGI whether LLMs are the path there or not. I’ve been concerned about AGI and ASI for far longer than I’ve even known about LLMs, and people were worried about exactly the same stuff back then as they are now. This is precisely the kind of threat you should try to find a solution for before we actually reach AGI - because once we do, it’s way, way too late.

Also:

There is factually 0 chance we’ll reach AGI with the current brand of technology.

You couldn’t possibly know that with absolute certainty.

source
Sort:hotnewtop