Comment on Microsoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactors

<- View Parent
j4k3@lemmy.world ⁨1⁩ ⁨year⁩ ago

Organic technology is hard. If you can figure out how to grow a compute system you will take human technology hundreds of years into the future. Silicon tech is the stone age of compute.

The brain has a slow clock rate to keep within its power limitations, but it is a parallel computational beast compared to current models.

It takes around ten years for new hardware to really take shape in our current age. AI hasn’t really established what direction it is going in yet. The open source offline model is the likely winner, meaning the hardware design and scaling factors are still unknown. We probably won’t see a good solution for years. We are patching video hardware as a solution until AI specific hardware is readily available.

source
Sort:hotnewtop