Comment on Might not be efficient, but at least it... Uhhh, wait, what good does it provide again?

<- View Parent
fonix232@fedia.io ⁨20⁩ ⁨hours⁩ ago

Even that won't be anywhere close to the efficiency of neurons.

And actual neurons are not comparable to transistors at all. For starters the behaviour is completely different, closer to more complex logic gates built from transistors, and they're multi-pathway, AND don't behave as binary as transistors do.

Which is why AI technology needs so much power. We're basically virtualising a badly understood version of our own brains. Think of it like, say, PlayStation 4 emulation - it's kinda working but most details are unknown and therefore don't work well, or at best have a "close enough" approximaion of behaviour, at the cost of more resource usage. And virtualisation will always be costly.

Or, I guess, a better example would be one of the many currently trending translation layers (e.g. SteamOS's Proton or macOS' Rosetta or whatever Microsoft was cooking for Windows for the same purpose, but also kinda FEX and Box86/Box64), versus virtual machines. The latter being an approximation of how AI relates to our brains (and by AI here I mean neural network based AI applications, not just LLMs).

source
Sort:hotnewtop