To a billion parameter matrix inverter? Probably not too hard, maybe not at those speeds.
To a GPU, or even just the functions used in GenAI? We don’t even know if those are possible with analog computers to begin with.
To a billion parameter matrix inverter? Probably not too hard, maybe not at those speeds.
To a GPU, or even just the functions used in GenAI? We don’t even know if those are possible with analog computers to begin with.
fcalva@cyberplace.social 1 day ago
@TheBlackLounge @kalkulat LLM inference is definitely theoretically possible on analog chips. They just may not scale :v