Comment on China scientists develop flash memory 10,000× faster than current tech

<- View Parent
schema@lemmy.world ⁨5⁩ ⁨days⁩ ago

I don’t think it would make much difference if it lasted longer. I could be wrong, but afaik, running the actual transformer for AI is done in VRAM, and staging and preprocessing is done in RAM. Anything else wouldn’t really make sense speed and bandwidth wise.

source
Sort:hotnewtop