Comment on Opera is testing letting you download LLMs for local use, a first for a major browser

<- View Parent
Bandicoot_Academic@lemmy.one ⁨2⁩ ⁨months⁩ ago

Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.

source
Sort:hotnewtop