Comment on Opera is testing letting you download LLMs for local use, a first for a major browser

<- View Parent
T156@lemmy.world ⁨1⁩ ⁨month⁩ ago

Unlikely, at least on non-nvidia chips, and even on AMD, it’s only the latest four chips that support it. Anything older isn’t going to cut it.

You also need a fairly big amount of VRAM for models like that. (4 GB is the minimum for the common kinds, which is more than typical integrated systems, or 8 GB of system memory). You can get by with system RAM, but the performance will be quite bad, since you’re either relying on the CPU, or you’ll be adding the latency from data moving between them.

source
Sort:hotnewtop