Comment on Consumer GPUs to run LLMs

<- View Parent
RagingHungryPanda@lemm.ee ⁨1⁩ ⁨month⁩ ago

The coder model has only that one. The ones bigger than that are like 20GB+, and my GPU has 16GB. I’ve only tried two models, but it looked like the size balloons after that, so that may be the biggest models that I can run.

source
Sort:hotnewtop