Comment on Consumer GPUs to run LLMs

<- View Parent
RagingHungryPanda@lemm.ee ⁨4⁩ ⁨days⁩ ago

The coder model has only that one. The ones bigger than that are like 20GB+, and my GPU has 16GB. I’ve only tried two models, but it looked like the size balloons after that, so that may be the biggest models that I can run.

source
Sort:hotnewtop