Comment on Selfhost an LLM
ragingHungryPanda@piefed.keyboardvagabond.com 1 week agonot for LLMs. I have a 16GB and even what I can fit in there just isn’t really enough to be useful. It can still do things and quickly enough, but I can’t fit models that large enough to be useful.
eleitl@lemmy.zip 1 week ago
The GPU used to but they dropped ROCm support for Radeon V and VII some time ago. Have to look at that Strix/AI thing I guess.