Comment on Selfhost an LLM
Is Radeon V with 8 GB HBM worth using today?
not for LLMs. I have a 16GB and even what I can fit in there just isn’t really enough to be useful. It can still do things and quickly enough, but I can’t fit models that large enough to be useful.
The GPU used to but they dropped ROCm support for Radeon V and VII some time ago. Have to look at that Strix/AI thing I guess.
ragingHungryPanda@piefed.keyboardvagabond.com 1 week ago
not for LLMs. I have a 16GB and even what I can fit in there just isn’t really enough to be useful. It can still do things and quickly enough, but I can’t fit models that large enough to be useful.
eleitl@lemmy.zip 1 week ago
The GPU used to but they dropped ROCm support for Radeon V and VII some time ago. Have to look at that Strix/AI thing I guess.