Comment on Consumer GPUs to run LLMs
AMillionMonkeys@lemmy.world 1 week ago
I would prefer to have GPUs for under $600 if possible
Unfortunately not possible for a new nvidia card (you want CUDA) with 16GB VRAM. You can get them for ~$750 if you’re patient. This deal was available for awhile earlier today:
…msi.com/…/GeForce-RTX-5070-Ti-16G-SHADOW-3X-OC
Or you could try to find a 16GB 4070Ti Super like I got. It runs Deepseek 14B and stuff like Stable Diffusion no problem.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
I am OK with either Nvidia or AMD especially if Ollama supports it. With that said I have heard that AMD takes some manual effort whilst Nvidia is easier. Depends on how difficult ROCM is
swelter_spark@reddthat.com 1 week ago
With Ollama, all you have do is copy an extra folder of ROCm files. Not hard at all.