Comment on What budget friendly GPU for local AI workloads should I aim for?

snekerpimp@lemmy.world ⁨1⁩ ⁨week⁩ ago

Everyone is mentioning nvidia, but amds rocm has improved tremendously in the last few years, making a 6900xt 16gb an attractive option for me. I currently have a 6700xt 12gb that works no problem with ollama and comfyui, and an instinct mi25 16gb that works with some fiddling as well. From what I understand, an mi50 32gb requires less fiddling. However the instinct line is passively cooled, so finding a way to cool it might be a reason to stay away from them.

source
Sort:hotnewtop