Comment on Consumer GPUs to run LLMs
hedgehog@ttrpg.network 2 days ago
I recommend a used 3090, as that has 24 GB of VRAM and generally can be found for $800ish or less (at least when I last checked, in February). It’s much cheaper than a 4090 and while admittedly more expensive than the inexpensive 24GB Nvidia Tesla card (the P40?) it also has much better performance and CUDA support.
I have dual 3090s so my performance won’t translate directly to what a single GPU would get, but it’s pretty easy to find stats on 3090 performance.
marauding_gibberish142@lemmy.dbzer0.com 2 days ago
The 7900XTX was $1000 when it launched, I wouldn’t mind it used either.