Comment on What's the bang for the buck go to for AI image generation and LLM models?

hendrik@palaver.p3x.de ⁨2⁩ ⁨months⁩ ago

Buy the cheapest graphics card with 16 or 24GB of VRAM. In the past people bought used NVidia 3090 cards. You can also buy a GPU from AMD, they're cheaper but ROCm is a bit more difficult to work with. Or if you own a MacBook or any Apple device with a M2 or M3, use that. And hopefully you paid for enough RAM in it.

source
Sort:hotnewtop