Comment on What type of computer setup would one need to run ai locally?

panda_abyss@lemmy.ca ⁨1⁩ ⁨day⁩ ago

High RAM for MOE models, high VRAM for dense models, and the highest GPU memory bandwidth you can get.

For stable diffusion models (comfyui), you want high VRAM and bandwidth. Diffusion is a GPU heavy and memory intensive operation.

Software/driver support is very important for diffusion models and comfy UI, so your best experience will be Nvidia cards.

I think realistically you need 80gb+ of RAM for things like qwen image quants (40 for model, 20-40 for LORA adapters in ComfyUI to get output).

I run an 128gb AMD AI 395+ Max rig, qwen image takes 5-20 minutes per 720p qwen image result in ComfyUI. Batching offers an improvement, reducing iterations during prototyping makes a huge difference. I have not tested since the fall though, and the newer models are more efficient.

source
Sort:hotnewtop