Comment on What type of computer setup would one need to run ai locally?

KairuByte@lemmy.dbzer0.com ⁨3⁩ ⁨days⁩ ago

It really comes down to what kind of speed you want. You can run some LLMs on older hardware “just fine” and many models without a dedicated GPU. The problem is that the time taken to generate responses gets to be crazy.

I ran DeepSeek on an old R410 for shits and giggles a while back, and it worked. It just took multiple minutes to actually give me a complete response.

source
Sort:hotnewtop