Comment on (A)lbert e(I)nstein

<- View Parent
ArsonButCute@lemmy.dbzer0.com ⁨1⁩ ⁨week⁩ ago

A Relatively recent gaming-type setup with local-ai or llama.cpp is what I’d recommend.

I do most of my AI stuff with an rtx3070, but I also have a ryzen 7 3800x with 64gb RAM for heavy models where I don’t so much care how long it takes but need the high parameter count for whatever reason, for example MoE and agentic behavior.

source
Sort:hotnewtop