Comment on What budget friendly GPU for local AI workloads should I aim for?

<- View Parent
irmadlad@lemmy.world ⁨6⁩ ⁨days⁩ ago

<giggle> I’ve self hosted a few of the bite sized LLM. The thing that’s keeping me from having a full blown, self hosted AI platform is my little GeForce 1650 just doesn’t have the ass to really do it up right. If I’m going to consult with AI, I want the answers within at least 3 or 4 minutes, not hours. LOL

source
Sort:hotnewtop