Comment on What budget friendly GPU for local AI workloads should I aim for?
irmadlad@lemmy.world 6 days ago<giggle> I’ve self hosted a few of the bite sized LLM. The thing that’s keeping me from having a full blown, self hosted AI platform is my little GeForce 1650 just doesn’t have the ass to really do it up right. If I’m going to consult with AI, I want the answers within at least 3 or 4 minutes, not hours. LOL
Diplomjodler3@lemmy.world 6 days ago
Quite so. The cheapest card that I’d put any kind of real AI workload on is the 16GB Radeon 9060XT. That’s not what I would call budget friendly, which is why I consider a budget friendly AI GPU to be a mythical beast.
irmadlad@lemmy.world 6 days ago
It would be super cool tho, to have a server dedicated to a totally on premise AI with no connectivity to external AI. I’m just not sure whether I can justify several thousand dollars of equipment. Because if I did, true to my nature, I’d want to go all in.