Comment on What budget friendly GPU for local AI workloads should I aim for?
Diplomjodler3@lemmy.world 6 days agoQuite so. The cheapest card that I’d put any kind of real AI workload on is the 16GB Radeon 9060XT. That’s not what I would call budget friendly, which is why I consider a budget friendly AI GPU to be a mythical beast.
irmadlad@lemmy.world 6 days ago
It would be super cool tho, to have a server dedicated to a totally on premise AI with no connectivity to external AI. I’m just not sure whether I can justify several thousand dollars of equipment. Because if I did, true to my nature, I’d want to go all in.