Comment on (A)lbert e(I)nstein
GeneralDingus@lemmy.cafe 5 days agoI’m not sure what you mean by ideal. Like, run any model you ever wanted? Probably the latest ai nvidia chips.
But you can get away with a lot less for smaller models. I have the amd mid range card from 4 years ago (i forget the model at the top of my head) and can run text, 8B sized, models without issue.
ptu@sopuli.xyz 5 days ago
I’m sorry, I use chatgpt for writing mysql queries and dax-formulas so that would be the use case.
GeneralDingus@lemmy.cafe 5 days ago
You’d have to go looking for a specific model but huggingface.co has a model for nearly anything you’d want. You just have to setup your local machine to run it.