Comment on (A)lbert e(I)nstein

<- View Parent
GeneralDingus@lemmy.cafe ⁨5⁩ ⁨days⁩ ago

I’m not sure what you mean by ideal. Like, run any model you ever wanted? Probably the latest ai nvidia chips.

But you can get away with a lot less for smaller models. I have the amd mid range card from 4 years ago (i forget the model at the top of my head) and can run text, 8B sized, models without issue.

source
Sort:hotnewtop