Like make a query and then go make yourself a sandwich while it spits out a word every other second slow.
There are very small models that can run on mid range graphics cards and all, but it’s not something you’d look at and say “Yeah this does most of what chatGPT does”
Evono@lemmy.dbzer0.com 1 week ago
Basicly I can run 9b models on my 16gb gpu mostly fine like getting responses of lets say 10 lines in a few seconds.
Bigger models if they don’t outright crash take for the same task then like 5x or 10x longer so long it isn’t even useful anymore
So very worse.