Comment on The hidden cost of self-hosting
MTK@lemmy.world 1 week agoI know there is all of that AI hate, which i’m all for. But taking models to run locally does not benefit the AI companies. If anything this is the way to make something that is actually good out of that hot mess.
diegantobass@lemmy.world 1 week ago
You’re right, but I’d need a graphic card < money.tar.gzip
Jason2357@lemmy.ca 1 week ago
I used phi3:mini-4k for tagging all my bookmarks and don’t think it was any worse than a big model for that kind of job. It will run on a 10 year old cpu and a few gb of ram.
MTK@lemmy.world 1 week ago
Yeah, personally I just looked for second hand high vram gpus and waited. I got 2 titan Xp (12gb vram) for only $180 each.