Comment on The hidden cost of self-hosting
MTK@lemmy.world 1 month ago
You could use an llm with an mcp to the local filesystem and hope it can do it for you
Comment on The hidden cost of self-hosting
MTK@lemmy.world 1 month ago
You could use an llm with an mcp to the local filesystem and hope it can do it for you
diegantobass@lemmy.world 1 month ago
Or I could not. Ever.
MTK@lemmy.world 1 month ago
I know there is all of that AI hate, which i’m all for. But taking models to run locally does not benefit the AI companies. If anything this is the way to make something that is actually good out of that hot mess.
diegantobass@lemmy.world 1 month ago
You’re right, but I’d need a graphic card < money.tar.gzip
Jason2357@lemmy.ca 1 month ago
I used phi3:mini-4k for tagging all my bookmarks and don’t think it was any worse than a big model for that kind of job. It will run on a 10 year old cpu and a few gb of ram.
MTK@lemmy.world 1 month ago
Yeah, personally I just looked for second hand high vram gpus and waited. I got 2 titan Xp (12gb vram) for only $180 each.