Comment on New to self-hosting
dan@upvote.au 2 days agoI haven’t looked into paperless-ai yet, but I hope my machine would be beefy enough for this task
You need a GPU with a decent amount of VRAM to get LLMs working well locally. I don’t have a new enough GPU to be useful - my server just has the Intel iGPU, and my desktop PC only has a GTX1080, which is from before Nvidia added Tensor cores for AI.
BennyInc@feddit.org 2 days ago
Thanks, I’ll look into it. For completionists: This is the article about how to properly archive paper: peelarchivesblog.com/…/how-do-archivists-package-…