You can run Deepseek on a Raspberry Pi.
Comment on Would you use a self-hosted, AI-powered search engine for your favorite sites?
T156@lemmy.world 1 week agoI can’t imagine self hosting an LLM-based search engine would be too viable. The hardware demands, even for a relatively small quantised model, are considerable. Doubly so if you don’t have a GPU to accelerate with.
JeremyHuntQW12@lemmy.world 6 days ago
Senal@programming.dev 5 days ago
At a level you’d need to use for a search engine ?
CameronDev@programming.dev 1 week ago
Yeah, absolutely. And running a GPU 24/7 to occasionally search is just a waste of power. I’m not convinced that google and bings AI search makes financial sense either, Google dropped live search (where the results updated as you typed realtime) because it was too expensive, how does LLM search end up cheaper than live search?!
Jakeroxs@sh.itjust.works 6 days ago
You realize the gpu site idle when not actively being used right?
It’d be cheaper if you host it locally, essentially just your normal electricity bill which is the entire point of what op is saying lol.
CameronDev@programming.dev 6 days ago
Idle is low power, not zero power. And it won’t be idle when its scraping and parsing the sites, so depending on how much scraping its doing, it could be significant non-idle energy usage.
Jakeroxs@sh.itjust.works 6 days ago
The gpu is already running because it’s in the device, by this logic I shouldn’t have a GPU in my homelab until I want to use it for something, rip jellyfin and immich I guess.
I get the impression you don’t really understand how local LLMs, you likely wouldn’t need a very large model to run basic scraping, just would depend on what OP has in mind really or what kind of schedule it runs on. You should consider the difference between a mega corps server farm compared to some rando using this locally on consumer hardware. (which seems to be the intent from OP)