Ohoho? That’s interesting. I don’t have the horse power to selfhost an AI, but that’s good to know !
Thanks for the pointer !!!
You can self host that too ;)
OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it’s easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you’re not confined to ai results. It’s not quite duck.ai slick but I think I can get there with some more tinkering.
Ohoho? That’s interesting. I don’t have the horse power to selfhost an AI, but that’s good to know !
Thanks for the pointer !!!
dubyakay@lemmy.ca 14 hours ago
Is there a guide on how to do this on Linux + 16GB Radeon?
sunstoned@lemmus.org 13 hours ago
I mean, I could write one! I kind of just pieced it together from guides on the three individuals