Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone

<- View Parent
tal@lemmy.today ⁨17⁩ ⁨hours⁩ ago

Yes I have Ollama on my windows rig.

TBH, im not sure if librechat has a web ui.

Okay, gotcha. I don’t know if Ollama has a native Web UI itself; if so, I haven’t used it myself. I know that it can act as a backend for various front-end chat-based applications. I do know that kobold.cpp can operate both as an LLM backend and run a limited Web UI, so at least some backends do have Web UIs built in. You said that you’ve already used Ollama successfully. Was this via some Web-based UI that you would like to use on your phone, or just some other program (LibreChat?) running natively on the Windows machine?

source
Sort:hotnewtop