Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone

<- View Parent
DrDystopia@lemy.lol ⁨3⁩ ⁨days⁩ ago

3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or localhost, only 0.0.0.0. Ollama’s port 11434 on 127.x worked fine though.

you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.

Fair point.

source
Sort:hotnewtop