Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone

DrDystopia@lemy.lol ⁨22⁩ ⁨hours⁩ ago

Just do like me - Install Ollama and OpenWebUI, install Termux on Android, connect through Termux with port forwarding.

ssh -L 0.0.0.0:3000:YOURPUBLICIP:3000

And access OpenWebUI at 127.0.0.1:3000 on your phone browser. Or SSH forward the Ollama port to use the Ollama Android app. This requires you to be on the same LAN as the server. If you port forward SSH through your router, you can access it remotely through your public IP (If so, I’d recommend only allowing login through certs or have a rate limiter for SSH login attempts.

But what are the chances that you run the LLM on a Linux machine and use an android to connect, like me, and not a windows machine and use an iPhone? You tell me. No specs posted…

source
Sort:hotnewtop