Bet, I’ll try that when I get home tonight. If I don’t have success can I message you directly ?
Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone
DrDystopia@lemy.lol 22 hours ago
Just do like me - Install Ollama and OpenWebUI, install Termux on Android, connect through Termux with port forwarding.
ssh -L 0.0.0.0:3000:YOURPUBLICIP:3000
And access OpenWebUI at 127.0.0.1:3000 on your phone browser. Or SSH forward the Ollama port to use the Ollama Android app. This requires you to be on the same LAN as the server. If you port forward SSH through your router, you can access it remotely through your public IP (If so, I’d recommend only allowing login through certs or have a rate limiter for SSH login attempts.
But what are the chances that you run the LLM on a Linux machine and use an android to connect, like me, and not a windows machine and use an iPhone? You tell me. No specs posted…
BlackSnack@lemmy.zip 21 hours ago
tal@lemmy.today 22 hours ago
ssh -L 0.0.0.0:3000:YOURPUBLICIP:3000
If you can SSH to the LLM machine, I’d probably recommend
ssh -L127.0.0.1:11434:127.0.0.1:11434 <remote hostname>
. If for some reason you don’t have or inadvertently bring down a firewall on your portable device, you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.(Using 11434 instead of 3000, as it looks like that’s ollama’s port.)
DrDystopia@lemy.lol 22 hours ago
3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or
localhost
, only 0.0.0.0. Ollama’s port 11434 on 127.x worked fine though.you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.
Fair point.
BlackSnack@lemmy.zip 21 hours ago
Oh! Also, I’m using windows on my PC. And my phone is an iPhone.
I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.
tal@lemmy.today 21 hours ago
Okay, that’s a starting place. So if this is Windows, and if you only care about access on the wireless network, then I suppose that it’s probably easiest to just expose the stuff directly to other machines on the wireless network, rather than tunneling through SSH.
You said that you have ollama running on the Windows PC. I’m not familiar with LibreChat, but it has a Web-based interface? Are you wanting to access that from a web browser on the phone?
BlackSnack@lemmy.zip 20 hours ago
Yes exactly! I would love to keep it on my network for now. I’ve read that “exposing a port” is something I may have to do in my windows firewall options.
Yes I have Ollama on my windows rig. But im down to try out a different one if you suggest so. TBH, im not sure if librechat has a web ui. I think accessing the LLM on my phone via web browser would be easiest. But there are apps out there like Reins and Enchanted that I could take advantage of.
For right now I just want to do whatever is easiest so I can get a better understanding of what I’m doing wrong.
tal@lemmy.today 20 hours ago
Okay, gotcha. I don’t know if Ollama has a native Web UI itself; if so, I haven’t used it myself. I know that it can act as a backend for various front-end chat-based applications. I do know that kobold.cpp can operate both as an LLM backend and run a limited Web UI, so at least some backends do have Web UIs built in. You said that you’ve already used Ollama successfully. Was this via some Web-based UI that you would like to use on your phone, or just some other program (LibreChat?) running natively on the Windows machine?