Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone
BlackSnack@lemmy.zip 19 hours agoOh! Also, I’m using windows on my PC. And my phone is an iPhone.
I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.
tal@lemmy.today 18 hours ago
Okay, that’s a starting place. So if this is Windows, and if you only care about access on the wireless network, then I suppose that it’s probably easiest to just expose the stuff directly to other machines on the wireless network, rather than tunneling through SSH.
You said that you have ollama running on the Windows PC. I’m not familiar with LibreChat, but it has a Web-based interface? Are you wanting to access that from a web browser on the phone?
BlackSnack@lemmy.zip 18 hours ago
Yes exactly! I would love to keep it on my network for now. I’ve read that “exposing a port” is something I may have to do in my windows firewall options.
Yes I have Ollama on my windows rig. But im down to try out a different one if you suggest so. TBH, im not sure if librechat has a web ui. I think accessing the LLM on my phone via web browser would be easiest. But there are apps out there like Reins and Enchanted that I could take advantage of.
For right now I just want to do whatever is easiest so I can get a better understanding of what I’m doing wrong.
tal@lemmy.today 17 hours ago
Okay, gotcha. I don’t know if Ollama has a native Web UI itself; if so, I haven’t used it myself. I know that it can act as a backend for various front-end chat-based applications. I do know that kobold.cpp can operate both as an LLM backend and run a limited Web UI, so at least some backends do have Web UIs built in. You said that you’ve already used Ollama successfully. Was this via some Web-based UI that you would like to use on your phone, or just some other program (LibreChat?) running natively on the Windows machine?
BlackSnack@lemmy.zip 17 hours ago
Backend/ front end. I see those a lot but I never got an explanation for it. In my case, the backend would be Ollama on my rig, and the front end would be me using it on my phone, whether that’s with and app or web ui. Is that correct?
I will add kobold to my list of AIs to check out in the future. Thanks!
Ollama has an app (or maybe interface is a better term for it) on windows right that I download models too. Then I can use said app to talk to the models. I believe Reins: Chat for Ollama is the app for iPhone that allows me to use my phone to chat with my models that are on the windows rig.