BlackSnack
@BlackSnack@lemmy.zip
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 1 week ago:
Tried it on the pixel and no luck…
Apparently it’s not apk install telnet, I need to type pkg instead. I did that and got the response below (not exact obviously) -get: 1 http.termux.net -get: 2 (similar to above) -fetched 246kb -58 packages can be upgraded. Run ‘apt list —upgradable to see them. -ERROR unable to locate package telnet.
So I ran the command line ‘apt list upgrade’ (is that what it’s called, a command line?) and I got a bunch of text back. Most of it is something like below… -dpkg/stable #.##.# arch64 [upgradable from #.##.#]
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 1 week ago:
Sorry for the delay! I had a fun weekend…
The ish app seems dope. Looks like it could be useful but unfortunately Im not able to get that to work in the way we want either. The “install” prompt doesn’t work, It told me to use “—help” for more info. I did that and it said to install I use “upgrade” instead. I did that but I got back ‘package Telnet not found’. 🥲
I appreciate the help with the iOS but maybe switching to android would be best? My long term goal was to switch to android/pixel anyway because I heard those would be best for security/privacy concerns. And lucky me I have a pixel 3 I can switch everything too. I see you made another comment about how to try it on android….im going to give that a shot rn!
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Hell ya! I would definitely appreciate some hand holding thru this process! This self hosting stuff is going to take a bit longer and more learning than I anticipated.
-the opening the port process makes sense. It seems like if I have a backend on my rig, I’m going to need to open a port to access that backend from a front end of a phone device. Or possibly even access that same backend on the phone device via a mirror?
-it seems like it would be easier if I could connect to the rig via an android phone instead of an iPhone. My end goal is to use Linux but I’m not ready for that step. Seems like android would be an adequate stepping stone to move to, especially if we have to go thru all this trouble with iPhone. Shall we try on the android instead? If not I’ll follow the directions you put above and report back on Saturday.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Hate to say it but it didn’t work. I listed below the things I double checked. I really appreciate you helping me troubleshoot this, but it seems like I may have bitten off more than I can chew. I choose Ollama because it was supposed to be one of the easier loca AIs to set up. Do you have any recommendations for alternatives? Or do you think I should incorporate a docker or open web ui as some others have said ?
-when I went to the ollama app and entered the http://10.#.#.#:11434 , it didn’t work. Also tried the enchanted app and that didn’t work as well.
-I double checked the rule I made to make sure that was inputted properly. The 10.0.0.0/24 for the local and remote ip addresses.
-the sanity check went well. The ipv4 said closed. The ipv6 said failed.
-I checked the netstat -abn thing and 0.0.0.0:11434 is still listening.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
There are 3 lines with the :11434 in them. No brackets or anything like that. -1 has 0.0.0.0 in front -2 has 10.#.#.# in front and has a foreign address that is something other than 0.0.0 -3 is like the 2nd but a slightly different foreign address
The iPhone does have a 10.#.#.# ip number that is slightly different than the PCs.
The subnet mask is 255.255.255.0
I have taken a pause here while we trouble shoot the subnet mask. We’re getting close!!
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Dope! This is exactly what I needed! I would say that this is a very “hand holding” explanation which is perfect because I’m starting with 0% knowledge in this field! And I learned so much already from this post and your comment!
So here’s where I’m at, -A backend is where all the weird c++ language stuff happens to generate a response from an AI. -a front end is a pretty app or webpage that takes that response and make it more digestible to the user. -agreed. I’ve seen in other posts that exposing a port on windows defender firewall is the easiest (and safest?) way to go for specifically what I’m looking for. I don’t think I need to forward a port as that would be for more remote access. -I went to the whatismyipaddress website. The ipv6 was identical to one of the ones I have. The ipv4 was not identical. (But I don’t think that matters moving forward.) -I did the ipconfig in the command prompt terminal to find the info and my ipv4 is 10.blahblahblah.
- I ran netstat -abn (this is what worked to display the necessary info). I’m able to see 0.0.0.0 before the 11434! I had to go into the settings in the ollama backend app to enable “expose Ollama to the network”.
I’m ready for the next steps!
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Bet. Looking into that now. Thanks!
I believe I have 11g of vram, so I should be good to run decent models from what I’ve been told by the other AIs.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Sever is my rig which is running windows. Phone is iPhone.
Exposing the port is something I’ve tried to do in the past with no success! When you say, change the bind address, do I do that in the windows defender firewall in the inbound rules section?
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Backend/ front end. I see those a lot but I never got an explanation for it. In my case, the backend would be Ollama on my rig, and the front end would be me using it on my phone, whether that’s with and app or web ui. Is that correct?
I will add kobold to my list of AIs to check out in the future. Thanks!
Ollama has an app (or maybe interface is a better term for it) on windows right that I download models too. Then I can use said app to talk to the models. I believe Reins: Chat for Ollama is the app for iPhone that allows me to use my phone to chat with my models that are on the windows rig.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Yes exactly! I would love to keep it on my network for now. I’ve read that “exposing a port” is something I may have to do in my windows firewall options.
Yes I have Ollama on my windows rig. But im down to try out a different one if you suggest so. TBH, im not sure if librechat has a web ui. I think accessing the LLM on my phone via web browser would be easiest. But there are apps out there like Reins and Enchanted that I could take advantage of.
For right now I just want to do whatever is easiest so I can get a better understanding of what I’m doing wrong.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Oh! Also, I’m using windows on my PC. And my phone is an iPhone.
I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Bet, I’ll try that when I get home tonight. If I don’t have success can I message you directly ?
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
Bet. I believe what you mentioned is best for accessing my LLM no matter where I am in the world, correct? If so I will try this one after I try what the other person suggested.
Thank you!
- Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone 2 weeks ago:
lol I have! They all say the same similar thing but it’s just not working for me.
- Submitted 2 weeks ago to selfhosted@lemmy.world | 43 comments
- Comment on First Time Self Hoster- Need help with Radicale 3 weeks ago:
I thought I was stuck there but I misspoke. I made an edit to the original post. Thanks for the insight tho! In sure that’ll be helpful once I actually get there.
- Comment on First Time Self Hoster- Need help with Radicale 3 weeks ago:
I made an error in my original post. Please see the edit I made.
But I think I’m understanding a bit! I need to literally create a file named “/etc/radicale/config”. Then after that I need to copy/paste the configuration file/command line into said folder. Once I do that then I should be able to move onto authentication and then addresses.
- Comment on First Time Self Hoster- Need help with Radicale 3 weeks ago:
I misspoke earlier. I wasn’t having issues with the addresses part, I didn’t even make it that far. I’m stuck on authentication. I updated the original post and added a pic for clarity.
- Submitted 3 weeks ago to selfhosted@lemmy.world | 17 comments