nagaram
@nagaram@startrek.website
- Comment on See no evil 2 days ago:
Blood libel
Hey wait a minute…
- Comment on Dinner is ready! 2 days ago:
G easily.
Southern comfort food, Mexican, Carribean, and Sushi Rolls. RIP Pizza but I didn’t need it
- Comment on Wobble wobble 4 days ago:
Folks reading way too much into this lol.
reads too much into it
The joke is they died!
- Comment on When your brain is below your waist the ONE Pro is good enough 6 days ago:
Move “hits me” over and that’s my ex
- Comment on Do you think conservative feel the same need to burn it all down as everyone else felt when trump won again? 1 week ago:
I like your vague language. It really conveys the sense of mystery and intrigue you’re going for.
So Khazars are a real active modern faction? Is it like Kabal practicing Gentile Converts?
- Comment on Do you think conservative feel the same need to burn it all down as everyone else felt when trump won again? 1 week ago:
Khazar is an interesting new identity to me.
Cursory google search says it was a trade empire that lasted 200 years and converted to Judaism.
Off topic to the rest of the post, but I’m now deeply curious what a modern day Khazar is and what it means to you. Please enlighten me!
- Comment on 'Borderlands 4 is a premium game made for premium gamers' is Randy Pitchford's tone deaf retort to the performance backlash: 'If you're trying to drive a monster truck with a leaf blower's motor, you're going to be disappointed' 1 week ago:
Considering Randy REALLY wants you to pay $130 USD for this game, I’m not shocked his performance advice was “be less poor”
- Comment on The Left is Infighting? At least We Don't Shoot Each Other! 1 week ago:
You’re right. I should get in their DMs. See what’s going on. And if their behavior doesn’t change in a month well put their ban from the discord to a vote as any good anarchist would.
- Comment on The Left is Infighting? At least We Don't Shoot Each Other! 1 week ago:
I’m so glad we just ban each other from book clubs and defederate.
- Comment on Nightmare blunt rotation... or killer rotation? 1 week ago:
Cop vibes
- Comment on When real life generates the shitpost 1 week ago:
Miss conception. It was actually a synth
- Comment on Blursed hygiene 2 weeks ago:
Mmmmmm… Foot yeast makes the best spicy grape juice!
- Comment on AI in Education: Doomed? 2 weeks ago:
A few months ago now, Arizona? Arkansas maybe? Some state legalized “AI powered” home schooling systems. But it was mostly clickbait and the system is less like ChatGPT and more like the YouTube Algorithm machine learning. It takes into account the stuff that students do well at and let’s them advance beyond “grade level” limitations while also learning how to present problem areas in ways the student responds to.
I had asked my home schooled AI researcher buddy his thoughts and he obviously liked it. I like the idea too, but my hang up was on socializing kids. That to me is the more important role of schools.
I wouldn’t trust an LLM in this set up though. A human tutor would still need to step in for questions outside of a FAQ IMO. I love working with an LLM by giving it all the manuals, guides, and config files I used then asking where I went wrong because it can usually give me a good enough interpretation to see where to go next. But that’s just a rubber duck. My mind and skills are developed. A kid learning math for Tue first time can’t do that.
- Comment on "Very dramatic shift" - Linus Tech Tips opens up about the channel's declining viewership 2 weeks ago:
Oh the rossman video.
I hate how obsessed on dumb shit he gets. The man is legitimately doing great work usually, and then he takes something minor that an otherwise ally says or does and blows it out of proportion.
This man would have made a great tankie. Unfortunately he made a whole 20 minute video on why AOC is stupid for saying unskilled labor doesn’t exist and then explaining exactly the points she was making.
I legitimately love this mans work and I wanna support him, but man is he petty.
- Comment on That one Pokémon 3 weeks ago:
Softshell turtle
- Comment on 1U mini PC for AI? 3 weeks ago:
Honestly if you’re not gaming or playing with new hardware, there is absolutely no point.
I’ve considered swapping this computer over to Fedora for a hot minute, but it really is a gaming PC and I should stop trying to break it.
- Comment on 1U mini PC for AI? 3 weeks ago:
True, but I have an addiction and that’s buying stuff to cope with all the drawbacks of late stage capitalism.
I am but a consumer who must be given reasons to consume.
- Comment on 1U mini PC for AI? 3 weeks ago:
The Lenovo Thinkcentre M715q were $400 total after upgrades. I fortunately had 3 32 GB kits of ram from my work’s e-waste bin but if I had to add those it would probably be $550 ish The rack was $120 from 52pi I bought 2 extra 10in shelves for $25 each the Pi cluster rack was also $50 (shit I thought it was $20. Not worth) Patch Panel was $20 There’s a UPS that was $80 And the switch was $80
So in total I spent $800 on this set up
To fully replicate from scratch you would need to spend $160 on raspberry pis and probably $20 on cables
So $1000 theoratically
- Comment on 1U mini PC for AI? 3 weeks ago:
Ollama and all that runs on it its just the firewall rules and opening it up to my network that’s the issue.
I cannot get ufw, iptables, or anything like that running on it. So I usually just ssh into the PC and do a CLI only interaction. Which is mostly fine.
I want to use OpenWebUI so I can feed it notes and books as context, but I need the API which isn’t open on my network.
- Comment on 1U mini PC for AI? 3 weeks ago:
I was thinking about that now that I have Mac Minis on the mind. I might even just set a mac mini on top next to the modem.
- Comment on 1U mini PC for AI? 3 weeks ago:
Ollama + Gemma/Deepseek is a great start. I have only ran AI on my AMD 6600XT and that wasn’t great and everything that I know is that AMD is fine for gaming AI tasks these days and not really LLM or Gen AI tasks.
A RTX 3060 12gb is the easiest and best self hosted option in my opinion. New for >$300 and used even less. However, I was running with a Geforce 1660 ti for a while and thats >$100
- Comment on 1U mini PC for AI? 3 weeks ago:
A mac is a very funny and objectively correct option
- Comment on 1U mini PC for AI? 3 weeks ago:
I think I’m going to have a harder time fitting a threadripper in my 10 inch rack than I am getting any GPU in there.
- Comment on 1U mini PC for AI? 3 weeks ago:
I do already have a NAS. It’s in another box in my office.
I was considering replacing the PIs with a BOD and passing that through to one of my boxes via USB and virtualizing something. I compromised by putting 2tb Sata SSDs in each box to use for database stuff and then backing that up to the spinning rust in the other room.
How do I do that? Good question. I take suggestions.
- Comment on 1U mini PC for AI? 3 weeks ago:
With a RTX 3060 12gb, I have been perfectly happy with the quality and speed of the responses. It’s much slower than my 5060ti which I think is the sweet spot for text based LLM tasks. A larger context window provided by more vram or a web based AI is cool and useful, but I haven’t found the need to do that yet in my use case.
As you may have guessed, I can’t fit a 3060 in this rack. That’s in a different server that houses my NAS. I have done AI on my 2018 Epyc server CPU and its just not usable. Even with 109gb of ram, not usable. Even clustered, I wouldn’t try running anything on these machines. They are for docker containers and minecraft servers. Jeff Geerling probably has a video on trying to run an AI on a bunch of Raspberry Pis. I just saw his video using Ryzen AI Strix boards and that was ass compared to my 3060.
But to my use case, I am just asking AI to generate simple scripts based on manuals I feed it or some sort of writing task. I either get it to take my notes on a topic and make an outline that makes sense and I fill it in or I feed it finished writings and ask for grammatical or tone fixes. Thats fucking it and it boggles my mind that anyone is doing anything more intensive then that. I am not training anything and 12gb VRAM is plenty if I wanna feed like 10-100 pages of context. Would it be better with a 4090? Probably, but for my uses I haven’t noticed a difference in quality between my local LLM and the web based stuff.
- Comment on 1U mini PC for AI? 3 weeks ago:
That’s fair and justified. I have the label maker right now in my hands. I can fix this at any moment and yet I choose not to.
I’m man feeding orphans to the orphan crushing machine. I can stop this at any moment.
- Comment on 1U mini PC for AI? 3 weeks ago:
Oh and my home office set up uses Tiny in One monitors so I configured these by plugging them into my monitor which was sick.
I’m a huge fan of this all in one idea that is upgradable.
- Comment on 1U mini PC for AI? 3 weeks ago:
These are M715q Thinkcentres with a Ryzen Pro 5 2400GE
- Comment on 1U mini PC for AI? 3 weeks ago:
Not really a lot of thought went into rack choice. I wanted something smaller and more powerful than my several optiplexs I had.
I also decided I didn’t want storage to happen here anymore because I am stupid and only knew how to pass through disks for Truenas. So I had 4 truenas servers on my network and I hated it.
This was just what I wanted at a price I was good with at Like $120. There’s a 3D printable version but I wasn’t interested in that. I do want to 3D print racks and I want to make my own custom ones for the Pis to save space.
But this set up is way cheaper if you have a printer and some patience.
- Comment on 1U mini PC for AI? 3 weeks ago:
Not much. As much as I like LLMs, I don’t trust them for more than rubber duck duty.
Eventually I want to have a Copilot at Home set up where I can feed a notes database and whatever manuals and books I’ve read so it can draw from that when I ask it questions.
The problem is my best GPU is my gaming GPU a 5060ti and its in a Bazzite gaming PC so its hard to get the AI out of it because of Bazzite’s “No I won’t let you break your computer” philosophy, which is why I did it. And my second best GPU is a 3060 12GB which is really good, but if I made a dedicated AI server, I’d want it to be better than my current server.