I’m often using 100gb of cram for ai.
Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.
Comment on GPU prices are coming to earth just as RAM costs shoot into the stratosphere - Ars Technica
hoshikarakitaridia@lemmy.world 3 weeks agoAI or servers probably. I have 40gb and that’s what I would need more ram for.
I’m still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn’t buy them in time before the prices skyrocketed. Fuck me I guess.
I’m often using 100gb of cram for ai.
Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.
Damn
Yeah used ram is probably where it’s at. Maybe you get them used later on from data centers…
Yep, used ECC server RAM DDR3 or DDR4 is basically thrown out. Unfortunately most consumer mainboards do not support ECC.
This is exactly the reason I’m about to order a dell poweredge r630 with Intel xeon 2680 v4 from alibaba.
Also I’ve never ordered from alibaba before so we’ll see if I get scammed xd
NotMyOldRedditName@lemmy.world 3 weeks ago
It does work, but it’s not really fast. I upgraded to 96gb from 32gb, and being able to play with the bigger models was fun, but it’s not something I could do anything productive with it was so slow.
possiblylinux127@lemmy.zip 3 weeks ago
Your bottle necked by memory bandwidth
You need ddr5 with lots of memory channels for it to he useful
hoshikarakitaridia@lemmy.world 3 weeks ago
I always thought using ddr5 average speeds with like 64gb in sticks on consumer boards is passable. Not great, but passable.
tal@lemmy.today 3 weeks ago
You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, I think that that’s correct.
NotMyOldRedditName@lemmy.world 3 weeks ago
Ya, that’s fair. If I was doing something I didn’t care about time on, it did work.