I just had a look, 2nd of April I payed 67,000 KRW for one 16 GB stick, now the same one (XPG DDR5 PC5-48000 CL30 LANCER BLADE White), they only sell them in pairs, a pair costs 470,000 KRW in the same shop, so 235,000 KRW per 16 GB stick. That is a price increase of 250%, god damn.
Comment on GPU prices are coming to earth just as RAM costs shoot into the stratosphere - Ars Technica
jeena@piefed.jeena.net 5 hours ago
This is very unfortunate, about a year ago I built my PC and only put in 32 GB of Ram, It was double I had on my laptop so I thought it should be enough for the beginning and I could buy more later.
Already after 2 months I realizes I can do so much more because of the fast CPU in parallel but suddenly the amount of RAM became the bottleneck. When I looked at the RAM prices it didn’t seem quite worth it and I waited. But that backfired because since then the prices never went down, only up.
jeena@piefed.jeena.net 2 hours ago
tal@lemmy.today 3 hours ago
Last I looked, a few days ago on Google Shopping, you could still find some retailers that had stock of DDR5 (I was looking at 2x16GB, and you may want more than that)and hadn’t jacked their prices up, but if you’re going to buy, I would not wait longer, because if they haven’t been cleaned out by now, I expect that they will be soon.
roofuskit@lemmy.world 2 hours ago
Switch to Linux, double your available RAM for free.
jeena@piefed.jeena.net 2 hours ago
I’ve been on Linux since 2002.
VaalaVasaVarde@sopuli.xyz 9 minutes ago
Then use a distribution from 2002 it uses less ram.
/s
NotSteve_@piefed.ca 3 hours ago
What are you running that needs more than 32Gb? I’m only just barely being bottlenecked by my 24Gb when running games at 4k
jeena@piefed.jeena.net 2 hours ago
Two browsers full of tabs but that is not a problem, but once I start compiling AOSP (which I sometimes want to do for work at home instead in the cloud because it’s easier and faster to debugg) then it eats up all the RAM imediatelly and I have to give it 40 more GB or swap and then this swapping is the bottleneck. Once that is running the computer can’t really do anything else, even the browser struggles.
hoshikarakitaridia@lemmy.world 3 hours ago
AI or servers probably. I have 40gb and that’s what I would need more ram for.
I’m still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn’t buy them in time before the prices skyrocketed. Fuck me I guess.
NotMyOldRedditName@lemmy.world 3 hours ago
It does work, but it’s not really fast. I upgraded to 96gb from 32gb, and being able to play with the bigger models was fun, but it’s not something I could do anything productive with it was so slow.
possiblylinux127@lemmy.zip 3 hours ago
Your bottle necked by memory bandwidth
You need ddr5 with lots of memory channels for it to he useful
tal@lemmy.today 3 hours ago
You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, I think that that’s correct.
panda_abyss@lemmy.ca 1 hour ago
I’m often using 100gb of cram for ai.
Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.