It does work, but it’s not really fast. I upgraded to 96gb from 32gb, and being able to play with the bigger models was fun, but it’s not something I could do anything productive with it was so slow.
Comment on GPU prices are coming to earth just as RAM costs shoot into the stratosphere - Ars Technica
hoshikarakitaridia@lemmy.world 2 hours agoAI or servers probably. I have 40gb and that’s what I would need more ram for.
I’m still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn’t buy them in time before the prices skyrocketed. Fuck me I guess.
NotMyOldRedditName@lemmy.world 1 hour ago
possiblylinux127@lemmy.zip 1 hour ago
Your bottle necked by memory bandwidth
You need ddr5 with lots of memory channels for it to he useful
tal@lemmy.today 1 hour ago
You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, I think that that’s correct.
NotMyOldRedditName@lemmy.world 1 hour ago
Ya, that’s fair. If I was doing something I didn’t care about time on, it did work.
panda_abyss@lemmy.ca 27 minutes ago
I’m often using 100gb of cram for ai.
Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.