Apparently Framework did try to get AMD to use LPCAMM, but it just didn’t work from a signal integrity standpoint at the kind of speeds they need to run the memory at.
Comment on Framework’s first desktop is a strange—but unique—mini ITX gaming PC
slacktoid@lemmy.ml 1 year agoI understand the memory constraints but it does feel weird for framework, is all I have to say. But that’s also the general trajectory of computing from what it seems. I really want lpcamm to catch on!
Scholars_Mate@lemmy.world [bot] 1 year ago
grue@lemmy.world 1 year ago
Sounds like it doesn’t bode well for the future of DIMMs at all, TBH.
avidamoeba@lemmy.ca 1 year ago
My AM5 system doesn’t post with 128GB of 5600 DDR5 at higher than 4400 at JEDEC timings and voltage. 2 DIMMs are fine. 4 DIMMs… rip. So I’d say the present of DIMMs is already a bit shaky. DIMMs are great for lots of cheap RAM. I paid a lot less than what I’d have to pay for the equivalent size of RAM in a Framework desktop. Of course there are significant differences between the speed.
SpaceNoodle@lemmy.world 1 year ago
You have a DIMM view of the future.
brucethemoose@lemmy.world 1 year ago
Eventually most system RAM will have to be packaged anyway. Physics dictate you pay a penalty having it go over pins and mobo traces, and it gets more severe with every advancement.
It’s possible that external RAM will eventually evolve into a “2nd tier” of system memory, for background processes, spillover, inactive programs/data, things like that).
slacktoid@lemmy.ml 1 year ago
That would be fine. But as long as it can use it as RAM and not just a staging ground.
brucethemoose@lemmy.world 1 year ago
Keep in mind that it would be pretty slow, as it doesn’t make sense to burn power and die area on a wide secondary bus.
leisesprecher@feddit.org 1 year ago
It’s already fourth tier after L1, L2, L3 caches.
Maybe something like optane will make a comeback. Having 16gb of soldered RAM and 500gb of relatively slow, but inexpensive optane RAM would be great.
brucethemoose@lemmy.world 1 year ago
DRAM is so cheap and ubiquitous that they will probably keep using that, barring any massive breakthroughs. The “persistence after power-off” is nice to have, but not strictly needed.