Comment on 16GB of RAM Could Be the New Minimum in Apple's Upcoming M4 Macs
lengau@midwest.social 2 months agoMy Linux machine has 64 GiB of RAM, which is like 128 GiB of Mac RAM. It’s still not enough
Comment on 16GB of RAM Could Be the New Minimum in Apple's Upcoming M4 Macs
lengau@midwest.social 2 months agoMy Linux machine has 64 GiB of RAM, which is like 128 GiB of Mac RAM. It’s still not enough
areyouevenreal@lemm.ee 2 months ago
Serious question what are you using all that RAM for? I am having a hard time justifying upgrading one of my laptops to 32 GiB, nevermind 64 GiB.
lengau@midwest.social 2 months ago
For me in particular I’m a software developer who works on developer tools, so I have a lot of tests running in VMs so I can test on different operating systems. I just finished running a test suite that used up over 50 gigs of RAM
InvertedParallax@lemm.ee 2 months ago
Same, 48c/96t with 192gb ram.
make -j is fun, htop triggers epilepsy.
Few vms, but tons of Lxc containers, it’s like having 1 machine that runs 20 systems in parallel and really fast.
Have containers for dev, for browsing, for wine, the dream finally made manifest.
Mistic@lemmy.world 2 months ago
If games, modding uses a lot. It can go to the point of needing more than 32gb, but rarely so.
Usually, you’d want 64gb or more for things like video editing, 3d modeling, running simulations, LLMs, or virtual machines.
areyouevenreal@lemm.ee 2 months ago
I use Virtual Machines and run local LLMs. LLMs need VRAM rather than CPU RAM. You shouldn’t be doing it on a laptop without a serious NPU or GPU, if at all. I don’t know if I will be using VMs heavily on this machine or not, but that would be a good reason to have more RAM. Even so 32 GiB should be enough for a few VMs running concurrently.
Mistic@lemmy.world 2 months ago
That’s fair. I’ve put it there as more of a possible use case rather than something you should be doing.
Although iGPU can perform quite well when given a lot of RAM
tal@lemmy.today 2 months ago
Honestly, I think that for many people, if they’re using a laptop or phone, doing LLM stuff remotely makes way more sense. It’s just too power-intensive to do a lot of that on battery. That doesn’t mean not-controlling the hardware – I keep a machine with a beefy GPU connected to the network, can use it remotely. But something like Stable Diffusion normally requires only pretty limited bandwidth to use remotely.
If people really need to do a bunch of local LLM work, like they have a hefty source of power but lack connectivity, or maybe they’re running some kind of software that needs to move a lot of data back and forth to the LLM hardware, I think I might consider lugging around a small headless LLM box with a beefy GPU and a laptop, plug the LLM box into the laptop via Ethernet or whatnot, and do the LLM stuff on the headless box. Laptops are just not a fantastic form factor for heavy crunching; they’ve got limited ability to dissipate heat and tight space constraints to work with.
Reverendender@sh.itjust.works 2 months ago
Photo Editing, Video Transcoding.
datelmd5sum@lemmy.world 2 months ago
k8s
tal@lemmy.today 2 months ago
Any memory that’s going unused by apps is going to be used by the OS for caching disk contents. That’s not as significant with SSD as with rotational drives, but it’s still providing a benefit, albeit one with diminishing returns as the size increases.
areyouevenreal@lemm.ee 2 months ago
Outside of storage servers and ZFS no one is buying RAM specifically to use it as disk cache. You will also find that Windows laptops are also designed to be left in sleep rather than hibernate.