Written by someone who apparently has no understanding of virtual memory. Chrome may claim 500MB per tab but I’ll eat my hat if the majority of that isn’t shared between tabs and paged out.
If I’m misunderstanding then how the fuck is chrome with it’s 35+ open tabs functioning on my 16GB M1 machine (with a full other application load including IDE’s and docker (with 8GB allocated)
Shadywack@lemmy.world 6 months ago
Looks like you didn’t read the article either.
Earlier it’s mentioned that they have 15 tabs open. I don’t like a lot of things they do in “gaming journalism” but on this article they’re spot on. Apple is full of shit in saying 8GB is enough by today’s standards. 8GB is a fuckin joke, and you can’t add any RAM later.
ABCDE@lemmy.world 6 months ago
That doesn’t make sense. I have the 8GB M2 and don’t have any issues with 20+ tabs, video calling, torrents, Luminar, Little Snitch, etc open right now.
Shadywack@lemmy.world 6 months ago
15 tabs of Safari, which is demonstrably a better browser by some opinions due to its efficiency and available privacy configuration options. What if you prefer Chrome or Firefox?
I will argue in Apple’s defense that their stack includes very effective libraries that intrinsically made applications on Mac OS better in many regards, but 8GB is still 8GB, and an SoC isn’t upgradeable. Competition has far cheaper 16GB options, and Apple is back to looking like complete assholes again.
ABCDE@lemmy.world 6 months ago
I’m using Chrome.
disguy_ovahea@lemmy.world 6 months ago
That’s because PC people try to equate specs in dissimilar architecture with an OS that is not written explicitly to utilize that architecture. They haven’t read enough about it or experienced it in practice to have an informed opinion. We can get downvoted together on our “sub standard hardware” that works wonderfully. lol
pivot_root@lemmy.world 6 months ago
The only memory-utilization-related advantage gained by sharing memory between the CPU and GPU is zero-copy operations between the CPU and GPU. The occasional texture upload and framebuffer access is nowhere near enough to make 8 GiB the functional equivalent of 16 GiB.
If you want to see something “written explicitly to utilize [a unified memory] architecture,” look no further than the Nintendo Switch. The operating system and applications are designed specifically for the hardware, and even first-party titles are choked by the hardware’s memory capacity and bandwidth.
magiccupcake@lemmy.world 6 months ago
Oh no I read the article, I just don’t consider that testing.
It’s not really apt to compare using ram on a browser on one computer and extract that to another, there’s a lot of complicated ram and cache management that happens in the background.
Testing would involve getting a 8gb ram Mac computer and testing it using common task to see if you can measure poorer performance, be it lag, stutters or frame drops.
Shadywack@lemmy.world 6 months ago
You do have a point, but I think the intent of the article is to convey the common understanding that Apple is leaning on sales tactics to convince people of a thing that anyone with technical acumen sees through immediately. Regardless of how efficient Mach/Darwin is, it’s still apples to apples (pun intended) to understand how quickly 8GB fills up in 2024. For those who need a fully quantitative performance measurement between 8 and 16GB, with enough applications loaded to display the thrashing that starts happening, they’re not really the audience. THAT audience is busy reading about gardening tips, lifestyle, and celebrity gossip.