That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
…That being said, there’s a lot of trends going against people, especially for gaming:
-
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
-
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
-
Time gaps between generations are growing as silicon gets more expensive to design.
-
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
-
Individual parts are more repairable. If my 3090 dies, for instance, I can send it to a repairperson and have a good chance of saving it.
-
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.
claymore@pawb.social 8 minutes ago
Don’t forget about PCIe expansion. Just yesterday I got a FireWire PCIe card for 20€ to transfer old DV tapes to digital with no quality loss. Plug the card in and you’re done. To get the same result on a laptop I’d need a Thunderbolt port and two adapters, one of which isn’t manufactured anymore and goes for 150€+ on secondhand stores.