Comment on Nvidia might not have any new gaming GPUs in 2026 — and could be 'slashing production' of existing GeForce models

<- View Parent
balsoft@lemmy.ml ⁨14⁩ ⁨hours⁩ ago

It’s nicer to develop anything on a beefy machine, I was rocking a 7950X until recently. The compile times are a huge boon, and for some modern bloated bullshit (looking at you, Android) you definitely need a beefy machine to build it in a realistic timeframe.

However, we can totally solve a lot of real-world problems with old cheap crappy hardware, we just never wanted to because it was “cheaper” for some poor soul in China to build a new PC every year than for a developer to spend an extra week thinking about efficiency. That appears to be changing now, especially if your code will be running on consumer hardware.

My dad used to “write” software for basic aerodynamic modelling on punchcards, on a mainframe that has about us much computing power as some modern microcontrollers. You wouldn’t even consider it a potato by today’s standards. I’m sure if we have our wits about us, we can optimize our stacks to compile code on a friggin 3.5GHz 10-core CPU (which are 10 year old now).

source
Sort:hotnewtop