Stuff designed for much higher peek usage tend to have a lot more waste.
For example, a 400W power source (which is what’s probably in the original PC of your example) will waste more power than a lower wattage on (unless it’s a very expensive one), so in that example of yours it should be replaced by something much smaller.
Even beyond that, everything in there - another example, the motherboard - will have a lot more power leakage than something designed for a low power system (say, an ARM SBC).
Unless it’s a notebook, that old PC will always consume more power than, say, an N100 Mini-PC, much less an ARM based one.
WhyJiffie@sh.itjust.works 1 week ago
in my experience power supplies are more efficient near the 50% utilization. be quiet psus have charts about it
Aceticon@lemmy.dbzer0.com 1 week ago
The way one designs hardware in is to optimize for the most common usage scenario with enough capacity to account for the peak use scenario (and with some safety margin on top).
However specifically for power sources, if you want to handle more power you have to for example use larger capacitors and switching MOSFETs, and those have more leakage hence more baseline losses. Mind you, using more expensive components one can get higher power stuff will less leakage, but that’s not going to happen outside specialist power supplies which are specifically designed for high-peak use AND low baseline power consumption, and I’m not even sure if there’s a genuine use case for such a design that justifies paying the extra cost for high-power low-leakage components.