How’s the power consumption compare between two performance equivalent setups? (Genuine question, it’s something I’m trying to determine for my self-host use-cases).
My first RPi is for Joplin to replace OneNote. My current server runs 24/7 and costs about $1/day for power (it provides other services too). I haven’t calculated my Pi power consumption yet, but it’s running on a 2.5 watt power supply, vs my server 700 watt (of course, these are both peak measurements).
Given my self-host stuff will spend 99% of its time at idle, it seems like Pi has a massive advantage. But of course that all depends on how things are used and setup.
bustrpoindextr@lemmy.world 1 year ago
The raspberry pi, like all RISC chips, uses much less power.
In fact the super computer summit runs on powerpc64 which is a RISC chip, that’s a big reason why its power consumption for a super computer is so low.
BearOfaTime@lemm.ee 1 year ago
I hadn’t considered the RISC angle. Does RISC consistently use less power than CISC at given operations levels (MFLOPS, for example), or is there another/better way to make a power-consumption vs operations/performance comparison?
I realize this is kind of esoteric for my use-cases, but it would be useful for making projections to see if spending X dollars on Y number of Pi’s recoups the investment over a given period, just in power consumption.
E.G. If I can reduce my power consumption by 70% by switching to 3 Rpis, then I can recoup their cost in 2-3 years. Since my server needs replacing anyway, this seems like a no-brainer.