Comment on [deleted]
razorcandy@discuss.tchncs.de 5 days ago
They’re great, aren’t they?
Heat pumps normally use more electricity to heat than to cool, so it’s surprising that the opposite was happening for you.
My bills have always been lower in the summer. The only time the bills have been unusually high was when there was a problem with the connection between the main electrical supply and the pump, causing it to keep restarting and use more energy.
deranger@sh.itjust.works 5 days ago
Why would they use more energy in one direction versus another? This doesn’t really make sense to me. Heating and cooling is just swapping which element is the condenser and which element is the evaporator.
IcedRaktajino@startrek.website 5 days ago
Heat pumps move heat. In the summer, it’s pulling heat from inside and moving it outside and the opposite of that in the winter.
Basically, the temperature differential is what makes the difference. The larger the differential, the more energy it has to use.
In the winter, when it’s 30 degrees (F) outside, and you want it to be 70 inside, that’s 40 degrees it has to move. In the summer when it’s 90 degrees outside, and you want it at 70, that’s only 30 degrees.
Air source heat pumps, as the name implies, pull heat from (and exhaust heat to) the ambient air. When it’s really cold in the winter, there’s less ambient heat to move inside, so it has to run longer. Some (all?) heat pumps also have an auxiliary resistive heating element to make up the difference which lowers efficiency quite a bit.
Granted, newer heat pumps can work well down to lower temperatures than the older ones I’m familiar with, but in a nutshell, that’s why they can potentially use less energy in the summer.
A_norny_mousse@feddit.org 5 days ago
You mean like an electrical heater basically? I’m pretty sure this one hasn’t, and I was told to additionally use the provided wall heaters during proper winter, as the air pump’s efficiency decreases.
deranger@sh.itjust.works 5 days ago
I get that, but outside of using aux heat, it seems like identical temp differentials would be identically efficient. The heat pump doesn’t use more energy to heat than cool, the heat pump uses more energy in winter due to a larger difference to overcome.
howrar@lemmy.ca 5 days ago
Besides the temperature differential that everyone else mentioned, there’s also sometimes the need to defrost the outside bits, which means running the heat pump in reverse and undoing a bit of the heating it already did.
seathru@lemmy.sdf.org 5 days ago
The line pressures are much higher when heating so the compressors are having to work harder. Plus some units have resistive heaters in the compressors that come on during low temps.
partial_accumen@lemmy.world 5 days ago
If nothing else, the temperature range differential needed is very different from cooling in the summer to heating in the winter. Apologies for my Celsius friends. I think most humans consider 70 degrees to be comfortable. If its 80 degrees outside the differential is only 10 degrees (80-70=10). For most people the hottest outside temperature they may have is 100 to 110 degrees. So we’re looking at a differential of 30 to 40 degrees the heat pump would need to keep.
Now lets look at winter during the coldest months where I am 0 degree days are pretty common and -10 to -30 can happen occasionally. So the normal differential is a 70 degrees! And the uncommon differential can be as bad as 100 degrees! Further, I believe heating/cooling follows the inverse square law which means for each degree of temperature change it doesn’t just increase the effort linearly, but rather exponentially. So the farther away the different the harder it is to reach it, and we’ve just seen that winter is much farther away (larger differential) than summer.
I know for my home’s heat pump I use between 2kW and 4kW running for normal cooling (its a variable speed compression in mine) while in the depths of winter it usually is around 4kW and when really cold outside gets as high as 8kW (in pure heat pump mode). Because the differential is so much larger in the winter, I’m asking it to do much more work.
splendoruranium@infosec.pub 5 days ago
The gradient determines that. Moving heat energy from inside ambient 25°C to outside ambient 30°C is easier than moving heat energy from outside ambient 5°C to inside ambient 20°C, for example.
BurgerBaron@piefed.social 5 days ago
And eventually drops off to no gain at all indoors for heating once it's cold enough.
Successful_Try543@feddit.org 5 days ago
Like the other commentators have already stated, the conditions (temperature difference)in winter and summer are different. However, if the temperature differences are the same, only reversed, heating requires less energy than cooling, as the (electric) power is also transformed to heat which in winter, when in heating mode is also usable heat while in summer, it adds to the heat that needs to be discharged outdoors.
A_norny_mousse@feddit.org 5 days ago
Do you think your example is realistic? That would be 20% - a relevant difference. Not ALL the electrical power consumption will be turned into inside heat I guess?
Successful_Try543@feddit.org 5 days ago
That’s for the ideal case. In a real engine, there is some loss directly at the engine of the compressor.
deranger@sh.itjust.works 5 days ago
There’s the answer I was looking for.
razorcandy@discuss.tchncs.de 5 days ago
Heat pumps first extract heat from the outside air before moving it indoors (I don’t know the exact mechanics of it). The colder it is outside, the more energy is needed to do this. When it’s cooling, it only moves the heat from inside to outside.