Electrical engineer here. There is almost no difference.
The cost of streaming video from a server to your computer is pretty small, basically just transferring the bytes from a hard drive to a network card. This happens in a datacenter on a big server designed to be efficient at it, and serve a ton of people at once. Your own electricity consumption on your viewing device is likely much higher than that. You can calculate your electricity consumption using a Kill-A-Watt or similar device, but here are some averages of measurements I’ve made on my devices:
PC with 27" LCD monitor: 150W
50" TV: 300W
Laptop with internal 14" screen: 40W
Phone with 5" screen: 10W roughly, but it’s complicated
Phone with screen off, speaker only: 2W (guessing here)
Handheld FM radio: less than 1W
If you look at your computer’s CPU usage while watching video, it’s mostly idle. So most of the power consumption is the screen’s backlight.
Assuming worst-case coal power, releasing 0.4kg of carbon per kWh, and a large TV, and let’s say 10% overhead for the server’s energy cost, that’s 0.13kg of carbon per hour. So don’t worry about it.
whyNotSquirrel@sh.itjust.works 6 months ago
I don’t think the problem comes from the clients devices.
I really don’t see how broadcasting could consume the same amount of energy as downloading. When you broadcast something it doesn’t matter how many clients are watching the content, but for streaming or downloading the more clients the more energy you’ll need to support the load