I think it’s because 120hz + overclock can get to 144 so someone probably started selling factory OCd 120hz screens at 144hz and then it caught on. Then someone did the same to native 144hz and we got 165hz. I’m more curious about why 165 was chosen, its not a nice number like 144
What is the idea behind 144? It seems to particular a number to be arbitrary. 24, 60 and 120 seem to be based on other techs and related media.
Lojcs@lemm.ee 10 months ago
Jumi@lemmy.world 10 months ago
I honestly have no idea but so far I never really reached 144 fps or 4k, much less both simultaneously.
Darthjaffacake@lemmy.world 10 months ago
I found people online saying it’s because it’s 24 frames (standard frame rate) higher than 120 meaning it can be used to watch movies using integer scaling (1:6 ratio of frame rate rather than 1:5.5 or something strange), take that with a massive grain of salt though because lots of people say there’s other reasons.
Humanius@lemmy.world 10 months ago
If consuming media with integer scaling is the main concern, then 120Hz would be better than 144Hz, because it can be divided by 5 to make 24Hz (for movies) and divided by 2 or 4 to make 30/60Hz (for TV shows).
144Hz only cleanly divides into 24Hz by dividing it by 6. In order to get to 60Hz you need to divide by 2.4, which is not an integer.
And with either refresh rate 25/50Hz PAL content is still not dividable by a nice round integer value
Darthjaffacake@lemmy.world 10 months ago
Yeah as I said take what I said with a massive grain of salt, some people are saying it’s because of a limit of hdmi data sending so it could be that.
Squizzy@lemmy.world 10 months ago
Oh man those maths didn’t click with me, of course it’s just another 24 frames.
Darthjaffacake@lemmy.world 10 months ago
Me neither to be honest, 24 is kind of a weird number when added up.