Yeah, legitimate 8K use cases are ridiculously niche, and I mean… really only have value if you’re talking about an utterly massive display, probably around 90 inches or larger, and even then in a pretty small room.
The best use cases I can think of are for games where you’re already using DLSS, and can just upscale from the same source resolution to 8K rather than 4K? Maybe something like an advanced CRT filter that can better emulate a real CRT with more resolution to work with, where a pixel art game leaves you with lots of headroom for that effect? Maybe there’s value in something like an emulated split screen game, to effectively give 4 players their own 4K TV in an N64 game or something?
But uh… yeah, all use cases that are far from the average consumer. Most people I talk to don’t even really appreciate 1080p->4K, and 4X-ing your resolution again is a massive processing power ask in a world where you can’t just… throw together multiple GPUs in SLI or something. Even if money is no object, 8K in mainline gaming will require some ugly tradeoffs for the next several years, and probably even forever if devs keep pushing visuals and targeting upscaled 4K 30/60 on the latest consoles.
Illecors@lemmy.cafe 3 days ago
A couple things - every jump like that in resolution is about a 10% increase in size at the source level. So 2K is ~250GB, 4K is ~275GB. Haven’t had to deal with 8K myself, yet, but it would be at ~300GB. And then you compress all that for placea like netflix and the size goes down drastically. Add to that codec improvements over time (like x264 -> x265) and you might actually end up with an identical size compressed while carrying 4x more pixels.
HDMI is digital. It doesn’t start failing because of increased bandwidth; there’s nothing consumable. It either works or it doesn’t.