Only press who previewed the RTX 5060 under Nvidia’s test conditions are getting review drivers, reports claim
Submitted 1 day ago by ZippyBot@lemmy.zip [bot] to gaming@lemmy.zip
Submitted 1 day ago by ZippyBot@lemmy.zip [bot] to gaming@lemmy.zip
LouNeko@lemmy.world 23 hours ago
I’m going to play little off topic devils advocate here.
I’ve recently had to choose between a 5070 TI and a 9070 XT for my new setup.
If I were to listen to any reviews - the general consensus would be that Nvdias 50XX Series is a steaming pile of dogshit that will burn your house down and kill you dog.
Every single of those reviews boils down to: AMD slightly faster but actually worse at everything. But because price lower number, AMD better. PS: Nvidia bad.
The RTX series is primarily a gaming card and gaming performance is all that matters, not all that additional fluff that reviewers dish out to make a 7 min video into a 20 min video.
Nvidia has the clear R&D advantage for new technologies, older titles barely matter in this discussion because everything pre DLSS era is going to run at 200+ FPS at 4K anyway (with the exception of 32bit PhysX titles, dick move Nvidia).
Newer titles on the other hand, are an absolute wildcard of performance because nobody cares about optimization anymore. I’ve used a 1080 Ti for 8 years and even some of the newest titles still ran at around 60 FPS maxed out. But here’s the catch, the discrepancy between highest and lowest setting has gotten miniscule, to the point where you’d get 60 FPS on lowest and 50 FPS on max settings with a 1080 TI. My personal reason for an upgrade tho, was Raytracing and Generative AI Performance, both at wich AMD sucks.
“Don’t blame the Card, blame the Game.”
Nowadays performance is more about the rendering technique rather than the visual settings. And when it comes to technique Nvidia is clearly the winner. Now that I have a reference between DLSS and FSR, I can confidently say that AMD will be in 2 years where Nvidia is now.
Let’s talk about price. People are still living in the fantasy that GPUs are going to cost 400€ again like it’s 2014 (which they didn’t even do back then). 800€ is the new baseline for a midrange GPU - that’s just how it is now. I’ve paid around 900€ for a 1080 8 years ago, now I paid around 900€ for a 5070 TI and get about triple the performance. So what’s the problem?
I think there’s a small but obnoxiously loud minority of “must have the newest thing” babies that didn’t get the memo when we reached the technical limitation of the Moore’s Law about 10 years ago. You won’t get double the performance in 2 years for the same price. No reason to whine about it. Want to get your moneys worth, keep you old GPU for another 2 years, there problem solved.
Also, what not a single reviewer ever has mentioned is the significant increase in power draw that AMD has over Nvidia.
Let’s do a simple calculation:
Price for a 9070 XT: ~800€
Price for a 5070 Ti: ~900€
Difference in favor of AMD: 100€
Power draw 9070 XT: 300W
Power draw 5070 TI: 250W
Difference in favor of Nvidia: 50W
(Based on tests of the same games with equal settings)
My regional price per 1kWh is ~0,40€
100€ / 0.40€ per kWh = 250 kWh
250 kWh / 50 W = 5000 hours
My average game time per day is maybe 2-3 hours, make it 2.5 h.
5000 h / 2.5 h per day = 2000 days ≈ 5.5 years
Since electricity prices are constantly on the rise, and I intend to keep my current setup for maybe another 8 years, the price difference between Nvidia and AMD doesn’t really matter in the long run.
The funny thing is I paid about 2200€ in 2017 for a high-end build that could play everything I throw at it, and now I paid 2000€ for another high end build that can play everything I throw at it. People forget to mention that while GPU prices have gone up. Everything else has gone down. RAM and HDD/SSDs, are dirty cheap nowadays.
While yes, Nvidia is a greedy and shitty company, let’s not pretend like their product doesn’t do what’s advertised - that is playing games really well. And AMD is not a saint either, their GPUs aren’t exactly cheap either.