Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.
Comment on NVIDIA is full of shit
just_another_person@lemmy.world 3 days ago
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You’d think people would have learned their lessons a decade ago.
RazgrizOne@piefed.zip 3 days ago
CheeseNoodle@lemmy.world 3 days ago
Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.
RazgrizOne@piefed.zip 3 days ago
Yeah I got a 9070 + 9800x3d for around $1100 all-in. Couldn’t be happier with the performance. Expedition 33 running max settings at 80-90fps
FreedomAdvocate@lemmy.net.au 3 days ago
But your performance isn’t even close to that of a 5090…….
tormeh@discuss.tchncs.de 3 days ago
If you’re on Windows it’s hard to recommend anything else. Nvidia has DLSS supported in basically every game. For recent games there’s the new transformer DLSS. Add to that ray reconstruction, superior ray tracing, and a steady stream of new features. That’s the state of the art, and if you want it you gotta pay Nvidia. AMD is about 4 years behind Nvidia in terms of features. Intel is not much better. The people who really care about advancements in graphics and derive joy from that are all going to buy Nvidia because there’s no competition.
just_another_person@lemmy.world 3 days ago
First, DLSS is supported on Linux.
Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.
Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD’s goal isn’t selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.
JTskulk@lemmy.world 3 days ago
Don’t you mean NVidia’s goal isn’t selling cards for gamers?
just_another_person@lemmy.world 3 days ago
No. AMD. See my other comments in this thread. Though they are in every major gaming console, the bulk of AMD sales are aimed at the datacenter.
FreedomAdvocate@lemmy.net.au 3 days ago
Nvidia cards don’t require their own dedicated PSU, what on earth are you talking about?
just_another_person@lemmy.world 3 days ago
Low rent comment.
First: corsair.com/…/rtx-5090-5080-and-5070-series-gpus-…
Second: you apparently are unaware, so just search up the phrase, but as this article very clearly explains…it’s shit. It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements. It’s like saying you want food to taste better, but then they serve you a vegan version of it. AMD’s version is technically more useful, but it’s still a dumb trick.
Static_Rocket@lemmy.world 3 days ago
Well, to be fair the 10 series was actually an impressive improvement to what was available. Since then I switched to AMD for better SW support. I know since then the improvements have dwindled.
just_another_person@lemmy.world 3 days ago
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia’s latest lines of Jetson are just recooked versions from years ago.
FreedomAdvocate@lemmy.net.au 1 day ago
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does.
AMD could only do that because they were so far behind. GPU manufacturers, at least nvidia, are approaching the limits of what they can do with current fabrication technology other than simply throwing “more” at it. Without a breakthrough in tech all they can really do is jack up power requirements and clock speeds. AMD will be there soon too.
Shizu@lemmy.world 3 days ago
Cause numbers go brrrrrrrrr
eager_eagle@lemmy.world 3 days ago
they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090
just_another_person@lemmy.world 3 days ago
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
eager_eagle@lemmy.world 3 days ago
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
SheeEttin@lemmy.zip 3 days ago
Yup. You want a server? Dell just plain doesn’t offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
just_another_person@lemmy.world 3 days ago
Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.
Ulrich@feddit.org 3 days ago
Then why does Nvidia have so much more money?
iopq@lemmy.world 20 hours ago
Because of vendor lock in
just_another_person@lemmy.world 3 days ago
See the title of this very post you’re responding to. No, I’m not OP lolz
ctrl_alt_esc@lemmy.ml 3 days ago
Unfortunately, this partnership with OpenAI means they’ve sided with evil and I won’t spend a cent on their products anymore.
IronKrill@lemmy.ca 3 days ago
enjoy never usimg a computer again i guess?
Chronographs@lemmy.zip 3 days ago
That’s exactly it, they have no competition at the high end
Naz@sh.itjust.works 3 days ago
I have overclocked my AMD 7900XTX as far as it will go on air alone.
Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.
At it’s absolute best, it’s competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder’s Edition (the slowest of the stock 4090 lineup).
The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn’t shown anything new.
AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we’ll see some serious price cuts and competition.
bilb@lemmy.ml 3 days ago
And/or Intel. (I can dream, right?)
Pirate@feddit.org 3 days ago
Why do you even need those graphics cards for?
Even the best games don’t require those and if they did, I wouldn’t be interested in them, especially if it’s an online game.
Probably only a couple people would be playing said game with me.