I’ll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game. The idea is that “the game dev knows best how to optimize for its specific usage” but in reality the game dev have no time to deal with hardware complexity and this is the result.
Comment on Open source community figures out problems with performance in Starfield
Blackmist@feddit.uk 1 year agoThis is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who’s job it is to fix shit game code in the drivers. That’s why (a) they’re massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
www.gamedev.net/forums/topic/…/5215019/
It’s interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don’t look bad on benchmarks.
mattreb@feddit.it 1 year ago
frododouchebaggins@lemmy.world 1 year ago
Yes they do. We know they do because current gen consoles are frequently providing better fidelity and better stability than PC games. Not because PCs have inferior hardware. But because optimization is actually incredibly hard when your custom base is all running different hardware AND different drivers. So even when the hardware is “the same”, it’s not.
This has been true forever. It just took 30 years for high performance computing to be affordable enough to put in consoles. 30 years was a long time for PC gamers to feel superior. Now they enjoy humble pie and make comments like this on the internet to explain why things are so “bad”.
PC games are still great. Don’t let this bother you more than it should.
stonedemoman@lemmy.world 1 year ago
To attribute this most recent failure to an overabundance of hardware variety is a joke. This issue persists on all Nvidia and Intel cards. Why? Because it’s an oversight pertaining to the one thing they all share in common: their shared interaction with DirectX.
Let me repeat myself for the people in the back. The number of items they had to account for with this failure is one. One driver.
emax_gomax@lemmy.world 1 year ago
This sounds more like hardware manufacturers haven’t provided a good enough abstraction layer across their devices, or they did (vulkan) but everyone is just stuck on bad apis that don’t properly map to the abstractions for the hardware. Or even more likely the publishers cheaped out and pushed something to release when it wasn’t ready like they have been forever.
Shadywack@lemmy.world 1 year ago
It’s also a lack of specialized talent. There’s lots of great “talent” at game devs and even middleware devs. There’s just not much great talent that deals with renderers and API development. The vast majority of devs just lean on the middleware developer to push out the renderer codebase. In a situation like Bethesda running their own studio engine, they just don’t have the right people for it. This plagued the 90’s when people were trying to code for Glide, OGL, DX5,6,7,8, and 9. Many studios folded because they couldn’t get their tech to work with hardware acceleration.
uis@lemmy.world 1 year ago
*for current wage
Redditiscancer789@lemmy.world 1 year ago
Lol
Redredme@lemmy.world 1 year ago
Pc gaming is and forever will be way better then games on consoles.
Why?
I’ve 3 letters for you.
R G B
( ͡° ͜ʖ ͡°)
tbf pc gaming was always a fight for performance, I never felt superior back in the day fighting with qemm, irqs for the soundblaster or glide3d, it’s always had been a shitshow. It was a super shitshow in the nineties, it was a bit better in the zero’s and nowadays it again became a tad better.
But somehow I enjoyed that shitshow. Still do.