You heard him 4090 users, upgrade to a more powerful GPU.
The missing part is that the user with a 4090 complaining had a CPU from 2017 🥴 Image
Submitted 1 year ago by Edgelord_Of_Tomorrow@lemmy.world to games@lemmy.world
You heard him 4090 users, upgrade to a more powerful GPU.
Yeah, I’m not buying that either. I’m on a 2014 i7 and a 3060 playing on ultra. My sole issue was not running on an SSD which I resolved yesterday. That kid is clearly playing on a potato and lying.
I’m shocked at home many PC users are still running HDDs given that SSDs have been standard ok consoles for three years now.
Playing on ultra on a 3060 ? So you’re getting 20-30 fps? Because that’s what it gets on mine with a much newer CPU.
Considering that this thing runs great on a Series S (which is CPU-heavy, but with a weak graphics card) that makes so much more sense.
Lol, dude used up all the money to get a GPU.
Gotta love the Bethesda fanboys upvoting this one cherry picked comment. They’re are like 70 comments in there with all different combos of system specs complaining about performance.
people really need to put the nostalgia googles down…back in the days nobody played Crysis with full details and a steady framerate.
You were in 1024x768 and turned everything down just to play the game with barely 30fps and you know what it was still dope as fuck.
Crysis was built by a company specialising in building a high fidelity engine. It was, by all accounts, meant primarily as a tech demo. This is absolutely not the case with Starfield - first, the game doesn’t look nearly good enough for that compared to Crysis, and second it’s built on an engine that simply can’t do a lot of the advanced stuff.
The game could be playable on max settings on many modern computers if it was optimised properly. It isn’t.
complains about others wearing nostalgia goggles
calls Cysis dope
After all this time I don’t think I ever heard anything about how Crysis plays or what’s the story and such. People only talk about how hard it was to run and how fancy these graphics were. Doesn’t make it sound all that great.
Wtf Crysis 1 was awesome… At least the first part without the aliens… And not because of the graphics
Except this time even with 1024x768 and lowest settings you can barely break 60 FPS due to the huge CPU overhead.
And that’s with a Ryzen 7 5800X.
I have the same processor and no issues. 1440p 80-125 fprs, high Details, 100% and FSR2
It’s system by system, I have the same cpu and do fairly well, admittedly with it boosting to 4.5ghz. My wife has the same cpu and it struggles on her machine. It feels like the game just wasn’t tested well.
There will always be that game that pushes the boundaries between current gen and next gen. Sometimes even more. Crysis is the perfect example of the past. Starfiels seems to do a decent job right now even if it’s probably not even close to what Crysis did. When people spend a lot of money we feel entitlement, thats only natural. No one did anything wrong. So no need to point a finger anywhere.
Please explain in detail how Starfield is pushing the edge graphically in any way that’s comparable to Crysis.
But it didnt tho, it looks shit and hogs more resources compared to other games like cyberpunk which is probably a better example for next gen graphics
I don’t know what your problem is, guys. When Skyrim was released, NVIDIA had GT 5xx series. Skyrim barely run at 40 FPS on Ultra on 1080p on a GT 560. Today, according to Gamers Nexus, Starfield runs at 60 FPS average on GTX 4060.
So, Starfield is better optimised than Skyrim was. Go buy a new GPU.
www.youtube.com/watch?v=uCGD9dT12C0
Get a new game engine, Todd. Bethesda owns id Software. id Tech is right where.
Exactly this. It was only two generations ago when idTech was an open world engine, id can and have made it to do whatever they want and to suggest that despite Bethesda money (let alone MICROSOFT money) id couldn’t make a better engine with similar development workflows as Creation is just dishonest to suggest.
It’s a shame idTech is no longer released publicly. It would’ve been amazing to see what people could do with the beast of an engine that powered DOOM Eternal, especially modders.
I assume you’re talking about Rage, which had an open world map, but no where near the level of simulation systems as a Bethesda game. In fact I remember back at the time most of us saying the map was pointless as it was just a way to travel between levels with nothing to do in it.
You are completely talking out of your ass.
You realise custom engines are built for specific game types right? iD Tech is great for creating high fedelity FPS games with linear levels and little environment interactivity. That’s not what Bethesda make though.
They could do everything they usually do but better if they used Unreal. They don’t need a custom engine.
ID tech is nowhere near flexible enough for something like Starfield or even Skyrim. It’s partially the reason why it’s so efficient. It simply isn’t fit for the task.
And the Bethesda developers are intimately familiar with Creation Engine, achieving the same level of productivity with something new will take a long time. Switching the engine is not an easy thing.
Not to say that Creation Engine isn’t a cumbersome mess. It has pretty awful performance, stability and is full of bugs, but on the other hand it’s extremely flexible which has allowed its games to have massive mod communities.
If Bethesda can’t take the time to do it then who can? People act like they’re some small time developer but they’re not. They simply refuse to expand their dev team to do things like a redesign.
Creation engine is not going to hold up well for another 6 years, there’s no way their cell loading system will be considered acceptable by the time ES6 comes out. The amount of loading screens in Starfield is insane for a modern game. This company needs new talent badly.
I know they don’t want to switch, but it would be worth it to make the swap to something like unreal, even if it takes a few years of customization to get the open world stuff right. Creation Engine just feels so old.
Yeah, we optimized. We didn’t do it well, but it happened!
I’m a game developer and I’m ashamed by this.
When chip production will halt because of the climate, you will see programmers optimizing their code again.
Jeez I hope this economy crashes.
Do you guys not have better PCs?
I understood that reference.
I have a 3080 ti, and a 12700k, and 32 gigs of ddr5, and a 2 terabyte ssd. It runs great for me. I don’t understand the problem. /s
So, this system runs it fine? Good to know. I was worried that my computer would not be able to run it smoothly, but now no worries at all.
I've got an 8086K and 3080, running on a 4K screen - with Ultra settings and FSR of 80% I'm getting 35-40fps, which honestly doesn't feel too bad. It's noticable sometimes but it's a lot smoother than the numbers suggest.
Because my CPU is a little long in the tooth, I've gone probably a bit hard on the visuals, but my framerate didn't improve much by lowering it. The engine itself has never really liked going past 60fps, so I don't know why people expected to be able to run 100+ frames at 4k or something.
I've got that but with a 4080 - no issues.
I admittedly feel like I went full retard on my build and seriously hope these specs aren't what's necessary...
You had me in the first sentence, and then I realized it was sarcasm. 🤪 I’m running a similar rig, but it’s primarily for rendering work, etc., so for juuust a second there, I wondered if it was falling behind. 😅🤓
Honestly, what do you expect someone to say when asked a question like that? There’s no answer there.
“we have worked a lot on PC performance. wanted to reach performance parity with consoles for release on similar hardware and we achieved that, However, our teams will continue working on improvements and integrating technologies like fsr and dlss in the future. “
Umm… honesty. Games used to run on the bleeding edge of performance. Not Bethesda games but just games in general. Now the release half broken blatant cash grabs and think no ones gonna call them out for it.
They don’t think that. They just know that the people will pay up anyway, bringing in the profits for shareholders and the C-suite, and that’s all that matters.
The DLCs, cosmetics, MTX, etc. are all pretty much alive and well despite everything just because enough people cash out, so why change their ways?
AAA gaming is a big industry, and big industries are nothing wholesome.
Seriously? Just say that we’re always trying to optimize our games and we’ll continue working on it. It’s such an easy question to tackle. I refuse to believe you can’t see that.
That’s not an answer that people would have accepted either and no matter what answer was said, it would have been dissected and criticized by the syllable.
The point I’m trying to make here is that “optimize your game” doesn’t help anybody. Especially not as an interview question. You might as well have asked “why didn’t you make your game fun?”
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that’s averaging about 50fps on a 4k monitor.
If that isn’t optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it’s solid middle range.
People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you’d have to scale it down to windows 640x480. If you want “ultra” settings you need an “ultra” PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option
I mean, this was also before video cards cost as much as some used cars or more than a month’s rent for some people.
I’m not saying it’s not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that’s the price of the hobby. Saying that a game isn’t optimized because it doesn’t run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it’s a weird thing to demand. If you want ultra, you pay for ultra prices. If you don’t want to/can’t, that’s 100% acceptable, but then just be content to play on High settings, maybe 1080p.
If PC gaming is too expensive in general that’s why consoles exist. You get a pretty great experience on a piece of hardware that’s only a few hundred dollars.
yea idk if used cars or rent are good comparisons.
People are entitled because they don’t want to spend thousands of dollars on components only for them to be outdated within a fraction of the lifecycle of a console?
How about all the people that have the minimum or recommended specs and still can’t run the game without constant stuttering? I meet the recommended specs and I’m playing on low everything with upscaling turned on and my game turns into a laggy mess and runs at 15fps if I have the gall to use the pause menu in a populated area. I shouldn’t have to save and reload the game just to get it to run smoothly.
Bethesda either lied about the minimum/recommended requirements or they lied about optimization. Let’s not forget about their history of janky PC releases, dating back to Oblivion, which was 6 games and 17 versions of Skyrim ago.
and no one is saying they have to, that’s my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that’s awesome, but you’re going to pay a premium for that. If people are upset because they can’t play ultra settings on hardware that came out 5 years ago, to me that’s snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.
If the game doesn’t play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that’s not fair at all, it wasn’t about how well it played, it’s that it didn’t play.
Consoles don’t even last their whole life time anymore, both machines required pro models to keep up with performance last gen and rumours have it Sony are gearing up for one this gen too.
You’re missing the point.
There are a lot of games that look much better AND run much better.
It’s not about how often you upgrade.
I mean, yeah but also by what metric. There’s a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don’t know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it’s already starting to show it’s age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.
My ultimate point is that you can’t expect to get ultra settings on a brand new game unless you’re actively keeping up on hardware. There’s no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It’s a brand new game, my original comment was me saying that I’m surprised it runs as good as it does on the last generation hardware.
I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that’s the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it’s definitely not how we got started in PC gaming.
I have an AMD 3800X and an RTX2070 and I am barely seeing 30fps on the lowest settings at 1080p and 1440p.
DOOM Eternal runs just fine at 144fps on High and looks miles better.
It’s just not optimised.
Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That’s the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.
I’m sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it’s about time. I’ve had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It’s just how it goes in PC gaming, this is nothing new
Curious if you can name one thing Starfield is doing that wasn’t possible in a game from 2017.
I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things
I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”
I’m running it on a Ryzen 1600 AF and a 1070. NOT Ti. 1440 at 66% resolution. Mix of mostly low some medium. 100% GPU and 45% CPU usage. 30 fps solid in cities. I won’t complain at all. I’m just happy it runs at all solidly under minimum spec.
This is a great way to view it, and I think you’re getting excellent specs for that card. Kudos to you for getting it running !
Why do people use entitled like it is a bad thing? Why wouldn’t consumers be entitled as opposed to spending money as though it is an act of charity? Pretty weird how mindset of gamers over the years has shifted in a way where the fact that they are consumers has been forgotten.
I say entitled because gamers should just be happy, be happy with the hardware you have even if it can’t put out 4k, turn off the FPS counter, play the game. If you’re enjoying it, who cares if it occasionally dips down to 55? The entitlement comes from expecting game makers to produce games that run flawlessly at ultra settings on hardware that’s several years old. If you want that luxury, you have to spend a shitload of money on the top of the line gear, otherwise just be happy with your rig.
I'm running it on a Ryzen 5 2600 and an RX 570, and it seems to run relatively well other than CTD every hour or so.
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU
Just specifying the series doesn’t really say much. Based on that and the release year you could be running a 5600X and RTX3060 or you could be running a 5950X and RTX3090. There’s something like a ~2.5x performance gap between those.
PC gamers enjoyed a bit of a respite from constantly needing to upgrade during the PS4/Xbone era. Those machines were fairly low end even at launch and with them being the primary development formats for most games, it was easy to optimize PC ports even on old hardware.
Then the new consoles came out that were a genuine jump in tech again as consoles used to be, and now PCs need to be upgraded to keep up and people that got used to the last decade on PC are upset they can’t rock hardware for multiple years anymore.
"We optimized it for the very high end of computers. The issue is your wallet."Kek mf’ing w
With my experiences playing the game with an unsupported GPU and getting a solid 60 fps still as long as no NPCs are in the vicinity, I don’t think it’s the GPU side of things that needs optimization. It’s whatever uses the CPU.
If there’s an Xbox One version, then there’s really no excuse for it not to load on a PC with similar or better cpu/memory/graphics specs.
First game to just have constant crashes on my seven year old RX480, which is great since otherwise the game runs completely fine. Support doesn’t seem to want my crash reports either, I guess in Todds world, I should just throw the thing in the trash for a game that does literally nothing special in the tech department.
Since negative opinions travel fast, I’m just gonna say my GPU is actually below the minimum requirements, though admittedly I upgraded CPU last year. The game’s minimum is a GTX 1070 TI, I just have a regular GTX 1070.
In my case, it’s doing a LOT of dynamic resolution and object blurring nonsense to get the game to run smoothly, but it does run smoothly. I get to see the character faces during conversations, I can see what I’m doing, there’s no hitching, etc. New Atlantis looks ugly, but that might change if I get a new GPU.
It’s perfectly optimized. I’m getting a rock solid 30fps. /s
Their idea of optimization in console was to cap the frame rate to 30, even on the Series X. So you can wonder what that means for PC
lol, no they didn’t. They didn’t even test adequately, more than a few GPUs that meet the requirements didn’t work when early access launched.
Damn, glad I didn’t buy it on day 1. Got baldurs gate instead and am enjoying that
That’s why I will never PC game. You spend thousands on your gaming PC and it can’t play a game that will come out in a year
Running on i5-8400, 3080ti. Runs good to great depending on whether I’m in New Atlantis.
Wish they had asked him why the console version isn’t optimized either running at only 30.
WHERE IS MY CLIMBABLE LADDER, TODD?
I’ve got a mid-grade PC and haven’t had any issues except the potato people with weird speaking animations and that ugly green filter over everything. Used a mod to take care of the second thing. While I am enjoying the game quite a bit, I’ve long wished that Bethesda could up their art style and animations. Thank fuck they got more than 5 voice actors now.
Everything time this dude opens his mouth, I get an urge to wear an eyepatch.
I have a 4060 TI and it crashes constantly on low. Runs fine between crashes so not a performance issue…
lustyargonian@lemm.ee 1 year ago
Damn this is a pathetic response. He could’ve said we’ve tried our best to make it as polished and possible before launch, and are working towards further optimising it to give you the best experience. Even if they did jackshit, it would not come out as condescending and snarky.
Edgelord_Of_Tomorrow@lemmy.world 1 year ago
It’s not even about graphics alone.
They’re clearly building their games in an extremely inefficient way. Starfield does not have anything going on in it that other games with much lower requirements also have done.
You see evidence of this in their previous games. One of the major performance issues with Fallout 4 for example, was that instead of building their cities in performant ways, they literally plonked every building as an individual asset into the world which thrashed the CPU for no reason. Modders just had to merge them all into one model to significantly improve performance. Their games are full of things like this and Starfield will be no different.
Chailles@lemmy.world 1 year ago
Unless I’m completely mistaken here, modders didn’t combine the buildings together, that’s how they are by default. Mods, however, sometimes needed to break said system which resulted in massively degraded performance.
Silverseren@kbin.social 1 year ago
Why would he? Todd hates everyone who plays his games and cares only about separating money from pockets. Fallout 76 made that quite clear to everyone.
MonkeyKhan@feddit.de 1 year ago
If he gave a standard appeasing PR statement without following it up at all, that would somehow be preferable? This may be snarky, but at least you know what to expect.
lustyargonian@lemm.ee 1 year ago
I mean, yeah I guess this does help temper expectations that they are done optimising, so maybe you’re right, being blunt is probably for the best.