this post was submitted on 07 Sep 2023
434 points (93.9% liked)
PC Master Race
14226 readers
1 users here now
A community for PC Master Race.
Rules:
- No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
- Be respectful. Everyone should feel welcome here.
- No NSFW content.
- No Ads / Spamming.
- Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.
Notes:
- PCMR Community Name - Our Response and the Survey
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'll believe they actually optimized their PC port when their PC ports start showing some signs of effort at being actual PC ports
No FOV slider after FO76 had one is, to me, a great sign of how little Bethesda actually cares about the platform that keeps them popular (thanks to mods)
This is the mentality they want you to have. And it's a shit one. PCs should be able to run any game well when it comes out.
I'm gonna make a game where the graphics are basically the same at all presets, but with different filters. Crank the bloom and add a sepia filter, call it pro max or something.
“But this one goes to eleven”
The industry kind of did it to themselves. We had a really long period what 1080p was the default resolution and games really didn't even try to push graphics at all. Things kind of plateaued post-Crysis for about 10 years before I even felt like we had a game that looked significantly better than it did.
So a lot of people have gotten used to being able to hit ultra settings day 1 because their entire gaming life that's been possible.
On a 10 year old potato
If you’ve got a 5 year old pc, sorry you shouldn’t expect to play on max, let alone anything over medium.
People need to temper their expectations about what a PC can actually do over time.
We are talking modern hardware, nobody is expecting a 5 year old PC to be running maxed out games anywhere near as well as the latest hardware should be. People are just more and more willing to bend over and accept shit given to them, there's no reason Starfield couldn't be running better, they certainly had the capabilities at Bethesda to make it so.
Read the comment chain again, because you missed the persons original point….
They talk about old and modern hardware, you can’t just ignore half their point.
I think you are imagining modern hardware to just be like a 4090. Any modern hardware here meaning current generation GPU/CPUs. They should be able to run at max settings yes. The performance match ups of low to mid range hardware of this generation overlaps with mid to high of the last generation (and so on), so just talking years here doesn't really translate.
People holding on to their 1080tis obviously shouldn't be expecting max settings, but people with 3080s, 4070s, 6800XTs (even 2080ti/3070s), should absolutely be able to play on max settings. Especially games like Starfield that are not exactly cutting edge, there's a lot older games that had a lot of work put into performance from the start and they look and run a lot better than this.
I have an i9 9900k and a 4070ti and can play it butter smooth max settings in 4k 100% rendering. The CPU is definitely starting to show its age, but I haven't had any complaints about starfields performance.
That said I can't fucking stand consoles. I get that companies would be stupid to not sell something to as many people as possible, but I'm so sick and tired of seeing good games handicapped because they need to run on a rotten potato Xbox from 10 years ago or whatever...
Like 40-45fps? I've seen a couple people say this now, but every outlet I have seen benchmark performance contradicts it. I don't consider 40fps smooth at all, but I guess consoles even have to suffer with 30fps in some cases, so a lot of people are okay with it.
Consoles dictate a lot of triple A games, that's where the biggest profit is and why PC is an after thought like it was here.
I actually never pulled up an FPS meter as it has been so smooth I never felt the need to check. I'll see when I get home later what it actually is in neon or somewhere "busy."
Do you have motion blur on?
I'll never understand why developers add stuff that make the game look so much worse...
Looking at you chromatic aberration, motion blur, film grain, vignette...
The first thing I do with a new game is check graphics settings and nuke that extra garbage lol
Yup like sure add it but at least disable it all by default, but motion blur does make low fps look better, if you can put up with the blur that is (I can't), it's used heavily on consoles for that reason.
Modern literally means the most recent release. And games should be pushing those to the limits on max settings. I semi agree that even the next release of GPUs should be able yo get more out of the game, ie design the game for the future.
If you’re expecting a 2080 to run a game on max, what limits are we pushing with every new gen? You’d be hampering yourself and leave a bunch of dead weight on modern and semi modern GPUs.
Which I explained would mean the 4000 series/7000 series GPUs and the 13th Gen/Zen 4 CPUs, but the worst one from one of these is not better than the best of the previous generation, so it's not as cut and dry as 'modern/old'.
Starfield is pushing no limits, thats the point. It's just built like shit, so it runs like it. I could maybe be swayed a bit on the matter if it was absolutely ground breaking, but it isn't. It's Fallout 4 in space with less stuff going on at any one time.
Remember when "could it run crysis" was a good thing to understand? Now everyone acts like max settings should run on 5 year old gpus and complaining about devs instead.
We're on PCs guys, there's a shit load of variables as to why a game might run poorly on your device, there is absolutely no way a game dev can (or should) be able to account for all those variables. If you want a standard gaming experience with preset settings that run fine, get a console.
"Could it run Crysis" was a pro for your computer, but it was also always a bit of a dog on the fact that the game was largely unplayable for a lot of people.
It was because it pushed the boundaries on what was possible with current gen hardware at the time, that didn't make it unoptimized or a bad game, but that concept seems to be lost on a lot of people.
Are you seriously suggesting that Starfield pushes any boundary? The game still uses the god forsaken “boxes” from Oblivion, every slice of world pales in comparcomparison to both the size and quality of like, all modern open worlds of comparable budget.
There literally are settings lmfao
I'm not super interested how Starfield will play 5 years from now. I didn't even play Skyrim past it's 5th release..
But seriously modders have shown they're up to the task op upgrading the engine and visuals over time.
I like that your last sentence is open-ended, like a Mad-lib
How would you expect them to develop a game targeted towards hardware they can't test on due to it not existing? Latest PC hardware should be able to run max or near max at release.
People would rather be angry than know how things work.
Yeah. But at least you can use console command ( ~ tilde as usual ) to change fov Default values are first person 85 and third person 70 Range is 70-120
When you're happy with what you got, issue
Or you can just edit
The game, without using dlss mods, runs at 30fps with some stutters on my system using the optimized settings from hardware unboxed (linked) on a 4060ti. If I install the DLSS-FG mods I immediately get 60-90 fps in all areas, that alone should tell you everything you need to know..
Heres the rub, Im not a FPS whore. Its generally a good experience for this game at 30 FPS assuming you use a gamepad/xbox controller. KB+M it gets really jittery and theres input lag. The game was clearly playtested only using a gamepad. The reactivity of a mouse for looking is much different and the lower FPS the game is optimized for becomes harder to digest.
I have also tested on my 1650ti max-q, and a 1070. Both the 4060ti and 1070 were on an egpu.
My system has an 11th gen i7-1165g7 and 16 gb ddr4 ram. I play at 1080P in all cases.
For the 1650ti and the 1070 the game runs fine IF I do the following
Even on the 4060ti, it saying its using 100% of the GPU but it is only pulling like 75-100 W, when it would normally pull 150-200W under load easy.
TL:DR - this game isnt optimized, at least for NVIDIA cards. They should acknowledge that.
I can at least change the FOV with an ini file edit, but there's no way to adjust the horrible 2.4 gamma lighting curve they have... It's so washed out on my display, it's crazy!
There is actually https://www.nexusmods.com/starfield/mods/281