Makes my decision to not buy it even easier.
I own many games that I impulse buy, but find out that I don’t care for. That gets expensive.
Now I’m much more selective, and tend to wait until the game’s been out long enough to get patches, updates, and reviews.
Add my lack of interest in any Todd Howard product until ES6, which I may not live long enough to play (boomer puke here), as well as the offhanded arrogance of his ‘upgrade your PC’ statement, and that about covers why I’ve decided not to buy Starfield.
It’s BS though. People with TOTL hardware are having issues. Those systems don’t underperform because the game is advanced or anything like that – the game underperforms because it is a new release that is poorly optimized. It’s also expected because it’s on a senior citizen of a game engine that likely needs a few other nudges.
Todd Howard forgets that PC users see this shit all the time, and it’s pretty obvious with this one. Hoping to see talk of optimization in a coming patch instead.
Edit: a good example – not hitting 60fps in New Atlantis, but concurrently, CPU usage in the 50s and GPU usage in the 70s. That’s a sign of poor optimization.
I’m starting to think that maybe, just maybe brute forcing a 26 yesr old engine that makes skyrim have a stroke if you try to play above 30fps isn’t a good idea
What game engine is 26 years old other than the Unreal engine?
Edit: stepped on some toes i guess lmfao
No, Im not a fan of the game personally but a quick search shows they are using the creative engine 2, which is a newer version of their engine.
While I’m no fan of paid sponsorships holding back good games, this is untrue.
Neither nvidia nor amd block their partner devs from supporting competing tech in their games. They just won’t help them get it working, and obviously the other side won’t either, since that dev is sponsored. There are some games out there that support both, some of them even partnered.
So yes, it’s bullshit. But it’s not “literally paid” bullshit. Bethesda could have gone the extra mile, and didn’t.
My friend and I were just discussing the likelihood that some hardware producers pay game devs to purposely output bad optimizations so users are encouraged to spend more on upgrades.
In this case, you get Starfield free with the purchase of select AMD CPUs or GPUs.
But it’s weird for Todd Howard to come out with this push now, because it’s in response to those already playing the game.
I mean, that’s probably why he would make the push. The bait’s in the mouth (people have the game), then comes the pull of the hook (they have to upgrade to try and handle its poor optimization, fulfilling the benefit of AMD backing them). And Beth doesn’t lose anything if its too frustrating and people stop playing over it because they already have the money.
EDIT: Admittedly I keep forgetting that game-pass is a thing, but maybe even that doesn’t really matter to Microsoft if it got people to get on gamepass or something? That makes my earlier point a bit shakier.
I have a i9 13900k and a Radeon 7900xtx, 64GB RAM and I had to refund on steam it because it would keep crashing to desktop every few minutes. Sometimes I would not even get passed the Bethesda into Logo before crashing. Very frustrating experience to say the least.
I have a i7-10700k/32gbRAM/3080ti - playing the game at 4k with all settings to max (without motion blur ofc) and with almost 80hrs into the game, I have yet to have a single crash or performance issue.
Only realized people were having issues when I saw posts and performance mods popping up.
I haven’t played starfield yet but many of the recent headliner releases have been performance hogs. It’s not unreasonable to expect people to either play with lower settings or upgrade if you want to run the best possible set up. That’s why there are performance sliders in most games. When you need a 3080 to run minimum settings that’s when you start running into trouble (👀ksp 2)
Man, that’s why armored core blew me away. Completed the whole game, at launch, maximum settings and I don’t recall a single frame drop. 3060, with very mediocre other hardware. I know there’s a lot to be said about map sizes and instanced missions, but with as fantastic as that game looks and plays…
Same happened with Doom Eternal. The graphics were a show stopper when the game came out and the game didn’t even stutter. It’s so well optimized that I’m told you can even play it with integrated graphics.
It’s almost like having a giant open world comes with some massive drawbacks. I’m pretty fatigued over open world games tho so that may just be me.
Ridiculous statement. I’ve got an rx 7900xtx and a ryzen 7 7700x with 64 gigs of ram @5600mhz and the fucking game barely ever hits 144fps. Usually it’s sitting around 100-110 fps which is playable for sure, but literally every other game I’ve played on it has had no problem staying nailed at 144fps. This is at low-medium settings BTW (for starfield).