Don’t worry though, Todd Howard himself said that Bethesda definitely did a lot of work on optimizing Starfield. This is all still the fault of the end users, who just need to “upgrade their hardware.” Just ignore the decrepit Gamebryo engine that still has all the same old bugs and quirks that it’s had for nearly two decades.
Indeed. “this would have the fewest bugs any Bethesda game ever shipped with.” as said by MS.
Which is probably the platonic ideal of “damning with faint praise.”
That’s like when you call your insurance company and get sent a survey, “Did we exceed your expectations?”
Well, my expectations were so low that actually speaking to a person means you exceeded them. So sure, I guess.
No! It’s Creation Engine 2! Ignore that it’s just Gamebryo with yet another new coat of paint!
Unreal engine 5 is just Unreal engine 1 with yet another new coat of paint, what’s your point?
Just because the initial release date of an engine is decades old, doesn’t mean the actual engine is. Game engines get updated and fitted with new features and capabilities on top of what’s there already, Devs don’t waste time rewriting the engines from scratch on each new version.
The issue with Gamebryo is more that key bugs are just not getting fixed. How long have NPCs walked away mid-conversation? How long has enemy AI just found itself stuck without taking actions. How long have companions been getting stuck in doors and blocking your path?
The frustrating part isn’t that they’re reusing the same engine. It’s that they’re endlessly tacking on more shiny baubles, without ever fixing issues that have been present across all of their games. Hell the newest bauble they added, the new facial animations, are already breaking down in bizarre ways.
At this point, it feels like there’s so much technical debt from half-implemented features that starting from scratch or licensing an engine from someone else might be the best route forward.
In this thread, people who understand very little about technology and how it works
Sounds like about 80% of Starfield discussion at the moment.
If anyone wasn’t aware, there is a mod to replace FSR2 with DLSS and it is INCREDIBLE for performance if your system supports it. I went from all minimum with 40-50fps to well over my 144 target on medium (indoors) and running okay (60 with some drops) in tough scenes outdoors.
Running on a 2070.
I tried that and saw no difference unfortunately.
I’m just confused because I have an i9 9900k and a 4070ti, neither are even remotely close to being worked hard and yet I can’t break 50fps. Like half vram usage and maybe 10% CPU/GPU usage. I thought I had it running smoothly, but it’s a smooth 30fps…
Yikes… is your monitor plugged into your GPU (and not your motherboard)?
Lol yeah it’s all hooked up correctly. Just don’t get why it seems to not be utilizing my system efficiently. :/
Finally took the time to check my performance. I’ve got a 3070ti and a Ryzen 9 3900x, CPU is at ~55% utilization and GPU is ~95% at maybe 40 fps average. That’s at ultra on all settings, with the DLSS mod and a render scale of 80%.
Have the recent driver & game updates helped at all?
I’m an idiot and for the entire year I’ve had my 4k monitor I never enabled the setting on the monitor that allowed it to run at 60hz so I was always locked at 30fps with vsync…
I couldn’t say if the recent update helped as I just discovered this new level of my idiocy yesterday lol
I have a 4070ti as well and it struggles a bit in NA. Probably under twenty FPS in the most crowded places
AMD being a “partner” is business speak for “AMD paid us a bunch of money because having their brand on our product is a much larger advertising reach than they can accomplish on their own”.
That performance is better on AMD is in no way “bizarre”… it’s exactly what would be expected.
It’s unexpected for nvidia users, who have grown used to games being optimised for them rather than AMD users.
This whole thing is just man bites dog news
Well, maybe. It’s optimised for the Xbox which runs AMD hardware.
You can force it to use resizable bar and get more fps. It just needs to be enabled and it’s such an easy thing for the Bethesda devs to do, yet people need Nvidia profile inspector to enable it. For no reason.
force it to use resizable bar and get more fps
If this is true, it means the game is designed around a UMA architecture, i.e. xbox. Nobody in their right mind tries to map more than 256MB of CPU memory concurrently for a single frame. Either that or the engine is completely shit at resource streaming (also characteristic of console-first games), and so is relying on the OS to demand-page random resources as needed.
You mean to tell me that enabling ReBAR in the BIOS doesn’t automatically enable it for every game?
The BIOS setting enables the bus feature in hardware. But the driver also needs to support it.
I want to know how the hell I am lucky enough to not have any real performance or graphical issues…
I’m not even using a supported GPU (1660 Super) and it’s still very playable with the lowest fps being 27 and the highest being about 70.
Outside is on the low end. Interiors are higher, with empty interiors (IE no NPCs) being the fastest. Just dropping a single NPC into a space I am getting 72 fps in drops the frame rate to 50. NPCs aren’t handled by the GPU; they are CPU bound.
My CPU is a Ryzen 5 3600x; the exact AMD chip Bethesda lists as the recommended. In fact, other than my GPU, the rest of my system meets recommended requirements.
Edit: I kinda wonder if it’s simply how things are tested in QA. For years, I see users claiming to have high end systems having tons of problems across various games, and I am starting to think if they aren’t simply lying about their specs (which seems an odd thing to do if you want real support), is that they are simply too new and the focus was more on hardware more users use. Going by Steam hardware survey stats, most people have pretty old stuff while only a small fraction are on super high end systems.
Yeah, I second that. I run the game to a perfectly playable extent, low-to-medium settings, and I have a barely better GPU, 1660-Ti, with a 10th gen laptop i7
It is? Am I doing something wrong? Because I get a solid 60-70 fps at all times on a 3070ti
lemmy guess, you are less than 8 hours in the game?
Little over 36 hours. Every setting on the highest it can go
Starfield doesn’t even run on Arc GPUs.