The Outer Worlds 2 Performance Benchmark Review - 30+ GPUs Tested
Posted by hannopal@reddit | hardware | View on Reddit | 89 comments
Posted by hannopal@reddit | hardware | View on Reddit | 89 comments
mrfixitx@reddit
A 5090 getting 48 FPS at 4k very high with RT off is not great. There is also no GPU that will break 100 FPS at 1080p on a 9800x3d.
Seems to be heavily CPU bound if nothing can break 100fps.
VenditatioDelendaEst@reddit
Very High is the highest preset here, right? So there should be no expectation that it hits 60 on a GPU that you could buy before the game came out. And in the screenshots it's basically indistinguishable from High.
Vb_33@reddit
Very high lumen scales bad and becomes exceedingly expensive to run. Run on high at most unless you need to fix the shadows bug by playing on very high shadows.
NeroClaudius199907@reddit
This game was made to push fg/mfg
Farfolomew@reddit
What a ringing endorsement of the game by the article's author:
"Everything felt much too wordy, and I felt like I was forced to read AI-generated text for the sake of it. Maybe the humor isn't mine, but the story didn't grab me either. Overall I'm happy to be done with the game and doubt I will ever touch it again"
Ploddit@reddit
The reviewer is massively exaggerating stutter problems in my experience. With a 7900X3D and 5080 at 1440p DLSS Quality I get locked 60fps with very consistent frame times. RT is definitely a performance disaster, but doesn't look good enough to be worth using anyway.
cp5184@reddit
november 2025, the month of 1080p gaming! (or is dlss quality 1440p lower than 1080p?)
Ploddit@reddit
960p, apparently?
I care about two things - frame time consistency and how does the game look when you're actually playing it. I'm not pixel peeping blades of grass to see if I can spot rendering problems. DLSS Quality is good enough that I don't notice.
kuddlesworth9419@reddit
I'm managing to play it pretty well on my 1070 at 1080p. It was actually playable at 4k with XeSS at the lowest setting. Not great but playable. At 1080p and quality upscaling it runs pretty well which so far for me is the best UE5 result.
Plank_With_A_Nail_In@reddit
Starfield didn't have named NPC's move around either only the generic crowd NPC's that all stare at you.
kuddlesworth9419@reddit
It's been a long time since I last played it. I never want to play it again anyway but that sounds like a huge downgrade over their previous games.
bastardsword2D@reddit
They do have radiant ai on cydonia but not anywhere else , it's weird
kuddlesworth9419@reddit
Ohh, I haven't got there yet. I'm still messing around on the first island place.
letsgoiowa@reddit
Internal res of 540p or below then? That's dire.
kuddlesworth9419@reddit
Whatever ultra quality is on XeSS for 1080p. At 4k I was at the ultra performance setting or whatever that one is called.
xspacemansplifff@reddit
Yeah that weird sizzling thing is something I haven't seen before. Can't recall the settings I ended up with but it looks ok. Clean at least.
RxBrad@reddit
I had to turn off all of the Post Processing effects in Clair Obscur because everything blurry in the background looked like re-warmed pixelated butt on my 9070XT.
Seems like a UE5 thing.
Logical-Database4510@reddit
It's called noise. It's inherent in how RT based GI solutions work, and yeah that includes software lumen.
It's removed by software solution called "denoising". Right now their denoiser is busted ATM on a lot of things, particularly shadows.
Cheap-Plane2796@reddit
Nah, this is by far the worst example of rt lighting
Alan wake 2 , indiana jones, cyberpunk, doom, dying light 2, metro exodus etc all look way more stable while running better too
Plank_With_A_Nail_In@reddit
Have an upvote, the guy downvoting you hasn't played the game they are just regurgitating something they heard elsewhere and trying to pass it off as fact.
Plank_With_A_Nail_In@reddit
Its not that, the game uses special textures that expect to be temporally aliased so if they aren't they flicker weird. Outer Worlds had it too.
kuddlesworth9419@reddit
I have to play with the Global Illumination setting on medium I think and the shadow setting at Very High. Otherwise you get some really weird lighting glitches like reflections moving around and sizzling.
The image never seems to be very stable to be honest. Like it's always doing weird shit when it really shouldn't. I've noticed this in a lot of modern games though mostly UE5 games. In reviews I've seen it with a lot of Ray Tracing games as well but I don't think that is related as I'm not doing RT on a 1070.
xspacemansplifff@reddit
Ue5 fun as usual.
MumrikDK@reddit
I had the same issue with Avowed at much higher settings on a 4070. Sometimes nature looks like they forgot about proper shadows, even at maxed out shadows.
kuddlesworth9419@reddit
I do a fair amount of photography and it's just being realistic. When it's cloudy outside midday the suns light is defused and a lot of the time you won't even see shadows anymore. It's why I'm not a fan of realistic lighting in games because a lot of the time in real life lighting is crap. It's why I go out in the early morning or afternoon when the sun is low with a speckling of clouds. Not enough clouds the sky is boring, too many the light is diffused too much and the sky looks boring. The thing is with games you aren't limited to real life light so why would you ever want to? You can get good photos in diffused light though but just not of landscapes and other things that you would want some nice natural lighting. I shoot natural light only so it's pretty limiting. You can get good macro shots though but you really need a crap load of sunlight if you want to get real close. You can still get some good shots though with minimal light it's just you have to be a bit creative. I would like to see some more interesting lighting effects though with DOF, it wouldn't be realistic to human vision but some cool lens effects would be nice like lens flaring, bokeh and such. That could be interesting but I don't think any games really do that other than mods for Skyrim.
Outer Worlds 2 is a sci-fi game based on a different planet, they could go wild with the lighting. It's not just OW2, it's pretty much every sci-fi game I've played. They can go mental with the lighting but they never bloody do. You can have crazy weather events and stuff with drastically different light for all I care. The glowing spore things in the game glow which is nice but they aren't used for their lighting much other than in a Mysterious cave which actually looks really nice. They have some spotlights int here as well which flicker. Whoever did the lighting there did a nice job. I've noticed some of the white flowers seem to give off some light in the dark which is cool. Glow in the dark flora and fauna would be nice. Or even some animals that do it. Even in real life some animals and plants do that.
r_z_n@reddit
I don't have problems with stutter but on my system, 5800X3D and a 3090, I am significantly CPU bound, even with RT off.
I'll be sitting at 55-60 fps with my GPU pulling only 260-280W.
I don't really understand how this game is that CPU-heavy. Again, hardware RT is disabled.
BlueGoliath@reddit
CPU bound on a 5800X3D is insane.
r_z_n@reddit
It's understandable with hardware RT on, or in big 64 player multiplayer games like Battlefield 6. I don't quite understand what's going on with TOW2.
BlueGoliath@reddit
Wasn't Battlefield 4 doing 64 players in 2014?
letsgoiowa@reddit
BF3 was doing it in 2011 on dual cores!
fmjintervention@reddit
BF3 also had 10hz servers. Yeah the game ran on a dual core, but the standards for what was considered an acceptable experience in terms of hit registration and server responsiveness were completely different in 2011 to where they are now. If BF6 had 10hz servers the playerbase would go nuclear
letsgoiowa@reddit
Oh fair. So bf4 and bf1 are better benchmarks because those did have good hit registration eventually
r_z_n@reddit
Yes, and it was one of the few games that would also scale well beyond 4 cores at the time. Battlefield games are CPU heavy but they are usually well optimized technically.
Logical-Database4510@reddit
It's because they're using a relatively new to the AAA scene technique of shader compilation called asynchronous compilation. What this means is that the game instead of having a very long precomp time before first run it will use any extra threads it can to compile shaders in real time while you're running around in the game world.
Silent Hill F and Borderlands 4 also use this and have similar issues.
Async comp first became big in the emulation scene with Dolphin and Rpcs3, and was hyped for years as a potential magic bullet to solve shader comp issues in modern games. As you can see though there's no such thing as a free lunch, and modern shaders are simply too large/current CPUs don't have enough threads yet to make it viable currently.
It's not a bad idea on its face, CPUs just aren't quite there yet in terms of core count to make it worth it yet.
Strazdas1@reddit
Sigh, i wish they would just let me pre-compile everything on game start and get on with it. All those attempts to make them compile real time just makes things worse every time. I dont care if i have to wait 10 minuets on first launch.
Logical-Database4510@reddit
Only game I've ever seen that did was Last of Us part 1, which took roughly 30m on my 7700x at the time.
Strazdas1@reddit
Never seen more than 15 mins happen (and wasnt there a bug in TLOU where it took longer than i should that got patched?), but id be fine with 30 minutes if it meant no shaders need to be compiled while playing.
The_Axumite@reddit
They are. Unreal just not good at utilizing them. Multiple cores are at a standstill and there is no reason for them not to use it other than the complexity of their state management which is why they are going back to functional programming for the next engine. Plus there should be some type of standard with video card makers that will allow precomputed shaders to be shipped with the game like consoles
Vb_33@reddit
The standard will be Microsoft Advanced Shader Delivery. Intel recently talked about their implementation of it.
Problem is I don't expect 100% game coverage, a game like TOW2 sure but a smaller indie UE5 game like the original Ark Survival Evolved? Big doubt.
r_z_n@reddit
But with asynch compilation you should see the CPU usage drop after the shaders are compiled, right? I'll double check next time I play but I haven't seen any real improvements in the CPU bottleneck even after running around Paradise Island for a long time now. There are some scenes that just seem CPU intensive regardless of how long I've been there.
Vb_33@reddit
Yes until you reach a new area and Async starts compilation again.
Logical-Database4510@reddit
It does. Check out Daniel Owen's video on the game in the first few mins you can see him discover this accidentally as he runs up to the first town on the 5800 he was testing on. CPU gets obliterated for a bit as he approaches the town, then settles back to normal as he reaches it.
Vb_33@reddit
Asynchronous compilation does not preclude (long or short) precompilation load screens prior to gameplay. Usually you get both.
The problem with the execution in TOW2 and Silent Hill is F is shaders you need at the beginning aren't the ones precompiled prior to gameplay start. And for asynchronous compilation, the shaders you want to be async compiled are the ones for the later parts of the game so they're ready when you get there. Instead SH f and TOW2 devs used async compilation for shaders you're about to need right now which results in missing shader effects because the shader have not been compiled yet.
On top of this the CPU toll of async has a cost on the games fps so the game will run worse until it's done compiling. That said Async is great when you have a modern CPU and when it's used properly. Include the early game shaders in the precomp step, and then the later ones in the Async step so all shaders are available when they're needed. Sadly UE5 devs have not achieved this yet.
comelickmyarmpits@reddit
If I am remembering correctly, dying light 2 too had asynchronous compute but that game is very optimized and runs fine even on pascal cards
Keulapaska@reddit
GPU power alone isn't the greatest cpu bottleneck indicator, especially if at low render res as that will lower gpu power draw sometimes a fair bit, usually the combination of gpu power+gpu usage is decent indicator, so if the usage is dropped as well then for sure cpu bottlenecked. At least the hardware RT option does seem to be kinda cpu heavy, and slightly broken, looking at the DF video.
Now i have no clue how power heavy the game is as haven't sen power draw fixed clock benchmark of it but there are games that are very power light and don't sue all the fancy schmacy stuff on the gpu. Starfield was(no idea if it still is) one them, hovering around 40-80W less power on my 3080 at the time for example.
r_z_n@reddit
I'm running at 3860x1600 (ultrawide) so it's not a lower render resolution. The GPU will sit around 80%-90%, so it's not being fully utilized either.
Most games at this resolution will push this GPU well over 300W, unless I cap the frame rate.
Keulapaska@reddit
Oh yea definitely cpu bound then, kinda weird though as Digital Foundry was showing some footage of 3600 hitting around 40-60(depending on low/high settings) so I'd think a 5800x3d would be quite a lot better than that, especially as the 9800x3d does scale as it "should".
Guess gotta wait for some cpu benchmarks to see what is going on with cpu:s in this game or maybe the game hates am4 for whatever reason and even more cache can't save it for whatever reason.
Also for power, I did totally miss that daniel owen video does have power draw stats and stock 3080, yea it's not showing voltage so harder to say power how light/heavy it is, but even without that info clearly not Starfield levels of power lightness going on and probably closer to avg power draw game
Vb_33@reddit
Stutter is there regardless even if you don't notice it. Check the DF review of TOW2 on a 9800X3D and Ryzen 3600.
r_z_n@reddit
To be clear, what I meant was I am not getting continuous stutter, or at least, that's not the issue I'm complaining. If I just leave my character looking out over the expanse of the wilderness in Paradise Island, long after the game has finished compiling the relevant shaders, I'll be sitting at 55-60 fps with my GPU only at 280W. Clear CPU bottleneck. My 3090 with the voltage curve I have should be hitting 360-380W at peak usage.
feanor512@reddit
Time for AMD to release the 9950X3D2.
The_Axumite@reddit
Software RT is on my default. Still CPU intensive and does not really take advantage of multi cores being unreal engine. I use a 6090XT with a 5800x3d and I average about 65 with hardware RT until I go outside and then I am down to 50.
ZubZubZubZubZubZub@reddit
Shadows are very demanding in this game with software RT.
Setting everything to high but keeping shadows very high increases fps by 50%. The same high settings but turning shadows to medium increases fps by 100%.
The_Axumite@reddit
Putting shadows and global illumination increased fps drastically even in hardware rt
Cheap-Plane2796@reddit
It runs like ass ( 7800x3d and 4080s needs dlss and framegen to stay over 60 fps at 1440 p on mostly very high with lighting and shadows on high) and the second planet is the only one that looks ok visually.
I thought avowed was a decent looking game but this one is fucking HIDEOUS a lot of the time, especially the first and third planets and the story area right before the third planet.
Many of the environments look like hastily slapped together low budget crap. Endless repeating assets, poorly fitted together meshes, ugllllly textures.
Ue5 being especially shitty in this as shadows and lighting constantly break. Rarely looks good, often looks hideous.
Never looks anywhere near what the performance would justify.
This runs worse than path traced cyperpunk and that game loooks ten times better.
Avowed was more technically and visually competent. This looks like it was made in a third of the time on a tenth of the budget.
And yet they have the gall to ask 70 euros for this garbage.
My last month of gamepass is the only reason i played this. Ill never touch another obsodian game. They re shitting out games faster than ubisoft and infinity ward did at their worst
The_Axumite@reddit
I think the interior of the game looks very good, mostly due to hardware ray tracing making even your standard object look more real and not just flat
Vb_33@reddit
Very high is extremely demanding and looks barely any better than high on software lumen. This is a classic UE5 thing.
Cheap-Plane2796@reddit
Read before replying I said lighting is on high and shadows too
xcaelix@reddit
Getting over 90 fps on 4070 and 265k, DLSS balanced + no framegen. I think something is broken with AMD cpus in that game.
Morningst4r@reddit
I'm getting really decent performance (90+ fps) on my 7700, so I'm not sure why people are struggling with 9800X3Ds. I have a 9070 XT so maybe AMD is a little better on the CPU bottleneck, but not by that much surely.
LowMoralFibre@reddit
Hmm I get no stutter, well not enough to notice. I'm not someone who just doesn't notice as I gave up VTM Bloodlines 2 after 30min as I couldn't bear he constant stutter and the stuttery Dead Space & Silent Hill 2 remakes were a slog.
A couple of settings turned down and the performance is great.
Definitely not the best looking game ever but runs perfectly fine.
Busy_Bison_5951@reddit
Summary: Trash optimization Trash RT Nothing new
Seanspeed@reddit
No, this is a GPU comparison test first and foremost.
You can play with settings to get more performance, as usual. Digital Foundry showed you can actually get some pretty huge performance savings this way.
Busy_Bison_5951@reddit
How does that make my statement wrong? The game has a trash optimization whether you like it or not and the RT performance is also trash whether you like it or not. It does indeed compare between the gpus but testing gpus on a game that barely got out and has little to no optimization will show very little information
Rather test the gpus on titles that actually made fair optimizationed games to show what the gpu is capable of and it would provide extremaly huge amount of info instead of this test
Seanspeed@reddit
I dont think there's a more overused and less understood term among PC gamers than 'unoptimized'.
Busy_Bison_5951@reddit
Adding to what i said, if its a "gpu comparison test" there would be tests on multiple games This is literally a game performance test, idk if you even read the test
Seanspeed@reddit
When you only run one game and one scenario with basically just one set of settings, then it's a GPU comparison test first and foremost.
That performance will differ in that game with different scenarios and different settings. It's absolutely NOT a reliable way to indicate what kind of performance you should expect.
And no, it doesn't need multiple games. It's a GPU comparison test within this specific game. That's still valid information. Still interesting to see how different architectures and vendors are doing with specific games.
TheIndecisiveBastard@reddit
My 5080 seems to handle the game pretty well on high 4K with RT and 3x FG DLSS P.
Only problems are the odd, grainy lighting and HDR screwing with frame gen to the point it’s an unplayable blobfest, but for all I know, it could just be the game pass copy.
derider@reddit
The grainy lightning is due to the fact that they don't have any denoiser running when RT is on, due to it being lumen....
Homerlncognito@reddit
What a shot show. Very bad performance for a game that looks like a modded Creation Engine game.
AnechoidalChamber@reddit
https://tpucdn.com/review/the-outer-worlds-2-performance-benchmark/images/upscaling-performance.png
Funny and sad to see the 9070XT have the same pre-upscaling performance as the 5090 and better performance once upscaling is taken into account.
Something's fracked with this game, they severely fracked up.
theholylancer@reddit
well, that is what going to 4k gets you right... 4090 is 4k while 9070XT is done at 1440p
its way lighter to run at 1440p than 4k
Reasonabledwarf@reddit
Seeing the 9070XT bumping up against the 4090 and 5090 in the RT test is crazy. Also interesting that the AMD control panel seemed to be causing crashes.
fulthrottlejazzhands@reddit
I have a 5090 and 9070 XT (and a 78003dx) here I've been testing and playing on. At 1080p, the cards are indeed close. Even at 1440p they're much closer than normal. It's definitely CPU that's pulling them close.
IshTheFace@reddit
What I got out of the chart is that RT is unusable at any resolution. Imagine buying a 5090 and only getting 65 fps at 1440p 😂
RxBrad@reddit
"Unusable" might be a bit of an overstatement. >60fps native in an RPG seems fine to me.
And this is also the type of game where I wouldn't be afraid to use DLSS/FSR. The "quality" preset gets you >60fps on the 5070Ti & >70fps on 9070XT without frame gen.
cheesecaker000@reddit
“seems fine”
Hell of an endorsement for a game running like crap on the most expensive hardware you can buy.
Logical-Database4510@reddit
The game scales very well. RT cuts your fps in half. On top of that you get 50% higher perf from high vs very high with very little bit to visuals.
Daniel Owen posted a video a few days ago playing the game on a 1070 and it was playable.
IshTheFace@reddit
I'm one of those people (unfortunately) who is sensitive to fps. Unfortunately, because hardware is expensive and I just wouldn't feel satisfied with low fps.
Lower fps to me make games feel floaty. Like aim wise. And it also gets blurry if it's fast paced. But yeah. The less fast paced a game is, the easier it is for me to accept a lower frame rate.
I don't even know my frame rate in this game but it's acceptable for the most part. Granted I don't run RT. The game just crashed the one time in tried turning it on just to see how it looked..
jtj5002@reddit
That really just mean it's CPU bound at 1080p and the 5090 is running at like 30% utilization.
AreYouAWiiizard@reddit
If I had to guess, the game possibly doesn't like the overlay so it might be fixed by disabling it instead of having to resort to driver only install?
bestnovaplayerever@reddit
"Seeing the 9070XT bumping up against the 4090 and 5090 in the RT test is crazy." In 1080p...
TheNiebuhr@reddit
Yeah it's the same framerate at 720p (Kryzzp/zworm video), with gpu usage close to 40%. It's completely broken.
ShadowRomeo@reddit
The Ray Tracing on this game is very broken and very CPU intensive, and doesn't look great compared to other RT games, I wouldn't use this as a basis of Ray Tracing / Path Tracing performance between different GPUs like the way Cyberpunk 2077, Alan Wake 2 currently does.
ShadowRomeo@reddit
The Ray Tracing on this game is very broken, and doesn't look great compared to other RT games, I wouldn't use this as a basis of Ray Tracing / Path Tracing performance between different GPUs like the way Cyberpunk 2077, Alan Wake 2 represents currently.
Logical-Database4510@reddit
It's not because of RT but because they're doing RT with asynchronous shader comp on top. RT has its own set of shaders on top of everything else so you get double mollywhopped when you turn on hardware RT due to the already increased CPU overhead needed for RT plus the new shaders needing to be compiled.
Logical-Database4510@reddit
They're CPU bound
s3rgioru3las@reddit
The winner for the least optimized UE5 game (so far)
BetweenThePosts@reddit
I’m running fsr4 balanced on 4k/very high/RT/hdr and honestly it looks great. And I’m a snob when it comes to visuals.
Dreamerlax@reddit
Literally forgot this game came out. I'm one of the "few" that actually liked the first one so I might check this out.