Does the cpu really matter?
Posted by LowChildhood8604@reddit | buildapc | View on Reddit | 69 comments
ve watched a few benchmarks and it seems like a 9060 XT 16GB (the card I'm building with) still doesn't get CPU bottlenecked in most games at 1080p with a 3600 or a 5500. I don't see why people recommend anything over that, if even a 5500 gets the 9060 XT to 100% usage.
Video reference: https://www.youtube.com/watch?v=ZDGl2H8XUwI
AlfaPro1337@reddit
Yes, as you should not be getting any of those dead, dated and unusable cpus by AMD and PCMR standards, you should be getting the latest GOAT gaming king cpu, 9950X3D2.
FazeGreen1223@reddit
Your PC is only as good as the weakest component/link is my philosophy
Skywers@reddit
Yes.
Some games make much heavier use of the CPU. Sometimes even more than the GPU. So it doesn’t matter if you’ve got a decent graphics card; you’re still going to struggle to get a smooth gameplay experience if the CPU isn’t up to scratch
The perfect example for me is World of Warcraft. It’s a game that relies heavily on the CPU because it has to handle the effects / attacks and the number of MANY players at the same time
Warcraft_Fan@reddit
Also the core of Warcraft were coded over 20 years ago when CPU usually did the heavy lifting and GPU just made it look pretty on the screen. Back then GPU had around 64 or 128GB RAM.
beirch@reddit
WoW does not use the same engine it did over 20 years ago. It's still fairly CPU dependant because there's a ton of NPCs and other players, but not nearly as reliant as it was when it was first released.
Sephurik@reddit
Nah it's still about as reliant on CPU and RAM speed as it was then. I went from a 5900X to a 9800X3D last year and on mythic sprocketmonger (lots of effects and such) and I went from averaging 20 FPS on the 5900X to 60-70 FPS on the 9800X3D (both used a 3070 at the time).
Like I get what you're trying to say but gaining upwards of triple performance in raid encounters is not a small thing.
BurritoSupreeeme@reddit
I feel like they made some progress with midnight on that front. In dornogal in averaged like 40fps on a busy time of Day. In silvermoon i get my capped 100fps basically all the time. Almost the same PC, slightly higher clocked RAM on a 14600k
conaii@reddit
I played wow on a 4c/4t i5 6500k with 275mg of ram. I upgraded to that from a 2c/4t core2duo that was a beast when it launched. Until I overclocked and got better cooling for the i5, Ironforge was about 10fps so I made darnasus my capital city, but the rest of the game where there were never more than 40 players (in raids) was fine in those days.
Whatever you are playing, it’s probably got Zero of the same code from those days.
Kotschcus_Domesticus@reddit
dude that thing is ancient. just like cod games are still running on good old id tech 3 from Quake 3 arena deep underhood.
beirch@reddit
The base game is ancient for sure, but the engine and code has been reworked several times.
When I first started playing in 2005, I could easily get 60+ fps at max settings on a Pentium 4 and no dedicated GPU.
You're not doing that with no integrated GPU today. The only exception being like an AMD 8060S.
Kotschcus_Domesticus@reddit
it was reworked but it is still badly optimised due to its age. it has a lot of limitations.
KFC_Junior@reddit
mb* lol, this subreddit wishes we had 128gb vram lol
Sephurik@reddit
MB not GB. Also CPUs were single core back then.
TheWaspinator@reddit
Yeah, it depends heavily on the gamr
Protoclown98@reddit
More modern GPU features, like ray tracing and frame gen, requires a beefier cpu to get too. Certainly not as much of an issue if you are targeting 50-60fps but if you want much higher you need a cpu to assist with those things.
mashdpotatogaming@reddit
Frame gen doesn't require a "beafy cpu", in fact frame gen is the one that that can get you more "performance" with an older cpu.
lotsof_freetime@reddit
Still tho, If you can’t manage over 60 fps framegen is a terrible experience
DistinctCellar@reddit
Tip for those WOW players: turn down object detail before anything else for huge frame increases.
I have a very good system, but in big cities or areas with lots of players, even my rig gets cooked a bit. Turning down object detail to about 7-8 drastically improves frames. CPU affected setting of course.
Rusko_2@reddit
Lmao bruh... Depends what you play ffs.. I plat cs mainly so its cpu bound same settings and 9060xt 16gb on 5700x on 4:3 was barely doing 280fps ans dips to 150 lows... While my new 9800x3d with same gpu on same settings was pushing 600-700 fps and dips to 450 fps.. On 280hz monitor 5700x was slide show...
Ono_Palaver@reddit
You're not going to see cpu bottleneck on a monitoring graph.
LowChildhood8604@reddit (OP)
Hows that?
ArmoredAngel444@reddit
100% matters with a high end gpu
No_Guarantee7841@reddit
Turns out if you cherry pick games that are not cpu bound you may end up with results that show even a 4770k not bottlenecking a 9060xt, who knew.
L1teEmUp@reddit
That is why I don’t follow non big tech reviewers..
People might hate on these channels, but at least tech jesus, paul’s hardware, hardwate unboxed, digital foundry, jay2cents, Vex, and Daniel Owen are reliable for me when it comes to games benchmarks
Errorr404@reddit
just don't go to LTT for good results, they had a recent video showing the difference RAM speeds can have but only included top of the line CPUs like the 9800x3d and 14900k instead of throwing in a 7600x or 14600k in there too that have much less cache, less cache means RAM is more important.
this_dudeagain@reddit
Ram speeds aren't going to make up the performance hit of not having enough cache directly on the chip.
Errorr404@reddit
yea it's not gonna get the performance of huge CPU cache but in games that love fast RAM it could be up to a 20% difference in FPS if you for instance have a 7600x with JDEC spec DRR5 compared to 6000mhz cl30-36 kits.
CodeRoyal@reddit
People hate on those channels because they actually test hardware properly.
wsteelerfan7@reddit
I think it's mostly hate for Gamers Nexus because they only hear about his videos exposing stuff in the tech industry and think he's corny when he's actually right about those situations.
CocaBam@reddit
Some people dont think that Paul is "excellent"
wsteelerfan7@reddit
I think Paul occupies about the same space as Jayz2cents
wsteelerfan7@reddit
zWormz Gaming is surprisingly solid, too. He also runs specific games benchmarks at a mix of settings with his GPU stock, but he's running proper benchmarks as well and explaining things while doing it. Solid channel that looks like bad click bait on the surface
Primus_is_OK_I_guess@reddit
Yeah, I love that guy. He's basically the only one who still benchmarks games with a 5090. I get why they don't do it, most people have more practical GPUs, but I make bad financial decisions and I want to see how my overpriced toy is going to perform before I buy a game.
wsteelerfan7@reddit
It's more that he'll properly test GPUs with CPUs that don't bottleneck and the same way for the CPUs. He shows FG while also pointing out the caveats with latency and clarity. He also tests PT and RT settings separately when available when TechPowerUp and TomsHardware only list a generic custom RT scene.
Primus_is_OK_I_guess@reddit
Agreed. He also shows upscaling, which is great. That was just the main reason I started watching. I watch all his videos now because I'm curious how different hardware performs.
wsteelerfan7@reddit
I also like watching to find out who the next Bob is gonna be
Primus_is_OK_I_guess@reddit
Who hates on Daniel Owen and can they fight?
notrealtedtotwitter@reddit
It does, gonna tell my part. I have RTX 2060 and ryzen 3500 CPU, I am actively running into CPU bottlenecks, specially if I want to stream games to my steam deck or do a few more things with my PC. If I got 3600 I would be in a much better place, but my RTX 2060 is actually performing okay for 1080p gaming.
zensentsu@reddit
You want to be gpu bottlenecked. If your cpu becomes the bottleneck at any point you'll notice it immediately in the frametimes.
HankThrill69420@reddit
I wish people would stop plugging the 5500 when the 3600 could be had for less and has more cache plus pcie 4.0
jhaluska@reddit
Well yes it does, but it brings up a good point about why balancing hardware is so difficult. It really comes down to the games you play and in what amount.
mashdpotatogaming@reddit
Yes CPUs matter. You're probably watching misleading content. I'd say even a 5600x and 5700x will bottleneck the 9060xt even on 1440p in a lot of cases.
Being at 100% doesn't tell you the whole story. You can run cyberpunk woth pathtracing at 1080p at absolutely abysmal performance and it'll be GPU bound sure. But if you wanna play it at decent framerates by dropping some settings, you'll quickly find out that a 3600x will have FPS drops quite often in CPU heavy areas. In a lot of games you can use settings that are more than your GPU can handle, which will put more pressure on your GPU but then your experience is compromised and without too much in return. You should look at a CPU as the limiting factor to how high your framerates can go. Your GPU might allow you to scale down to a point where the game looks and runs the way you want it to, but your CPU might limit you to a performance that you don't find good enough.
Play something like helldivers 2 on higher difficulties and tell me that the CPU isn't a bottleneck, when the framerates drop all the way to like 30 something fps on a 3600x.
pwnrzero@reddit
Helldivers 2, Stellaris, FaF, Factorio, and Sins of a Solar Empire 2 all benefit from good CPUs.
These are all games I played on a potato, outdated computer and now an almost cutting edge system. Saw tremendous benefits from updating my CPU.
MrTytanis@reddit
I had previously Rx 6600 XT + ryzen 5 5500 and even with this combo my CPU bottlenecked my GPU in some titles in 1080p (for example battlefield V and cyberpunk 2077) so I upgraded to Ryzen 5 5600 and that solved the issue. Then I upgraded my GPU to rx 9060 XT 16gb and at 1080 the bottleneck was huge so I pulled the trigger and bought myself a proper 1440p panel, then I OC my Ryzen and it almost solved the issue completely. Bottleneck still occurs in some titles like path of exile 2, but it's not that common. So yea 5500 is not good enough for Rx 9060 xt
itchygentleman@reddit
i'm a simple man. i see the word "bottleneck" and i ignore the thread
lleyton05@reddit
You’ll often notice better 1% lows with a better cpu pairing
NewestAccount2023@reddit
At 4k the CPU doesn't really matter, in various games a better CPU gives you better 1% lows at 4k while the average stays the same (which is still desirable), in other games even the 1% lows are the same. Once you start using aggressive upscaling or lower resolutions (basically the same thing) then the CPU matters more and more. The first hand playable experience on some of those games would be noticeably different between a 3600 and a 5800x3d despite average framerates being about the same the 5800x3d will have less micro stutters and better 1% lows in general.
Euphoric_Lynx_6664@reddit
The cpu does matter. A better cpu will provide better 1% lows which means your game will run smoother.
Funny-Carob-4572@reddit
Yes ...
I bought a 9800x3d for future proofs I have a 9070xt at the moment and when the 70 series and AMD equivalent come out then the processor will still be not near bottlenecking the newer gen cards.
Also it some games make use of the 3D shenanigans magic stuff ....
PsychologicalGlass47@reddit
Absolutely
Your GPU selection plays no part in your CPU load and if you'll be pinned on draws. That's purely down to your RAM.
Primus_is_OK_I_guess@reddit
Nonsense. Higher frame rates = increased CPU load. A better GPU produces higher frame rates.
Also, what the hell does "pinned on draws" mean?
PsychologicalGlass47@reddit
Higher frame rates are obviously going to require more draws, but a 9060XT is in no way going to reach meaningful numbers to which anything short of an i7-7700 will struggle with.
Do you have no idea what draw calls are? DPS? The hell are you doing here?
Primus_is_OK_I_guess@reddit
That is complete bullshit. Even at 1440p an 11600k can bottleneck a 9060XT in some games. You're talking out your ass. https://youtu.be/NqRTVzk2PXs?t=724&si=MN36d_fIklpJEsa0
I have never anyone use "DPS" as a metric for gaming performance. I know what a draw call is, but each draw call can have wildly different overhead, so it would be an absolutely useless metric. Probably why nobody actually uses it.
Stealthality@reddit
Very GPU heavy games there. I could put on CS and Valorant and ask does GPU really matter? When I solely played csgo (before the more graphics intensive cs2) I upgraded my cpu 3 times before I upgraded my RX 470, which at some point I had an i5 13600k and an rx 470, and it worked fine at 1280x960 low resolution (most common pro resolution)
Similarly though I want to point out one thing that I actually does mitigate a weak cpu. That’s frame generation. Your cpu can push lower frames and the gpu can push higher with frame generation. So I guess if you want good enough, and you don’t mind latency then you can use frame gen.
Jupiter-Tank@reddit
It really depends on the context. If you are playing competitive FPS or something else that requires a high refresh rate, or something like BG3 / Anno / Stellaris, then you may be better served with a strong 5000 series AM4 or moderate AM5 or equivalent Intel chip. In addition, though 4k and high graphical fidelity loads typically max the GPU befor the CPU, there are typically either particle or population effects that have the potential to weigh heavily on the CPU.
Niko_Bellic99@reddit
imo a 5600, 12400 level cpu is the minimum for that card
Plenty-Industries@reddit
CPU is still going to matter when you're using re-scaling tech like FSR/DLSS which leverage the CPU even more.
Just because you saw a few tests that shows a GPU-bottleneck, doesn't mean that there aren't games that are more reliant on CPU performance.
HayesBrewery@reddit
It's more about future proofing. A 5600x won't bottleneck a 5070ti at 1440p, but you might notice it at 1080p. Then a year from now you will notice it more.
A 9600x on the other hand will serve you well for many years.
Whiskeypants17@reddit
Yes. My 9060xt/9600x seems to be getting cpu bound in 1080 with both baulders gate and space marine 2, which along with spoodermans seems to be the cpu bound testing games these days. Can hit 200fps+ in arc raiders with all the frame gen gimmicks turned up, but in bg3 and sm2 it will flux from 80fps to 200fps depending on background cpu loads.
XenourXS@reddit
I have r5 3600, it's definitely not enough for single player 60fps gaming, or multiplayer 120fps
creativejoe4@reddit
It really depends what you are using your pc for. Most games don't need a high end cpu usually, you would want to focus more on the clock speed rather than core count for gaming. For other purposes like digital design or programming you would need something higher end, but it really depends on what kind of work you are doing. Me personally I have a i9 14900k, I need both the cores and the speed for compiling heavy computational workloads, it speeds up the process so much, for example what would normally take a mid range cpu around 8 hours the high end cpu does it in 30 minutes to 1.5 hours, allowing more work to be done in a shorter time frame.
Important_Sea_1136@reddit
Ryzen 5500 is not a good cpu, I had it and it caused constant stuttering in cpu heavy titles like KCD1.
ShadowsGuardian@reddit
Like everything in life, the answer is that it depends.
Play a cinematic movie kinda game and you may not notice any difference.
Try an MMO, simulator or competitive game and it would be a whole different story.
Mr_Hyper_Focus@reddit
I experienced an insane cpu bottleneck. I kept old hardware for a long time. I just replaced my i7 6700k with a 9800 x3d. I kept my 2070. My fps in arc raiders went up significantly. My gpu used to sit at like 60 percent utilization. Now it’s maxed at 99.
It’s definitely real.
Urdnot_Flexx@reddit
That video doesn’t really tell the whole story. Also the 1% lows aren’t unimportant at all. There’s a reason reviewers test CPU’s at 1080p with 5090’s.
TheDreadfulSagittary@reddit
That video almost only tests GPU heavy games. It's likely a different story for esports or strategy games.
DuuhEazy@reddit
Depends on the game and on your target fps, if you target 60fps you are probably fine, now say you want 90 real frames, the 5500 can't deliver that in most new games. Look at cyberpunk, can't go past 75 with rt and 90 without, way worse in the most intensive cpu area's, like crowds, and explosions, etc, plus the 1% lows.
CPU's don't bottleneck gpus per se, they define the fps ceiling. If you turn up all the graphics settings related solely to the gpu to the point it can't go above 60fps because of the GPU's limits, the 5500 won't bottleneck even a 5090.
That video is not testing the cpu limits per se
agente4242@reddit
I have a slightly weaker gpu (5060 8gb) and still get bottlenecked on quite a few games on a ryzen 5600g (similar to 5500).
If you tinker with the graphics, you can find a middleground where the GPUu pushes as much as it can, which is probably what that person is doing on the vídeo to minimize bottlenecks.
The thing is though, there are many situations where there is simply nothing you can do. Competitive shooters for example, I can max my graphics and still only be using 40-50% of my GPU, thus losing half my performance due to a bottleneck.
Open world games or unoptimized Unreal Engine ones as well, always losing 10-20% performance due to the CPU not handling physics and npcs in time for the GPU to work properly.
Thing is though, I always consider getting a stronger GPU a higher priority than a CPU. If you can get the 9060xt now, absolutely go for it. You can always upgrade your CPU later when you can, and still have great performance while you can't.
WanderingGenesis@reddit
I personally experienced some bottle necking by my 3700X with the 9060xt, so i upgraded to a 5800XT. Its not horrific bottle necking, but it was def noticeable in games like deep rock galactic, monster hunter wilds, and, surprisingly, elden ring.