What's the point of upgrading CPU for gaming?
Posted by Cumcentrator@reddit | buildapc | View on Reddit | 66 comments
So if i'm using an AM5 7600 and it's around 30% to 50% utilization in the games I play, what benefits would 7800x3d or 9800x3d provide for me?
Better 1% lows? or less of them?
More fps? If so by how much?
Cheger@reddit
I can tell you that I have an i5 8600K overclocked to 4,5 GHZ and my fps in my main game which is lol didn't change when I got a 6750xt to replace my gtx 1080. So yeah your CPU can be the bottleneck in many regards. Btw I get somewhere between 100 to 144fps in lol with that setup which is abysmal.
RoawrOnMeRengar@reddit
The benifits will differ depending on the resolution you play in :
1080p : big difference, pretty big performance gap between bottom line and top of the line cpu
1440p : lesser difference, there is one but it's just less obvious and impactful
4K : as long as you have at least a ryzen 5000 or Intel 12th gen or later, the difference will range from negligeable to non existent.
Suchgallbladder@reddit
You are only telling 1/2 the story, what resolution do you play at? The more above 1080p you go, the more the games will be GPU bound instead of CPU. I play at 4k and my CPU usage never climbs above 30% or so, but my GPU is always at / near 100%. The CPU absolutely helps fps at every resolution, but above 1080p the difference will be negligible unless you’re using a very old CPU. I upgraded from a 9900k intel CPU to a this-generation AMD and I’m seeing maybe a 1% improvement in gaming.
asdjklghty@reddit
I don't think the difference is negligible at a resolution higher than 1080p. CPU-intense games still run at their finest with a good CPU.
For a test I used Cyberpunk 2077 2.1. I tested with a Ryzen 7 5700X. The 1% lows were in the high 50s.
I tested again with a Ryzen 7 5700X3D and the 1% lows were in the high 60s. Quite a major improvement. The 1% lows are closer to the average framerate which leads to a more enjoyable experience.
That's the thing most gamers don't understand. A good CPU doesn't give you higher frame rates but it evens the frame rates so you have a smooth experience.
TipTopMuffin@reddit
To play cpu bound games like Rust or Tarkov
BlightlingJewel@reddit
I went from a R5 3600x to a R7 5800x3d and my FPS in cpu Heavy Games went up a lot with the Same setup. LoL 120–>300fps Valorant 150–>400fps cs2 120–> 300 fps PoE waaay better 1% and .1% lows
Supertobias77@reddit
Some games require better CPU’s. For example Cities: skylines 2
Archernar@reddit
I upgraded from a Xeon v123 or so (can't remember the exact name) to a 7800x3d (among other things obviously) because Path of Exile kept stuttering because of the CPU.
Other than bottlenecking, I would never upgrade my CPU as it usually is not that relevant in games. Depends on the genre obviously, most 4X are exactly the opposite.
ntlong@reddit
Faster computation is good, everything is snappier. Some games with heavy computations will benefit. Games with multiple objects moving on the field… will have less lag.
It depends on the games you play
szczszqweqwe@reddit
If you want to have an idea on that by looking at single number check your GPU utilization.
ForThePantz@reddit
I invest in CPU and mainboard for longevity and efficiency. I spend what I need on GPU for performance.
liaminwales@reddit
The most simple way I can put it\~
Think of the CPU as a limit on max FPS, think of the GPU as a limit on max graphics quality (and FPS but to a lesser amount).
Hub just did a video that explains the topic well https://youtu.be/O3FIXQwMOA4?si=ynVv2jBa-ObUvrVP&t=207
Ill also point out there CPU Scaling videos, they show the same CPU with a range of GPU's.
example - CPU & GPU Scaling Benchmark, Ryzen 5 7600 vs. Ryzen 5 5600: Is Zen 4 Worth It?
-UserRemoved-@reddit
30-50% utilization doesn't mean much considering not all games will use all your cores. If you have half your cores pegged at 100%, then overall usage will be 50% despite the CPU being the obvious limitation.
Better CPU can provide higher max performance, better 1% lows, and more CPU overhead. In general for gaming, the CPU tells the GPU what the render and the GPU does all the rendering, so having an appropriate CPU can make a big difference.
Entirely depends on the exact game and settings. This is what benchmarks are for, gaming CPU benchmarks generally will test more CPU dependent games, since those games will show the largest margins for comparing differences.
GamingKink@reddit
Is i7 6700k at 4ghz holding back rtx 2080 8gb by any chance?
Moscato359@reddit
Yes. But not by a massive amount.
GamingKink@reddit
Thanks.
JakeRay@reddit
Depends on the type of game and the performance you're aiming for. In most cases, I'd estimate that a 6700k would hold back a 2080.
But, if you're seeing GPU utilization at 95-100%, you're fine for now.
dafulsada@reddit
so even with a 10 year old CPU as long as GPU is at 99% I'm fine? Are you sure?
SupFlynn@reddit
Thats where we introduce frame times consistency and %1 lows and how fluent of a experience you need obviously in 4K Ultra settings your cpu wont be a bottleneck for any of these however in 1080p those are the graphs you should look for even your gpu utilization on %100 you can monitor that your frametimes and %1 lows are flactuating and they're not what you'd aspect than thats a cpu bottleneck. GPU utilization wont give you anything by its own.
secretreddname@reddit
Nope. In WoW GPU was always in the 90%+ but going from a 10600k to a 9800X3D was a massive increase in FPS and 1% FPS mins.
myrlin98@reddit
You will always be bottlenecked by something. If you're GPU is at 99%, that's your bottleneck. If you upgrade it (or play something less GPU intensive on low settings) then you will likely start being more CPU-bound. If you then upgrade your CPU, the cycle repeats.
dafulsada@reddit
so I may have 99% GPU in a game but 80% GPU in another CPU intensive game, am I right?
Gregardless@reddit
Ye
dafulsada@reddit
Skylake? Yes, it's bottleneck for that GPU
weqoeqp323@reddit
Depends on game, resolution, and framerate. If you're curious you can look up benchmarks of games you play on better CPUs.
chaddledee@reddit
On top of this, a GPU bottleneck is preferable to a CPU one because GPU load is fairly consistent frame to frame but CPU load can change dramatically, so having a better CPU leads to more consistent frame times.
CrateDane@reddit
In addition, CPU load can also fluctuate over time, so a core that shows at 50% load may just be pegged at 100% load (and limiting performance) 50% of the time.
GingerB237@reddit
What this person said, if you have it available track max core usage. I have a read out of all my water cooling temps and fan speeds etc and it tells me my which core is at the highest utilization and what that is. So in games I’ll often be at 100% on the max core but 30% reported by windows task manager.
vaQ-AllStar@reddit
1080ti i7700k to 1080ti ams ryzen 9 7900x3d a performance increase of 30-60 fos depending on game, whats the point probably that. Feel like most people dont understand how cpu heavy most games are. Better gpu only matters in case of raytracing or highter res like 8k or in my case where the gpu struggled was when i played racing games on triple 1140p monitors ie a res of 7680x1440 then my gpu started struggling in the low 35-45fps.
NewestAccount2023@reddit
Games can be CPU limited even when no cores are above 70%, CPU utilization is much more complicated than GPU utilization. The typical way to tell if you are CPU limited is by looking at GPU usage because the GPU idles until the CPU hands it frame data to render, most often if a GPU isn't at 98% or higher is because the CPU isn't giving it frame data quick enough. A "CPU bottleneck" could be from slow ram or CPU architecture not lining up well with the game architecture, or its simply not powerful enough.
Aside from all of that 1% lows are often significantly affected even for games properly at 99% GPU usage, games with similar a erages but much better 1% lows will look smoother and feel better.
Moscato359@reddit
If you have a 7600, no new cpu in the world with have a large performance uplift for you, unless you have a 4090.
Kionera@reddit
It depends on the types of games you play. If you're mainly playing AAA titles, sure. However in certain genres like simulation games, MMOs, shooters, etc. you'll still see a noticable improvement even on midrange GPUs.
Mrcod1997@reddit
It's really best to look at gpu load to see if you are cpu limited. If gpu load is below 90-95% then you are probably being limited by the cpu at the settings and resolution you use. It's not always a simple answer. Generally though, I would say a 7600 is enough cpu for most people in most games.
looopious@reddit
Usually just fears of bottlenecking. It really doesn’t affect much unless you have an ancient cpu or you are always lagging because of 100% utilisation.
NeighborhoodOdd9584@reddit
No one knows the answer, depends on the game and settings and your GPU. Way too many variables. But the most noticeable thing will be the lows by far.
lazava1390@reddit
Yeah those 1 percent lows can literally make or break game stability and immersion. I refuse to play a game that stutters to a crawling fps only to jump back up to smooth only to go back down again. I think that used to be the biggest reason I went Intel back in the day because the FX CPUs were just god awful with their 1 percent lows.
ccfoo242@reddit
Until relatively recently, many game engines ran their main loop in a single thread. So, only one core of your cpu was doing the majority of the non-gpu work.
Unity added a feature called DOTS that, among other things, let the developer run their code in multiple threads. The main loop is still one thread but it can schedule 'jobs' at the beginning of the frame then do something with the results at the end. This allows for features like many thousands of animated npc's without slowing down the game.
I know less about Unreal engine but reading their release notes I see that they've been moving more and more stuff out of the main loop thread.
One of the things that is done is use code running in a thread to send parameters to the gpu (such as after one of Unity jobs runs), so a faster cpu on a system that has a faster connection with the gpu will speed up everything.
I probably didn't explain it perfectly, but I hope that helps.
meteorprime@reddit
My 240hz display was struggling to stay locked with my 3080 ti and 9600k because dlss can increase cpu load
Now it stays at 240
Vivid_Promise9611@reddit
As the others guys have said, there’s a lot of good reasons to upgrade a cpu for gaming. However, in your case, there is no reason to upgrade. UNLESS you were noticing stutters (lows).
7600 will push anything up to a 7900 xtx/ 4080 super to its max at 1440p. And it’s does so with pretty good 1% lows
Striking-Fan-4552@reddit
More specifically, stuttering or uneven movement speed WITHOUT a drop in fps.
dafulsada@reddit
no game uses every core or every CPU resource, 7800X3D has much much more L3 cache so many many more frames
Tehu-Tehu@reddit
depending on the game, you can get bottlenecked by your weak CPU if your GPU has headroom
AAMust@reddit
Bottlenecks
Ok_Seaworthiness6534@reddit
Most games are capped at single thread , So dont look All cores utilization, check per core, But To make it easier, Rn you got a 7600 which is a pretty good cpu, In a few years, You might see your games fps suddenly dropping a lot in high populated scenarios like cities in MMORPGS or Open world/Multiplayer games, Or even stuttering/low gpu usage, that is when a new cpu comes :)
Care_BearStare@reddit
The CPU utilization can be misleading since games do not use all CPU cores. Depending on the game, your 50% utilization may be maxing out the one or two cores the game actually uses.
I've made two upgrades on my current rig. The first was moving from a 5700xt to a 6900xt while using my 3600 CPU. I ran a solid OC on the CPU for years. While I did see a good bump in average FPS with only the 6900xt. I always had stutters from 1% low spikes with both GPU's while using the 3600. These stutters are not to be confused with texture/shader stuttering in Unreal Engine games. UR is notorious for load-in stutters. It's an engine issue in many games. I had stutters in all games regardless of engine. Not game breaking, but noticeable. I then upgraded my CPU to a 5800x3D. That chip was a huge bump in performance for me. Avg FPS increased a decent amount, and my 1% lows are nearly non-existent. I felt the CPU upgrade much more than the GPU while gaming.
You don't need the best CPU on the market, but if you're building new. I would go with one of the AM5 x3D chips. I'm currently waiting for NVIDIA to release the 5000 series GPU's. I am running 4k now, so I think I may go with a 5080 or 5090, actual price and performance will make that decision. I do not plan to upgrade to AM5. I might add another 32GB of RAM for 64GB total. I believe my 5800x3D will keep up with the new GPU. I might look at what comes in the next gen of CPU's, if I feel the need.
Naturalhighz@reddit
you got some good responses but let me put it like this. My main game was wow. had a 1600x and a 1060. upgraded to a 3070 and basically no difference to my fps. upgraded to 3700x and doubled my fps, upgraded that to a 5800x3d and doubled them again.
logangrowgan2020@reddit
There are going to be a lot of cute answers here but at the end of the day GPU is the big driver for game performance
krauserhunt@reddit
Try playing kingdom come deliverance and your CPU will be 90%
Grat_Master@reddit
Go check your gpu benchmark at the resolution / settings you intend to play.
Go check 1080p cpu benchmark of the cpu you already have.
If the gpu average fps is higher than what the cpu average is, a cpu upgrade could help. If gpu average is under the cpu average, a gpu upgrade could help.
Do this if you don't meet the average fps of your monitor refresh rate.
Atcera95@reddit
Play something like Dragons Dogma 2 and find out exactly why.
MrMusAddict@reddit
If you play any simulation games, like Stellaris of Civilization, then you could see massive late-game gains in them. For example, Stellaris is programmed to run at an ideal pace of ~90 sec per simulated year. However, that's the lower-bound. There's technically no limit to taking longer, and once you get to mid/late-game, you might be looking at more like 180+ seconds per year.
GamersNexus has shown that in Stellaris when you remove the 90 sec/year simulation-time cap, the 9800X3D can simulate an end-game save-file at 25 sec/year. In other words, it's much much less likely for you to experience a slowdown later in the game.
There are also some rarer instances in normal games where a faster CPU means more FPS. Digital Foundry shows that in Cyberpunk Ultra RT/1080p/DLLS Perf mode, the 9800X3D has about 20% more FPS.
Otherwise, the GPU is more likely to be your main bottleneck.
AbrocomaRegular3529@reddit
If you are aiming 300+ fps.
If the game you play depend on cache, for example cities skyline, or games like squad etc.
Overall better 1% lows.
For most people they aren't that necessary. But then again CPUs last 5-6 years before requiring more powerful, so why not pay 50$ more and get a much better one?
MrKiltro@reddit
The answer is a big ol' it depends. Mainly, it depends on your GPU and the games you're playing.
If your GPU is relatively weak compared to your 7600 (i.e. you're GPU bottlenecked) AND/OR the game you're playing isn't CPU bound, the improvement will be low, or even negligible. Maybe 5%ish (7600 vs 7800x3d).
If your GPU is relatively strong compared to your 7600 (i.e. you're CPU bottlenecked) AND/OR the game you're playing is heavily CPU bound, the improvement will be much more significant. Maybe 20-30%ish (7600 vs 7800x3d).
Long story short, look at benchmarks. There's tons on YouTube. Try to find one that's similar to your setup and plays games similar to what you play.
For example: https://youtu.be/yeRZ8DnwG5c?si=u6KR6AfvZ9XRBMGS shows the 7800x3d gives you 10-20% more FPS over the 7600 if you've got a beefy GPU.
chy23190@reddit
Go check benchmarks on YT, you will see what the benefits are.
Indystbn11@reddit
That's a very incorrect way of thinking. Utilization percentage doesn't equate to fps.
ltecruz@reddit
You are not seeing single core usage probably as well. It will also depend on your GPU.
Admiral_peck@reddit
You need to see per core utilization, most games don't use more than more cores so you'll never see more than about 70% total usage if you're looking at percent of the whole CPU
That said I'd be checking what the utilization is on your GPU before considering upgrading.
Impossible_Okra@reddit
To post on Reddit and get Internet points. Also benchmarking the shit out of things for .1 % increases
9okm@reddit
If it's at 30 to 50 then you're likely GPU bound and a CPU upgrade would do nothing.
Need more info.
dabocx@reddit
Utilization isn't always the best marker. If the game is hitting only 1 or 2 cores while the others wait it might show that you are only using 20-30% even if the cpu is actually holding you back.
But yes something like a 7800x3d would show big gains in 1% lows and cpu heavy titles like BG3 act 3 and strategy games.
jcalvert289@reddit
Sorry to be that guy, but there is a LOT of information about this online, either through benchmarkers/reviewers like gamers nexus, or the many people before you that have asked this question.
Do some research, and if still confused, it could be worth bringing back some more specific questions to the sub
aussiesam4@reddit
CPU is more relevant at a lower resolution, higher resolution will depend more on GPU .. Either way you should check a bottleneck calculator when choosing components to see if there will be any issues. Sometimes it makes a big difference. Other times your cpu might already be more than enough
Crafty-Photograph-18@reddit
Do you have a GPU bottleneck?
R1zzMazt3r7000@reddit
Some games and higher FPS require more CPU power, that’s it.
lyons4231@reddit
Depends what your GPU is and resolution.
WarringPigeon9000@reddit
If you play with your gpu at 100%, you won't see a benefit. If you play on a low resolution with high fps where the cpu is the limiting factor instead of the gpu, a better cpu will give you more fps. I'm not sure exactly how much more tho.