Does i9-10900k hold up for 4k
Posted by Tr15t0n@reddit | buildapc | View on Reddit | 83 comments
Curious if it’s worth while to drop like a 1K CAD on a mobo/9800X3D/ram to pair with my 5080, or if it’ll even really be noticeable vs my current 10900k. Frames in most games is pretty mint even on high/max at 4k (90FPS+), but I do see significant fps drops in CPU heavy titles like Space Marine 2 etc. assuming it’s not too late for AM5, as am6 is still year(s) out.
I also use my PC to run geology software among other work applications. Lots of task running at once/streaming data etc.
Any insight would be helpful.
Cheers!
LOLXDEnjoyer@reddit
could you tell me which motherboard and ram you run your 10900K on?
Tr15t0n@reddit (OP)
Asus Z590-A mobo, and 3200mhz ram I think (32GB)
Jaded_Working_8551@reddit
CPU doesn't mean shit in 4k. A ryzen 5600 will get within 5% performance of 9800x3d in native 4k.
_Rah@reddit
Depends on the game. A 9900k in Satisfactory (1K+ Hours save file) at 1440p gave me sub 20fps. All 8 cores pegged almost to the max. Going to 4k isn't going to suddenly make the game playable. Upgrading to 7950X3D got me to 85+FPS with same GPU and settings.
Its certainly not true for most games, but if you were to give me this advice and I trusted it, I would be pissed with the results.
gakule@reddit
I do wonder if upgrading to 4k would have actually helped you.
The reason the person is saying that the CPU doesn't really mean much at 4k is because the GPU has such a heavy workload it can't work faster than the CPU.
At especially 1080 and sometimes at 1440 the GPU outperforms the the CPU in calculation speeds and has to wait for the CPU to catch up and issue instructions, which is why performance counter-intuitively (from a common knowledge perspective) tanks.
4k is like 4x the rendering work of 1080p and 2.25x the rendering work of 1440p
The fix is either up your resolution or get a better CPU.
You didn't list out your GPU but I'm assuming it's a pretty high end, but do keep in mind that the person you were replying to specifically mentioned 4k and not 1440.
_Rah@reddit
I had a RTX 3080 back then. I am running an RTX 5090 now. Still chose to be on 1440p with my 480Hz OLED, instead of 4k 240Hz. I play a lot of Counter Strike and decided to go with the higher frame rate than the resolution.
And when people say that increasing the GPU load will reduce the reliance on the CPU, that's not technically correct.
If the CPU can only output 15FPS for example, the GPU could be 10% used or 100% used, that wont change the fact. The reason people say increase settings is because at that point you might as well, since your GPU is idle. Satisfactory is a UE5 automation game. There are thousands of machines producing things with thousands of conveyer belts transporting things across the massive map, which is very CPU heavy. Substantially more than an average game. Especially once your savegame is over 1000 hours old where multiple people have been working for hundreds of hours building things.
I could have easily maxed out my GPU in those days by turning on Lumen and other settings to max. I didn't because there was no point. Sub 20FPS was not playable no matter how pretty the game looks. The physics in the game was starting to break by that point, and I couldn't even shoot myself out of a cannon to move around the map, as due to lag it just shot me out of the map, killing me.
Another example was Hogwarts Legacy. I was at about 100ish FPS in Hufflepuff common room. It remained there even if I lowered the resolution, settings or turn DLSS on/off. Increasing the GPU utilization masks the CPU bottleneck by the GPU being an even bigger bottleneck. It does not solve the CPU bottleneck. So if your aim is to get more frames, going from 1440p to 4k will not get you more frames. If the CPU cannot get more than 20, that limit will not change. Certainly not enough to turn an unplayable experience into a playable one.
gakule@reddit
I don't believe anyone said increasing the GPU load reduces the reliance on the CPU, but maybe I misread something you saw.
I do understand some games are more CPU bound, but just explaining situations in which the GPU may be too strong for a given resolution and CPU combo. Not debating your experience necessarily.
_Rah@reddit
It seems to be a common thing for people to say. Daniel Owens just put out a video within the last hour. I haven't watched it, but seems like he is ranting about this exact scenario. https://www.youtube.com/watch?v=EBYHylB1ELs
I thought you were suggesting the same, which is why I responded the way I did. If you aren't suggesting this, then I'm not sure what you were trying to say.
gakule@reddit
Yeah so what I said is basically this:
The CPU needs to consume 50 apples to perform calculations and get them to the GPU to perform its calculations and draw a picture. It takes the GPU 15 apples to perform the drawing on 1080p, 35 on 1440p, 60 on 4k.
The GPU consumes 15 or 35 apples and comes back to the CPU for more instructions but has to wait - the CPU is bottlenecking the GPU. With consuming 60 apples, the CPU has instructions ready and waiting for the GPU.
The fps stuttering is what happens with bottlenecking at lower resolutions.
Not a perfect example, but is the real world example.
The CPU isn't doing less work, which is how you've incorrectly interpreted this chain of comments, the GPU is just doing more and therefore not creating a bits and bytes logistical logjam.
_Rah@reddit
Yeah, but even in your example, if the CPU can only generate 20 frames. Your GPU using more resources is not going to make it generate more than 20 frames. Higher GPU usage can only slow the process down further. Not bypass the limitation of the CPU.
gakule@reddit
I agree with you, but most games aren't CPU bound in that way. Getting a bigger GPU just makes it take fewer apples to complete a task. A CPU by itself is likely not limiting you to 20 frames, certainly not anything modern.
_Rah@reddit
I literally said that most games don't behave this way. But I have coke.across multiple games over the years that do.
And yes, a CPU is totally limiting me to 20 frames on something modern. That is my whole point. That such games do.exist and people like me would put hundreds or thousands if hours into these.
Here are the screenshots. Its comparing my i9 9900k to 7950X3D. The game alone is using 60% of my 16 cores. Most people only use 8 cores total. Not to mention that the X3D is the saving grace here.
https://imgur.com/a/3ofnHz0
RighteousnessWrong@reddit
"And when people say that increasing the GPU load will reduce the reliance on the CPU, that's not technically correct."
You're missing the point. A computer with a CPU bottleneck will be more evident at 1080p than it will at 4k. So much so, this is typically how CPUs are tested in gaming(at 1080p) because they will all perform within 1-2% of each other at 4k when the limiting factor becomes the GPU.
For example, I own a ryzen 7 7700. A 9800x3d will perform 5-25% faster at 1080p in pretty much all games but it will perform almost identically at 4k(i say almost because there are some small differences in 1% lows). There's loads of tech review data to back this up if you feel like looking it up.
_Rah@reddit
Daniel Owens just put out a video within the last hour. I havent watched it, but seems like he is ranting about this exact scenario. https://www.youtube.com/watch?v=EBYHylB1ELs
_Rah@reddit
I literally gave an example of a game I have nearly 1000 hours in that was down to sub 20 FPS. 1% lows was in single digits. This would be true at any resolution. Resolution did not matter. Some strategy games are same. Frames wont change but the time Ai takes to think during their turn will significantly increase/decrease.
I have seen the data. But once again, my point was that increasing resolution will not improve the CPU bottleneck. It will just introduce the GPU bottleneck by decreasing the FPS even more. So higher resolution does not help things. It just shifts the bottleneck and makes the performance even lower.
water_frozen@reddit
you were downvoted for this? wth
_Rah@reddit
No idea. People have this very fixed belief that all games care about is GPU. CPU performance is just a side thing. But I have had plenty of experiences that prove otherwise. From CSGO, Satisfactory, Vermintide 1 to even Hogwarts Legacy. Especially older games that aren't multithreaded well like The Witcher 3 or Crysis are just hard capped by the CPU.
You will have people say that no games use more than 8 cores. Yes Satisfactory alone uses 61% of my 16 core CPU. Got screenshots to prove all this in case anyone is interested.
KillEvilThings@reddit
No one's running native 4k on the newest games lol. And in cases where you can, the CPU absolutely makes a difference.
SolomonG@reddit
I mean, that kinda depends on the game lol.
Say that in the middle of a boss fight in WoW.
Tr15t0n@reddit (OP)
This was my thoughts also, but if it’s going to be years before the next gen drops, not sure I want to limp it out another few years. But performance seems decent as-is. Need more 1% lows testing
greggm2000@reddit
Next-gen (Zen 6) could be as soon as a year from now, but is more likely to be a year and a half (at CES in Jan 2027).
Jaded_Working_8551@reddit
Well when next gen drops that will be better than the current gen. It won't make a difference right now to upgrade so just wait till next gen when everything is better
Tr15t0n@reddit (OP)
Prob the move ya. Cheers.
02mage@reddit
keep waiting and you'll never upgrade
SenseIndependent7994@reddit
Why stop there just wait for the next gen again
samusmaster64@reddit
It entirely depends on the game. There are plenty of games where it does matter a great deal and an older gen 9 or 10 intel 6 core CPU is going to severely hold back a 5080, even at 4k.
water_frozen@reddit
i'd love to see your sources for this
Scar1203@reddit
If you just look at averages sure, but CPU bottlenecks are where your 1% lows feel the worst. You can have a high average framerate and still be stuttering.
ImYourDade@reddit
Is that worth the 2.5x cost for the average user? Especially when a lot of games won't even see a tangible difference in average or 1% lows at 1440p+
ocka31@reddit
Average user doesn't play at 4K.
kapybarah@reddit
There are people who play at 4k with an average perception of performance. Those are the ones who won't notice a big difference in the 1% lows improvement
ImYourDade@reddit
Well the post is talking about 4k, so the average 4k user is implied and it's not very important there. And I even mention 1440p+ because imo at 2k it's already less impactful. What average user is the almost absolute top end CPU made for btw? I don't know of any product where the top end is made for the average user lmao
Scar1203@reddit
Average user? Probably not, but there's a lot of middle ground between a 5600 and a 9800X3D. The best price to performance for a gaming build right now is probably doing an AM5 build with a 7600, 7700, 9600, 9700, or 7800X3D depending on budget.
Unless it's absolutely necessary due to budgetary constraints I think doing a new AM4 build right now is a mistake, and I don't think a 9800X3D is a necessity unless you're going for a top-end build or play purely e-sports titles or competitive shooters.
matte808@reddit
Not necessarily true
InevitableKey3811@reddit
Bro those are going for $500 you’re getting scammed for four grand
Xlxlredditor@reddit
That is the resolution not the price
InevitableKey3811@reddit
$4k for $500 product does not sound like resolution to me. That’s just predatory pricing.
Xlxlredditor@reddit
He wants to play at 4k not pay 4k
InevitableKey3811@reddit
Yeah I wouldn’t want to pay $4k to play either
Xlxlredditor@reddit
Are you dense
InevitableKey3811@reddit
No, I’m Dennis.
supercakefish@reddit
I have 4K monitor. I typically use DLSS Quality. Upgrading from i9-9900K to 7600X3D almost doubled my average FPS (with an RTX 5070 Ti).
Tr15t0n@reddit (OP)
This seems wild; wouldn’t think you’d get these sort of gains. That’s awesome though.
supercakefish@reddit
It certainly exceeded my expectations. I underestimated just how CPU limited my PC had become in many recent games. Remnant 2 and Shadow of the Tomb Raider were two other examples where my testing showed big boosts in performance.
Jaded_Working_8551@reddit
That's cause of upscaling. When you use upscaling it renders at a lower resolution where CPU matters and then upwcales to 4k. In native 4k it doesn't make a difference.
LabAccomplished5401@reddit
Could you see an improvement? Yup.
Will it be monumental? Probably not.
Is it worth 1k CAD? Helllll no
kapybarah@reddit
Stick to the 10900k. A 5080 at native 4k will remain the bottleneck. Easily. If not, just run your games at a higher resolution and downscale to 4k (crazy to say that) and enjoy even crisper visuals. You'll still get 60+ fps all the time and fg can help alleviate the cpu bottleneck a bit
Tr15t0n@reddit (OP)
So many completely different takes on this matter I love it. Hence why I’m still pondering the decision. Clearly “do nothing” is the easiest decision 👌🏻. And I only see the CPU struggle occasionally. Overall it’s a non-issue. Dips here and there.
kapybarah@reddit
Plus if you have the itch to upgrade, get an oled display of you don't have one yet. I'll take 60fps oled over 120 on ips any day of the week for anything non eSports ofc
1LastHit2Die4@reddit
Unless a good deal for AM5 CPU + MB comes out I am sticking with similar config as yours, I got i9-10850K paired with RTX5090 running at 4K 240Hz.
The gains are not that big, I checked the reviews, the comparisons it doesn’t justify the costs versus performance gain.
I am not waiting on AM6 as probably the price will be too high and the motherboard stability will be shaky with the new CPU lineup.
Tr15t0n@reddit (OP)
I think at the end of the day, this is where I’m leaning also. Maybe we get some Black Friday bundles thst tip the scale or something.
Jaded_Working_8551@reddit
Everyone who is saying they had crazy differencss in performance when changing CPU are using upscaling. In upscaling it's rendered at a lower resolution where CPU matters. In native 4k CPU doesn't make a noticeable difference, maybe in 1% lows if it's an x3d.
Naerven@reddit
CPUs don't care about what resolution you are using. That's what GPUs are for.
Cradenz@reddit
just super not true lol.
1080p is more CPU intensive than 1440p. 1440p is more cpu intensive than 4k.
greggm2000@reddit
It’s more complicated than that. Yes, you’ll get more fps at 1080p vs. 1440p, and more at 1440p than 4k, but if you compare the CPU load when capped at, say, 120 fps, 4k will have the highest bc it’s moving the most data around, what with larger textures and all the rest.
AMLRoss@reddit
It most definitely does make a difference. I went from a 5950X to a 9800X3D and gained 50% more frames in many games, at 4k. People saying it makes no difference at 4k are full of shit. I have first hand experience. X3D is no joke. (on the same GPU, 3090)
Tr15t0n@reddit (OP)
Oh damn. This was the flip-side answer I was looking for. I went from 3090 to 5080 which was a pretty beefy upgrade, you are making me ponder the team red swap 😎 happy gaming!
greggm2000@reddit
And it’s important to add that even at 4k, if you use DLSS upscaling, you’ll see most of the benefit that people playing at 1440p native or 1080p native will get.
Scar1203@reddit
It'll vary game to game. It's definitely not too late for AM5, I upgraded from a 13700k to a 9800X3D and you got way more use out of your 10900k than I did my 13700k.
Depending on how CPU intensive your geology software is you might look at the 265k as well, as a pure gaming CPU it's being avoided for the most part but it performs about the same as the 9900x in CPU intensive workloads and is close to the 9700x in gaming. I might get burned at the stake for saying that, but it is a decent value since the price drop for a mixed use PC.
Tr15t0n@reddit (OP)
Great info, thanks 👍 Will look into these
greggm2000@reddit
There’s no upgrade path with Intel though, but with AMD (AM5), you’ll be able to upgrade to Zen 6 and maybe Zen 7 after that.. and the rumors out there (which may be wrong) suggest a really substantial uplift in performance.
Scanoe@reddit
The 9700X is a great lower cost Cpu.
I upgraded to one after running a 7700X for about a year. The 65watt 97 beat my 105 watt 77 in every way. Put the 9700X in 105 watt mode, Walla an excellent lower-cost Multi-core Cpu ta' boot, and easier to cool then the 7700X on top of it all.
Yes my 9800x3d is better in every way, but the 9700X is definitely a 'Budget' King.
VaultBoy636@reddit
What most people forget is that an overclocked 265k will slot in between a 7800x3d and 9800x3d in gaming. Although you need fast ram as well for it. Not everyone is going to overclock but it's definitely a possibly and don't need to live in a freezer to oc the 265k unlike some previous chips
DesmoKerist@reddit
If you are playing in 4k, stick to your CPU. I have a 3090 ROG Strix paired with i5 12600k and playing games also on a good LG 4k monitor, it runs smoothly, processor doesn't matter much in this resolution.
Most-Giraffe-8647@reddit
weird question, cpu and resolution is not really relevant.
Queasy-Experience251@reddit
It will do fine
GuyNamedStevo@reddit
10900K is comparable to a 5000 series chip from AMD (non 3D-cache) in terms of performance per core. So a 5600X is (almost) as fast as a 10600K (10600K sucking over 200 watt on load, of course).
d3facult_@reddit
I don’t know what 10600K you have, but it certainly isn’t a 10600K if it can pull 200 watts
GuyNamedStevo@reddit
Well, its all-core boost tdp is 182 watts, so I guess you are right.
dakkottadavviss@reddit
I would say go for it if you’re doing a 5080 class build. I moved from a 10900k up to a 9900X3D on a 3080 GPU. I already saw differences. Shortly after I have a 5070Ti and it’s very clear how good the CPU is
Basically it depends on the game. Baldur’s gate is my most played and it’s a huge difference. Big open world with lots of NPC will be very CPU heavy. Very high fps titles will need a top end CPU.
Antenoralol@reddit
4K is mostly GPU bound.
bhm240@reddit
On native 4k not much of a difference, but you are not going to be playing on native 4k for most of the time on any modern game
KFC_Junior@reddit
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/20.html
https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/21.html
Tr15t0n@reddit (OP)
Touché. Even though she’s deep down the list, still pretty comparable. Good visual.
RealBerfs1@reddit
Disable hyperthreading and overclock the shit out of it, it will give you better CPU performance than half of the CPUs available today.
Cradenz@reddit
huh? 10900ks usually top out at 5.1-5.3ghz only. and 5.3 ghz is for golden chips. Diamond samples are able to get 5.4ghz.
But this is a pretty old architecture.
RealBerfs1@reddit
If you keep hyperthreading on, yes. HT off, you can get 100-300 MHz higher depending on the cooling used.
Spooplevel-Rattled@reddit
I run 5.3ghz ht on 10900k daily with 4400cl16 memory. It's very snappy still in 2025.
That said, it's strength is now outmatched in games by newer cpus despite the fact my setup has the same timespy cpu score as a 9800x3d.
kr1tz__@reddit
for 5080 yes it does make difference
DZCreeper@reddit
You answered your own question. Most games will be GPU limited with that combination, but the occasional CPU heavy title will benefit.
9800X3D is largely overkill. An R5 7600X is much better value for gaming, and leaves you the option for doing an X3D upgrade in 2-3 years time.
Tr15t0n@reddit (OP)
Thanks for some parts to scope. I figured if I were to have to swap to team Red, might as well go for it and “future proof” for a few years out. Leaning towards just hold out for next gen as the 5080 is carrying the system pretty good as-is. 100-130fps in BF6 beta for example.
Cradenz@reddit
Really depends on the game.
9800x3d will be a very nice upgrade regardless if a game uses the extra cache or not….however.
If you are still ok with your cpu with how it is now. I would wait until the 10800x3d comes out.
Tr15t0n@reddit (OP)
This is what a smart man would do.