RTX 5070 Ti – Ryzen 7 7800X3D or 9800X3D, is it worth the extra cost?
Posted by Advanced-Paint9257@reddit | buildapc | View on Reddit | 88 comments
Hi, I’m building a new PC with an RTX 5070 Ti and I can’t decide whether to go with the Ryzen 7 7800X3D or the 9800X3D. I’ll mostly be gaming at 1440p but I also want to use it for VR with Google VR or Meta Quest 3. The price difference is around 300–400 PLN. Does anyone have experience with these CPUs and can tell me if it’s worth paying extra for the 9800X3D and which one would be better in the long run?
SirFunktastic@reddit
A 7800X3D is enough if you're looking to save but I got a 9800X3D with my 5070 Ti because I want my rig to last a good 8-10 years without my CPU being the bottleneck.
Tyraid@reddit
Just curious if you bought a CPU ten years ago if this could be true today
MrFartyBottom@reddit
All depends on the games you play but no 10 year old CPU holds up for the latest AAA games. Will a 9800X3D hold up in 10 years? Unlikely but it all depends on what games they will be playing in 10 years and what kind of CPU advancements happening in the next 10 years. There was a 6 year period there where no CPU major improvements happened.
Careful_Sea9444@reddit
The Ryzen 7 3700x came out around 7 years ago and i've heard it still works quite well at 1080p. Its not quite 10 years ago but advancement hasn't been as quick as we think
Tyraid@reddit
Yeah I didn’t think so I was sort of hoping someone could say a specific CPU model that could still hang today
kermityfrog2@reddit
I was trying out the Space Marine 2 demo on my ancient 2600K and it was still working at 40fps minimum for 3440x1440. I think people overestimate how the curve has flattened over the years. I'm going to upgrade now because Windows updates are now starting to break my motherboard and other old drivers.
LouBerryManCakes@reddit
Lol I upgraded from an i7 4790 to a 9600x based build like 6 months ago. I didn't think there was hardly anything older than that still running. Well done!
3VRMS@reddit
Was talking about my 4690k a while back and someone mentioned still using their i960 for things. :P
kermityfrog2@reddit
i7 4790 is running my Plex server/HTPC!
rofon345@reddit
I upgraded just this last weekend from an i7 8700k/1080. Of course I wasn't playing AAA's on max settings, but it held up until now without giving me any trouble
xmkgenzo@reddit
same here. My old rig (i5 6600k/1070) was running like a champ until the forced Win 11 upgrade. that's an almost 10yr old CPU. the only thing I did was upgrading the RAM to 32Gb.
I built my new rig (9800X3D/5080) a few weeks back and the plan is to keep it for several years. might upgrade the RAM at some point if needed -at the current rate of price increases it might cost like my whole PC in just a couple of years LOL
_PacificRimjob_@reddit
my spouse's rig is still rocking my old 6700k/1080 Ti. They also basically just play their Steam Deck now which is roughly what the rig is capable of on the occasional impulse to play something like HOMM3 or Rise of Nations. I wouldn't play anything competitive on it but don't let reddit's standards fool you, games are plenty playable on older hardware as long as you set your expectations correctly.
karmapopsicle@reddit
I think the big difference is whether that “8-10 year” span includes a GPU upgrade. That time period is roughly what it takes it to go from top-spec to minimum spec in AAAs.
What some people are really asking about is whether buying the higher end CPU means they can drop in a new GPU of a similar tier in 6-7 years down the line and keep using it for 3-4 more years as-is. At that point though even the entry level chips from newer contemporary platforms will likely be substantially more powerful and better matched for the new hardware.
boz271@reddit
I had an i7 4790k and that came out in 2014 and paired with a 2080s on I think mid/low 1440 on dlss 60ish fps on black myth wukong so it’s not terrible but it definitely struggles. It especially struggles on cs2 kinda Fortnite and definitely rust as that would dip to 40 fps and could stay often at 40 otherwise it was max 60fps in a server.
ime1em@reddit
i think 5800x3d has the potential due to the extra cache. Would need to wait
GoatShapedDestroyer@reddit
I ran with an 8700K until those year and it handled up to 1440p pretty admirably in a lot of games. Obviously varied game by game but it was sufficient.
Lazuf@reddit
What are you talking about? The 6950X holds up, so does the 7700K, and even slightly older chips like the 4790K/5775C/5960X or the 9590 from AMD would play AAA titles at 4K60 lol
SexBobomb@reddit
a lot of those intel chips wont take Win 11, which as far as crappy self-inflicted limitations go...
blackkkrob@reddit
I'll be playing old school runescape - should be fine. Probably still won't have 99 runescrafting though
HPowner0@reddit
Another important consideration nobody has said yet: Moore's Law is starting to decay in practice due to physical constraints, i.e., CPU performance will soon stop doubling every 2 years, as the law dictates. https://penntoday.upenn.edu/news/penn-engineering-moores-law-really-dead
Due of this, the difference in CPUs 10 years ago to today is going to be much larger than the difference between CPUs today and in 10 years -- unless a significant breakthrough in chip architecture occurs I guess.
3VRMS@reddit
"CPU transistors will soon stop doubling every 2 years."
Gordon Moore
Meanwhile Intel, who aligned their tick-tock business model off their co-founder Gordon Moore's prediction, stuck on their 14nm++++++++++++++++++ for years 😂
the-sexterminator@reddit
Moore’s Law is used a bit misleadingly here. It was about transistor density and cost, not a guaranteed doubling of CPU performance every two years. While transistor scaling has clearly slowed, performance gains haven’t stopped all together, they’ve shifted toward architecture, parallelism, chiplets, and specialization. It’s plausible the last 10 years saw bigger raw gains than the next 10, but that’s not a direct or inevitable consequence of Moore’s Law “decaying”, it depends heavily on workload and design choices, not just physics.
A good parallel is Dennard scaling. When it broke in the mid-2000s, CPU clock speeds largely stopped increasing, but overall CPU performance continued to rise. For example, an AMD FX-4170 (2012) has a ~4.2 GHz base clock, while a Ryzen 7600X is ~4.7 GHz. GHz barely changed, yet real-world performance is vastly higher due to IPC gains, more cores, and better architectures instead of higher clocks.
HPowner0@reddit
Can you explain what you’re attempting to argue here?
Firstly, it seems to me that you’re repeating what I’m saying but specifying that it’s actually the transistors doubling (and the performance grows exponentially, which no longer holds true as it’s now already down to linear growth), which is obvious given that it’s the first sentence in the only wiki article I had linked and I intentionally did not get into this distinction for brevity’s sake.
Secondly, your only argument against my claim is that other factors such as chip architecture could make a difference to fill in the gap left by a slowing Moore’s Law. I very clearly already said this.
Finally, yes, Moore’s Law is “decaying”. There is a clear reason for this rooted in physics, despite what you have said. Transistors are now approaching single-atom dimensions (around 0.2-0.34 nm for silicon/carbon), physical limits where quantum effects (electron leakage) and manufacturing challenges (positioning individual atoms) make further shrinking difficult. Please at least bother to do a quick google search before commenting again - there a hundreds of articles and peer reviewed articles on this very topic.
the-sexterminator@reddit
redditard I literally work in a semiconductor lab. some of u business major retards are insufferable. stop watching LTT and repeating random shit from them
cheetosarelife@reddit
This sounds wrongly worded - do you mean the difference is going to be smaller unless a major breakthrough happens?
3VRMS@reddit
I have been using a 4690k for 10-11 years, recently upgraded to 9800X3D because I need more RAM for my use case.
For almost everything I do, felt no difference CPU performance wise. If you swapped the system and matched the RAM quantity, I wouldn't be able to tell.
There's still people rocking 2500k and gaming just fine today for what they want. Meanwhile, there's likely going to be a lot of people buying the 9850X3D when it comes out, and honestly, heck yeah, if you can afford it and it's what you want, go for it.
These things are highly subjective, especially once you start dragging it out from year-to-year, to decade-to-decade. Personal preference and use case both today and how it changes in 10 years matter a lot on whether a processor is good for one person or not.
I also tend to use my phone for 7-10 years before I bother "upgrading" to a phone that's a few generations behind the latest.
The majority of people tend to use old, outdated hardware for a very long time and it's perfectly fine, sometimes still overkill for their use case.
This isn't a bash towards people who upgrade every year. It's simply pointing out how long something can last is highly dependant on the person, especially if you're trying to predict our interaction with tech a decade later. The person upgrading every year for the past 10 years certainly didn't see their system as enough, but they might get tired and stick to one for 15 years.
boodopboochi@reddit
Nope. But thats because 10 years ago, ryzen did not exist and 4 core 8 thread intel chips were king. In my opinion, innovation isnt a product of time, but rather of competition. New generational leaps require a competitive landscape where companies try to outperform eqch other. When theres no competition, theres no incentive for companies to innovate.
So if you expect there to be low competition for 10 years, then the top chips of today will become obsolete slower. If you expect high competition, then obsolescence happens faster.
I ran an 8700k/1080Ti build from 2017 until Nov 2025 and the jump to 9800k/5070 Ti is staggering.
ToxicInhalation@reddit
Betting against the Moore's law
CalmAndSense@reddit
My 6700K/1070 still plays WoW and the occasional modern game at low settings.
blackknight16@reddit
10 years is possible but it depends on when in the refresh cycle you do you build.
Looking back 10 years was the rough period of Intel pushing 4C/8T at the top end for multiple generations and AMD hardly in the CPU competition. The legendary i7-2500K (4C/8T, 2011) was viable for a long time after its introduction. However an i7-7700K (4C/8T, 2017) would be become dated far more quickly as 6 core and 8 core CPUs became prevalent and games started utilizing more cores in the 2020s.
Remains to be seen if a powerful 8C/16T CPU will remain viable for a long time or become obsolete with future CPUs using higher core counts.
SexBobomb@reddit
Speaking for having a 2700X in my second system and knowing a dozen people still with them, they still work fine - not superlative, but they can game without causing problems
arsci@reddit
The i5-2500k was indeed legendary as you mention. Installed it and immediately overclocked by an additional 1.2 GHz, rock stable. Held strong during Shintel hell and releasing the same 4 core CPUs year after year, until AMD arrived to save us.
SexBobomb@reddit
The main reason they dont is Win 11 and the industry finally jumping to more than 4 cores for most people, realistically though you can still game on a zen+ like the 2700X chip which is going to be 8 years old in April.
e_xTc@reddit
I've been rocking a côté 2 quad q9550 from 2009 until 2019. Worked great through 2gpus : gtx295, gtx770.
Best part tho was going from a mechanical hdd to a ssd
Lazuf@reddit
I mean, a 10 year old CPU WILL run AAA games if it was top of the line at the time. intel had:
5960X on x99 i7 7700k on Z270 6950X on X299
All three of these would hold up today if the target was 4K60.
but even older machines,
FX-9590 from 2013 i7 4790K from 2014 i7 4960X from 2014 i7 5775C from 2015
All of these top end CPUs will still run AAA titles at 4K60 today.
joe1134206@reddit
Implying we'll get half the improvements we got from 2015-2025 as we will in 2025-2035...
JustJohnItalia@reddit
The ryzen 2600 i have on a machine does just fine, it runs basically everything at decent settings 60+ stable fps with a 6700xt and 2x8 gb of ddr cl15 3000mhz ram.
It couldn't run everything with a gtx but then again it did do fine on cyberpunk for example
redrubberpenguin@reddit
Definitely not, but it definitely feels like we're seeing a stagnation in desktop CPU technology incoming. At this point I wouldn't be surprised if the 9800x3d actually would still be viable in 7, 8, 9 years.
_gabber_@reddit
I bet it won't. but it will certainly last a good five, like the 5800X3D now, is still very strong and is enough even for the most demanding games.
ImYourDade@reddit
Brother I hate to tell you this, but even lower end cpus from generations before the 5800x3d are enough for games today. Obviously the performance is better with newer stuff, but plenty of older chips get more than enough performance with recent titles
Trollatopoulous@reddit
I had a 6800K until recently and it was more than fine for 60 fps (AAA; >120 esports). Moreover the advances for CPUs (and software support) have stalled so a 9800X3D will more than be able to last 10 years. Just think about the fact that the console's future cpu will still be nowhere near it and even the PS5 will keep being the primary platform for another 7 years.
basement-thug@reddit
No
IllustriousPace8805@reddit
The 9800x3d is a single digit gain over the 7800x3d, when the latter is struggling, so too will the former. Irrelevant.
10 years is 100 when it comes to tech. You think you'd be happy with a 6700k 4core cpu right now?
I mean its possible. I went from a 3770 to a 5800x3d but I was primarily playing wow. If I had been trying to be cyberpunk with a 3770, different story.
Errorr404@reddit
8-10 years might be a bit too ambitious, 3D stacked transistor design is close (2-4 years) along with photonic links also being developed which can expand multi chip design.
SIDER250@reddit
Hardware Unboxed showed around ~10% performance increase over 7800X3D and that is at 1080p. These 10% extra oerformance won’t last extra few years honestly, but it miht give you a headroom to play a game at 60 fps vs 50 if that helos. If it was a real generational uplift, I’d agree. But 9800X3D is bearly any faster than 7800X3D, unless we talk about very specific games that take advantage of 3D cache. If you take into consideration 1440p, the gap becomes even narrower. By the time 7800X3D is obsolete, 9800X3D will also be.
YeehawBogus@reddit
Fully agree with this statement, if you have money to spend go for 9800x3d but the idea that it’s gonna last a lot longer than 7800x3d so it’s good to future proof is just wrong. They’re both very comparable and by the time 7800x3d starts bottlenecking people’s pcs so will the 9800x3d. They’re both amazing gaming cpus so you can’t go wrong with either one just choose which ever your budget works best with.
JirachiWishmaker@reddit
the 9800X3D running 10C cooler is a bit of a nice value add for what its worth.
tuskernini@reddit
i don't understand this. correct me if you're planning to play competitive csgo at 540p, but for 2025/26 AAA games your cpu will not be the bottleneck unless you game at 1080p or below (but even then the 7800X3D would rip). for games in the next 5-6 years, the 5070 Ti will be the bottleneck, so you'll probably want a new GPU, in which even the 9800X3D will be the bottleneck going forward.
Sure, I'm exaggerating but it's to explode the flaws so they're easier to see. spend on monitors and GPU then buy the CPU that's right for them.
beirch@reddit
The 7800X3D is 8% slower on average. They'll both be obsolete at the same time.
Lurking_From_Shadows@reddit
If I had to guess, we are reaching silicon limits pretty soon and with the AI push we will soon find an alternative solution to computing. In 10 years cloud everything might take precedent over physical hardware for all we know. Buy what you want or can afford and enjoy it. Who cares what comes out tomorrow? Gotta enjoy the now!
AD1SAN0@reddit
Yeah…not gonna happen. In next 10 years it will be what 1800X is now.
basement-thug@reddit
That system is going to be waaay outdated and irrelevant long before 8 to 10 years.
MightyYuna@reddit
Do you play CPU intensive games? Do you use a high refresh rate monitor? Do you play competitive games? If you answer any of these questions with yes, then go with the 9800X3D. Otherwise go with the cheaper option.
People only look at average FPS, which is not a good metric, when comparing these two cpus.
Also the 9800X3D runs cooler and can be overclocked.
-blueberry-@reddit
this
cs2 was the reason i bought the 9800xd, especially the 1% lows are the big difference between 7800 and 9800 its 1% 180-190fps for the 7800 and 260-280 for the 9800
NisshinJampKo@reddit
When ppl say competitive games, do you just mean multiplayer versus games? Like I play deadlock and arena breakout (MOBA and extraction shooter) would 9800x3d benefit those over the 7800x3d?
snmnky9490@reddit
More realistically, it means high FPS fast-paced games
Makerudjl@reddit
Yep, usually in these multiplayer games cpu difference is most visible, while when testing single player aaa titles that are graphically heavy, cpu difference is single digit percentage.
JeremyJoeJJ@reddit
Games you play at 300+ FPS with no stutter at all like valorant, cs2. Anything graphically intensive your gpu will matter more. At 1080p and high frame rate or cpu intensive RTS games you want a great cpu.
Straight-Health87@reddit
The 7800X3D will do. With that GPU, you’ll play 1440p/4k. No difference between 7 and 9 at those resolutions. A few fps here and there, at most.
rembakas@reddit
get 9800x3d and 9070xt. Why 5070ti..
Ants_r_us@reddit
7800x3d now or wait for zen 6 to come out next year (it looks like it's going to be a big improvement).
Chalanger1994@reddit
Zen 6 starting msrp will starting from $700 at very least imo.
Thruthful@reddit
How much extra is the 98 vs 78? 100 more or so? No question for me tbh
kermityfrog2@reddit
25-30% more money for 5-10% more performance.
Chalanger1994@reddit
14-20 with OC
Denselens@reddit
Could be the last PC you are putting together that will not set you back "used-car money", so get the 9800X3D and make it last longer.
TDYDave2@reddit
It comes down to your personal expectations.
Are you a "meets expectations" kind of guy, or are you an "exceeds expectations" kind of guy?
7f0b@reddit
Do you plan to do anything else with the PC? Productivity, work, etc? If so, consider that a 9900X might be a good choice too, and maybe cheaper than either of those. And with a 5070 Ti at 1440p, there won't be nearly as big of a difference as the benchmarks using an RTX 5090 at 1080p would make you think.
CurryLikesGaming@reddit
If you're a temperature gamer, and worry a lot about cpu at 99% load. I suggest going for 9800x3d, due to the design, the 7800x3d itself has an insulator inside the silicone layers, making it runs possibly steamier at 99% load, mine reached 88.5 celcius during tests and very cpu bound games, even AIO can't cool it. on 9800x3d the problem is solved, it runs way cooler on a dual-tower air cooler, because there is no insulator, so it opens to more overclock possibility. For default performance, both are identical, with 3-5 fps difference at max. To me that didnt justify 60% price increase so I went with 7800x3d, more money into rams ( which turned out well because fuck Ai bubble)
BrayIsReal@reddit
I just got a 9800x3d for my 9070 XT if money isn't too much of an issue id just spring for it. You're then getting the best gaming CPU on the market that's future proof meaning it'll handle stuff for a long time without getting outdated. In the end it's worth it
IANVS@reddit
Only if you use the CPU for productivity too because 9800X3D is noticeably better in that than 7800X3D. If not, then go with the 7800X3D...
Ripe-Avocado-12@reddit
about 3% more performance on average. Is it worth it is subjective to you, but probably not. https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/19.html
snmnky9490@reddit
Those are graphically demanding games at 1440p. The games OP plays make a huge difference. If they mostly play AAA games at 4K with RT at 60FPS, then it will make no difference at all. If they play fast-paced competitive shooters at minimum graphical settings for 240+Hz monitor, then it can be a pretty solid improvement
bouwer2100@reddit
Yrrebbor@reddit
I'm building a new setup with a 9070 and went with 9800 as it was only $50 more on Black Friday. Spent the money elsewhere unless you get a deal.
Emergency-Basis-1858@reddit
Recently switched from 7700k to 9800x3d. I7 couldn’t run stable bf 6 on minimal setting 1440p
AMLRoss@reddit
Right now that cpu is the one part worth upgrading since everything else is so damn expensive (ram, gpu). I would just get the 9800x3D, especially since the next cpu (10800X3D) is just around the corner.
Nychthemeronn@reddit
Of course it isn’t, but you also can’t look at the math that way. The 9800X3D is 30% more expensive for a 10%(obviously depends on how you benchmark it but on average 10%) performance increase. There are lots of great historical posts about this topic.
Alkatraz9127@reddit
For 1440p you are perfectly fine even with a 7700. So go with 7800x3d the difference is Like 1% At thst resolution. Depend on the game yes but mostly At that resolution you are gpu bottleneck
sgt_bug@reddit
If the price difference is > $150, then you need to think about it, else I would suggest 9800X3D
AbrocomaRegular3529@reddit
300-400PLN is 100 euro, so it would be worth it IMO. 9800x3d is still better, faster, and runs cooler.
In other EU countries the difference is usually around 200€, so 7800x3d is still most sold AMD CPU, but for your case I would go with 9800x3d.
Cleenred@reddit
Unless you want to play CS2 or the difference is less than 50 USD then I'd get the 7800X3D.
Alternative_Dig_8049@reddit
I'm using a 7800x3d with a 5080 on a 1440p, 240Hz OLED monitor and I'm limited by the GPU. Draw your own conclusions.
JeremyJoeJJ@reddit
Nah I have the 9800x3d with 5070Ti and it’s an overkill.
ItsSevii@reddit
9800x3d will be a top tier gaming cpu for the next 5 years minimum. That was good enough for me
munkiemagik@reddit
I run a 7800X3D for PCVR use and I did my best to understand the impact of the CPU in my use case using tools like fpsVR to look at CPU frametime in gameplay and there was zero gain to be had upgrading my CPU to a 9800X3D.
The 7800X3D is more than capable for the framerates you can expect from your GPU in VR. And with regards to 1% lows and smoothness. i did experience microstutters but that was nothing to do with CPU, ie switching to USBC-ethernet on the Quest3 completely eliminated previous microstutters. So again upgrading to 9800X3D wouldnt have netted me any observable benefit.
scottiedagolfmachine@reddit
No from price vs performance increase standpoint.
7800x3d is cheaper and only loses couple of frames.
Blue-150@reddit
I've seen this asked a few times today, the consensus seems to be no