RX 9070 XT vs. 7900 XTX – Same price right now, but which one will deliver better FPS in the long run? (July 2025)
Posted by SourceFull4556@reddit | buildapc | View on Reddit | 182 comments
Hey everyone,
I'm currently trying to decide between two AMD GPUs – the newer RX 9070 XT and the older RX 7900 XTX. As of now (July 2025), both are priced roughly the same, which makes the choice surprisingly difficult.
What really confuses me is the inconsistent information across the web:
Some benchmark websites claim the 7900 XTX is 10–12% faster, others show just a 5% lead – and when I look at real-world gaming benchmarks on YouTube, the 9070 XT is almost always neck and neck, and sometimes even ahead, especially when FSR or ray tracing are involved.
Specs-wise, the 9070 XT seems to have a lot going for it:
- More modern architecture
- Better power efficiency
- Stronger performance in RT and FSR scenarios
- And it recently received another small performance boost from driver updates
But in the end, FPS is all I care about – nothing else.
Is the 7900 XTX still the better choice thanks to raw raster power – or will the newer 9070 XT start to shine more and more in future titles due to its architecture, better scaling with modern features like FSR 4, frame generation, and potential driver optimization?
Would love to hear your thoughts from a performance-focused perspective!
RettichDesTodes@reddit
FSR4 alone makes the 9070xt the winner
SourceFull4556@reddit (OP)
So even at the same price, you would go with the 9070 XT? Because software and architecture are more important than just having more VRAM, right?
GARGEAN@reddit
I would go for 9070XT if it was 100$ more pricy than 7900XTX.
Extra VRAM is irrelevant if 9070XT is not running out of it. And it won't in any realistic scenario.
RT exists, and it becomes only more prevalent, not less prevalent.
FSR4 exists, and it is usable starting from AA method where it gives plainly better image than what 7900XTX can achieve and ending by heavier upscaling scenarios where with some hit to image quality 9070XT can be playable in situations where 7900XTX is not.
THEspartan8440@reddit
thats what they said about 8GB of ram....then im getting stuck at 83% gpu usage while a game is trying to use 10GB of vram. now i got games trying to use 16GB or vram if i set everything to max at 2k. yeah, i dont think 16Gb is going to last much longer.
GARGEAN@reddit
Well, it's simple - you are not correct on that one. Difference between 8 and 16 is massive in practice, effectively bigger than 2 times. Even IF future consoles will allocate more than 16gb for VRAM(and that's a big IF) - they are still years away and next crossgen period will be substantially longer than current one. 16gb now occupies substantially smaller part of GPU market than 8gb did in 2020.
And last, but not least - one can easily have problems fitting modern game into 8gb pool. One needs to really try to actually HAVE PROBLEMS in fitting game into 16gb. So while 8gb can struggle in general now, 16gb will be extremely trivial to fix in all foreseeable future.
Long story short - no, 8gb situation in 2020 is ABSOLUTELY not equal to 16gb situation now. Even 12 would be a stretch.
Jotoz33_TTV@reddit
The problem is the technology curve is an exponential parabola. The 8gb card took forever to get to from a 1gb card to 8gb. While 16gb followed 8gb much more quickly. As time passes the it will continue to make exponential jumps. Just like a floppy disk used to measure KB and now KB is irrelevant as new games coming out will start to need a TB to download in the next few years whereas a decade ago it was wild to see a game need over 50 GB to download. All tech growth is exponential. 16gb ain't gonna last long for anything but 1080p
HelpMeInThisMoment@reddit
next gen console specs are more or less out and they won't have 24 or 32gb vram lmao. let's be realistic here
Comfortable-Finger-8@reddit
"Well, it's simple - you are not correct on that one"
You can't just tell people what's good enough for them. I play games where it eats up 16+gb of vram and when I played on a gpu with only 16gb I constantly had problems because I didn't have enough vram
CheapNegotiation69@reddit
I have an RX590 8GB and I play games at 4k. I'm currently playing Titan Quest 2 and am only using 5.6gb VRAM at 100% utilization. I've never maxed out vram, even on this old shitty card.
rW0HgFyxoJhYka@reddit
4K is the only area where you can run out of VRAM.
But most people aren't playing on 4K, especially not on a 9070XT
Aggressive-Gate1121@reddit
I have a 55" tv as my monitor. 4k 60hz and 1080p 60hz. I only get 30hz at 1440p. So I would much rather play at 4k medium settings than 1080p maxed out. Many people play on tvs. My card could do 1440p at max settings, but I won't play at 30hz. So for me VRAM matters. I was looking at 9070xt and 5070ti. I picked up the 5070ti for $899 before the price explosion. Better RT performance. Can't find a listing under $1199 now. Very happy with my choice. Many games 4k 60 at max settings.
Weary_Document_9132@reddit
Borderlands 4 sits at 17.6gb of vram usage at 1440p....you must only play like 3 games if you think 4k is the only place 16gb of vram isn't enough, I can name a half dozen in my library without even thinking that couldn't run on 16gb above 1080p
Natural_Energy_9577@reddit
Because borderlands 4 is the most shittily optimized game of 2025
Suitable-Run-3699@reddit
Let me correct you sir, it is Escape from tarkov ;)
Weary_Document_9132@reddit
And? Add it to the fucking list of the hundreds of other terribly optimized games out right now, my point still stands valid, it needs more than 16gb above 1080p and there are many, MANY others
Natural_Energy_9577@reddit
True
skullboy3300@reddit
if you're focusing on gaming, yes
but for stuff like rendering/3d modelling, it can chew up a lot of vram if you're working on a complex model with millions of polygons, even rendering it might be a problem if you have a low vram card lol
ZealousidealPrize980@reddit
What if you are running heavily modded Skyrim or cyberpunk? Id assume VRAM is needed for that and 16gb might not be enough
Iron_Idiot@reddit
Heavily molded skyrim eats all vram. It uses up my 20gb just outside of whiterun. Solution? Don't install enb.
DanimalMapleSyrup@reddit
I’ve get an heavily modded Skyrim(several hundred mods including high res packs) and it uses less than 10 gigs of VRAM with ENB on. You can lower the strain in puts on cards massively and improve your overall performance by shifting resource load to system ram. I have 20 Gigs of my 96 gigs of DDR5 and my 4090 doesn’t even register a load from my modded Skyrim. You need to adjust your configurations something fierce.
ZealousidealPrize980@reddit
What about heavily modded cyberpunk?
Iron_Idiot@reddit
Not r3ally an issue, I've got a few hundred mods and I still am over 100 FPS on 1440p Ultra. The creation engine ain't good man. It sucked in skyrim, sucks in fallout, and starfield is dogshit.
ZealousidealPrize980@reddit
Damn I guess I'm just stressing about VRAM for no reason then 😅, thanks for the response!
Iron_Idiot@reddit
12gb is about the minimum for 1440p. 4k is 16gb minimum. The 5080 doesn't have the horsepower to run flight sim in 4k, it runs out of VRAM.
ZealousidealPrize980@reddit
I won't be playing flight sim in 4k I'll mainly just play games with a decent amount of mods
fiittzzyy@reddit
That's a good point. If you get into a situation where you're needing to use FSR performance to get a decent playable framerate then the 9070 XT will still look very good in that situation whereas the 7900 XTX using FSR 3 on performance would be pretty shit.
psi-storm@reddit
AI is also becoming more important. RDNA 4 has full fp8 support and will get rocm drivers while rdna3 probably won't.
KoftaBozo2235@reddit
9070xt already got rocm support around a month ago, ollama runs fine on windows and linux, just not wsl haha
sparky8251@reddit
Will? Already running local AI via ollama on my Linux box on my 9070XT. Flawless functionality as long as you use the new enough ollama.
SourceFull4556@reddit (OP)
Good points. Something also tells me that the 9070 XT is the better option, especially since I don’t play AAA games on 4K that often—where the extra VRAM would really make a difference.
RettichDesTodes@reddit
At everything below 4K i'd get the 9070xt, 16GB is plenty for that. At 4K the 24GB might come in handy.
alc4pwned@reddit
I think at 4k FSR4 is far more useful than the extra vram still.
Dex4Sure@reddit
just get Nvidia if you really want to use upscaling. DLSS4 is still better than FS4 period.
weetawr@reddit
Not everyone wants Nvidia lmao. Im selling my 4080 and getting the 9070xt for a bazzite itx pc
rW0HgFyxoJhYka@reddit
Just say you can't afford a NVIDIA GPU lol.
Everyone here would be buying NVIDIA if they had more money.
onlinethrowaway8@reddit
Lol What? Dude literally said he's giving selling his Nvidia to get an AMD GPU. And it's not like he has an affordable GPU, one of the higher end more powerful gpus. And I'm about to build my second rig that's going to be all AMD because of non-monetary reasons. Wtf is your response?
Comfortable-Finger-8@reddit
He probably spent his whole paycheck on an nvidia gpu and feels the need to brag/justify his purchase
Alucard_7x@reddit
Nah, he spent one of daddy's whole paychecks on an Nvidia GPU and feels the need to brag/justify daddy's purchase because daddy can't afford rent/utilities/food.
Comfortable-Finger-8@reddit
That's a weird take. That isn't really a better insult or anything
Alucard_7x@reddit
I didn't say it was? Just more likely.
Alucard_7x@reddit
Just say you're a moron lol.
Most definitely not everyone here would be buying Nvidia if they had more money. Get off Jensen's dick imho, I'm sure others want a turn.
areamike@reddit
Wow, you're about as sharp as a bowling ball if that is what you "think".
MonkeyShack81@reddit
Not true. I am also literally looking to move away from NVIDA. That guy literally said he is moving from NVIDA to AMD. I also am. Has absolutely nothing to do with price.
IBleedGreen75@reddit
I’ve had both .. tbh … I’m at the point I would rather have price per performance .. and at this point .. Nvidia drivers are crashing , I like both .. but if radeons got a card that meets my gaming needs.. and saves me hundreds .. I’m grabbing
ot_milo@reddit
terrible rage bait
garf2002@reddit
Where I am an AMD card is 60% the cost of a comparable Nvidia card
Pwnbuggy@reddit
Barely!
VoraciousGorak@reddit
I'll note that at 4K my 9070 XT does fine even in VRAM-heavy games like Cyberpunk. It's not a noticeably different experience with regard to VRAM than my 3090 is.
Alucard_7x@reddit
Yeah I finally got around to playing Cyberpunk this last Fall, and I was hitting \~80fps on 4K ultra no-RT and no-upscaling on my 9070XT. FSR4 brought it up to a smooth 120fps and I couldn't tell the difference, more than enough for me.
MrPapis@reddit
Cp2077 is really not VRAM heavy though. Indiana Jones, Warhammer space marines 2, COD those are examples that actually can easily eat 12gb.
Stunning-Club6112@reddit
Truth be told, Indiana Jones made my 4070 OC card feel it was on its deathbed, and every other UE5 game that's come out past has also done the same. 12gb of VRAM is not enough for modern titles like those Im finding out.
SmallyBigs2000@reddit
I know this is an old comment, but I'd like to point out that when it comes to UE5, it isn't the fault of the GPU. I've worked with UE5 long enough to know how shit of an engine it is, which is why some companies are beginning to move away from it. Hell, turning off the Lumen feature along instantly doubles performance. And don't even get me started on Nanite. All I'll say is this. You see how games need higher end GPUs while simultaneously looking slightly worse than they used to? Well that's because instead of going through the optimization steps, devs are now taking high poly meshes and throwing them into the engine, then flicking on the Nanite feature, which doesn't actually reduce poly count (which reduces draw calls in UE). The high poly mesh still gets loaded into your memory, which will absolutely devour your VRAM and your RAM.
digitalsmear@reddit
And funny enough, my 3080ti does just fine at 4k with 12gb.
RettichDesTodes@reddit
Yeah sure, if it's enough it's enough
digitalsmear@reddit
12gb vram on my 3080ti has been enough for perfectly playable (80+) framerates at 4k in all the games I play. Granted, I haven't stressed it with the latest AAA's, but even something like Callisto Protocol runs just fine.
F9-0021@reddit
FSR4 Performance mode will offset the vram difference, at least in the usable lifespan of the cards.
BMWupgradeCH@reddit
I play only 4k native on 9070xt and never run out of memory. Get 115fps in Warzone 4k extreme native preset. (All maxed out)
nilarips@reddit
Your question was which will deliver more FPS in the long run, that will always be the one with FSR4, everything else doesn’t matter in this instance unless you refine your question
dorting@reddit
Yes, you should understand FSR lower VRAM in the first place becouse you are rendering at lower resolution
The 9070 xt is the better card, you buy the xtx if you need VRAM outside of gaming
MarxistMan13@reddit
The VRAM is mostly irrelevant. The key differences are FSR4 and RT performance (9070XT), and raster performance (7900XTX). The 9070XT also draws slightly less power.
hags2k@reddit
The better upscaling makes a difference. I've been using RDNA 2 and 3 and though I love the cards, DLSS is the one big thing I really miss having access to. All the reviews say FSR4 is finally competitive with DLSS.
Weary_Document_9132@reddit
Upscaling is largely irrelevant if the card is strong enough to run it natively
hakanavgin@reddit
A little late to the discussion but it is absolutely relevant as even RTX5080 or 5090 is not enough for native 4k, and probably none of the cards that will be released in the upcoming years will be. 4K performance is not a static goal, 1080Ti should've been enough and we would stop producing new cards if that was the case.
Weary_Document_9132@reddit
Yeah, considering only about 4% of pc gamers play at 4k, it is 100% irrelevant to 96% of all gamers. Statistically the amount of people it is relevant to are an amount equal to margin of error. And there are MANY cards that can brute force 1080p and 1440p native without any need for upscaling at all. My point stand valid.
Dominicshortbow@reddit
It is definitely not irrelevant. look at heavily unoptimized games like Borderlands 4, you like need upscaling to get playable frames. And even if you have playable frames like 60-90fps you may still like to enable upscaling for high refresh 120-160fps which I definitely rather for for frames.
Zestyclose-Bowl1965@reddit
Update : AMD will add FSR4 support. 7900XTX will be the winner here. The 9070XT should stay in its actual category now. Still great though
Insurgent97@reddit
This didnt age well
tuttut97@reddit
8 Days later and still no 9080 :P
HoneyEducational5344@reddit
And we might get 9080xt at that time ;)
steave44@reddit
Not to mention the 9070XT being more likely to support any newer versions of FSR over the 7900XTX.
cosine83@reddit
Kinda seems like RDNA4 may be for FSR what Turing has ended up being for DLSS/RT. Kinda murky now as early adoption but a high likelihood of future development and improvement making the hardware's value live a longer time than expected.
mgp901@reddit
7900 XTX is still AMD's best card to date. I got an RX 9070 XT and have all of its features turned off. For me FSR is just glorified motion blur. What's worse is that when I pan my camera then stop, there's this moment where it switches from slightly blurry to crystal clear image, it's weird seeing that jump in quality. There's also the static HUD on fast moving background, it blends with the bg in an obviously AI look, looking at it feels like I'm dreaming: I know what I'm looking at is the HUD but I can't seem to make out the details like I'm looking at it with my peripheral vision.
That being said I'm still happy with my card as its raw performance is still incredible at 1440p. Don't get caught up with all those fancy features to make your decision, cuz it's a hit or miss depending on the application. The bulk value of the card is in its raw performance not the features. 7900 XTX also has more vram if you can utilize that.
alc4pwned@reddit
FSR4 is effectively more performance though. Are you talking about racing games? Because yeah, that's the one genre where upscaling still isn't great imo. In most games upscaling (DLSS anyway) looks as good as native imo.
trplurker@reddit
It's zero more performance, it's just frame interpolation which is motion blur. Because of where the interpolation is happening it cause's the frame ready counter to increment and register as a "frame" for most benchmarks. It's a smoothing technique when run at full resolution, which could be useful sometimes. Thus the joke "fake frames for fake MSRP". Steam's overlay can be made to display the real FPS though.
alc4pwned@reddit
You're thinking of frame gen, not upscaling.
trplurker@reddit
Upscaling is even less "more performance" then frame gen. 1080p upscaled to 2160p is no where close to what native 2160p is.
alc4pwned@reddit
Nah, that's very wrong. First of all, the base resolution that gets upscaled depends on what DLSS/FSR setting you use. If using 'quality' DLSS, it's 1440p being upscaled to 2160p. Even the performance mode of DLSS4 looks pretty great though - there are many videos out there showing that it produces sharper textures than native in some cases.
That you think DLSS looks 'nowhere close' to native tells me that you've never actually used it. You should check out some of those videos, in reality it looks good to the point that you should probably be using it in most games.
garf2002@reddit
Are you seriously trying to imply theres scenarios where upscaled looks better than native?
What youre doing is confusing sharpness with image quality.
Upscaling cannot physically be better than native, it can at most be exactly the same. If you want the native to look different and upscaling does that then fair enough, but thats like the old "I think 30fps looks better than 60fps" argument.
Also they are right FSR isnt extra performance, its just better upscaling, it allows you to play at a lower res without it looking as bad, it does not magically make the gpu better.
The whole reason we call them resolutions is because they increase the detail that can be resolved, if youre happy with less exactness then yes FSR/DLSS is great.
alc4pwned@reddit
With AI upscaling, that's not true. Can generative AI take a lower quality texture and turn it in to a higher quality texture? Yes it can.
...which can be treated as extra performance if the up scaling is doing a good enough job.
Sensitive-Offer-5921@reddit
BRAWNDO (tm) Upscaling: its got what plants crave
garf2002@reddit
You missed my point entirely, generative AI interpolating 1440 to 4k cannot represent 4k better than native.
What you are saying would be like taking two colours and mixing them, then getting AI to predict what the two colours mixed would look like, and arguing that the AI prediction is "better" than the real mixture.
And yet again, you are ignoring the word "upscaling", all FSR is doing is making upscaling better, its still the same from performance perspective as playing at the lower res (though better looking)
So you arent saying FSR increases performance, youre saying FSR makes playing at lower res than your monitor look better.
Big difference.
proper-warm@reddit
So if I’m using a 1440p monitor what should I put dlss on for it to upscale to 1440p?
alc4pwned@reddit
It's always up-scaling to 1440p, the different dlss settings affect the resolution that it is upscaling from. DLSS performance upscales 720p to 1440p, DLSS quality upscales 960p to 1440p.
https://www.club386.com/what-is-nvidia-dlss/
proper-warm@reddit
Ok thank you for the info
trplurker@reddit
Keep drinking the koolaid...
alc4pwned@reddit
The guy who didn't know the difference between frame gen and upscaling is telling me to keep drinking the koolaid. Lol.
Vivid_Promise9611@reddit
In your opinion that very well may be true, but there are many people that don’t like fsr or dlss
So for those of us that do notice the decrease in resolution, and even though you’re getting more frames, can you really call that more performance? If i switch to 1080p, and in the process get more frames of course, is that effectively more performance?
And you can’t use fsr 4 in very many games. I truly believe the 7900xtx wins the performance battle, but it does not win the efficiency battle
alc4pwned@reddit
It's not a decrease in resolution. There are still some visual anomalies yes, but in most genres I think they're very difficult to notice personally. There are some pretty good videos out there that do detailed comparisons of DLSS4 and native and you really have to zoom in and pixel peep to notice a lot of this stuff.
garf2002@reddit
It is a decrease in resolution, dont lie to make your point, its just a very fancilly upscaled and decrease in resolution.
From a game engine perspective its no different to playing at a lower res and having your monitor terrible upscale, except UI elements.
Vivid_Promise9611@reddit
You know what youre right dog I didn’t even think about all that. Wish you were here 176 days ago
alc4pwned@reddit
You get what upscaling is right? It renders at a lower resolution and then upscales it to a higher resolution. The resolution you see after upscaling is not lower. It has been upscaled to a higher resolution.
garf2002@reddit
The screen resolution is the same in both examples, thats what upscaling is, but the screen resolution is always your monitor resolution.
FSR and your monitors shitty upscaling from 2006 to map 720p to 1080p are the same game engine resolution.
So either all game resses are the same resolution on the same monitor (a silly pedantic argument), or upscaling is fundamentally not the same resolution as native
alc4pwned@reddit
Ok, but modern upscaling is not just taking a 1440p image and sizing it up to 4k obviously. It is producing a 4k image with detail that is about the same as a native 4k image. The resolution for all intents and purposes is the same, the only question is whether there are any artifacts introduced in the process.
If you haven't personally experienced something like DLSS4, there are plenty of good videos comparing DLSS footage with native which show how good of a job it does. Tbh, I don't understand how people are even still having conversations like this about upscaling.
Vivid_Promise9611@reddit
Still images it looks great. You’d really have to be a perfectionist to notice a difference there. When that image starts moving things change! At least for me
Maybe I meant blurriness rather than a decrease in resolution
Professional-Ad3762@reddit
U must be blind, as good as native 😅
alc4pwned@reddit
DLSS4 on performance mode? Yes.
Zeti_Zero@reddit
Some people just don't like blurryness. I find all upscaling blurry (I didn't try DLSS 4 yet) and just not worth it. Meny times TAA is already too much for me and I need to fight with in game options to try to negate this effect
Gambler_720@reddit
Yup Forza Horizon 5 is quite literally the only game where I choose not to use DLSS
trplurker@reddit
So straight up, the 7900 XTX smokes the 9070 XT in raw performance, and it should as it's entire tier above it on the product stack. If you care about raw FPS, then 7900 XTX period. It having 960 GB/s vs the 9070 XT's 640 GB/s memory bandwidth just confers an insurmountable advantage.
If RT is important to you then it's a different story, the 7000 series RT support sucks so the 7900 XTX has to rely on pure brute power while the 9070 XT has accelerators that let it work faster. The break down here for RT is 9070 XT > 7900 XTX > 9070.
FSR4/DLSS MFG is just frame interpolation, aka motion blur. It doesn't give you anymore performance just because it has an "AMD" stick on it. With proper support it can use optical illusions to make you believe your experiencing higher frame rates and better resolution then you would otherwise notice. Native resolution and fully rendered frames will **always** look the best, AI stuff might be "close enough" so could be useful if someone can't render at full resolution.
XESS/FSR/DLSS at "Balanced" is approximately 50% resolution. 2160p balanced is rendering at 1080p then using AI to upscale. "Ultra Quality" by comparison is at 100% resolution and just doing AA, which is fine by me.
yiidonger@reddit
but 9070xt performs roughly the same or better in overall games, not sure what you talking about, we only care bout FPS.
833psz@reddit
That’s not true. The 7900 XTX pulls way ahead at 4K native. The only way the 9070 XT can even be benchmarked against the 7900 XTX at 4K is if FSR4 is enabled on the 9070 XT which leads to an unplayable situation for a lot of us, in a lot of games.
I do own both cards and enjoy them both.
Dex4Sure@reddit
What's 'way ahead'? To me seems like 9070XT is generally the same in raster, maybe loses by few FPS. I know 7900XTX has more OC headroom though.
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hi there! Thanks for the comment.
We ask that posts and comments be in English so they can be understood by as many people as possible. Translations on Reddit are client-side, and not all apps or browsers support auto-translate. Currently many users (and moderators) aren’t able to read your comment.
Could you please submit a new comment in English?
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
xenocea@reddit
FRS 4 alone, is worth choosing the 9700XT, not to mentioned it has much better RT performance over the 7900XT.
OmarVIPG@reddit
Wow bros from the future, in your time you have the 9700xt? Can I have one of those for 2 dollars and a pack of gum?
xenocea@reddit
You can have 4 of them in crossfire, along with Ruby, and a candy.
OmarVIPG@reddit
you have got a deal (spits on my hand) now shake my hand to secure it
Wonderful-Lack3846@reddit
The extra vram of the 7900 XTX will not be able to give it more performance. Because games generally don't require 16GB vram. As long as you never need more than 16GB, the RAM capacity of the 7900 XTX won't matter.
7900 XTX is valuable for it's raw power and high vram capacity. The vram capacity can be nice for VR and some productivty tasks.
If you are only gaming, I don't think you should get 7900 XTX.
Go for 9070 XT; more support, FSR4 and better power efficiency.
Foreign_Preference24@reddit
Um, if you're at 4k native, 16gb is gobbled up in seconds. I think for people who are ready to turn down the ray tracing, and don't care about the blurry mess that FSR is, 7900 XTX would be the better choice.
Brave_Suggestion5597@reddit
Does it change if you look for a card for sim racing, I have 3x 2k reso screens and goodfps with good enough graphics is needed.
Anonymous_Foxx@reddit
Best description right here. I play heavily into VRchat and like to use AI locally on linux. Also doing blender and unity stuff. so the 7900xtx with the added Vram is my better choice.
SourceFull4556@reddit (OP)
True. And if the next generation of GPUs offers a significant performance boost, and if I actually need that extra power, upgrading to a newer GPU won’t be a problem. Thank you!
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Hi there! Thanks for the comment.
We ask that posts and comments be in English so they can be understood by as many people as possible. Translations on Reddit are client-side, and not all apps or browsers support auto-translate. Currently many users (and moderators) aren’t able to read your {{kind}.
Could you please submit a new comment in English?
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
realdeal1993@reddit
7900 xtx is a much much faster card than a 9070xt. If you dont care about rt of fs3 since the 7900 xtx maxes everything out native anyway, then go for the 7900 xtx.
f1rstx@reddit
it can't run any modern game at 4K/60Fps native
833psz@reddit
nonsense
f1rstx@reddit
lmao
putridi@reddit
Nor can the 4090 and 5090. Most games out at the moment still need dlss and frame gen to get good performance. That's not the cards fault. It's the optimisation of the game that's the issue
f1rstx@reddit
5090 can, but thats not the point - you still want to use FSR/DL AA or even 4K FSR4/DLSS4 Performance has better image quality compared to TAA Native, not to mention Quality upscaling. And 7900XTX has only fsr3 which is unusable garbage
putridi@reddit
Did you really just suggest upscaling is better image quality than native ? Are you mental ?
yiidonger@reddit
its like saying AI porn is better than traditional porn
SnakeRoberts301@reddit
Air porn is pretty good bro! Lol
realdeal1993@reddit
He is not wrong tho. Alot of games use taa which is native. Alot of games with dlss 4 look better with dlss on then without because of the blurriness of native taa. For example horizon looks MUCH better with dlss than native.
trplurker@reddit
On those games removing TAA is trivial, easily replaced with FXAA, SMAA or if your willing to spend performance, MSAA.
f1rstx@reddit
Or RDR2, god it’s awful at native and it’s like next gen graphically with DLSS4. CP2077 is awful native too. Basically any TAA game looks bad :)
trplurker@reddit
Yes there are people that drunk the Kool-Aid and believe this.
f1rstx@reddit
You’re living in the past
realdeal1993@reddit
I havent had 1 game that i couldnt run 4k60 fps native. And i played alot. Which games cannot run 4k60?
f1rstx@reddit
any new one ;)
realdeal1993@reddit
Can you name some? Im really curious. I have played most games with the exception of AC. Benchmarks with an oced xtx are higher than a stock 5080. Timespy 36k.
d3rw4hr3g4yx1@reddit
i own a Asrock 7900XTX Taichi White OC and it shreds every game i throw at it. Im no fan of raytracing and fsr upscaling so i mainly play just native 1440p and i've had no game that didn't run except for a few AMD specific driver errors. And after comparing both it's obvious its a more powerful card in terms of pure FPS. If you wanna place your trust in RT and FSR then maybe the 9070xt is something to consider but i would never "downgrade" to it from my 7900XTX
ur1tek@reddit
Good luck on affordable 20gig plus gpus. 16 gig ain’t it.
Ambitious_Bed4310@reddit
Performance wise not a lot between them but 24GB Vram makes XTX the sensible choice
Ambitious_Bed4310@reddit
In Sniper Elite Resiatance im using 20GB Vram at 1440p .
Secretnamenooneknows@reddit
For those trying to decide today. The 9070xt will not run out of vram unless using ray tracing at 4k. Native 4k with no rt will be more fine. So you might think “well then I’ll get a 7900xtx for more vram if I want to use rt” but the 9070xt is so much better at ray tracing while the 7900xtx is plain bad at it in my opinion and it’s worth turning off for more performance. In the end I’m saying rt is not worth it in either scenario at 4k and that’s the only time vram would be a concern. So pretty much any situation you’re better off with the 9070xt unless you get the 7900xtx for super cheap.
Weary_Document_9132@reddit
This is just not true. There are a bunch of games that sit at or above 16 GB of VRAM with no ray tracing and below 4k. Borderlands 4, for example, at 1440p, no RT max settings, bounces between 17.9 and 18.6 GB of VRAM usage consistently. modded Minecraft, Skyrim, ANY simulator, can ALL use beyond 16 GB of VRAM with no trouble. Clearly, you aren't knowledgeable enough to make such definitive statements, so I'd suggest you refrain from doing so as to not look like as big of an idiot in the future.
DiscussionMiddle1238@reddit
FSR4 is a significant upgrade from FSR3, and the older generation RT cores aren't even worth bothering with on the 7900xtx, I always turn ray tracing off on mine. I can run most games at 1440p max settings, and get at least 120FPS with no upscaling, but from what I see of FSR4, I'd have no problem running it.
Suspicious-Leg-8495@reddit
5090
Top-Zucchini-9421@reddit
The 7900 XTX runs fsr4 now?
Kitchen-You-6240@reddit
The coding has been leaked that backwards compatibility is coming to the 7000 series. You can mod it in but its not "Officially released" yet. Bunch of YT guides on it though. I haven't done it because I don't really need fsr but it will future proof the XTX even more
Suitable-Skill-2229@reddit
depends on what your using the card for. and what your rig is. cause of the card having a fsr4 makes it future proof. but the card it only 16gb. why the 7900xtx is 24gb with fsr3. so this really depends on what your going with. also what is your monitor. dont say its some big tv screen with 60z. cause then your just wasting money. i have a 7900xtx. with a 260fps monitor. and 128gb ram. i have no problems with frame rate. and my monitor is a msi 40" wide. but my 7900 is not future proof. cause of it being a fsr3
Wolfenstein49@reddit
I am going to be the outlier here and say the 7900xtx. It's getting ai support, yeah not as new as the 9070, but the 7900 has the power to back up the ai fluff. Some games I turn of all ai and can still get a stable 60fps on 4k max settings. I have never used the 9070xt, but I can imagine either choice is going to be more economical than nividia counterparts
Hot-Ride-9747@reddit
it also depends on how long you want to keep your gpu and what resolution do you play at and plan to play at. I think 24gb of vram is going to come in handy much sooner than we think.
Next Gen Ps6 is rumored to have either 24 or 32gb of unified ram. If it's 32gb here is what gemini says (daniel owen talked about it too in one of his videos)
Gemini:
If the PS6 had 32GB of RAM, a portion would be reserved for the system and OS, likely leaving around 24-26GB available for games, which would then function as VRAM. This is a significant increase from the PS5's 12.5GB usable, and it means games will be developed with higher memory requirements in mind.
Romeos_Crying@reddit
I have a Asrock Taichi 7900 xtx and love it. Yeah I wish I had modern features like FSR 4 and Ray Tracing, but in all honesty, I play all my games at max settings with a 5950x and am well above 120 fps on most games max. The extra VRAM allows you to use it for streaming 4k recording if you are into that without having to get an extra capture card. If you are worried about AI, just go with NVIDIA lol. They are leagues ahead.
Pitiful-Signal-6344@reddit
If your raw power get the xtx raytracing 9070xt is better performance increase. The fact there comparing it to a 4070 ti super awesome card btw meant they weren't aiming to be on par with a 5080 but there 9070xt exceeded the mark, however there end goal was to catch up to Nvidia. I think by 2027 the new AMD will come out swinging hard at Nvidia 10k series whatever the name is
Worth_Ad_2901@reddit
What’s important to you? Personally I would rather play at 4k 60fps, over 120 fps using upscaling because the image quality looks better. Also with the extra vram the 7900xtx performs better than the 4080 at vr. Unless fsr is import to you, which it shouldn’t be unless your using a weak gpu, the 7900xtx is technically the better card
Sef4k@reddit
I play BF6 on High settings and get 160-180 frames with a rx 7900 xtx. With FSR i get abov 300 AND a blur image, with upscaling close to 400fps. I disabled all that upscaling and FSR because i cant see enemies clear in the distance and the screen is much sharper without all that. So dont count in those fakes frames if your fps is good without
Charming_Package6206@reddit
Would be interesting to see whether opinions here have changed now that it is possible to use FSR4 on the 7000 series cards. I have used it on my 7900 XTX and it does look hugely better than 3.1, and my card ran significantly cooler (10-15C cooler! in cyberpunk with everything else cranked up to ultra).
Sharp-Dealer-5099@reddit
Aber AMD hat es doch noch nicht offiziell bestätigt das es FSR4 auf der 7000er verfügbar sein wird, dass heisst hier muss etwas gebastelt werden damit man es auch nutzen kann und dies ohne Garantie das es auch optimal läuft...
Nearby_Control@reddit
9070 considered mid range and is 900.... Interesting, sounds like a pass.... 7900 was considered top .... Same price m ... Tech should get cheaper
Honest-Technology@reddit
I think I'll go with the 7900 XTX. I play flight simulator and that extra RAM will help. I don't care about FPS 30 at least in 1440p will do.
Hudz04@reddit
i just bought used hellhound 7900xtx for $820. I was sad bcos I just knew that It doesn't support FSR 4 and it's FPS is on par with 7900xtx but stronger Ray tracing with less power consumption :( . I plan to sell it again and I wonder if someone is going to buy it again. I only do gaming and graphic design
unlucky980@reddit
Well technically in the long term for raw power it will be better. :).
APRV4Kush@reddit
I have both. Prefer the 7900 xtx.. I use tons of mods in racing games and have easily exceeded 16gb vram many times. FSR doesn't add any value as playing at anything other then native is stupid on a high end gpu. Suuuureee lets just gimp the quality down to compensate lmao! Won't even get started on the latency increase. RT is just as worthless. Smidge of extra effects at the cost of 75 percent of your fps.
MrSimonShirley@reddit
Hiya. Did you make a decision? Ive seen a few people asking the question but not many saying which way option they went for.
I'm looking at the same options upgrading from a 3070. Probably going to get a 9070xt of some variation.
One thing I've not seen in this thread but in others is the power usage if that concerns you 7900xtx is around 355w and the 9070xt is around 300w
Such_Potato7736@reddit
Had the same dilemma, went with 7900xtx due to higher performance in VR.
Embarrassed-Cap7135@reddit
Same, 7900 xtx 120% resolution scale at 4k looks stunning. No FSR gimmick.
Tankdawg0057@reddit
Same
Shadowpaw-21@reddit
9070xt is more future proof with fsr4. Driver updates have been improving the 9070 performance in some games so as time goes on the gap may become larger between them.
Gullible_Cricket8496@reddit
only reason to buy a 7900 XTX is if you have some specific use for the extra VRAM.
matte808@reddit
every rdna3 product is trash at this point, and was borderline when it came out, already obsolete
putridi@reddit
Yeah agreed TAA is bad. Especially in motion. But there are other options that don't give a blurry end result. Especially with foliage and water refections
Joker28CR@reddit
FSR4. That's it.
putridi@reddit
I just brought the 7900xtx a couple of months ago. I mostly play older games that don't really support FSR and raytracing. So having the raw performance is better for me tbh. Although the 7900xtx is overkill for games that come out 10+ years ago. I also use my pc for game development so having the 24gb of vram is nice to have. It really comes down to use case. It depends on your games. I don't care for Ray tracing but you might. All I'm saying is no matter what comes out in the future both cards will run it so I wouldn't worry to much get the most recent and call it a day. My 7900xtx runs morrowind fantastic in 1440p ultrawide 😅
A_Tasty_Timbit@reddit
I'm stuck with this same dilemma, however, I do media production for a living also. I use one system to game and edit on so I need a great balance. Do I need that extra bit of raw power from the 7900 xtx or will the 9070 xt be more than enough?
I use Adobe Creative Cloud and heavily work in Premiere Pro, after effects, light room and photoshop.
I have an AMD RYZEN 9 7950X cpu
f1rstx@reddit
if you're "using PC for a living" - get 5090. It's pretty cheap for professional usecase.
No_Yogurtcloset9994@reddit
As someone who prefers the AMD ecosystem more, I would say just get a 5070ti if you do media production for a living. We are talking about similar price regarding all 3 GPUs. Personally, I wouldn't even consider the 7900xtx if priced similar. If you require more than 16gb vram, then I would wait for the super refresh. Be content with what you have currently and wait at least 6 months, then you'll know more about the super refresh and possibly a 9070xtx or 9080. Just be patient.
chatman77@reddit
I face the same question right now as I just ordered a 4K monitor (upgrading from WQHD) I just want to add the question if I should maybe switch to RTX 5070 instead due to general performance advantage. Also I consider getting a Quest 3, so VR performance will be a decision factor
alvarkresh@reddit
9070XT. Without question.
acewing905@reddit
With games starting to rely on hardware RT, I'd say the 9070 XT is the easy choice if both are around the same price
bigkenw@reddit
I have been using the 9070XT OC from Gigabyte for the last few months. It has been super solid. I have played some stuff at 4k or 2k on my OLED in HDR, and when using my PC monitor, I run 1080p (until I find a good 2k monitor upgrade).
I have zero complaints, and most games I get 150fps...give or take given the game.
Don't forget some games are CPU intensive as well; I would pair it with a decent CPU.
Premish828@reddit
I lust over the xfx 7900 xtx
fuzzynyanko@reddit
I would say the RX 9070 XT for ray tracing reasons. Game studios might start using RT more because it might be cheaper than doing it optimally, and/or if Nvidia throws them a wad of cash to implement it. Game studios seem to be strapped for time especially
That being said, don't replace the 7900 XTX just based on my prediction.
Jbarney3699@reddit
The recent driver update tilts things in the direction of 9070XT overall. Very close raster while the 9070XT has better feature set and RT performance.
f1rstx@reddit
Drivers did almost nothing, that HUB video is misinformation. Something is wrong with their initial testing
Dorky_Gaming_Teach@reddit
If you want to push higher rasterization in 4k, then get the 7900XTX. If you care about a little more raytracing and FS4, then get a 9070xt. However, FS4 support is still in its baby phase. If you are gaming in 1440p, the 9070xt is likely the better choice.
deadfishlog@reddit
FSR4 and superior ray tracing make the 9070xt the better choice
definitlyitsbutter@reddit
The 9070xt will be better in the long run. Just because fsr4.
Wonderful-Lack3846@reddit
Same price right now; because the 9070 XT is overpriced? Because normally they should not cost the same.
As you said;
7900 XTX has a bit more raw power
But 9070 XT is the modern card which will receive more support. So generally the recommendation is to pick 9070 XT
SourceFull4556@reddit (OP)
There's a special offer where I can get a pre-built gaming PC with a 7900 XTX for about the same price as one with a 7900 XT. The rest of the hardware, like the processor, is identical. It's a limited-time deal from one of the biggest retailers in my region. Normally, if I bought the GPUs separately, the 7900 XTX would cost around 20% more — and in that case, I wouldn't consider it worth it.
At the moment, I'm leaning about 70% towards the 9070 XT, but there's still that 30% pulling me toward the 7900 XTX. I've heard a lot of arguments saying that in 4K AAA games, the 7900 XTX really shines thanks to its extra VRAM — but when I compare live benchmarks on YouTube, I can't really confirm that.