RX 7900 XTX vs 4070 Ti super
Posted by RelativelyOriginal@reddit | buildapc | View on Reddit | 76 comments
This is the information I’ve received from forum browsing.
The 7900 XTX performs similarly to the 4080 at a lower price. While it lacks the software of NVIDIA cards, it makes up for that with brute force strength. Plus, the 24gb of vram secures the longevity of the card.
The 4070 ti super provides “value” without sacrificing too much performance, hence why I chose it over any 4060/4080 cards. Though I’m not sure how much longevity I’d be sacrificing with 16gb of vram.
It seems to me that AMD has fixed the chief complaint with the 40 series lineup, which is a lack of vram. However, according to userbenchmark, both cards I listed perform very similarly. It’s clear UBM has a strong bias against AMD, so I’m taking their results with a grain of salt. Other forums I’ve seen have been completely torn, so any clarification would be greatly appreciated.
Edit: Thank you everybody that replied, there were a lot of interesting factors brought to my attention. There was a lot of praise for the XTX here, but there were some issues addressed too, like driver and compatibility problems, which a few people had. I wanted to avoid it, but I suppose I’ll fork the extra cash for the 4080 super, as NVDIA cards tend to be issue free. This also comes with the added benefit of DLSS and superior ray tracing performance, which I was sad to forfeit with the XTX. I’m sure I would be happy with either option, but I’m also more accustomed to NVDIA cards.
For anybody that comes across this thread in the future, ignore user benchmark for AMD comparisons, and be aware that the 4070 ti super, RX 7900 XTX, and 4080 super all have factors that are directly affected by the price, but all are viable options depending on your budgets. Hopefully this is helpful.
Mikizeta@reddit
Performance wise, the 7900xtx is superior to the 4070ti super.
The fact that USB tells you that their tied is actually a good indication that AMD has the upperhand, as their bias is is so strong they lie at every possible occasion in favour of Intel/Nvidia. They even tweaked their scoring algorithms to make AMD products perform worse.
As you said yourself, the 7900xtx give 4080 performance at a lower price. Of the 7900xtx and the 4070ti super are tied in price, I'd go AMD.
RelativelyOriginal@reddit (OP)
I suppose I’ll go for the 7900 XTX. Sure I’m sacrificing RT performance and DLSS, but 24gb of vram makes me salivate.
CatalyticDragon@reddit
Cyberpunk 2077 is one of the worst showings for AMD when it comes to ray tracing. Likely because it's an NVIDIA showcase.
That said, I have a 7900xtx and I'm running at 4k with all RT options on and RT ultra and I'm getting stable 60fps. Admittedly that's using FSR set to "auto" so source resolution is of course lower than 4k, but I do not see any issue with image quality. The game looks incredible.
CDPR has been dragging their feet on updating FSR from 2.1 to v2.2 or even 3. Things will improve again once FSR is updated.
So that's worst case. On average the RT perf is only about 10% lower than a 4080.
inflabby@reddit
why are you only getting 60 fps LOL
CatalyticDragon@reddit
60hz TV
goldrimmedbanana@reddit
Isnt RT and DLSS more for older cards? If you getting a beast GPU you dont really need RT and DLSS when the GPU stock can pump out mad frames and sexual graphics innit? Plus FSR2 has been greatly improved to be on par with DLSS so its not that big of difference anymore???
inflabby@reddit
u need RT and DLSS for 4k max settings gaming for games like Sun wukong, Cyberpunk, alan wake, horizon etc to hit 60fps. Even for a 4080super it isnt enough. even if u are gaming in 1440, You need RT and DLSS to hit 100+ Fps in these games too. So the consensus is that RT and DLSS gets you max out 1440 settings. And 4k settings only on high. thats about it.
goldrimmedbanana@reddit
Thank you brother for the clarification.
weinbea@reddit
VRAM is overhyped, but DLSS isn't. It's way, way better than FSR. Plus Frame Gen is great for single player games.
junksong@reddit
Ya, dlss is definitely overhyped, so is ray tracing for that matter.
inflabby@reddit
its not overhyped when i can get a bump of additional 60 fps from Dlss, and frame gen and RT while maintaining the same quality.
LePouletMignon@reddit
Which is what every Nvidia fanboy will tell himself. VRAM bottleneck is real, and Nvidia is the main culprit.
Mikchi@reddit
You know how many games I've played where my 3080 Ti ran out of VRAM at 1440p?
Zero.
CasKiller2@reddit
lol
Mikizeta@reddit
Makes sense. Also, I believe the RT performance of the 7900xtx to not be inferior to the 4070ti, and FSR made strides recently.
In other words, you won't feel like you're missing out. Probably, the opposite 😁
clampzyness@reddit
RT is a bit overrated imho, lots of games that has decent lighting/reflection etc. looks good without it and people generally dont care if its "real time". thats just the sad truth that nvidia dont wants us to believe so they can push their tech and justify their premium price.
karmapopsicle@reddit
RT shadows, reflections, and ambient occlusion are all far superior to baked in techniques now that we have sufficient power to run those effects with enough rays.
RT is the biggest leap forward in rendering fidelity since the switch to physically based rendering in the early/mid 2010s. The light and shadows are what ground the objects to the world being presented, and the differences can be quite staggering.
I'm absolutely in the category of "eye-candy enthusiasts", particularly for big immersive single-player titles. CP2077 for example looks very good maxed out in full rasterization mode, but everything still "looks like a video game". The fully pathtraced lighting in RT Overdrive mode is like flipping a switch on an entire generational leap in fidelity.
junneh@reddit
But the XTX performs like a 4070ti in most RT stuff, sans pathtracing, but that still cripples a 4090 without upscaling so eh....
In light RT workloads like we are seeing more now the XTX has no problems and performs like any Nvidia. In hardcore nvidia rt (Cyberpunk) yea it fails.
So it really comes down to price. And what games you play (my only game with RT is WoW and there its so light u dont even notice). In EU the 4080s goes for 1150-1200. The XTX for 910. Both decent AIBs. Then the choice is easy (XTX).
karmapopsicle@reddit
I mean that sounds a lot like the kind of thing you'd say when your experience with upscaling has been primarily FSR. DLSS 2 has gotten to the point where I'm quite confident saying Quality mode delivers "as good as or better" than native rendering.
More to the point though, even at native res a 4090 runs CP2077 Ultra + RT Overdrive at 60FPS in 1080p and 40fps in 1440p native res without upscaling, frame generation, or even ray reconstruction (which quite honestly is a must-have for it). The game was optimized to play smoothly at 30fps on the consoles, and it's legitimately an enjoyably fluid experience at 35-40FPS. And that's about what I get playing at 4K/DLSS Performance/RT Overdrive on my 3090.
Consider it from a different perspective:
A 4070 Ti Super is around 850 Euro, right? Performs a little better in "regular" RT loads than the XTX, and can quite easily deliver an excellent experience in fully pathtraced games where the XTX simply falls apart.
The question is this... what non-RT games are you playing where you're really going to notice the additional performance of the XTX? And that's without factoring in the effectively undetectable upscaling from DLSS Quality which basically levels that performance field. You're really going to spend 60 euro more on a card that falls apart in the bleeding edge technologies that are realistically how you're justifying spending so much on a GPU in the first place?
junneh@reddit
A different perspective for sure. And not a wrong one.
Personally I dont play any game with ray tracing and doubt I will in (near) future unless it becomes the standard (some gens out for sure).
I still play Wow casually but the RT in there is so light you dont even notice it(both in visual and performance). Nor do I play typical AAA games. Most of my stuff is cpu limited first(large multiplayer, rts, etc) (7700x) This all on 1440p/165
I want some extra horsepower from the gpu for my 4k60 simrig (flightsim, racesim) but it seems 7800XT is already managing it with maybe some slight settings tuning needed (fine for me).
Currently debating on what to get, bought a XTX on 900 (which in euroland is nice price) but its just too much for what I need. Getting a 4070ti for the same price (30-50 euro less) seems not smart for me, personally.
Debating wether should just get 7800xt/7900gre and call it a day.
7900XT would have been perfect, but pricing for nice models is back on 800 now in euroland... which doesnt make sense vs the 900 XTX or the 850 4070TI super.
4070Super im afraid on 4k with 12gig and the sims.
4080Super I dont see the extra value in paying 200 more over xtx, in my usecase.
My power price is somewhat high so I do value efficiency, but I feel its somewhat overblown with it being around 50 watt difference per tier on normal (non crazy aib) power settings and thats running maxed out which it wont do all the time, so pretty negligble.
Budget is not much of an issue but its useless to pay 300-400 more for horsepower you are not really gonna use. Its better spend when the next series is out :)
karmapopsicle@reddit
Ah it seems like you have already been spending quite a bit of time weighing your options here. If I was exactly in your position, needing a card to handle both 1440p/165Hz also with strong 4K/60 performance at the price points you mention, here's what I would most likely end up deciding on and some of the reasoning for it.
Asus RTX 4070 Ti TUF. Link is to German PCPartPicker page, price is €889.00. Link to full TechPowerUp review of this exact card here.
My personal opinion here is perhaps slightly controversial around /r/buildapc, but I strongly believe the XTX is actually overpriced. If it were around €750-800 it would be a much more competitive product, but as it stands right now I think the card is just too far behind in a variety of features to justify that price. In essense, while it delivers around 17% (1440p) to 21% (4K) better native resolution pure rasterization performance, I don't find that enough to make up for the feature deficits.
DLSS2 (upscaling) is the most important one from a purely gaming perspective. It is simply in a different league when compared against FSR2. Particularly at 1440p and 4K, DLSS Quality preset delivers as-good-as-or-better image quality to native resolution - much of this results from the upscaler handling antialiasing far better than existing solutions. What you will tend to find comparing native to DLSS Quality side-by-side, when actually moving around in a game is that you'll either be unable to tell the difference, or you may pick out slightly better image stability on fine details in the distance with the DLSS image. Enabling that in any game that supports it essentially wipes out the raw performance advantage of the XTX. While you could use FSR to gain back the difference, even at the Quality preset you are always introducing some artifacts to the image simply by virtue of how it functions. For me the ghosting/shimmering/detail instability introduced by FSR is highly noticeable and can break immersion.
The part I find funny is that if you dig back into the whitepapers and presentations from the engineers who worked on DLSS, you will find explanations as to all of the various problems that show up with various traditional scaling techniques and exactly why a deep-learning solution was created to solve them. Those are all exactly the same issues that are pretty much universal to FSR.
Some of the other features I use occasionally include NVENC (which is still noticeably better than AMD's encoder), RTX Broadcast for some excellent microphone noise removal and quite good virtual green screen, and I dabble with AI image generation from time to time.
On the topic of power consumption, the XTX is running into the same issue we saw with the 6000-series here. The only card that actually seems to deliver that ~355W power consumption is AMD's in-house design. Even cards on the cheaper end like the XFX RX 7900 XTX Merc 310 are delivering real-world average gaming power consumption in excess of 400W. AMD has certainly done a solid job fixing the excessive idle and multi-monitor power draw figures since launch, as those cards used to draw upwards of 90W in those situations! However the point remains that the TUF 4070 Ti S here delivers a 292W average gaming power consumption compared to 406W for that low priced XFX XTX, and even 20-30W more than that for some other AIBs.
Assuming you're paying roughly €0.40/kWh for power, and gaming for say 3 hours a day on average, that's an extra 125kWh per year, or in other words an extra €50. Over a 4 year period that's enough that you would have spent the same total if you had just gone and bought a 4080 Super instead.
junneh@reddit
Well thats on XFX then.
The XTX I still have and am doubting about its 360w with the PL slider on stock (sapphire pulse)
The XTX my friend has (hellhound) is 355w with the PL slider on stock
The 4070ti super I can actually get for 850 (kfa oc) is listing 320w tdp in the spec (no idea if thats with slider maxed our or on stock)
So I dont think the difference is all that big. And the cards wont be maxed in my games so actual power use will be lower, since 4k60 in my sims is quite peanuts for a XTX and its doing ~180 watt in my own tests ie.
Not sure if Nvidia will scale much better in lower power situations but its not very interesting metric to decide a gpu on :P.
I didnt look into upscaler much but I will believe you on DLSS, reviews say the same - but again - in my usecase - Ill hardly use it.
And yea Im indeed between TI super or 7900xtx. Or just cheaping out use 7800xt and see with next gen.
karmapopsicle@reddit
I mean if the 7800 XT is capable of delivering acceptable performance for you now I'd say it's definitely worth saving the what... 300-400 euro? Bank that away and put it towards a next gen card.
junneh@reddit
Yep Im leaning that way. Will probably end up with a GRE if they dip below 550.
inflabby@reddit
tbh u dont really need a nividia card if u are not gonna use RT. u probably dont need a 1440 card either. 1080.
Leisure_suit_guy@reddit
Unfortunately for AMD this game with path tracing is a system seller.
miata85@reddit
Cyberpunk being an Nvidia-sponsored title has directly affected its ray tracing (RT) implementation. Nvidia sent its engineers to assist with, or even completely code, the game's RT. As a result, Nvidia optimized the game in a way that significantly hampers AMD's performance with RT.
Ray tracing involves bouncing lights to create reflection effects in the game environment. Since it's too costly to do this for the entire world, the space is divided into smaller sections using a Bounding Volume Hierarchy (BVH). You can think of a BVH as a cube that is subdivided into 8 smaller cubes, each of which is further subdivided into 8 cubes, and so on. This process continues until the space is divided into small enough cubes that lighting reflections can be calculated efficiently.
AMD's current BVH implementation (in RDNA 2 and RDNA 3 architectures) relies on system RAM via the driver side, which is a slower solution. This is one of the main reasons why AMD's RT is generally slower than Nvidia's, though it is usually manageable. Normally, games use 1 or 2 light bounces, as more bounces provide diminishing returns in visual quality.
However, for Cyberpunk, Nvidia configured the game to use 4 bounces. While this offers little improvement in visual quality, it significantly affects AMD's performance due to their less efficient system memory BVH. Consequently, an AMD XTX card performs comparably to an Nvidia 3090-3090 Ti in typical RT games like Alan Wake 2 or Metro Exodus, where it is one generation behind but still usable. However, in Cyberpunk, its performance drops to the level of a 3070 or 2080 Ti, which is unacceptable for a card of its price and supposed performance. A similar situation occurred with the game Control.
In contrast, for Unreal Engine 5 games and other RT titles, AMD's performance is typically around 20% behind Nvidia, not the 50% gap seen in Cyberpunk.
JoeCollins19-99@reddit
I am currently putting together a system and thinking about getting a mobo and ram that will support 7400-7600mkz ram speeds. Being that the XTX uses system ram, would the faster ram make any noticeable difference with ray tracing performance?
Leisure_suit_guy@reddit
It was an interesting read, however, I wouldn't say that Cyberpunk 2077's RT implementation has "diminishing returns in visual quality"
C77 with path tracing is night and day compared to no RT or even normal RT.
From my point of view it's not C77 that overdid it, it's the other games you mentioned that underutilize RT.
So, saying that AMD makes GPU that are not so far behind in RT, but only in games where RT doesn't make a meaningful visual difference, it's the same as saying: "buy AMD only if you don't care about RT" (which I may, with my next GPU. I've already finished C77 a couple of times and I'm sick of nvidia's VRAM shenanigans).
Systemlord_FlaUsh@reddit
It makes dynamic lighting and effects much more realistic. Just play Darktide with RT on. Its absolutely sick, unfortunately the performance draw is as well.
Systemlord_FlaUsh@reddit
You can run anything maxed out in 4K without RT but RT can improve some games.
karmapopsicle@reddit
In lighter RT implementations it's fine, but AMD falls apart in much heavier RT/pathtracing. In Alan Wake II pathtracing it's only just above a regular 4070. In Cyberpunk it's trading blows with a 3060/3070.
It still suffers from all of the same easily reproducible image stability and artifact issues. Until AMD figures out their own deep leaning upscaler to accelerate with the newer AI hardware in the 7000 series the tech is going to remain a couple generations behind.
seenasaiyan@reddit
The AMD CEO confirmed hardware-accelerated AI upscaling is coming to FSR in 2024. All RDNA3 GPUs have AI cores. Keep up, Ngreedia guy.
karmapopsicle@reddit
That's a funny way to stretch the CTO vaguely hinting about AI and upscaling in an interview. Lisa Su certainly hasn't made any announcements on the topic, and nobody at AMD has "confirmed" or provided any concrete info about this.
It's of course inevitable that they will eventually have a ML-enhanced FSR. The question is whether this is something the rudimentary AI enabling hardware present in RDNA 3 can even do, and if so, when it will become available and how long will it take to catch up to the progress Nvidia has made?
I'm only an "Nvidia guy" right now because AMD hasn't made a compelling enough product for my needs since the R9-290 I ran 5 years.
seenasaiyan@reddit
No, you’re an Nvidia shill because you’re calling the AI cores in RDNA3 GPUs “rudimentary” with absolute zero evidence or basis. The fact that FSR3 even comes close to DLSS without using ML upscaling actually bodes really well for the results when AMD does implement hardware accelerated AI upscaling. I’m actually objective, my last GPU was a GTX 1080 and I kept it for years.
You’re just salty because you probably overpaid for an Nvidia card with inferior rasterization performance and less VRAM than the cheaper AMD equivalent. And now AMD is going to achieve upscaling parity before your card is even old enough to need to make use of it. Cope.
karmapopsicle@reddit
Might have been worth your while to bother reading the linked article...
Now, let's be a little more technically accurate here. The AI accelerator AMD has implemented in RDNA 3 is called WMMA (Wave Matrix Multiply Accumulate) and is based around a similar idea to Nvidia's Tensor cores. I don't think HotHardware's take was really justified in calling them "rudimentary", but more charitably they are a little over a generation behind Nvidia in performance there. The 7900 XTX (123 TFLOPS) delivers FP16 performance similar to a 3080 (119 TFLOPS). For comparison a 4090 delivers 330 TFLOPS.
FSR3 is frame generation. FSR2 is the upscaler. Same naming scheme as DLSS. Both versions of FSR suffer from all of the artifacts that were demonstrated explicitly as examples of what the deep learning solution leveraging the Tensor cores for upscaling, and Ada's optical flow accelerator for frame generation, remedied. AMD is essentially using the same game engine hooks DLSS uses to deliver an experience that mostly offers a better result that simple bicubic resolution scaling/TAA/CAS/etc, but it's simply not in the same league as the far more refined results that DLSS delivers.
FSR always reduces image quality. The DLSS algorithms are so good at improving image quality that they can be used at native res (DLAA) or even super-sampled resolutions to further improve over native (DLDSR). These days I will almost always enable DLSS Quality by default if available because it has improved to the point of being at least as good as native while often providing subtle improvements to aliasing and stability in distant object - yes even compared to native.
I've even played around with FSR3 frame generation on my 3090, being that I can't use DLSS3 FG. While I was happily surprised at first at what seemed to be a free and near-perfect doubling of my framerate in CP2077 RT Overdrive, that illusions crashed down as soon as I hopped in a vehicle and the whole thing completely fell apart. Made it immediately apparent why Nvidia locked their solution to the Ada cards with the new hardware.
Yeah, they've got all the bones there already, they "just" need to implement it. So where is it? That's the kind of feature that they would be shouting from the rooftops if it was coming to the 7000 series. Even if it was a year away, that's the kind of thing that would reinvigorate sales of the line and boost buyer confidence that their cost savings now will be rewarded with "aging like fine wine" in the future.
There's no doubt in my mind they have a team of developers working on it. The fact they haven't said anything more than the super vague statements quoted in the article above to me is worrying though. Sure there's a chance it's down to them investing huge amounts of training time into their model to really change things up and put out a highly polished and straight up "as good or better" option when it's ready, but it could also be that they're running into problems getting it working to an acceptable standard on existing RDNA3 hardware, and it's being pushed to launch alongside RDNA4 as an exclusive feature to those cards.
I was team red since my ATi Rage 128. A decade ago I was literally here trying to preach about how nobody in their right mind would buy a GTX 970 when the R9 290 was so much cheaper for basically the same performance.
I would love nothing more than for AMD to achieve parity with DLSS and heat up competition in the GPU space once again. Literally everybody benefits from that. I'm not really sure why you seem to think I give a shit about mega tech corp A or B though. I'm a geek - I read the fucking whitepapers and drool over new rendering technologies. AMD and Nvidia are both out to enrich their shareholders, and the executives of both companies don't give a shit about you or me. Right now Nvidia has the technologies lead across the board. I think AMD has been fumbling the Radeon brand for years, and so far RDNA has been a year-over-year story of trying to play catch up instead of getting the proper investment and market strategy it needed to actually carve out some of that overwhelming marketshare held by Nvidia.
They did it to Intel with Zen. And if the rumours for RDNA4/5 are true, they might just have a glimmer of hope of doing the same to Nvidia.
Impressive-Cry-2237@reddit
Bruh, sorry to necro this, but this thread was a good read! Any more info now that we’re almost a year forward?
karmapopsicle@reddit
Indeed. FSR 4 will be "fully AI-based". They're exiting the 'enthusiast' GPU segment with RDNA 4 to focus on the midrange/performance segment. Hopefully they're priced competitively enough for the performance to get a better foothold than they've had in a long time.
Impressive-Cry-2237@reddit
Thanks for all that info! Sounds promising, AMD are already holding crazy strong for the average GPU market, so this whole plan seems to make sense with the current trends and where AMD are at. Hopefully Nvidea top end prices don’t go too crazy though if there isn’t much market to compete with them. Keen to see where it all goes!!
amadeuszbx@reddit
Really, I’m not an AMD or Nvidia fanboy, and I just want to reiterate how shit and worthless and comically biased against AMD userbenchmark is. It is literally worthless as a source of reliable information, their ridiculous vendetta is well known online and completely mental. Some people refuse to to believe it is real and think guy behind the site is just one big troll. I personally think he is mentally ill.
goldrimmedbanana@reddit
I was comparing some vida and aMDs and there is now a massive blurb against the AMD cards... like a literal hit piece... I was caught off guard and started looking up what AMD had done... it was surreal. Here is an example
"AMD’s domination of social media platforms has historically resulted in millions of users purchasing sub standard products, those users will be very hard, if not impossible for AMD to win back. If this trend continues, semiconductors may become a secondary business line for AMD, who appear more focused on developing “Advanced Marketing” relationships with select youtubers and media outlets."
"Despite steady price cuts, an increasing number of seasoned gamers simply have no interest in buying AMD products. They know from bitter experience that headline average fps are worthless when they are accompanied with stutters, random crashes, excessive noise and a limited feature set. Most gamers, who are better off playing at 1080p, will do well to wait for Nvidia’s upcoming 4060/4070 series cards (est. early 2023). Even brand fans that wish to be in AMD’s “2%” club, will find better deals after the launch hype settles. Shoppers should avoid AMD’s reference design as many users are reporting thermal issues."
Straight from the conclusion section on the 7900 card I looked up. What the heck is this lol.
S01arflar3@reddit
Lisa Su 100% fucked his wife
Avalanche-777@reddit
The issue, while each to their own of course, but i have gone from my 7800xt to 4070ti super, sure the 4070ti super is a premium asus tuf and the 7800xt was a Sapphire Pulse, which i believe is not counted as a premium card.
But with DLSS and since most games that have upscaling tend to have DLSS with FG and most games that have FSR tend to have 2.2 or something and some that have fsr3 don't have FG, for me, if i am getting 120fps i am happy, AMD with Rasteration performance can't do that anymore, games are becoming more and more dependent on upscaling and like it or not, DLSS right now is far better than FSR 3.1......maybe FSR4.0 might be better i don't know.
TincanTurtle@reddit
also, if you're willing you provide the xtx with a better thermal solution ,then you can OC it and get close to 4090 raw performance
deep_learn_blender@reddit
You're very unlikely to notice any vram benefit, even in the future, even at 4k ultra.
It's unlikely to impact you in the future because, by the time it matters, the card will be too slow to take advantage of it -- that is, rastering performance is likely to be the bottleneck by that time.
Vram above 16gb is just for workstations, primarily ai and 3d work.
No-Chicken-2704@reddit
This. Though I could see an advantage in the 7900XTX for supersampling...for 8K downscaling, to 4K.
rippingviper@reddit
Amd's new frame generation tech is amazing. My wife's 6800xt gets double its performance when it's turned on. Makes games silky at 1440p so id assume it would be even better on the 7900xtx.
Amak88@reddit
One thing I notice with my 6600XT and frame gen is if you are maxing out the card, the frame gen can heavily tax performance.
I've been playing mafia3 @ 1440p and with frame gen ON steam overlay fps 50-65fps (120ish fps in amd overlay), with frame gen OFF 120fps native.
Not only did it halve my fps only to double it via A.I but it also came with 'frame gen lag' of 22-24ms.
It is very handy as it supports every dx10/11 game, so games like gta5 where the physics mess up above 120fps you can just enable frame-gen for the sweet 240fps. But when you're really pushing the card, it can be more of a hindrance than helpful.
Numerous_Gas362@reddit
Having 24GB of VRAM is literally meaningless, by the time games actually scale to use that amount of VRAM, the 7900 XTX won't have enough raw power to remain relevant. These absurdly unusable VRAM amounts of the higher end GPUs serve no purpose other than baiting people into buying these GPUs. It's a marketing tool and nothing more.
CasKiller2@reddit
Not this again, the same moot point over and over.
karmapopsicle@reddit
Just to be clear here: 24GB of VRAM is basically just marketing nonsense. The only semi-realistic scenario where you'd even be close to needing >16GB would be very perhaps a few already heavy games combined with a lot of very heavy 4K-8K texture modding.
None of the more general actual real-world use cases that require that much VRAM are really any good on AMD cards - AI training/generation, heavy rendering, CAD, etc. In that case time is money and you'd already be out looking for the best price you could find on a 4090.
By the time 24GB ever becomes even a stretch target for developers that card is going to be too outdated to be useful anyway.
Systemlord_FlaUsh@reddit
16 GB may be enough but I don't see a reason to pay a premium to get less in case of the 4080. Now it may be more appealing, but when I bought the XTX it was a 350 € difference. That money bought me 3 TB of SSDs and a new PSU (which I needed to run the card).
karmapopsicle@reddit
In that case the most important reasoning is simply the price. Even if both cards had 16GB or 24GB, if your local pricing at whatever time you were buying put that much of a price gap between them it's a very different question.
Right now just taking German prices as a baseline for European pricing, there's only a €50 difference between the 7900 XTX (~€950) and the 4080 Super (~€1000), and at that point a few percent better rasterization performance and +8GB of VRAM don't make up for the feature deficit.
With today's prices the XTX would be competitive at €750-800, undercutting the 4070 Ti Super's price (€830) with enough extra raster performance to compensate for the feature deficit.
Systemlord_FlaUsh@reddit
If it was really just 50 €, then the 4080 may be the better choice. But not for 350. But its not that the XTX can't to RT at all. Its about on a 3090 /Ti level, which is impressive for AMD standards. RT without DLSS/FSR is hardly possible so I don't count when people say FSR is shitty. The FSR3 has improved by a lot and will likely do further, I hope that AMD will finally utilize those ML cores the RDNA3 architecture has.
karmapopsicle@reddit
Agreed 100%.
In most common current implementations with RT effects like reflections, shadows, and ambient occlusion, yes, certainly performant enough to be reasonable.
The major one that gives me the most pause though is how it completely falls apart with RTGI (aka path-tracing, fully ray traced lighting, etc). Particularly noticeable in CP2077's RT Overdrive. It's basically unplayable on any AMD card regardless of how much upscaling you throw at it.
Not a major factor for most people, sure, but having experienced how breathtaking it is I'm already eagerly awaiting future titles implementing it. For context, I'm still running a 3090, and I happily played through the entire game at ~35-40FPS at 4K with DLSS Performance. While I could easily have run it maxed out raster only with a clearer image at 80+ FPS, it was one of those situations where once I knew what I was missing I just couldn't go back.
FSR still suffers from all of the exact artifacts that were the genesis of utilizing ML for DLSS. The ML image reconstruction is the secret sauce, and it's really disheartening to see a year and a half post launch AMD hasn't even mentioned working on integrating ML into FSR, let alone a timeline.
I tend to give FSR a try with any new game I'm playing just to get an idea of how much progress they've made, and every time I inevitably have to switch back to DLSS because in particular I find the shimmering/ghosting artifacts extremely distracting.
The FSR frame gen was really impressive at first when I loaded it into CP2077, but only until I stopped just casually walking around and looking at stuff. Completely fell apart the first time I started moving quickly or jumped into 3rd person view in a vehicle. Doesn't handle RT shadows correctly, and isn't aware of HUD elements.
Systemlord_FlaUsh@reddit
You can run RT on the XTX. Use FSR3 or crack the LukeFZ mod if not avaiable. You may not max anything out but low-medium RT and global illumination is better than none. Compared to a 4070/Ti the performance malus might not even be that big. Only the 4090 makes a considerable difference, I would want one, but the price is not justifiable.
Shadow_Halls@reddit
With dlss and frame gen it will still beat out the xtx in many titles.
Native resolution is a bit of a thing of the past
CasKiller2@reddit
Native resolution is the real thing - and still the best one in terms of quality. DLSS is just a fake resolution all the way around made to compensate for weak GPU.
inflabby@reddit
the problem is the real thing is not powerful enough to run the newer games at max settings. Still cant hit 4k Max settings 100+ fps.
Audiman09@reddit
I don't mean to hijack this thread, because this is good information, but could you provide more info on how/why UBM favors Intel/Nvidia so much and is biased negatively towards AMD products? I've heard this a bunch but I've never understood why on earth a "benchmark testing platform" would want to be biased in that way unless they were receiving some form of "kickbacks" from one brand. I just want more knowledge to understand and better manage my expectations of UBM.
Please know I'm just genuinely just curious about the subject, and don't intend to sound like I'm asking for "EVIDENCE OR IT DIDN'T HAPPEN" lol.
CasKiller2@reddit
Money. Simple as that.
BZJGTO@reddit
Performance wise, the 7900 XT is superior to the 4070 TI Super. The XTX is a tier above.
karmapopsicle@reddit
All depends on what your performance goals are. If we're just talking straight raw rasterization performance, sure, the XT has a small lead over the 4070 Ti Super. If you're buying a high end GPU so you can enable all the eye candy in modern games, that flips around to have the 4070 Ti Super coming in noticeably faster than the XTX.
If you "don't care about RT" and just want solid frames with raster only... why are you spending so much on your GPU?
Kevosrockin@reddit
Best answer I’ve seen. People who don’t care about ray tracing. Why are you buying a 1k gpu lol ??
aVarangian@reddit
For 4k.
Kevosrockin@reddit
When the 4080 super is the same price and has dlss and basically same rasterization performance I’m taking that everytime.
aVarangian@reddit
prices vary, if you find the 4080/super for the same price; and you also don't need something traditional-sized like the reference xtx is; and you're not concerned with the tiny 16Gb (for 4k); then sure, obviously go for the 4080
Kevosrockin@reddit
Imagine thinking your 24gb of vram is gonna help years down the road when the card performs the same. It won’t have power so the 24 gb won’t matter.
aVarangian@reddit
Idk why are you being such a fanboying ass out of nowhere. VRAM is like RAM, having more than needed won't do anything for you, but being short on it is gonna be a painful experience. And for people who keep their stuff a long time it's wise enough to have a buffer instead of e-waste.
https://www.youtube.com/watch?v=Gd1pzPgLlIY
Kevosrockin@reddit
You are straight fan boying for amd when the nvidia card has way better features for the same money.
CasKiller2@reddit
Keep with your fake resolution with AI. I want 4k native, the real thing.
Diredevil1@reddit
Rx 7900 xtx without a question, even if it was 4080, I would still go with xtx,
You could probably go with 7900xt and it would be still better value than 4070 ti super.
The only reason AMD cards are not popular is because nvidia was on top the last decade and all the fan boys being super loud, but when you start looking deeper into things, look into some trustworthy reviewers like gamers nexus, youll find out that amd provides AS OF THIS MOMENT much better value -RT(how much anyone really cares about it really?)
inflabby@reddit
when u play games that only require RT/dlss/frame gen to hit a decent above 60 fps. Especially the new AAA games.
AvroArrow69@reddit
A little bit of advice... NEVER lend any credence to what userbenchmark shows. They are notoriously anti-AMD and among actual tech experts (like me), they are unaffectionately known as "loserbenchmark".
I have an RX 7900 XTX and it's an absolute unit. When I install a new game, I simply set it to 4K ultra settings.
While the card can do RT about as well as an RTX 3090, I've tried RT on it and I wasn't the least bit impressed. I just run games at 4K ultra and enjoy the performance.
Aggressive-Ad-7222@reddit
I own both, the xtx brute forces it's way to victory, though in my own experience I needed a 1000w PSU to get it stable. That being said the software experience has improved significantly month on month and I personally think Adrenaline is an awesome one stop shop tool. Also now fsr3 is popping up in games, it really hums. That being said I was able to squeeze the 4070 to super into a mini it's case with a fuel fan model and only 750w PSU. It's essentially my portable powerhouse. Both are great cards, edge is definitely with the xtx, it does 4k without a hitch the 4070 struggles. 4080 super is the way to go there and more competitively priced than the previous stock 4080.
No_Ambassador_4522@reddit
We got an Asrock radeon rx 7900 xtx phantom gaming oc gor a friend for 830 Euros, only 30 Euros more expensive than cheapest 4070 Ti Super. In scenarios where XTX is with 20-70 euro more expensive range, I would go with XTX. I do appreciate vram comments however there is certainly value not today but in 5 years from now. Probably will have better 2nd hand value as well. I personally have 4090 but ray tracing, although looks beautiful is not something you notice that much when you are immersed in a game. Hotspot issue with XTX is an issue on paper but not necessarily should be a problem as core temperature where sensitive components are cool. It also depends on case airflow which can be optimized with clever planning. Overall if I wanted the best performance with a limited budget and future proof, I would go with an XTX if it is within less than 100 euro range. I dont think it makes sense above it as performance gain is there but game specific.