Daniel Owen - RX 9070 XT vs RTX 5070 Ti -The Ultimate (re-Review)!
Posted by Antonis_32@reddit | hardware | View on Reddit | 202 comments
Posted by Antonis_32@reddit | hardware | View on Reddit | 202 comments
imjustaminer@reddit
I bought a 9070xt for 849 at Newegg, but just got a 5070ti for 849 at B&H. Ended up returning the 9070xt because the wattage was always around 30-70 watts higher. Outside of that, i really liked it.
ps, they are still some for 849 at B&H.
Antonis_32@reddit (OP)
TLDR:
RTX 5070 Ti is 4.5% faster (including path tracing results)
extrapower99@reddit
yeah with a little detail that ITS NOT 111 games, its 8 games, with 100+ benchmarks
crshbndct@reddit
9070xts are 75% the price of a 5070ti here, so it was a pretty easy choice
24bitNoColor@reddit
Germany: 9070xt is 700 Euro, 5070ti is 800 Euro, that is just 12.5% cheaper. For 4.5% less raw power, a lot less games that support AMD's Reflex equivalent (everybody insists on low latency being super important so this should be super important...), a lot worse support for good upscaling (DLSS and FSR 4), no Ray Reconstruction (meaning among other things that RT reflection remain at render resolution instead of getting upscaled), all that other little Nvidia perks...
I don't see it.
tyrannictoe@reddit
You came from 1060 and you only considered it alright?
Damn it might take a 5090 to satisfy you
thelastasslord@reddit
I upgraded my brother from his Radeon 580 to my old rtx 3080 and he hardly noticed, except to say of age of empires 4 that "oh yeah the units look a bit different".
chapstickbomber@reddit
Give him a 4k monitor
thelastasslord@reddit
He already had one.
chapstickbomber@reddit
Why does your apparently blind brother need either of those things
Vushivushi@reddit
Maybe they have a 1080p monitor.
crshbndct@reddit
I doubt it. Going from a HD5770 to a 1060 was a big dramatic uplift. Aside from games that won’t run without RT, it’s pixel peeping to see the differences.
My previous upgrade was “most of your library won’t run at all” now it’s “everything runs, but the new card just makes it look better”
I am getting the same enjoyment from the games that I was before, the difference is that I’m now running in 4K, Whereas before I was in 1080/720p. But the gameplay and level of fun is the same.
It was definitely time to upgrade, so I did, but I am not getting any more enjoyment than I was before.
It is nice to be able to game on Linux though. The nvidia drivers were such garbage there.
I should also note that I don’t mind games running at 30 as long as they are consistent. The improvement to 120fps is absolutely huge.
Schmigolo@reddit
As someone who went from 1070 to 5070, I'm gonna be honest after so many years of putting settings lower I realized it doesn't make much of a difference.
Now I get to put settings on high with a good performance uptick, but my experience hasn't changed much except for serious outliers like CP77 or MHW that I wouldn't play on my 1070.
kuddlesworth9419@reddit
It's just over 4X the uplift I think? But more than twice the price as the 1060 was back in the day.
tyrannictoe@reddit
your math is definitely wrong somewhere
1060 can't even run some of the newer titles
kuddlesworth9419@reddit
Touche. I can't run the new Doom either :(
StumptownRetro@reddit
When they are in stock. Yes. They are that price. But I find the Ti more readily available and probably will benefit from more sales in the future to bring the price down.
Qweasdy@reddit
Looking on amazon right now here in the UK the cheapest 9070xt's are £674.99, the cheapest 5070 ti is £796.49. Both available for 1 day delivery, and in both cases there are a bunch of alternative brands within a few % points on price.
So as of the time of this comment a 9070xt is ~85% of a 5070 ti on amazon. Other retailers are usually a bit cheaper and prices vary from day to day but this has been a relatively consistent price gap in my experience.
supercakefish@reddit
Right now Overclockers and Scan are the cheapest for these GPUs. Overclockers has the cheapest 5070 Ti for £720 while Scan has the cheapest 9070 XT for £660.
Immediate_Dependent9@reddit
If you're a UK shopper you can pick up a rtx 5070 ti for as low as £719 and at that price it makes the 9070xt a tougher sale
kuddlesworth9419@reddit
I think it depends what you're budget is. If you can't justify spending more than £700 just on the GPU than the 5070Ti is a no go.
StumptownRetro@reddit
In the US the situation is quite the opposite where both are out of stock constantly and the 9070 XT is routinely scalped for double its MSRP if not more.
SEI_JAKU@reddit
Sure wish people would stop saying this, it isn't true at all.
Right now on Newegg, even all the current open box sales are consistently cheaper for the AMD card. All of the retail prices are consistently lower. Right now on Micro Center, same thing. Likely the same thing for any store at this point.
ron41593@reddit
Shhhh dont tell him that! I have one in my cart waiting for payday 😋
popop143@reddit
Ehhh, even in my country that is notorious for not having readily available stock during the 4000-series and 7000-series GPU launch days, the 9070 XT and 5070 TI both have a lot of stock (seems like not a lot of buyers). For comparison, they're currently at $950 for 5070 TI and $830 for 9070 XT.
StumptownRetro@reddit
Canada?
Normal_Bird3689@reddit
Both are in stock around me and the 9070xt is still 20% cheaper.
crshbndct@reddit
Yes when they aren’t in stock the prices are pretty close, but I just waited and refused to get scalped.
fnsv@reddit
Same situation in my country down to the T - 9070 XT was just a steal for that ratio.
SJGucky@reddit
In germany it is 799€ for the 5070Ti (cheapest) vs 729€ for the 9070XT.
For different models add 50€ on top.
Lalaz4lyf@reddit
I still have a 1060 in my desktop. I'm still waiting on anything to make me excited enough to purchase a card over the open box gaming laptops I've been buying for years now.
SunfireGaren@reddit
The 9070 XT is close to the 5070 Ti in ray tracing, but loses massively in path traced titles (albeit there are not many out right now).
Aggravating-Dot132@reddit
With PT it is also worth noting, that all of them are optimised specifically to the green card. So, it's not like AMD is doing bad, it's just that PT hardware is very specific, and if you optimise to one team, other team gets nothing.
That said, with Optiscaler and upscaling, PT is playable on both.
Zarmazarma@reddit
It's more that Nvidia cards have the features necessary to do PT efficiently, and AMD cards do not. No amount of "optimization" is going to make CP2077 run at the same frame rate with the same visual presentation as on a 5070ti, even separate of upscaling.
Aggravating-Dot132@reddit
By optimisation I mean designing in the first place. For example, Nvidia RT cores are designed for 4 bounces (In theory). And AMD is for 3.
The only good way to fix the issue is to decrease the amount of bounces. Check mods for cyberpunk, once they are decreased a bit, PT flies on AMD as well (I don't remember how much it decreased).
Thus yes, Nvidia has PT better, but that's what you get when the special effect is designed for that. Remember Hairworks? AMD had identical quality that worked much better for all NPCs and with better performance (Deus Ex MD). Later on TressFX was adopted into engines directly.
_I_AM_A_STRANGE_LOOP@reddit
There is no hard threshold here, AMD cards are bad at highly recusive PT/RT because they lack a bvh traversal engine, unlike nvidia cards. AMD cards need to carry out this work on generic compute, while NV is hardware accelerated. The deeper the BVH/average recursion, the bigger the IHV difference will be - but there is no hard cutoff at 3/4 bounces
Aggravating-Dot132@reddit
It was an example. ANd you just proved my point. PT is designed around what Nvidia has. So like it or not, it will work better on their cards.
It's not that AMD should not do the same way, it's just that PT is like that.
That said, it's only in a few games and it's not that great in terms of perfomance/gains. RT only stuff will be added in more games though, and AMD will be doing fine there going forward.
SherbertExisting3509@reddit
What about Intel's RT implementation?
Intel implements RT by using dedicated fixed function hardware for traversing the BVH and dedicated registers are used to store which nodes have already been traversed.
This allows the RTA's to skip nodes that have already been traversed and through restart traversal from said nodes. Intel calls this "short stack traversal"
Battlemage can process 2 triangle nodes and 18 box tests per cycle, but this wasn't a limiting factor with alchemist (1 triangle + 12 box test). According to clamchower, it was likely implemented to do more parallel work.
Battlemage's RT implementation is more advanced than RDNA4 and is likely closer to Ada Lovelace.
Aggravating-Dot132@reddit
I guess, when you skip first steps and go straight to the higher league, you get better results.
Intel do need to work on their GPUs more. For the sake of all of us =\
bctoy@reddit
It's unfortunate that you were downvoted because for in practice it doesn't mean much, and your intiial comment about optimization for the green card is far closer to reality than the theoretical numbers other came up with.
The path tracing updates to Portal and Cyberpunk have quite poor numbers on AMD and also on intel. Arc770 goes from being ~50% faster than 2060 to 2060 being 25% faster when you change from RT Ultra to Overdrive. This despite the intel cards' RT hardware which is said to be much better than AMD if not at nvidia's level.
https://www.techpowerup.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/6.html
The later path tracing updates to classic games of Serious Sam and Doom had the 6900XT close to 3070 performance. Earlier this year, I benched 6800XT vs 4090 in the old PT updated games and heavy RT games like updated Witcher3 and Cyberpunk, and 4090 was close to 3.5x of 6800XT. 7900XTX should be half of 4090's performance then in PT like in RT heavy games.
https://www.pcgameshardware.de/Serious-Sam-The-First-Encounter-Spiel-32399/Specials/SeSam-Ray-Traced-Benchmark-Test-1396778/2/#a1
_I_AM_A_STRANGE_LOOP@reddit
You have more domain knowledge here than I do - but I’ve been generally quite impressed by intel’s forward thinking design!! They came out of the gate with such a modern run of dGPUs in some really important ways, imo. But I can only speak on the big picture myself. Thanks for sharing, interesting stuff!
SomniumOv@reddit
Yes that is the clear advantage of being able to design a GPU around the current state of the art and not on building upon an existing design like their competitors have to do.
But the double edged sword is that is also the reason why Backwards Compatibility has been their weak point.
If they can improve on performance per mm (to claw back margin), they could be a greater competitor to Nvidia than ATI/AMD has been for a long time.
VastTension6022@reddit
Path tracing was designed decades before Nvidia implemented it in hardware. Why is it so hard for you to accept that nvidia simply has a more advanced solution? AMD's RT was sub-turing level until this generation; it would be shocking if they weren't still behind.
_I_AM_A_STRANGE_LOOP@reddit
You have it backwards - nvidia cards are designed around accelerating what makes PT costly, PT is not designed "around what Nvidia has", Ada-onwards cards have these design elements to accelerate common and costly aspects of PT like recursion and incoherence. These pain points are downstream of issues we've been aware of and preparing for since before the current millenia, where PT is, and has been, the ground truth method of assessing 3d rendering. If nvidia didn't want AMD cards to run these sponsored titles, they could just bake a hardware limiter in as they do with DLSS(FG).
There is no current PT workload you can design that won't bog down AMD cards on deep recurion, and they cannot handle incoherence efficiently. These are what cause the massive PT performance drops while lighter RT still works fine.
SherbertExisting3509@reddit
What about Intel's RT implementation?
Intel implements RT by using dedicated fixed function hardware for traversing the BVH and dedicated registers are used to store which nodes have already been traversed.
This allows the RTA's to skip nodes that have already been traversed and through restart traversal from said nodes. Intel calls this "short stack traversal"
Battlemage can process 2 triangle nodes and 18 box tests per cycle, but this wasn't a limiting factor with alchemist (1 triangle + 12 box test). According to clamchower, it was likely implemented to do more parallel work.
Battlemage's RT implementation is more advanced than RDNA4 and is likely closer to Ada Lovelace.
based_and_upvoted@reddit
Do you have a source on AMD cards lacking a BVH traversal engine? Specifically RDNA 4.
From what I have read AMD made a lot of improvement with BVH traversal specifically, for example adding dedicated instructions to handle ray intersections and stack management, they also made RDNA 4 able to handle wider trees as an effort to avoid deep child nodes, and since RDNA 3 they have hardware accelerated traversal.
Though I haven't found anything specific of "AMD lacks XYZ hardware acceleration techniques that NVIDIA supports" and I'm not knowledgeable enough to compare both companies approaches to ray tracing
_I_AM_A_STRANGE_LOOP@reddit
aaaand here's a big TL;DR: https://i.imgur.com/y4hIFVc.png
https://www.amd.com/content/dam/amd/en/documents/radeon-tech-docs/instruction-set-architectures/rdna4-instruction-set-architecture.pdf 'PDF page' 140, 'document page' 130
_I_AM_A_STRANGE_LOOP@reddit
Sure although it's, to a degree, a process of elimination: note that such a full hardware traversal engine comparable to NV is not mentioned in these sources at all. The improvements to traversal come down to essentially much more iterative and conservative changes to the RT stack than a full hardware unit would bring to the table. Note that the compute engine diagram does not indicate the addition of a new hardware block. The C&C takeaway re: RDNA2-4 RT is of a "conservative raytracing strategy where a shader program controls the raytracing process from ray generation to result shading". In the bigger picture, the newer traversal instructions etc. are a way of applying that compute more effectively - but it's still mostly leveraging compute resources that could be fed by other jobs, rather than lighting up a bit of fixed function silicon!
https://chipsandcheese.com/p/rdna-4s-raytracing-improvements
https://hothardware.com/Image/Resize/?width=1170&height=1170&imageFile=/contentimages/Article/3508/content/big_rdna-4-compute-engine.jpg
https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F60d1b658-234b-4a4f-8a1b-6bb7106bcc92_652x343.webp
Compare:https://www.hwcooling.net/wp-content/uploads/2025/02/RDNA-4-Architecture-Press-Deck_008.png and https://images.nvidia.com/aem-dam/Solutions/geforce/blackwell/nvidia-rtx-blackwell-gpu-architecture.pdf, page 11. Dedicated hardware blocks are generally fully delineated as such when they are analagous, especially with a competitor in play. I hope this is helpful! Just dive deep on RDNA4 and see how things work (without a full traversal engine). Oh and let me know if I got anything wrong here please!!!! Happy trails~! Oh and a last probably good to mention note: bvh traversal is MUCH more involved than bvh stack management, definitely don't conflate those two hardware features!! The has full hardware flow control to autonomously traverse a tree, while the latter manages the traversal of such a tree via compute shader logic. Hope that makes sense!
SherbertExisting3509@reddit
AMD traverses the BVH on the shader engine and stores the results in Local Data Share
This has been the case since RDNA2
These improvements you cite are great and show signs that AMD is taking RT more seriously, but the BVH8 is still ultimately being traversed on the shader engine.
Fixed function hardware for BVH traversal would likely be implemented in UDNA, which seems like a big uarch redesign.
OverlyOptimisticNerd@reddit
The crux of your post is that, "AMD hardware is inferior in this one respect, and if you design your game to not use Nvidia's additional specialized hardware, then AMD's card does fine."
The fact that you tried to spin that as a win or AMD is absolutely bonkers.
Aggravating-Dot132@reddit
I didn't try it to spin as a win for AMD, lol. What level of mental gymnastics did you use here?
Or were you refering to TressFX? That one is a win, yes.
SherbertExisting3509@reddit
Intel's Battlemage architecture completely disproves your point.
RDNA4 is worse at RT than Blackwell because AMD didn't take RT and AI upscaling seriously until they started developing RDNA4.
Despite this late start AMD has stepped up with FSR4, which is more advanced than Xess and Redstone.
But RDNA 4 is only meant to iterate on RDNA3. Designing a fixed function BVH traversal pipeline is likely being saved for UDNA1, which is a major uarch redesign..
lolatwargaming@reddit
Bruh, just stop. You look desperate
nmkd@reddit
Not really.
It's just that AMD lacks the tech (both HW & SW) for fast/efficient PT because they are lagging behind when it comes to R&D. Which is understandable considering their presumably massively different budgets.
Strazdas1@reddit
its more that PT is accelerated on Nvidia cards and AMD has to do it the old fashioned way. The "optimization" is just inherently how PT works in current PT methods.
Aggravating-Dot132@reddit
We won't anything being improved for 2 gens at minimum. And, considering that Nvidia is on the record profits right now (gaming), players will lean towards 800$ consoles instead.
Tbh. Nvidia is the reason of everything that is wrong right now with the tech.
Strazdas1@reddit
I sure hope it will be improved soon, right now Nvidia is the only one even trying to improve things. Hopefully with new consoles not being so anemic the rest of the market starts catching up.
Aggravating-Dot132@reddit
Not true about improving (they are not the only one), but the main one. However, they are also the greed incarnate, so...
Strazdas1@reddit
They are a company out to make a profit. So is AMD. So is Intel. So is TSMC. So is SMIC. Stop thinking companies are your friends.
lolatwargaming@reddit
lmao people being so generous with AMD, while that same argument can be made for a 5060, as in you just need to configure it right
ResponsibleJudge3172@reddit
You can't be missing features like SER and say that devs are not accommodating you when you have less performance
_I_AM_A_STRANGE_LOOP@reddit
Yeah I didn't mention it in my other comment re: bounces to keep things simpler, but in highly incoherent workloads (like PT) SER is a massive accelerant. the 9070xt honestly does PT a bit better than I would've guessed given the lack of bvh traversal/SER! But it's not accurate to say that AMD is in any way gimped here beyond what the hardware genuinely lacks. I'd also say a lack of a dedicated hardware denoise path is also a pretty big difference, but that's getting even further into the weeds!
Jeep-Eep@reddit
Yeah, but PT isn't really gonna be relevant for much of games before maybe second or 3rd gen UDNA, so not that much of a stickler, and we'll see how much the gap closes when Redstone lands.
Vb_33@reddit
It's more relevant than ultra settings ever were. At least PT makes your games reach for Pixar level lighting which is the most important part of modern graphics in terms of visual returns. Ultra generally looks unnoticeable while running worst than high settings.
Gambler_720@reddit
Except that's not the full picture by any means. FSR 4 support remains pitiful and will continue to be a limitation for the 9070 XT. He literally talks about it in the beginning of the video.
The upscaling advantage is still huge and will only get nulfillied from next generation.
RTukka@reddit
It's not really something that can be marketed, and it's not a perfect solution, but Optiscaler makes FSR 4 a much bigger deal for people are aware of it and are willing to perform a few extra steps.
So I'd say the 50-series still has DLSS as a selling point over the Radeon 90- series, but if I were buying for myself, I'd pretty much rate it as a tiebreaker, and not consider it a huge advantage.
Strazdas1@reddit
Its not a perfect solution because its buggy as fuck. Im happy it exists as an option, but its certainly not at the level of "working" that would have manufacturers promote it officially.
Framed-Photo@reddit
For something that's supposedly buggy as fuck, it sure as hell works really well in a lot of the games I try it in.
Strazdas1@reddit
Then you dont try it in a lot of games. Thats fine. Im glad it works for you. It does not work fine for everyone.
Framed-Photo@reddit
It does not work fine for everyone because they don't know how to use it.
Optiscaler doesn't have universal compatibility, but do you know how many people I've seen try to complain about optiscaler in specific titles and the issue was that they didn't try every setting, or they didn't install it in the right directory? Hell I would have agreed with you a few months ago when I first tried it out, shit didn't work immediately in a bunch of the games I tried. Turns out it was user error.
The compatibility wiki alone has like 50+ entries, and that's not a comprehensive list. What games did you try that it didn't work in?
Strazdas1@reddit
The fact that you need a compatibility wiki is in itself evidence that it is does not work fine.
Framed-Photo@reddit
It's third party, open source software that hooks into games to take over the upscaling pipeline.
The fact that it has a compatibility wiki at all is a great thing, it makes it far easier to know how to get specific games working.
Working fine doesn't mean it's flawless, as I've said I've had issues too. But once I learned how to use it properly it worked very well.
If that's not something you want to put up with, then sure I agree that Nvidia would make more sense to go with.
Strazdas1@reddit
Thats fine. The discussion here was why isnt AMD promoting it officially as part of the drivers. And the answer is that it does not work well enough for such massive application to average user.
You said it yourself, you made user error until you leaned how to use it properly. Surely you see how that would make most users just complain about it without any problem solving?
Framed-Photo@reddit
Stop moving the goal posts.
You replied to someone saying it was buggy as fuck, I disagreed. Marketability was not part of your initial comment or any subsequent ones, and it wasn't part of my disagreement.
Do you want to provide any evidence of optiscaler being this horrible mess you make it out to be or are we done talking, because this is getting old. I've referred you to the compatibility wiki, even if those were the only games it worked with the tool would be pretty good.
Any counters?
Strazdas1@reddit
The goal posts always remained the same. It was too buggy for AMD to promote it officially.
Framed-Photo@reddit
Please read back through our discussion and identify where, at any point, my disagreement with you was on marketing.
Jeep-Eep@reddit
Yeah, and only a tiebreaker once the overall blackwell software stack is housebroken, which has not yet been achieved.
AwesomeFrisbee@reddit
The support will grow now that it makes a big difference and the quality is good enough that it warrants the work to put in. It also will be much more interesting for newer games since its just an easy fix to make a game that barely runs into something that runs normally on a decent amount of chips. The 9070 sells really well and this will actually influence more than it did in the past 3 years since AMD kinda got oblitorated by Nvidia.
As long as AMD keeps the pricing advantage, its still a better card to get imo. And devs will see that, no doubt. It also helps that more of their own staff will be buying these cards, making it easier to implement too, since normally post-launch you wouldn't just buy new cards to test on.
constantlymat@reddit
I remember when people predicted FSR 3.1 support would improve for FSR 2.0 games.
Which never happened.
ColdAngle1151@reddit
Re-sale value is a lot better on Nvidia cards where I am.
That is something to take into consideration. AMD GPUs depreciate noticeable faster here.
emanueladilio@reddit
nowadays they dont.
ColdAngle1151@reddit
Where I live there is a big difference. Maybe its not the same where you are, but here it is.
Vb_33@reddit
Nvidia's army of features overwhelms AMD. AMD now has a DLSS SR equivalent but they're missing the rest of the pack (not to mention Nvidia has a better performing SR solution with DLSS CNN). Even their just recently announced FSR Redstone initiative doesn't catch them up.
The 9070XT is a great competitor as long as you ignore features, productivity, local AI and VR.
batter159@reddit
Which VR feature does nvidia has that amd doesn't?
Vb_33@reddit
It's not a feature it's driver quality in this case. AMD VR drivers have always been worst than Nvidia's but for years now it feels like AMD has been flat out neglecting them, it's a similar story to productivity app performance (altho the 90 series has improved a good amount here).
bctoy@reddit
Adaptive Sync on nvidia is a mess with 50xx series and is often worse than AMD.
I've had both nvidia/AMD cards over the past few years and nvidia always had a worse experience on same Freesync/GSync compatible displays. Even intel igpu can do better with the display output routed through it.
lolatwargaming@reddit
You forgot the part about the lack of FSR4 support and the sub-par IQ across the board, be it xess or FSR4 in the few games with support vs DLSS4
Also, that 9070 XT was drawing like 120w more than the 5070 Ti, almost 50% more power
W_ender@reddit
Why the hell people like to spread lies, both cards have almost identical watt consumption
bestanonever@reddit
Thanks!
It would seem they are pretty evenly matched, according to the benchmarks from the videos (I skimmed through it, mostly straight to the charts). Of course, Nvidia still has DLSS exclusivity and while FSR4 is very nice, it's still not widely available (well, it's also two months old and only present on a single gen GPU lineup, so give it time).
Given that in my region, the price difference can be substantial, that'd really favor the AMD card. We'll see how things progress and if Nvidia releases Super variants later on.
StrawHatFen@reddit
I’m still not fully sold on ray tracing. Yes it looks good but it’s way too taxing.
Wildhamsi@reddit
In my country the msrp of 5070 ti is a whopping $1280 while the msrp of the 9070 xt is $900! There is a 380 DOLLAR DIFFERNCE FOR %10 PERFORMANCE?!
Zephyrot@reddit
One thing I would add is that 5070 Ti can be much more easily undervolted/overclocked, so anyone with minimal effort can net another \~10% performance. More with better cooling and some finetuning.
conquer69@reddit
The overclocking of the 9070 xt seemed pretty easy too. Both cards get an extra 10% but the 5070 ti seems to have better efficiency.
Klappmesser@reddit
The 5070ti saved me from having to buy a new PSU. I run it with a 5800x3d on 600w without problems. For 9070xt I would've needed to spend another 100+.
lukeetc3@reddit
I ran my 9070xt with a 5800x3D on my 650w just fine for like a month before upgrading just in case
EVRoadie@reddit
It'll be interesting to see how the 9070xt performs once they get their version of ray reconstruction implemented in FSR4. I think that's the only thing holding AMD back in PT.
sOrSuky@reddit
Does Ray Reconstruction improve path tracing performance ?
EVRoadie@reddit
https://www.techpowerup.com/review/nvidia-dlss-35-ray-reconstruction/5.html
Better visuals, less noisy, ~4-6 fps. Really curious to see how AMD stacks up with their tech and if it's a bigger jump in performance or about the same.
sOrSuky@reddit
I appreciate the link, friend.
TDYDave2@reddit
One unspoken advantage for the 9070XT is that AMD is apparently better supported under SteamOS, should anyone want to abandon Windows for a future build.
EJ19876@reddit
If Linux gaming were to ever matter and there were financial benefits to be had by developing proper drivers for it, Nvidia would have more than just the interns working on the Geforce drivers for Linux.
Jeep-Eep@reddit
Yeah, well, Gabe is doing his level best to create that scenario and frankly, VALVe is roughly as loadbearing to PC gaming as MS these days. I dare say he's in a position to at least try to create that change. Did it once with Steam.
aminorityofone@reddit
Steam was horrid at launch and for a few years too. SteamOS has been surprisingly really good at launch and continues to get better. I am excited to see what they do.
Jeep-Eep@reddit
Yeah, if they get easy to use out of box productivity options onto Steam OS, they might get a surprisingly large chunk to migrate off MS fast, especially if it becomes the price hack for new PC builders.
996forever@reddit
It doesn’t. Desktop Linux only exists on reddit and other forums.
taicy5623@reddit
You know what else only exists on reddit and other forums?
Endless bitching about Windows 10 & 11 and Microsoft abusing their customers and violating their privacy. Doesn't mean we all don't know its happening.
The difference is that Linux getting better and becoming good is the only thing that will ever make microsoft think twice about shitting up Windows any more than they already have.
People said the same thing, over and over again, about Blender, and it improving in leaps and bounds, along with private companies fucking over independent creators, have led to open source software taking control, slowly but surely.
996forever@reddit
There is nothing that ever will. The commercial, government, education, and any other enterprise sector alone will make sure of that (outside of HPC and datacentre that never were on windows to begin with). If any customer is leaving windows it will be towards mac and not linux, unless you begin to argue android is linux, but that's not the desktop we're talking about.
There is no such thing as a mainstream desktop linux user.
taicy5623@reddit
I'm well aware that Microsoft's primary customer is enterprise. The only reason why Windows 11 creating Ewaste and forcing upgrades isn't causing mass corporate backlash is because your average C-Suite is on the same drugs as MS's.
But the idea that Apple is going to be attracting enterprise, at those prices, is the delusions of somebody huffing gasoline. Meanwhile, open source software overtaking closed standards is something that happens all the time.
You can say its hard, or that your average user isn't going to be flashing Ubuntu on their machine, and you wouldn't be wrong. But portraying this copacetic smugness while bending over for Microsoft is ridiculous.
996forever@reddit
And yes, the “endless bitching about windows 10 and 11” absolutely IS an internet echo chamber thing. I don’t know why you brought it up as if it were a gotcha
996forever@reddit
Apple absolutely has a segment of enterprise and education sales. Even back when I was in elementary school we had a computer lab full of iMacs. And that was Core 2 era macs that were far worse in terms of price to spec compared to the ARM macs. Modern MacBooks compete with similarly priced Dell Precisions, HP Zbooks, and Thinkpads (just not the bottom of the barrel E series or L series that you might see commonly handed out). I don’t know why you’re acting like “apple prices” make it prohibitive when they have the hard data to back up their sales. Their being the leading NPU-equipped PC vendor despite AMD talking about how strong the demand is for Ryzen AI 300 series should be very telling (lunar lake might be too new). What you and I might feel about Microsoft or Linux or Apple or anything else is irrelevant with the real life general public trends that are backed up by data.
996forever@reddit
[This] is my favourite comment about "desktop linux" (https://www.reddit.com/r/hardware/comments/pczuje/update_windows_11_officially_wont_support_amd_zen/hamxq7a/)
aminorityofone@reddit
eh... linux at that time was also really bad for even a somewhat techie person (and i certainly was not). Try getting wireless drivers working in linux back then was a nightmare. That was when i cut my teeth on linux and switched back to vista because no amount of googling could i find the answer to why my wireless drivers in my dell laptop wouldnt work. If an average person cant use an extremely basic function without going into the command line on a mass produced laptop from one of the worlds largest OEMs then the OS isnt ready.
taicy5623@reddit
Drivers and open standards were nowhere near what they are now back when Vista came and flopped.
Vb_33@reddit
Hey Linux market share grew to 2% in the latest steam hardware survey.
2%!!!
BaconatedGrapefruit@reddit
Complete aside - I always have a laugh when people seriously quote the steam hardware surveys as evidence of Linux hardware growth.
Valve has their entire hand on the scale in favour of Linux and the best they can manage is 2%.
(Note for the Linux evangelists. I am not saying Linux is bad or SteamOS isn’t seeing some impressive adoption in the gaming space. I am saying that the steam hardware survey is in no way a representative sample of OS Market share. )
996forever@reddit
I wonder how much of Linux's "growth" in the past two years on steam is steam deck.
BaconatedGrapefruit@reddit
Honestly? Probably most of it.
996forever@reddit
On a side note, it’s pretty impressive the steam instalment base of Intel processors among windows users is down to 60%. This figures includes not only DIY builds but also gaming laptops and prebuilts which are the majority of the market. I imagine for new systems the share of Intel based machines might be just over 50% which would be unthinkable a decade ago. Zero hope of nvidia being challenged like that.
LooperNor@reddit
AMD is way better supported for anyone using any kind of Linux distribution. Granted, it's not a concern for most people, but for those who do use Linux it should make the choice between these two graphics cards much easier.
Jeep-Eep@reddit
I mean, Windows is getting a worse deal with every quarter, folks should be paying more heed to Linux perf.
LooperNor@reddit
Definitely agree.
If I didn't play Microsoft Flight Simulator a lot (with add-ons that are extremely difficult/impossible) to get working on Linux, I would jump ship real quick.
Jeep-Eep@reddit
And that is why my build is optimized to dual boot.
taicy5623@reddit
Yup, DX12 Perf has me dualbooting my Nvidia powered rig. Setting up an alias so i can "sudo grub-reboot $windowsgrubentry && sudo reboot" makes it about as convenient as I possibly can get it.
SEI_JAKU@reddit
It's frustrating, MFS itself works great from everything I've seen, but it's the add-ons that get you. Why is it always some specific political thing, time and time again?
Krendrian@reddit
Amd's gaming performance on linux is in parity with their windows performance.
For nvidia it's roughly 15-0% lower than their windows performance.
https://i.imgur.com/vCTwaXu.png
https://i.imgur.com/Q3Nj0eA.png
DistantRavioli@reddit
I believe this is mainly because of DX12 titles being 20-30% slower. Non DX12 titles I believe it should be parity, but I can't double check right now. Maybe someday Nvidia will fix this but don't expect it to come soon (please let statement this age like milk).
Krendrian@reddit
So it's only with dx12? Kinda weird considering the api calls are translated to vulkan anyway.
taicy5623@reddit
Its something to do with the codepath the Nvidia driver takes talking to vkd3d-proton
Incredible frustrating. Nvidia does have issues tracking it, but no idea if they're doing anything about it.
Basically, if you hate microsoft, you gotta learn to hate Nvidia too.
kuddlesworth9419@reddit
Only problem I have with AMD on Linux is the lack of a GUI for controlling stuff like Colour gamut. And HDR on Linux isn't where it needs to be yet. KDE Plasma is getting there but we really need better support for increase in colour gamut.
DistantRavioli@reddit
Unless you wanna connect to a 4k120hz TV over HDMI, then we can thank the HDMI forum for artificially gimping AMD in this regard.
Jeep-Eep@reddit
And tbh, that means more to me then PT or DLSS. I'm currently running Win11 and even deloused it is annoyance rich, and given how unserious the MS leadership has gotten, being free of their software ecosystem has become... pressing.
lolatwargaming@reddit
As someone with several recent xx90 nvidia cards, this is literally the only reason I’m looking at a 9070XT is to build a steam “deck”
Wander715@reddit
No reason to get a 9070 XT in most markets right now until pricing improves tbh. 5070 Ti is either same price or like $50 more in which case it's a no brainer.
-Manosko-@reddit
I lucked out and got a 9070 XT Red Devil at a freak discount in Denmark/EU. Paid the same price as what the most basic 16GB 5060 Ti costs here (560 USD with taxes/VAT and inflated by the tumble the Dollar has taken since January).
I doubt I’d see such sales on Nvidia here.
I’ve used Nvidia my whole life since my first Riva TNT way back when, with only consoles, Macs and my Steam Deck running anything else. Going to be fun trying Radeon and building a Bazzite machine for the kids and my gf to use my Steam library in the living room.
Will probably stick to Nvidia in my main desktop for the time being, though, unless this 9070 XT blows my mind.
eeeponthemove@reddit
Oh my!
Congrats on the great price! From where did you purchase it?
Vb_33@reddit
Damn $560 for a 5060ti is brutal. I've seen SKUs going for $600 but who buys that.
JustusJJonas@reddit
In Europe it's more like +100€ currently -.-
EscapeParticular8743@reddit
In Germany, its 739€ for the cheapest 9070xt and 799€ for the cheapest 5070ti. No brainer to go for 5070ti here
eeeponthemove@reddit
In Sweden:
cheapest 9070 XT is 769,06€
Cheapest 5070 Ti is 915,57€
Difference of 146,51€
xNailBunny@reddit
One caveat: the coolers on all the 799€ models are trash and anything decent comes with a 110€+ premium
DrNopeMD@reddit
I mean I have one of the hotter running 5070 Ti models (the MSI Ventus OC 3X) and even under load it has only ever hit mid 70C max.
So it might run a bit hotter but it's still well within safe temps and isn't thermal throttling.
xNailBunny@reddit
It's about the noise, not temps
EscapeParticular8743@reddit
Thats true for the MSI shadow, but not for the gainward afaik
xNailBunny@reddit
The Gainward phoenix doesn't look any bigger than the ventus/shadow. The phantom with it's 3 slot cooler may be decent (I've not seen any reviews), but it's 875€
Vb_33@reddit
There needs to be a cooler review roundup.
AwesomeFrisbee@reddit
Yeah, this is a big thing some are willing to ignore. There's not much difference in the AMD models performance-wise, but there is a massive difference in the Nvidia models.
Jeep-Eep@reddit
Yeah, 60ish euros for hot, noisy and the risk of technical adventure does not a deal make for me.
shugthedug3@reddit
Trash in what way?
tehKost@reddit
$800 9070xt vs $1000 5070ti
KARMAAACS@reddit
In Australia you can buy a 5070 ti for $1359 AUD. The 9070 XT at its cheapest is $1249 AUD. Thats like $65 USD difference and to be honest for DLSS and better RT the price difference is worth it.
Zestyclose_Plum_8096@reddit
this is a complete lie
more like 1300-1350(9070xt) vs 1600-1650(5070ti)
https://www.staticice.com.au/cgi-bin/search.cgi?q=5070ti&spos=3
https://www.staticice.com.au/cgi-bin/search.cgi?q=9070xt&spos=3
Vb_33@reddit
Woah what is that website? Seems straight out of 1998.
KARMAAACS@reddit
It is not.
5070 Ti: https://www.centrecom.com.au/palit-gamingpro-v1-geforce-rtx-5070-ti-graphics-card
Use promo code.
9070 XT: https://www.scorptec.com.au/product/graphics-cards/amd/117063-rx-97tswf3b9
Knew someone would think /i'm lying.
MiloIsTheBest@reddit
Ok so, 2 points:
Alright I'll accept that you can use this promo code that expires... tomorrow... to buy this one card at this one retailer. But I don't think that constitutes claiming that 5070Tis can be had for that price if you won't be able to make that claim on Wednesday. BY THE WAY: Yes that's actually a decent price and completely shits on the 9070 XT at that price point.
That actually is really good news even if it's only technically temporarily true. Stock levels of 50 series cards (except the 5090) are obviously outpacing their sales at the moment for any retailer to be offering that kind of a discount (even on an obviously unpopular Palit model). Most of the models are tracking back towards MSRP so hopefully the broader market not being a bunch of whales forces prices to go to an even more reasonable level.
KARMAAACS@reddit
This regularly goes on special at this retailer multiple times within a month. See this OzBargain post started on the 23rd of May for same price.
Also been $1375 AUD on eBay with eBay Plus. So you can find these pretty regularly on sale for around that price.
I've shown other retailers have similar prices at times above. If you buy smart you absolutely can get that price on a regular basis now days.
Yep.
Yep, I dunno what will happen now that 50 series supply has apparently been pulled back for June and July, but considering these are sitting on shelves, it's probably not going to change pricing down under here much.
Absolutely should considering similar pricing across multiple sites and sellers and multiple deals popping up on a regular basis.
MiloIsTheBest@reddit
Where?
I've literally never seen a 5070 Ti close to that on any of the major sites I obsessively look at. I've seen them at $1499AUD which is $10 (TEN WHOLE DOLLARS) below the AU RRP, StaticIce shows a listing at somewhere called SkyComp for $1473. These prices are RARE though and they're all usually between $1510 and $2000.
Everything below that is a 5070.
KARMAAACS@reddit
Here
5070 Ti: https://www.centrecom.com.au/palit-gamingpro-v1-geforce-rtx-5070-ti-graphics-card
Use promo code for $150 AUD off the 5070 Ti.
9070 XT: https://www.scorptec.com.au/product/graphics-cards/amd/117063-rx-97tswf3b9
Knew someone would think I'm lying. But I'm not.
Normal_Bird3689@reddit
You post is valid for 15 hours and 36 minutes from now, then you are wrong.
Not sure i would gloat about it?
KARMAAACS@reddit
Who's gloating?
sharkyzarous@reddit
i think he just got excited.
KARMAAACS@reddit
See this comment
Alive_Worth_2032@reddit
Lol, it's not even that much with 20%~ VAT on top of price here in the EU.
popop143@reddit
$830 9070 XT vs $950 5070 TI in the Philippines. There were $790 models last week for the 9070 XT but those were snagged fast.
RedTuesdayMusic@reddit
There are always reasons. I bought the 9070XT Reaper even though it was €25 more than the cheapest 5070Ti for the simple reason it's the only true dual-slot card no taller than the PCIe bracket of both manufacturers of this generation.
And I'll probably buy another, though I'm waiting to see what Arc does with a potential B770.
crshbndct@reddit
My 9070xt was 1500nzd, a 5070ti was 2000nzd
BeerGogglesFTW@reddit
Probably closer to 100$ from what I've seen in several weeks.
Recently on r/buildapcsales, most of what I've seen is:
9070 XT for $720-730.
5070 TI for $825-830.
It's been months since I've seen a $750 5070 TI, and usually gone within a minute.
However, that's just buildapcsales. If I was trying to buy one, I would subscribe to a discord channel or app to better track the prices. That may be different than what gets to reddit.
I also don't live near a Micro Center which again, could be different than what I see.
AC1colossus@reddit
Big fan of Daniel but not a big fan of lumping path tracing workloads in with the others. Apples-to-apples comparisons are better.
Method__Man@reddit
you have to just watch the video. everyone is obsessed with just skimming videos, bar charts.
sit down, get a beverage, and watch it all.
popop143@reddit
Same as for GN videos, last time Steve said something only 20% of viewers really watch the video and listen for the explanations. Most viewers just skip to the charts .
ResponsibleJudge3172@reddit
Because how many care about the mechanics of schlieren (no idea how to spell it) imaging?
42LSx@reddit
Yeah, because that's what many people are usually interested in. Since nowadays this stuff is extremely rarely available in text form, and YT is the go-to page for too many people, people are forced to skim through a vid to get the juicy excel tables.
Method__Man@reddit
And that's the problem, there's no context when it comes to our charts. There's no pause and show things like texture, there's no actual gameplay footage,
Our charts are not meant to be used as a sole source of information ... in fact the opposite
A bar chart is meant to be used as an auxiliary data graph that should be explained in detail over a long discussion. This is how it works in academic papers and in academic discussion. You don't just slap up a bar chart and send it out to the world. That's not how it works.
VibeHistorian@reddit
Adding an explainer over a 25 game comparison chart is perfectly valid - you can point out outliers, things that didn't work, unusual 1% lows, games where one or another card tends to do better, etc.
The issue is when half of a video is just slowly going through individual comparisons, with one or two obvious visual data points stretched out into 5 sentences, adding (nearly) no new information.
AC1colossus@reddit
I agree, but the TLDRs of the video have been discussing averages with mixed workloads. When the nuance gets buried, it's not good communication. Do those folks have to watch the video?
xole@reddit
Yeah, I'd rather be able to rank raster vs ray tracing vs path tracing myself. I still don't own a single game that supports path tracing, and only 1 game with ray tracing support, and I don't play it. So I wouldn't give ray tracing or path tracing as much weight as some other people would, though as time goes on, it becomes more equal.
liquidCarbon@reddit
What would be the best choice between a $630 9070xt and a $740 5070ti?
Knjaz136@reddit
Looking at those Oblivion results specifically, it seems 5070 Ti consumes significantly less VRAM in same scenarios.
Might be something to consider if you are playing heavily modded games. Like 10.1 vs 7.6gb difference.
LittleJ0704@reddit
I bought a rx 9070xt for 720 usd. Undervolt -60 and Power limit -16. Now it consumes 270 watts max and I lose about 2 fps.
Vapor chamber cooling and seven heat pipes. 54 degrees on GPU and 19-20 degrees more on hotspot. Memory temperature 76-80 degrees.
And it's a gigabyte rx 9070xt gaming oc and no putty leaks.
I have no problem with FSR4 of course there is still some tweaking to do but it's at that level.
Ray tracing and path tracing? The latter is weaker but really it's something that doesn't really matter. Best example is the Last of us 1-2 game. No ray and path tracing and still a beautiful game! In 4k with ultra graphics settings 80-90 fps (without fsr4).
850-1000 usd for the 5070ti. And even reaches 1200 somewhere.
It was clear which one to buy.
MiddleFoundation2865@reddit
Year is 2025, you buy graphic card that cost more than month of pay in many countries.
You need to reduce resolution to play game.
Dentingtea@reddit
Or, hear me out, people have disposable income and can spend their money however the fuck they want?
MiddleFoundation2865@reddit
You need to reduce resolution.
Reddit 2025, place of dislexic.
Dentingtea@reddit
You say that but you bought a 9070xt. What a fucking hypocrite
996forever@reddit
They can, they can also gobble up real estate in foreign countries purely as investments. It doesn’t mean people can speak out against it.
Dentingtea@reddit
I'm not saying you can't speak out against rising gpu prices, but saying "just lower your resolution" is a stupid fucking take. Am I supposed to wait 5 more years with my 3070 before gpu prices potentially go down? I'm already playing at low settings with some of the newer games.
Own_Nefariousness@reddit
The 9070 XT is a great card with a horrible price. MSRP means nothing, in my countries market it's the same old AMD strategy, just trail behind Nvidia ever so slightly, which when you take into account the features offered, it's simply not worth it, gaming features aside, if you ever need CUDA for anything the thing is outright dead. As always, AMD snatching defeat from the jaws of victory. I can't believe we've reached a point where our only hope for a sane market is Intel.
RedTuesdayMusic@reddit
Why does the 5070Ti stutter more in Oblivion? I thought it had higher memory bandwidth? Or is this case of a pre-overclocked Radeon model with that Steel Legend?
AreYouAWiiizard@reddit
Steel Legend is one of the cheapest 9070XTs with the base boost clock. Meanwhile the Asus Prime is a slightly more expensive model with higher boosts so no it's not that.
gatorbater5@reddit
my guess is driver overhead on the cpu. that game is super hard on the cpu.
Normal_Bird3689@reddit
Like the game, people are reporting issues on the XT also and the current fix is to disable resizzable bar.
ReeceT20@reddit
9070xt in the UK is £650, the 5070TI is £720, only 10% price difference, unfortunately it's a no brainer to buy Nvidia here
lifestealsuck@reddit
I sawn a some of the game AMD using more vram than nvidia and some of game they're both the same (mhwild, doom,kc2) .
Wonder why .
GloriousCause@reddit
One difference is that AMD uses Resize BAR in every game by default whereas Nvidia whitelists individual games. I'd guess that would likely cause differences. Also, there are just going to be differences in how the two brands handle memory management and compression, etc
buttplugs4life4me@reddit
SMA was specifically marketed to not just be turned on in every game
krilltucky@reddit
which is a problem since you have to go into your bios every time to turn it on or off. who's gonna do that?
lolatwargaming@reddit
In the video it leads to issues where AMD is like 70% behind nvidia tho, like it shits the bed.
lifestealsuck@reddit
Oh yea , I turn resize bar and hags off on my 3070 8gvarm, they're such a vram hog .
With them off I minimal suttering and fps drop in some game (e.g ff16 hideout fps increase by 50% on 1440p high dlss Q)
Keulapaska@reddit
FF16 doesn't have rebar on for nvidia by default.
Callling it a vram hog is a bit weird as Horizon Forbidden west manually enabling rebar did fix some vram related issues on 8GB cards for ppl at launch iirc, before rebar was enabled by nvidia automatically for that game.
HAGS on an 8GB card, no idea how that impact things though.
lifestealsuck@reddit
I just gave it as an one game exam when both off reduce fps drop and stuttering on my pc . There are many more obviously .
I though it help because with rebar on my vram usage stuck on 7300 , while it off it can get to 7500 , with hags off in can go as high as 7900mb . My fps goes from 40fps +microstuttering in hideout to 55fps +less stuttering .
Keulapaska@reddit
Oh that's funny as i think the problem with horizon FW was the complete opposite that without rebar it was stuck at ~7.5GB for ppl and then rebar on went up to near max.
Well i guess that's part of the reason why nvidia doesn't blanketly just enable it everywhere as it behaves differently in different engines, like how Horizon Zero Dawn rebar has(idk if the remaste has it) performance degradation but only on intel cpu:s.
Aggravating-Dot132@reddit
For Spider man 2 it's a memory leak on AMDs side. It has 14gb vs 15.5gb, yet it gets into RAM for some reason. Which means bad memory usage (optimisation). Probably one of the edge cases of console port and no opimisation (since PS has unified memory).
On average AMD has \~5-10% more usage than Nvidia. So you really have to find these edge cases where it does matter.
conquer69@reddit
Wish he didn't add graphs at the end where performance is unplayable and neither card will run the game like that.
Mech0z@reddit
Hope my 9070 XT Red Devil is order is fulfilled, 560$ VAT included in Europe :o
Much cheaper than the launch day 750$ I payed for RX 9070 (Not xt)