Daniel Owen - RX 9070 XT vs RTX 5070 Ti- The Ultimate Comparison
Posted by Antonis_32@reddit | hardware | View on Reddit | 169 comments
Posted by Antonis_32@reddit | hardware | View on Reddit | 169 comments
SomewhatOptimal1@reddit
TLDR: 5070Ti is roughly on pair in raster, 20% faster in RT and 50% faster in PT.
In my opinion 9070XT is much better buy over 5070, not so much over 5070Ti (even if 5070Ti is 100-150€ more). Not only due to performance, but also due to much better software stack.
I also reviewed HUB 9070XT review video and while 5070Ti gets 60fps in PT at 1440p DLSS Q in most games. 9070Xt can be up to 3x slower and usually gets unplayable 17-30 fps in Wukong, Indian Jones and AW2!
Legal_Lettuce6233@reddit
DLSS being in so many games may no longer matter. Some software I forgot the name of now allows FSR4 to be used in every place DLSS is implemented. Already showing good results in a few games, but some do have issues.
Oottzz@reddit
The tool you are talking about is called Optiscaler,
1aranzant@reddit
https://github.com/optiscaler/OptiScaler
Darksky121@reddit
The 5070Ti being 50% better in PT is a bit pointless when the framerate is low on both cards.
In CP2077 at native 1440P, RT Overdrive mode;
9070XT = 21fps
5070 = 32fps
These cards are not really suitable for PT is most cases.
SomewhatOptimal1@reddit
Sure native non are playable, but with DLSS Quality
5070Ti gets 64 fps DLSS Quality 1440p PT
9070XT gets 44 fps FSR3 Quality 1440p PT
5070ti is perfectly playable and 50% faster in PT in CB2077.
Same with multiple other titles like AW2, Indiana Jones and Wukong, where as 5070Ti is getting to roughly 60fps. The 9070XT is roughly up to 3x slower and on avg 2x slower.
Darksky121@reddit
You cannot accept that aside from a couple of Nvidia sponsored games, teh 9070XT is the gpu to get at an msrp of $600. Why pay $150 more for something that's around the same on average.
At FSR4 performance, the 2 or 3 PT games are also playable on the 9070XT. Who really cares if you get 60fps or 90fps when we all know most people would probably use frame gen if they really are going to use PT.
CataclysmZA@reddit
You're not thinking about this logically.
Yes, the 9070 XT gets close to a card that costs so much more, but the extra $150 you'd spend to get a 5070 Ti is generally rewarded with a vastly better experience in some key areas. Those are areas the 9070 XT won't be able to challenge for a long time, perhaps ever.
Don't think of the 9070 XT as a 5070 Ti competitor because it is not - the benchmark results clearly show that. It was aimed at the 5070, and AMD was going to price it at $650 to take advantage of a $100 premium for generally better performance and more VRAM.
If you have the money for a 5070 Ti, you would have to be mentally ill to get the Radeon for less. It is the inferior card.
SporksInjected@reddit
Are you talking about msrp or the real world?
CataclysmZA@reddit
Can be either. In his conclusion in the video, Daniel shows that the value proposition remains about the same even looking at current street pricing.
Even though the RX 9070 XT is better value, the 5070 Ti remains the better GPU.
SomewhatOptimal1@reddit
If it was one or two games you could call them a exception. When it’s 4 or 5 games it’s a rule.
5070Ti is getting roughly 60fps in PT in all games, that’s very playable.
Meanwhile 9070XT is getting well under 30 fps in multiple PT games with already FSR turned on at 1440p. Which is unplayable and FG only works well if you are getting 50-60fps in the first place. It’s not the crutch you want it to be.
Your comments make no sense, I think you need to take another look at 9070XT PathTracing results from Hardware Unboxed review video I linked before.
kikimaru024@reddit
5070 Ti is €200-300 dearer, though.
SomewhatOptimal1@reddit
Obviously depending on the region.
Homerlncognito@reddit
We also don't know how are the prices going to develop once the cards are more or less widely available..
SomewhatOptimal1@reddit
You are comparing 5070 not 5070Ti, with DLSS Quality 5070Ti gets to roughly 60 fps avg
Darksky121@reddit
It was a typo dude. The debate is about the Daniel Owen video posted which is clearly 9070XT vs 5070Ti and that is where the 50% figure is from.
SomewhatOptimal1@reddit
You’re right, I deleted my response and gave new one.
Korr4K@reddit
I would also add to the list DLDSR with their DL scalings; No idea why people who don't have a 4k monitor aren't using that feature constantly, or at least it's not talked as much as it should. Guess it's because it's hidden in the nvidia control panel
AMD has their VSR but it's still not DL/AI based so it's very limited, meaning you have to own a card much more powerful than your native resolution to use it.
DLDSR is probably the main reason why I still got a 5k series
upvotesthenrages@reddit
I mean, it's also highly likely that it's due to most people not having insanely monstrous GPUs that can render games at 4K, but then saved money by only getting a 1080p/1440p monitor.
I guess it could work well for older games, but it's a pretty niche feature for a reason.
Korr4K@reddit
Dldsr isn't 4k, 2.25x is between 1440p and 4k. Add DLSS and I think most people could try it
upvotesthenrages@reddit
What is the point of adding DLSS and DLDSR together?
The point of DLSS is that you can't run stuff at native, so you run it at a lower internal resolution and upscale it.
It's the diametrical opposite of DLDSR, right?
Yellow_Bee@reddit
Because it's better than DLAA...
Keulapaska@reddit
It does however cost more performance as well(and maybe slightly more vram?) at the same render res so DLAA vs DLDSR 2.25+dlss quality, though usually worth the extra performance cost.
TrptJim@reddit
DLDSR handles downscaling, while DLSS handles upscaling.
Combining the two, specifically using DLSS Quality and DLDSR 2.25x gets you the best of both worlds. You get a scaled native resolution input and output. It looks fantastic in games like RDR2.
TrptJim@reddit
DLDSR is exactly what it says it does, rendering the image at a higher resolution (2.25x native) and using AI to scale down to native.
Korr4K@reddit
Yes but Nvidia claims that "ds 2.25x" has the same visual quality of "native 4x" but at a much lower computational cost. So while you can do the same with AMD's VSR it's much more efficient
panix199@reddit
do these games have no FSR-support at all? Not even through modding?
SomewhatOptimal1@reddit
That’s with FSR already.
panix199@reddit
i am a bit suprised/confused by the benchmark... epseiclaly for Indiana Jones. So why does a 4070 TI have only 6 - 12 fps while a 4070 TI Super 39 - 47 fps?
Ill-Investment7707@reddit
price dif, either msrp or not, makes it a no brainer. 9070xt all the way
Strazdas1@reddit
9070 xt is more expensive than 5070ti in real life. what now?
rxc13@reddit
I would like to live in your real life where 5070 ti's are under 1000. That would be nice.
Strazdas1@reddit
Come to eastern europe :)
Matei_SAURON@reddit
which country? in RO most of them are at least 1200 euros
Strazdas1@reddit
im in the baltics and theres plenty in the 900-1000 range to choose from.
TopdeckIsSkill@reddit
Where I live 9070xt can be bought for 850€, meanwhile 5070ti is at least 1100€
Strazdas1@reddit
Thats unfortunate for you, but thats not how it is where i live.
Neustrashimyy@reddit
I am upgrading from RTX 2000 series. If I could choose either at msrp, I would pay $150 more for DLSS4 transformer and better RT/PT.
LeMAD@reddit
I'd argue that waiting for the next generation or buying a used card is a no brainer.
Omputin@reddit
Did you watch the video?
on1zukka@reddit
4070ti super was cheaper than 9070xt here, prices outside USA are wild
West_Bandicoot_7532@reddit
for me its the other way around 4070ti super is more expensive then both 9070xt and 5070ti :D
JaykopX@reddit
It's all about the price with RX 9070 (XT), 5070 (Ti) with the current pricing (Germany) best deal is the 9070 XT:
5070: 639€
9070: 679€
9070 XT: 777€
5070 Ti: 949€
tartare4562@reddit
I need a PCVR comparison with flight simulators before pulling the trigger, too bad the few that are made take weeks to come up after release:-(
MonoShadow@reddit
VR isn't the focus. Plus a lot of new titles come natively to Quest.
I found this thing tho.
https://hothardware.com/reviews/amd-radeon-rx-9070-xt-and-9070-review-and-benchmarks?page=3
tartare4562@reddit
I know VR is a niche, but that's what I like to do. Native quest stuff is not simulation.
That benchmark was incredibly helpful tho, thank you very much!!
Alternative_Spite_11@reddit
Does Flight Simulator not allow single pass stereo for VR so the performance is just slightly below equivalent resolution on a normal monitor? That’s where instead of completely rendering a full image for each eye the engine just renders one slightly larger image and just shows like 2/3 of it on each eye. On games that support it, it greatly improves performance. It makes a huge difference in iRacing for sure. I know DCS also had it available at one time but I don’t know if it made it through the various graphics updates though.
tartare4562@reddit
I don't understand how that would work. By how you describe it you would lose all 3D stereoscopic effect.
Adventurous_Part_481@reddit
Like saying not to review God of war because you can play it on playstation.
Noble00_@reddit
Just searched this up. DCS VR They show game settings and the performance overlay
RaptorRobb6ix@reddit
People keep saying that 16gb vram is on the low side for this mid/highend gpus, why is the 9070xt using 2 to 3gb more vram than the 5070ti in half of the games or situations here?
Can he pick some games that come close to the 16gb usage and than compare both cards.
Johnny_Oro@reddit
Nvidia's driver behavior, like how it translates color data into colors and such, makes it more VRAM efficient than Radeon. I don't know the details, but two very different architectures just can't be directly compared. The draw calls are simply different. The VRAM latencies and bandwidth aren't even the same. Comparing 4070 to 7700 XT, it seems to me 4070 always uses less VRAM, whether it's a deliberate design decision or not.
But that doesn't mean Nvidia's VRAM is adequate. That behavior differs from software to software, and some of them will certainly use more VRAM than the others. In some applications, Nvidia's advantage is less than 0.5GB.
That's why Nvidia is working on their upcoming texture compression technology which is said to compress over 90% of texture data and uncompress them on the go. Very efficient, but could also mean their older cards will go obsolete faster as game companies will choose to compress as much textures to as possible out of greed.
grizzly6191@reddit
My 3080 would stutter in games even though it reported it was using less than it's entire 10GB of available memory.
BookPlacementProblem@reddit
One guess is that most? some? AAA engines will use more VRAM if it's available to avoid having to reload things. And/or load larger textures.
Strazdas1@reddit
Unless there is hard limit on memory pool size the game will allocate as much as it can to itself because it expects to be the exclusive user of VRAM.
BookPlacementProblem@reddit
That is what I expect, but also I haven't worked with most game engines because most game engines are in-house; hence the technical qualifier.
Sedreen@reddit
Found for 5070 ti and its my first build. Could return it still. 9070 xt wasnt available nor was anything else. If 9070 xt becomes availabile within my return window would it be better to switch?
It was 880.
Going from a 2060 laptop.
Sedreen@reddit
Just found a 5070 ti available at microcenter for 880. I think going from a 2060 laptop to that in a desktop is a better deal since it was available at or close to its MSRP
ParusiMizuhashi@reddit
I acknowledge the 9070Xt being the better deal but I still went with the 5070ti because I could actually find that one in stock long enough to buy it
DirteeCanuck@reddit
These reviews aren't acknowledging the CUDA cores or backward compatibility.
Those things add value. For people using the card for more than gaming, the AMD option is as useless as a potato.
sammerguy76@reddit
Yeah but anyone that actually needs CUDA for their job can easily afford to buy Nvidia and it becomes a tax write off.
Strazdas1@reddit
What about people that need cuda cores for thier hobby?
sammerguy76@reddit
Like what? LLM or machine learning? Video editing? You'll just have to do with what you can afford. Since it's a hobby your tasks may take more time but you can still get a much older card with CUDA cores.
Strazdas1@reddit
I use image generation to create tokens and backgrounds for TTRPG i run. The budget to comission is zero because i do this for free. But with CUDA cores i can do it with AI.
sammerguy76@reddit
You could do that with a 1070 or 1080 no problem since there is no time constraints. I ran SD on my 1070ti and it was fine.
Strazdas1@reddit
It would still be running on CUDA on a 1070 and 1080. I actually have a 1070 i could try it on but not sure if i want to. Im doing 700x700 because it needs to be divisible by 70 pixels.
sammerguy76@reddit
Yeah I am aware that the 10 series has CUDA, that's why I brought them up. I guess the point I was making is that it's probably not worth it to buy a new new overpriced NVidia GPU if you're just using it for something like that. Honestly unless you are making hundreds upon hundreds I would just use one of the free or low cost generators online. They are far faster than running anything locally. Unless you are playing a pornographic TTRPG 😂🤣😂
Strazdas1@reddit
Online generators are limited once you stray outside the common tropes. In my game a lot of people have to wear gasmasks. Traditional image generators really hate it.
DirteeCanuck@reddit
It's nice to have it even just for light tasks or tinkering.
I just think it's something being completely overlooked in these comparisons as it does have some value.
Renard4@reddit
And what for exactly? 99% of GPU buyers will never use CUDA at all. This is the very definition of wothless to most.
conquer69@reddit
The gpu stock is low precisely because of demand for non gaming tasks.
There is plenty of reasons to dislike nvidia, there is no need to make things up.
Renard4@reddit
Don't be silly, the topic here is consumer GPUs.
Hamza9575@reddit
What backward compatibility. 5000 series dropped backwards compatibility. The whole physx thing. If you want backwards compatibility get a 4090, strongest gpu with physx support.
DirteeCanuck@reddit
I play a lot of Emulators and also run Linux and Botocera for gaming.
PhysX I don't have much care for, but it sucks they dropped it. Maybe it can be added somehow in the future.
I impulse bought a 5070 TI for $150 over MSRP ($1400CDN) and returned it.
Was giving me BSOD and had some issues that I probably could have sorted out. The reality is I don't want a 300W card. So the 4090 also out of the question.
Today I grabbed a 5070 ASUS PRIME for MSRP and at 250W it's even a little more than I would like (ideal 200W-225) but it should be a good fit for my needs.
SomewhatOptimal1@reddit
Something doesn’t add up, isn’t AMD much better on Linux for gaming…
Why u lying 🤥
Strazdas1@reddit
No? AMD has better driver for AAA gaming in linux. but if you are doing specific stuff you can often find AMD driver simply not working.
BarKnight@reddit
Given the current prices, it's not really a better deal.
Omputin@reddit
I mean 5070ti is currently way closer to msrp than 9070 xt
OftenSarcastic@reddit
Lol when I looked at local prices yesterday they looked like this:
Today the 5070 Ti is still 1000 USD and the 9070 XT is 835 for the Nitro+ model. Given the incoming supply I'd rather wait for the 700 USD "MSRP" models to restock next week than pay 1000 USD for a 5070 Ti or 835 USD for the Nitro+.
Aerroon@reddit
I looked at a local computer store here:
ErektalTrauma@reddit
At those prices you'd be insane to buy a 9070XT over a 5070 Ti.
Strazdas1@reddit
welcome to europe, where AMD is always the worse deal.
Woodworkingbeginner@reddit
Funny, I would have thought that with the USA getting tariffs on Chinese goods I thought they would see a price increased in comparison to Europe. Turns out that European retailers always manage to out do themselves
Strazdas1@reddit
Both Nvidia and AMD are american.
Woodworkingbeginner@reddit
Ok but one of the cards are assembled in America and the tariffs are applied at the point t of import, not manufacture
dab1@reddit
Here Nvidia rarely is the better deal. Probably depends on which european country you are, and what retailers you are looking at. PCPartPicker usually doesn't list the better prices that I can find when I browse directly some of the stores in my region.
The prices and availability of graphics cards in general has become increasingly worse in the last few months. The new releases aren't correcting that trend, but I've seen 9070XT around 820-870€, available in stock or listed at those prices, while the cheapest 5070Ti listed is 1200€+.
The 5070Ti might be better overall (way better at RT/PT and lower power consumption), but at a 400€ premium is not worth it in my opinion.
Strazdas1@reddit
there has been 5070tis in stock for 999 post-tax sitting for over a week now.
Pillokun@reddit
yep, when polaris 10(rx 4080 launched) it was 3000sek,(300usd today) when the 1060 6GB was basically the same price, and the older maxwell gpus 9070 was 2400sek and 9080 was 280 and the older amd gpus 290 was 1400, 290x was 1800 and 390x was 2400sek..
somehow amd gpus are super expensive or at the launch, even as expensive or more than competitive nvidia ones.
Recent_Rabbit1421@reddit
You'd be insane to buy any of these cards for that price…
ezkailez@reddit
If that were me I'd be insane if I'm even buying a gpu lol. I'm used to waiting for the best value for money or just downgrade.
Last year i replaced my dead 1660ti with a new (refurbished? It's a random chinese brand) $90 rx580 because i realized i don't really game that much.
And now i just upgraded to a 2k monitor, and a used $150 rx 6700xt looks mighty interesting
kikimaru024@reddit
I pre-ordered a Sapphire Pure 9070 XT on Amazon for €780 on Thursday.
Aerroon@reddit
That's a great price!
kikimaru024@reddit
Aye, the downside is it's not shipping for another 6 weeks; but that's fine - all I want from AIBs is a firm MSRP and a way for them to honour it!
AsheBnarginDalmasca@reddit
Yeah we're just back to 4080 super prices.
ishsreddit@reddit
the missing rops resulting in -10% perf on an incorrectly reported amount of 50 series GPU's is also a downside. Idk why people aren't mentioning this more. I would bet people would be all over AMD for it....
1-800-KETAMINE@reddit
People have been all over this. A joke about it is frequently the top or at least among the top comments on any thread about the RTX 50 series.
ishsreddit@reddit
who else comments about ROPs here?
1-800-KETAMINE@reddit
Oh boy. Please just put "rops" into the search bar for this subreddit.
ishsreddit@reddit
I meant in threads with the 9070 vs rtx 50 series like this one. When comparing them apples to apples would you not agree it should be a consideration at the moment? Similar to when RDNA3 had vapor chamber issues?
Most people don't mention the absent rops when comparing the 2. Thats what i was referring. Sorry for the misunderstanding/lack of context.
deoneta@reddit
Truth is it's an issue that happens to a very small percentage of users and Nvidia is replacing every card that has the issue. It's just not as big of a problem as some would have you believe. If it were happening a lot we'd hear more about it.
You've got take the negativity you see on this subreddit with a grain of salt cause a lot of it is just karma farming. This is a case of people taking a legit issue that only affects a small subset of users and blowing it out of proportion.
1-800-KETAMINE@reddit
The subject is sort of a dead horse at this point. I guess you just missed it being mentioned constantly, nonstop, for the last couple weeks since it was discovered. Stick around on the sub for another day or two and I'm sure you'll see more comments about it.
Even on /r/nvidia it's mentioned all the time, this is one of the top posts for today: https://www.reddit.com/r/nvidia/comments/1j6hbu8/use_gpuz_to_check_rops_not_cpuz/
ErektalTrauma@reddit
All 0.5% of cards where you get a free replacement or refund, wow, such a huge issue.
conquer69@reddit
No one has verified that number. There is nothing stopping these companies from making shit up to brush issues under the rug. Like intel with all their fucked up cpus they didn't replace.
1-800-KETAMINE@reddit
to be fair it is ridiculous that it's "affected customers can reach out for an RMA" instead of "we are reaching out to those affected"
ishsreddit@reddit
The irony of these replies im getting.... Exactly my point, people ignore nvidia missing rops. Im not even mentioning cables still melting, inconsistent performance etc etc. Its not just a price issue. There is a reason why reviewers call this the worst Nvidia launch ever.
ishsreddit@reddit
We are still well within the stage of addressing the issue. Its unclear whether or not users have been getting a quick and swift RMA process over their insanely inflated rtx 50 series gpu.
comments like yours are exactly the problem i pointed out lol.
Toojara@reddit
Similar situation here. 9070 730€, 5070 860€, 5070 ti 1140 €. Most expensive stocked 9070XT's were sold out at \~950€.
Pillokun@reddit
in sweden u can find 5070ti, gaming x treo, ventus, prime for 13.000sek while the 9070xt are not in stock and the 9070 are almost 10.000sek, basically 1k usd.
OftenSarcastic@reddit
Here's the Nitro+ one for people who can't wait for re-stock: https://www.computersalg.se/i/24108877/sapphire-vga-16gb-rx9070xt-nitro-gaming-oc-2xhdmi-2xdp-nitro-amd-radeon-rx-9070-xt-gaming-oc-16gb
Disguised-Alien-AI@reddit
9070XT is way more available though. Just need a couple weeks and it'll stop selling out. If you live by a Microcenter you can get them at MSRP 100%.
Strazdas1@reddit
Its not. I can easily get any of the Nvidia cards excpet 5070 because they are all in stock. Above MSRP, but in stock. The 9070XT is a lottery whether you find any in stock or not. and when you do it costs as much as 5070 ti.
aminorityofone@reddit
press x to doubt
ParusiMizuhashi@reddit
I have nothing to gain by lying. MSI put up all their 50 series cards on their webstore today
aminorityofone@reddit
x to doubt as MSI upped the msrp.... making it a bad deal. You got screwed in short.
ParusiMizuhashi@reddit
Ill be fine
aminorityofone@reddit
a gpu isnt about being fine or not fine. It is a luxury item. You purchased a card outside its msrp. You got screwed and youre okay with being screwed over, which is okay for you.
ParusiMizuhashi@reddit
Alright dude what is the point you're trying to make? My first comment acknowledges that the 5070 ti isnt the best deal. Are you just trolling to be a dickhead or what?
aminorityofone@reddit
You admit that it isnt the best deal and are mad that you got screwed. IDK you straight up say it isnt the best deal and are happy with being screwed. You tell me?
ParusiMizuhashi@reddit
Are you delusional or illiterate? I never implied I was either of those things
Frylock304@reddit
Same, plus the 5070ti is generally within 5% of the 4080 so good enough
Signal_Ad126@reddit
The monitor I need it for is exclusive gsync tho...
Plank_With_A_Nail_In@reddit
How old is your monitor?
Signal_Ad126@reddit
An early ROG 1440p
Plank_With_A_Nail_In@reddit
A new cheap IPS 1440p monitor will likely cost less than the difference between these cards and will be better than your current monitor...tech moves on.
Signal_Ad126@reddit
It's not my current monitor, it's for the garage. My current monitor is an Alienware ultra wide OLED. I'll probably just get a second hand 40xx for this build.
Slyons89@reddit
Even if it has a hardware Gsync module, if it has displayport 1.2 input, it should also be capable of adaptive sync (freesync).
Logical-Database4510@reddit
This really depends on how old the monitor in question is.
My old Asus monitor with module doesn't support free sync without a firmware update. How do you update the firmware? Ship it out to Asus /if/ they still even offer the service. They didn't in my case, which is the entire reason I ended up getting a 4070ti vs a 7900xt at the time as any NV card would have meant I had to buy a new monitor :/
As monitors tend to be one of the least often replaced components, there's likely a decent amount of early adopter gsync people who are stuck in this limbo. I didn't upgrade my monitor for another year after I bought the card, so it just was what it was at the time 🤷♂️
Signal_Ad126@reddit
Yeah plus I live in Australia, It's one of the early ones prior to freesync being announced by AMD. Some commenters are suggesting there was always a choice... It's still a good ROG 1440p monitor, I'll possibly consider a 40 series even to keep using it.
Slyons89@reddit
Yeah I’m pretty sure that was nvidias main strategy with hardware gsync modules to begin with. To lock in buyers to keep having to buy Nvidia GPUs. As someone who switched back and forth from AMD to Nvidia a lot I was happy when my Acer screen still worked with adaptive sync despite the hardware gsync module. But it’s a relatively recent display from ~2020 I think. One of the Acer predator ultrawide screens.
Logical-Database4510@reddit
Yeah mine was a 2017 model Asus 1440p display. No option but buy a new one or go NV 🤷♂️
While I've been mostly happy with my 4070ti, it definitely sucked being locked in like that.
Difficult thing to grasp with is the question of whether or not I'd make the same decision knowing the future...? Likely, yes unfortunately. Adaptive Sync is that important to me imo...I bought the OG modkit and monitor needed and put it together back in the day when the very first gsync module launched as a mod for a specific ASUS 1080p screen. Reason I have a Rog ally today over a Steam Deck as well tbh....
That's what makes these anticomsumer tactics so aggravating....they work, sadly 🤷♂️
I will say that adaptive Sync was a rare instance of my biting the bullet because I cared about the tech that damned much, so I don't see myself locking myself in like that again in the future. But, I guess you never know, eh?
ItsMeSlinky@reddit
If you have a 4070 Ti, you really don’t need to upgrade this generation
Logical-Database4510@reddit
Yeah I don't plan it for a while lol
Slyons89@reddit
Not to worry, Nvidia is discontinuing the hardware gsync module anyways, so future monitor should be an easy choice. In fact, all the highest rated new gaming monitors and OLED screens are purely adaptive sync / freesync / gsync compatible, no hardware gsync modules to begin found.
_zenith@reddit
Is it a PG279Q? The ROG 27” one? Quite a common and popular IPS monitor. It’s what I have.
ThankGodImBipolar@reddit
LOL, that is ridiculous
COMPUTER1313@reddit
ASUS: “Would be a shame if we held it hostage”
Signal_Ad126@reddit
Ah right, thanks for the helpful comment
sautdepage@reddit
You locked yourself in to proprietary tech, enjoy.
Signal_Ad126@reddit
This monitor existed before freesync ever did.
COMPUTER1313@reddit
Anyone want to buy some Rambus’s RDRAM sticks?
PoL0@reddit
your gsync monitor can be freesync compliant too.
CommenterAnon@reddit
I planned to buy the RTX 5070 for 800 USD. RX 9070 XT was 85 USD more. RTX 5070ti is almost 200$ above 9070 XT in my country
In my situation RX 9070 XT is GREAT VALUE!
Gigabyte Gaming OC Model
anor_wondo@reddit
Wish amd released high end this time, when they finally have competitive ml based upscaling. Too much of a cut down from 5080 for vr sadly
Pillokun@reddit
The thing is, it is easier to get hold of an 5070ti(3000sek more, say about 300usd/euro) than the 9070xt as they actually get available from time to time, 9070xt not so much, and the fact that the 9070 non xt are in stick but for almost 10.000sek is appaling.
ftt28@reddit
nice aggregate video but seems quite skewed at the end of video to theorize 9070xt stock stabilizing at $800 given the magnitude of difference in stock they've been able to produce compare to Nvidia.
My bet is the 9070xt (msrp models) will settle much closer to original MSRP once restocks begin while the 5070ti will continue to be $900+ without any model ever being available for MSRP.
Current 9070 xt prices on ebay is scalpers trying to scalp in the gap between debut and restocks.
Charrat@reddit
To be fair, he did say it was too close to the 9070 XT launch to know where the prices would ultimately settle. The take away was that you should account for the price difference between the 5070 TI and 9070 XT at time of purchase when making your decision; at the same price, the 5070 TI is clearly the better choice. If the 9070 XT can be bought for less, it can become a better value depending on your needs - i.e. how much do you value RT, PT, DLSS compatibility, etc.
GloriousCause@reddit
He looks at a variety of prices, I wouldn't say any of it was skewed. The 1000/800 was based on most common sold prices on eBay for each card right now, and he mentioned the 9070xt hadn't been out as long yet to have time to settle.
ftt28@reddit
yes, but to me it's a bit skewed to reference 5070ti's mode price on ebay ($1k) but then use $900 and $730 as example of possible real world price comparison to conveniently maintain the 25% price difference.
It feels misguided to mix 9070xt's current price surge (days after a high volume release) and 5070ti's possible response in pricing (weeks after low volume release with no confidence in showing any true volume potential let alone of msrp models specifically).
Noble00_@reddit
Punched some numbers and if it means anything to anyone, in the all RT test, removing CP2077 OD (PT), geomean goes down to 16.35%.
Bill_Murrie@reddit
...why..?
Noble00_@reddit
Why not? It's a data point to share. 23% geomean with it and 15.5% without it. It's the only data with >40% differential, that's a large difference compared to the other games.
Strazdas1@reddit
because it skews data and shows bad results is why not.
conquer69@reddit
Why would you remove PT? All these path traced games are very much playable on the 5070 ti and more titles are coming. I think it's a reasonable expectation when people are paying $750+ for a gpu in 2025.
RunForYourTools@reddit
PT its in only some games, and heavy sponsored by Nvidia. Its not needed and very taxing! It exists just for green team to brag better numbers relative to red team, nothing more!
CassadagaValley@reddit
We're way too early in PT to really be thinking about non-high end cards running it. Maybe with the next gen set of cards we can seriously look at the 6070TI or 10070XT running a game on PT but it's currently at the point where if you're aren't grabbing the 5080 or 5090 it shouldn't really be a factor.
StickiStickman@reddit
It's already been perfectly playable on a 4070 To, especially so on a 5070 Ti
niglor@reddit
It might not be «needed» but the effect you get when enabling PT in cyberpunk is undeniable. CP2077 with PT being playable on a 5070 Ti is certainly an opinion though (I have a 5070Ti and play cyberpunk).
EnigmaSpore@reddit
Or. It’s because people paying that much want to see what they’re getting in performance. I wanna see all the numbers. If im paying $750+, i need to see it. It’s not some conspiracy. We’re gpu nerds who just wanna see the receipts
conquer69@reddit
Are you saying people shouldn't play with PT enabled until AMD can sponsor their own PT in games?
Path tracing looks objectively good regardless of who sponsored it.
Noble00_@reddit
The 9070 XT has the grunt in RT for CP2077 Overdrive don't get me wrong (that in itself should be impressive), but it's outlying data as it's really only optimized for RTX cards. Removing PT means regular RT (that still has a huge performance hit, I still kept Indy Jones), meaning the usual API calls that all vendors can tackle
conquer69@reddit
There is no evidence of this. But even if it was true, I don't see how that is a concern for the user. The only thing that matters is performance and image quality.
Why AMD is underperforming isn't really my problem.
Noble00_@reddit
But there is?
https://www.youtube.com/watch?v=-IK3gVeFP4s
https://chipsandcheese.com/p/shader-execution-reordering-nvidia-tackles-divergence
Not to mention, Ray-Reconstruction. Of course it isn't really your problem. This is a non-discussion. If you want path tracing you get an RTX card... You fully misunderstood a rather plain statement
onurraydar@reddit
Honestly the 9070xt seems like a better deal but since I don't live near a microcenter its basically just as hard to get as a 5070ti. If I am going to be waiting to get an MSRP model for months and doing a hotstock tracking I'm just gonna try for a 5070ti as the 9070xt seems to be going up in price once AMD stops doing the rebates and the MSRP models all sell out. 5070ti also seems to be the better card. 8% better on average and 23% better in pure RT. If the 9070xt supply normalizes and the 599 models are still around I would probably go for it but I don't really think it's gonna happen.
Disguised-Alien-AI@reddit
In the Cyberpunk RT example, the 9070XT runs as low as 2.5Ghz. So, it's running quite a bit below spec. Seems like new drivers will fix some of these types of oddities and like increase performance even more.
TheNiebuhr@reddit
Lol no, it's power throttling, simple as that. There's nothing to fix.
Disguised-Alien-AI@reddit
I tested it on my 9070 and I go from 3.1-3.2 Ghz, no RT, down to 2.75-2.85Ghz with RT on. So I think you are correct. However, his GPU seems to really take a hit. (I'm using the non-XT variant)
Sh1rvallah@reddit
Was it thermal throttling? I wonder if the next RT load is causing some issues there.
Disguised-Alien-AI@reddit
Don't think so because they all run well below throttling temps. Possibly unseen high hotspot though....
Sh1rvallah@reddit
Yeah I was wondering about hotspot because isn't there some new component on there to handle RT now? I wonder if they have a sensor on that.