Really can't decide between the 3 horsemen of 1440p GPUs
Posted by FaceTheBlunt@reddit | buildapc | View on Reddit | 117 comments
I want to make the jump to 1440p with a nice OLED monitor.
I mainly play games like cyberpunk, rdr2, gta, elden ring, the usual. Throw in some shooters and sports games at 1080p 240hz
I can get a 5070 for $550
a 9070xt for $689
a 5070ti for $975 (bruh)
all pre-tax
I'm thinking the ti is easily out of the question, outrageous pricing. But I really can't decide whether 9070xt is worth $140 over the 5070 (and the loss of the more popular nvidia features)
I want to also use my old gpu as secondary (3060) to try to get lossless scaling frame generation setup on it, just to tinker and fuck around.
Any advice? Goes with a 7800x3d, 32gb ram, 850w psu
Fro5tbyte@reddit
Hot take: I’d say the 9070xt/5070ti are overkill for 1440p. I have a 9070 (non xt) and play games comfortably at 4k 120+ with a little FSR. It looks fantastic and I could definitely drop back to 1440p for a much higher frame rate
gakule@reddit
I don't think it's overkill really, but I did struggle with the choice between 9070 and XT... I ended up getting the XT and undervolting it for 5120x1440 and it has been rock solid.
Leo9991@reddit
I wouldn't say they're overkill at all. There are still some games that definitely push things, and new games coming out will continue pushing things.
iron_coffin@reddit
Resolution isn't that big of a difference anymore, just run 4k performance or 1440p quality for similar fps. I feel like it's level of path tracing+ having the vram for mfg.
Federal-Implement716@reddit
Idk if youve decided yet but, i went from 3060ti to 5070, its great, enough vram, good fps, and i got it for 520€, 9070xt for 690€ isnt that bad, but might not be worth it, 5070ti is just too expensive, 9070 i wouldnt really go for, nvidia is still nvidia, also only 70$ difference go with xt if any. Also if you sell 3060, like i did mine, it means you can “upgrade” your gpu for like 300$, and then you do it agan for 6070 or 7070, in a year or two, which both will probably be better than 5070ti.
The main problem is at 1000$ mark youre stepping into possible 3090ti or 4090 mark, which are both better
Prestigious-Crazy-84@reddit
Just for curiosity( I’m not American) if that 5070ti is $975 before taxes, how much would you end up paying? Bc that price tag is already big,holy shit
FaceTheBlunt@reddit (OP)
7% sales FL tax
kawaii_Summoner@reddit
9070xt = 5070ti > 5070
iron_coffin@reddit
Raster: 5070ti > 9070xt >>> 5070 Ray tracing: 5070 ti >> 9070xt = 5070
kawaii_Summoner@reddit
Raster: 5070ti = 9070xt > 5070*
iron_coffin@reddit
https://www.techspot.com/photos/article/3048-nvidia-5070-ti-vs-amd-9070-xt-with-dlss-fsr/#difference2-png
It is close, but the 5070ti had some commanding wins. Then AMD cards are dropping off so future games will be even more biased.
kawaii_Summoner@reddit
"Nvidia GeForce RTX 5070 Ti vs AMD Radeon 9070 XT with DLSS and FSR Enabled"
That isn't rasterization.
Jack2102@reddit
It is. Rasterisation means no ray tracing.
kawaii_Summoner@reddit
No. It means no frame gen
Jack2102@reddit
What are you talking about? Frame gen is completely seperate topic to rasterisation
kawaii_Summoner@reddit
Would you benchmark rasterization with frame gen ON or OFF?
Jack2102@reddit
You can benchmark it with frame gen both on or off, its a completely seperate thing, thats like asking if you can benchmark rasterisation with high or low textures. Rasterisation means that it is not using ray tracing.
kawaii_Summoner@reddit
When you turn on DLSS and FSR4, are they equal?
Jack2102@reddit
Maybe? Maybe not? Sometimes? I'm not sure the point you're trying to make with that, upscaling techniques also aren't relevant to whether something is rasterised or not
kawaii_Summoner@reddit
If you're looking for rasterization benchmarks to compare data, do you think you should enable settings that are "maybe? Maybe not? Sometimes?" The same?
Or do you think you should use the exact same settings to compare the two cards?
Jack2102@reddit
I think you should use the same settings yes, but that wasnt what I disagreed with you on.
kawaii_Summoner@reddit
If you think you should use the same settings to measure rasterization, then why would you enable frame gen?
Jack2102@reddit
To compare how frame gen works on an nvidia gpu vs a comparable amd gpu?
The same way you'd compare dlss4 balanced on nvidia vs fsr4 balanced on amd
Its not an issue if the same features are used for both gpu's and clearly stated
kawaii_Summoner@reddit
"To compare how frame gen-"
No.
How would you compare rasterization? Not frame gen.
Jack2102@reddit
Thats the disagreement. It is still a rasterised image regardless of whether frame gen/upscaling is used. The only time it becomes a non-rasterised is when ray tracing is used.
kawaii_Summoner@reddit
Do you think the frames added by frame gen are rendered the same as native frames?
Jack2102@reddit
No but thats not what rasterisation means, I'm not sure how much clearer I can be
kawaii_Summoner@reddit
So when you enable frame gen, it isn't all rasterized frames...
Jack2102@reddit
Those frames arent using ray tracing (if its not enabled) so they will be rasterised
Ray tracing and rasterisation are the inverse of each other, rasterisation can mean anything but ray tracing, and ray tracing can mean anything but rasterisation
kawaii_Summoner@reddit
Wrong. Rasterization is a rendering process that is entirely different than AI interpolation of frames.
Jack2102@reddit
Source? I can give sources for my explanation
Nvidia
kawaii_Summoner@reddit
Rasterization, ray tracing, and frame generation are three distinct, different processes used in computer graphics.
Jack2102@reddit
That doesnt mean frame gen cant be used with rasterisation
kawaii_Summoner@reddit
If you're measuring rasterization, why would you introduce AI generated frames into the measurement? Now you're measuring rasterization WITH upscaling software. Not pure rasterization.
Jack2102@reddit
I agree with you regarding benchmarking techniques but, again, thats not what I'm disagreeing with you on
kawaii_Summoner@reddit
Rasterization is a different rendering process than frame generation... brother... you're so close to leaning something.
Define frame gen.
Jack2102@reddit
All frame generation does is insert AI generated frames inbetween the rasterised frames, that does not mean that the output is not rasterised, it doesnt change the technique that is used to generate the real frames
For an image to not be rasterised it has to be ray traced
kawaii_Summoner@reddit
Rasterization is a process, a rasterized image is the end result of that image. You're not measuring the quality of a rasterized image... you're benchmarking the rasterization process.
Jack2102@reddit
I'm not disagreeing with you on benchmarking processes.
Do I have to repeat what my one and only claim is again?
kawaii_Summoner@reddit
Do you think rasterized image and rasterization are the same thing?
Jack2102@reddit
No, but again, a technicality on wording that doesn't change my overall point. Yes, I once again agree with you that a rasterised image is the output from the process of rasterisation.
A rasterised image is the output you get when ray tracing is not used.
Worded better
kawaii_Summoner@reddit
Do you think that you can measure rasterization when you have frame generation enabled?
Jack2102@reddit
Holy shit are we back to this again?
Yes, if frame generation is enabled it is still a measurement of rasterisation performance but with frame gen on top.
Once again, for those in the back
The only time it becomes non-rasterised is when ray tracing is enabled
kawaii_Summoner@reddit
Rasterised =/= rasterization
How would you measure rasterization with frame generation enabled?
Jack2102@reddit
At that point you're not measuring purely rasterisation, but rasterisation + fg, I will concede that
You seem quite hung up on the technicality of the wording regarding rasterisation & rasterised
kawaii_Summoner@reddit
You just don't seem to understand that rasterization is a process and a rasterized image is a result of that process.
Frame generation is a DIFFERENT process that adds AI generated frames in-between frames that are natively rendered.
And to measure your GPUs ability to use the rasterization process, you shouldn't add an entirely different process on top of it.
And Ray tracing is an entirely different process that is, just like frame generation, used ALONG WITH rasterization... and with Ray tracing enabled your GPU with use a hybrid approach of rasterization and Ray tracing, because the majority of geometry is still rasterized...
Jack2102@reddit
Ironic considering the only claim I'm making is factually correct lol
kawaii_Summoner@reddit
It's ironic that you don't know what frame gen is, and are incapable of learning
Jack2102@reddit
I'm not sure how much clearer I can be
Rasterisation does not mean that frame generation isn't being used. It means that the image is not ray traced.
kawaii_Summoner@reddit
'Rasterization' and 'rasterized image' are not interchangeable terms
Jack2102@reddit
Getting into a technicality on wording doesnt change the point I'm making
iron_coffin@reddit
Read the actual image: it's taa. My mistake was they added some rt games in, but 2 of the largest outliers are raster, so doesn't matter. Average out to the same, but the biggest noticeable 15% swings are in the 5070ti's favor. If you want to only look at averages, 5% is mostly equal, but it's so variable that it really depends on game selection.
kawaii_Summoner@reddit
https://www.techspot.com/articles-info/3048/bench/average.png
22 game avg is +-4 fps. So, once again, 5070ti = 9070xt in rasterization.
iron_coffin@reddit
How old is that? Nvidia took finewine this gen
Ryan32501@reddit
In pure raster price/performance, 9070xt is the better deal over the 5070ti. Yes 5070ti is faster but not worth the 40-50% price increase imho. Remember there are no bad cards from this generation of AMD/NVIDIA. Only bad pricing
TabScarlet@reddit
Still NVIDIA software and resale value down the road. AMD gpu's don't hold value.
GroundbreakingAd799@reddit
How much value you think they really lose? Used market right now only depends on the performance they get you literally seeing cards selling 50 below their msrp right now cause market on both sides basically, unless is a random guy in marketplace
kawaii_Summoner@reddit
Brother it's the article YOU shared and you're asking ME how old it is?
iron_coffin@reddit
I didn't know it was from that article. Anyways we're talking in circles. I concede the averages are close even if the 5070 ti is generally ahead in multiple sets of averages. But the fact that it has more commanding wins and pulls ahead slightly at 4k implies it's stronger. Looking at the whole picture shows averages aren't the whole story.
I think the fact that the 9070xt depends on cache so much will hurt it, just like the 5800x3d is starting to suffer in newer games.
Yes, you're literally correct, the benchmarks as of now show a basic tie at 1440p (even if 5% isn't nothing).
kawaii_Summoner@reddit
Well hopefully next time you'll say;
Raster: 5070ti = 9070xt > 5070
Ray tracing: 5070 ti > 5070 > 9070xt
(5% at less than 100 fps is, less than 5 fps btw)
iron_coffin@reddit
It's a geomean, but some games are close to 20% like I said.
Large-Teach9165@reddit
Seeing the chart, it's still not a win. The fact it goes both ways for the 9070 XT and 5070 Ti depending on the game is not a win, and the final result being +2 FPS for the 5070 Ti because one random game like Arc Raiders favoured the Nvidia card more than the average is not a win.
A "<" situation would be something like: 5080 > 5070 Ti. But when the results swing like a pendulum from game to game, with a final difference of 2 FPS because one Nvidia favoured game it is truly a "=" situation.
iron_coffin@reddit
Yeah, it is close, but the 5070ti wins are bigger, so I'm thinking more games will use the raw power in the future. I guess the real takeaway is the raster doesn't justify the price. It's only mfg that could justify it.
GroundbreakingAd799@reddit
Mfg doesnt justify the silicon printed, talk to me when it boost the latency less than 10 ms
Large-Teach9165@reddit
Hmmm, idk. Capcom and a lot of console titles except for Guerrilla and Insomniac games are powerful names that definitely favour AMD the most. It all depends on what you play, and that makes them equal, because on another contest, like the 9070 XT and non XT, there's no point to be made, same with the 5070 Ti and 5080.
iron_coffin@reddit
It pulls ahead at 4k, also, implying it's a stronger card that it limited by other factors. The 3060 12 gb can beat the 3060 ti 8gb with some settings, but the 3060 ti is still faster.
Moving the goalpost:
Yeah it does depend, but the 5070 ti is slightly faster. And effectively faster because you can reduce the resolution to match the equivalent fsr.
Then with things like lumen, the line between raster and rt is blurred and will favor the 5070 ti going forward even if you don't turn on explicit ray tracing.
apmspammer@reddit
If you looked at a pc that was playing a average game the 5070 TI and another one that was playing the 9070 xt, without the FPS counter on, you wouldn't be able to tell the difference.
Gen-Turgidson@reddit
Just finished a playthrough of Pragmata using DLSS 3x, very impressed with the latency on some recent games.
Firm_Serve_5480@reddit
Consider second hand 4080/super in warranty, ibe got myself one for 750€ with 1,5 year waranty and i have blast with it
althaz@reddit
9070XT just seems like the best option here by an absolute mile?
Anthrobotics@reddit
5070ti is a capable 4K card as well using upscaling.
ConsistencyWelder@reddit
So is the 9070XT though.
BaronB@reddit
The 9070 XT is a little under 25% faster than the 5070, basically just a little bit less performance gain than the 25% price increase.
Considering the 5070 Ti is a little under 30% faster than the 5070, but is more than 75% more expensive, the 9070 XT is the obvious winner here.
You can also look at the 9070 non-XT, which is about 10% faster than the 5070 and generally around the same price or even less.
The main thing you're loosing with the Nvidia GPUs is access to DLSS 4 upscale and frame generation. These are legitimately better than and supported by more games than FSR 4. However, FSR 4 is extremely good and unless you're looking at them side by side most people would have a hard time telling them apart. And in many situations even side by side they're difficult to tell apart. It's just that DLSS 4 upscaling is slightly better in a few places, especially when it comes to higher amounts of upscaling. Basically you can run DLSS 4 in Balanced (\~58% resolution) and get similar or slightly better image quality than DLSS 3.5 or FSR 4 in Quality (\~66% resolution).
DLSS 4 also a lot more expensive than DLSS 3.5 or FSR 4, so even though you get better image quality from a lower internal resolution, you get less framerate improvement than DLSS 3.5 or especially FSR 4. FSR 4 in Quality gets a bigger framerate increase than DLSS 4 in Balanced, closer to DLSS 4 Performance, at which point FSR 4 starts to look better than DLSS 4 in some situations. The resulting image from DLSS will have a little less ghosting, and look a little sharper, but other upscaling artifacts will be more noticeable. So it's not as clear cut of a win as it seems.
DLSS frame generation is also better than AMD's AFMF frame generation. AMD's implementation has some frame pacing problems, and worse artifacts. But honestly, I don't find DLSS frame generation usable either, so I consider this a moot point. Lossless Scaling has worse artifacts and more latency than either DLSS or AFMF, but gives you a lot more control and the dynamic frame generation implementation is awesome. But, again, I don't really find that to be usable either as the artifacts and added latency just don't make sense unless you're using a monitor with well over a 240hz refresh rate. I personally think you need a base framerate of at least 120 to make frame gen even useful at all as the latency and artifacts are way too noticeable to me at below that.
The only real advantage the Nvidia GPUs have is CUDA support. Some professional applications require it, and those that don't generally perform much, much better on Nvidia than AMD.
...
The short version is get the 9070 XT or 9070, unless you require an Nvidia GPU.
FaceTheBlunt@reddit (OP)
Bro thanks for the explanation
It's very interesting what you say about frame gen, do you think basically it's not needed at all to reach 144fps in ultra/high settings? Given that you use fsr/dlss?
I use it on my 3060 currently in cyberpunk and I genuinely couldn't tell any difference in latency. Odd artifact here or there though.
Really leaning towards the AMD side due to pricing. Hadn't considered the non-XT model though for some reason. If it can do 144 in the newest games at 1440p high/ultra it's a no brainer, gonna look up some benchmark videos.
Again thank you brother
BaronB@reddit
Different people are more and less sensitive to latency. People who tell you they can feel the difference of 1ms are full of shit. But 10ms differences, that can be significant to some, and completely invisible to others. Similarly some people can play games with nearly half a second of latency (not uncommon with cloud streaming services) and not notice anything wrong, but most people will find that to feel really sluggish and heavy.
The thing to understand about all of these frame generation technologies is they're frame interpolation techniques. Meaning you are always a full frame behind at the minimum, and more like a frame and a half due to it needing to generate the interpolated frames. At 60fps that's 25ms, and that can be a lot for some people, or not noticeable for others.
GroundbreakingAd799@reddit
Nobody can consistently tell like less than 5 ms, but 20-60 which happens with framegen at low fps is the equivalent to having playing on a server in another country vs locally.
Like its to the point you can use nvidia shadowplay and it will be the same to you (without suggesting to do that) because i don't support it unless you use it for free
BaronB@reddit
Latency with game streaming can be much, much higher than 60ms, depending on far away that other country is. Honestly 20\~60ms is what I would expect to see when playing on a server that's less than 50 miles from you.
There are online calculators that estimate the latency between two places "over the internet", but these are more what a dedicated fiber optic line between two locations could achieve. The actual internet is far less direct and there are a lot more stops in between.
All of the major game streaming services have boasted about the massive latency improvements they've made over the years, and like to talk about the huge tech advances that helped achieve that. And yes, better tech has helped reduce it... by about 10ms at most. The rest of the "improvements" have come from building a shit load more data centers closer to users. If you're close to one, you'll have a good experience. If you're not, it's not significantly better than it was a decade ago.
GroundbreakingAd799@reddit
My point kinda stands right now i see framegen as a rodiculous gimmick x2 can be usable from what i've seen.
But that's the future of how things are gonna go dlss was shit when it came out and i hated it with passion, right now its just a thing that allows you to play 4k or 1440p o a low end card or make your older card play newer games.
I hope framegen x2 gets to the point thats like 10-20 ms to activate it at 60 fps at most and that they don't suddenly forget raster and optimization
BaronB@reddit
Frame gen will be a roughly 1.5 frame latency hit, and always will be, because there’s no other way to do it.
PowerApp101@reddit
Does AMD anti-lag feature help here? I notice it is enabled in some games I play but honestly I don't know how it works. I have disable it to check but don't notice any difference!
BaronB@reddit
It can, yes. AMD Anti-Lag and Nvidia Reflex have a lot of similarities. The most significant thing they both do is turn off double or triple buffering, something that GPUs started doing over a decade ago to help smooth out framerates, but also increasing latency by 1-2 frames. So that's a nice big win at the cost of more variable framerate (something that the ubiquity of variable refresh has helped make far less of an issue).
The next thing is, weirdly, capping the users framerate to just a little below your monitor's refresh rate. This is better than running a bit above your monitor's refresh rate even with VSync off.
After that there are lots of other low level optimizations like grabbing the mouse and keyboard input in at the last moment, and exposing some more direct paths to parts of the graphics driver that avoids some DirectX / Vulkan API overhead. But really these improvements account for maybe another ms or two in benefit vs the main two. And those are things people have been doing manually for ages to reduce latency. Nvidia (and then AMD) just offered a way to do it automatically, and in a way that doesn't require you manually calculate the optimal frame rate limit for your specific monitor which is not as simple as "refresh rate -3" like many people claim.
But the short version is if your game is not hitting the max framerate of your monitor, and double / triple buffering is already off (I think it already defaults to off on AMD unlike Nvidia) then the differences are going to be very, very small.
...
I should note that this is one of the reasons why in Nvidia's marketing showing the latency differences between using frame gen or not, there's often basically no difference or even slightly lower latency when using frame gen. It's because they're comparing a worse case setup vs frame gen with Reflex on (as it's forced on). But that's not an honest comparison. An honest comparison is using frame gen vs just reflex alone; because disabling triple buffering removes about as much latency as frame gen adds.
Geohfunk@reddit
There are two things with similar names: AMD Antilag and Antilag2. Antilag does almost nothing and can be ignored. Antilag2 is is AMD's answer to Nvidia Reflex.
To give a simple example, lets say a game has 30 ms latency. Reflex (or Antilag2) might reduce that to just 10ms latency. FrameGen might increase that up to 40ms. FrameGen always turns Reflex on if it wasn't already.
Antilag2 is not supported in all games, but since more games support Reflex you can use Optiscaler to enable Antilag2 in single player games.
kemicalkontact@reddit
You won't be able to reach 144fps ultra/high settings without DLSS and frame gen.
GroundbreakingAd799@reddit
For AAA games that aren't doom or something like that, sure, and specially dlss, for thw rest of games that's not true, you don't need to play ultra either so the gpu is actually capable either way just not consistent
kemicalkontact@reddit
Well I know Cyberpunk 2077 for sure won't be able to do it
PowerApp101@reddit
FWIW I have the 9070XT and play games on 1440p high/ultra and usually hit 100+fps without having to use any framegen at all. However the very latest games are pushing the cpu/gpu harder and sometimes I get 60-100fps. Which is still fine with me.
MultiMarcus@reddit
Fsr 4.1 is great, but it’s not all that much faster. DLSS 4 certainly isn’t much more expensive than fsr 4. It’s more expensive than the original DLSS 3.5 or fsr 3.1 but both technologies have become more expensive.
DLSS 4.5 is admittedly more expensive, but that is also quite a bit better quality in some ways. Generally speaking, I think FSR 4.1 is a very good all-purpose model but it doesn’t have the clarity that the transformer model DLSS achieves in motion.
You are also confusing a few technologies here. AFMF is not the AMD frame generation. The equivalent DLSS frame generation is AMD’s MLFG and before recently it was their analytical frame generation branded as a part of FSR 3.1. AFMF is driver level frame generation and not implemented in games it would be equivalent to smooth motion and it’s working on roughly the same level as what lossless scaling does. But they are very different technologies in implementation with really the game implemented techniques being much better. That doesn’t really change the fact that they have big frame pacing issues on the AMD side though but just some terminology clear up.
I would say the big advantage of Nvidia gpus is really path tracing. It’s also just software stack in general. Ray reconstruction out classes ray regeneration handily. The upscaling is generally still something where Nvidia is better. Frame generation, they are better at. And even latency reduction using reflex vs anti-lag they are better at.
You are paying a lot more money for a more forward looking architecture if you buy Nvidia right now and I don’t necessarily think I would recommend the 5070 TI over the 9070XT because they are priced so very different differently, but there are quite a few down signs that you have to evaluate how much they matter to you
BaronB@reddit
Yep, you're right, DLSS 4.5 is the really slow one. DLSS 4 is basically on par with FSR 4 in terms of cost. Hardware Unboxed recent comparison is a great one for anyone else following along. (I suspect u/MultiMarcus has already watched it.)
https://youtu.be/T3MjSxysft0?si=7mF7qBCq3J9KUhFv
Good catch on AFMF vs MLFG too. I obviously cannot keep track of AMD's acronyms for its features.
This is certainly one of AMD's biggest strengths though... coming up with names for features and acronyms that no one can remember. Really among the best in the business at that. 😉
Ray reconstruction is definitely a big advantage Nvidia has for ray tracing, when it works. But there are like 4 games that use it in a way that its benefit is actually even visible. That's still a lot more than FSR Redstone ray reconstruction, but it's still one of those features that unless the only thing you care about is being able to play Cyberpunk 2077 with RT Overdrive path tracing enabled, it's hard to call it out as a really important feature. Weirdly in a few games turning it on when using DLSS 4.5 looks worse than leaving it off because ray reconstruction and DLSS 4.5 don't always play nice.
And while I certainly can't argue that Nvidia's architecture isn't more forward looking than AMD's, I'd also say I generally don't put a lot of stock in considering what a GPU might be able to do in the future. The mid and low end 20 and 30 series GPUs were supposed to bring ray tracing to the masses... and the reality is while they had the forward looking hardware, they didn't have the performance to really accomplish that.
MultiMarcus@reddit
AMD and their naming schemes are so so horrible. The ryzen Ai max+ 395 is absurd for example I still probably screwed up the order.
Ray reconstruction is I think in about 20 titles now which is not a whole lot certainly and it’s maybe only 15 of those where I think it’s worth it and maybe 12 of those it is transformative enough to use on something like a 5070 ti, but the reality is Nvidia is pushing path tracing and we are seeing more games adopt it and many games are going for RTGI so we will probably see wider adoption of ray reconstruction as we head into a more competent RT generation and AMD has an alternative. I’m just kind of wondering if there will be a new ray regeneration version for the 9000 series or if they’ll release something with the 10,000 series and drop the 9000 series.
And while I agree that we shouldn’t generally buy products based on what they could do in the future the reality is I feel like we all have to play a bit of seer when it comes to graphics hardware. Like the people who bet that upscaling was basically a passing fad and bought the 6000 series from AMD. I’m sure regret that even though they will now be getting a good upscaling solution finally. Having to play games for awards of six years without using upscaling or relying on a poor upscaling technique.
I think the 20 and 30 series cards really do highlight that forward looking architecture. Compare the rdna 1 cards to Turing. Rdna 1 has basically been fully abandoned. It doesn’t support the DX12 ultimate spec and doesn’t have upscaling meanwhile yes maybe the 20 series did not realise the dream of playing most games with heavy RT. I still think it’s at least able to play games and as we can see you with Indiana Jones and doom the dark ages two of the rare games that require RT acceleration they still play quite well on the 20 series cards including as low as the 2060. And I would say the 30 series also delivers a bump in RT performance and generally a better visual experience. It’s maybe not able to do full path but in medium heft RT implementations they seem quite viable.
If I put on my prediction cap, I really do think that we will start seeing more titles use path tracing with the next generation consoles and the 5070 ti and 9070 xt seem quite similar and performance to what is expected from Xbox helix and the PS6 though seemingly a bit weaker than the helix console and a bit stronger than the PS6. Based on the rumours of rdna 5/udna it does seem like the 5070 ti is more architecturally prepared for that than the 9070 xt but I guess it all depends on how good these companies are keeping their hardware updated.
Still even with all of that taken into account I think most people should buy the AMD card because it’s much cheaper and you could just spend those like $300 to buy a new GPU a year earlier maybe in 2032 or something. I just think it’s worth at least keeping in mind.
Its_Pamela_Isley@reddit
If you are after efficiency 9070 is the clear winner. Most balanced card and not far behind 5070ti/9070XT.
Skywers@reddit
If you only play video games, take 9070 XT. If you have the budget take the 5070 Ti. Otherwise 5070.
Ben_Nevi@reddit
I bought a 5070ti for the same price, but after tax. And I'm very happy with it. I play comfortably native 2k. DLSS quality if I want to use Ray Tracing. Frame gen if Path Tracing.
I bought it in November tho. If I were you now I'd go with 9070xt. 9070 will run out of juice too soon.
Lekanswanson@reddit
Im also trying to decide but since im planning to go Nvidia since I've been nvidia since day 1 building pc, my choice is between a 4080 super , 4070 ti super if price is right and 5070 ti.
Im upgrading from a gigabyte 3080 gaming oc 12gb version so whatever card i get needs to have at least 16gb
ChronoZaga@reddit
DLSS or no? YOU have to decide how much value DLSS brings to the table. We can’t answer that for you.
FaceTheBlunt@reddit (OP)
I honestly have no idea what the implications of not using dlss would be. I understand its purpose and know of AMDs equivalent. But Im nowhere near educated enough as to why dlss is better, or what to even look for in quality difference with dlss vs fsr
The thing that stands out the most to my ignorant eyes is fps, and above 144 it stops being a huge difference
iron_coffin@reddit
Find a video comparing fsr 4.1 to dlss 4.5. Frame gen is a lot better on nvidia though, and you'll probably need it to hit 144 in modern games
TortieMVH@reddit
Works in a lot more games too.
wubbadubdub_zzz@reddit
Main difference is number of games supported.
Ok_Fan_1637@reddit
Just looking at the pricing already gives you the answer. The 5070 Ti is superior in almost everything: ray tracing, frame gen x6, rasterization, overall features. If you want a visually impressive AAA gaming experience without spending too much, then the 5070 is also fine if you can accept 12GB VRAM.
The 9070 XT’s main advantage is price. But it has worse drivers, more random issues than Nvidia, and weaker ray tracing and frame generation technologies. Sure, it has 16GB VRAM and strong raw performance, but in 2026 when almost every game launches poorly optimized, raw performance alone doesn’t help much. On top of that, Nvidia GPUs are also much better for work and AI workloads thanks to CUDA support.
jaymorningside@reddit
It's ultimately a matter of practicality, but my 16gb 5060 TI runs everything I throw at it 1440P at a great frame rate. OBVI a bigger bus width would be better and more expensive cards will run things better, but if you can find a 5060 TI around MSRP it's a solid choice for 1440P. If I were OP however I'd bite the bullet and get a 5070 TI.
itsmekusu@reddit
IMO 5070 is the best price to performance between all of them.
FairwayGhost@reddit
Where are you finding the 9070xt at that price?? Is this USD?
PvtHudson@reddit
Where you seeing a 5070 for $550?
FaceTheBlunt@reddit (OP)
Microcenter
GapOk8380@reddit
Check your local Walmarts, believe it or not, for 5070ti's. One of my local one gets them randomly for msrp @750.
WaterWeedDuneHair69@reddit
I’d probably get a 5070 then when prices drop in the future sell it and get a 5070 to
bakuonizzzz@reddit
For that price difference just go for the 9070xt unless you're heavily in the nvidia feature camp, i just wanted a plug and play but i definitely would of gone with the 9070xt if it was more available at the time and fsr 4 was more supported at the time when i bought my 5070 ti.
Agent_Nate_009@reddit
The 9070 is close to the same performance as XT model, I guess you have to decide if $70 will s worth maybe 10% performance improvement and 60 extra watts of power draw.
I have an RX 9070 and I really like it, got mine for $574
RAF2018336@reddit
The 9070 is the perfect 1440p card for 3-5 years easy. But at that price might as well get the 9070xt
BillionaireBear@reddit
9070xt is just the best value here. 5070 will (quickly) leave you wanting more. If you can happily afford it, 5070ti is best choice. I have a 9070 in my 2nd pc, it’s a great 2nd PC Card, but I would not be satisfied for my main rig
Reddit_is_srsbsns@reddit
You made it dramatic bruh.
AU_Cav@reddit
If you could drop the $700 for the XT, you wouldn’t be asking here, I don’t think. It’s a better card and you can find that anywhere.
After my entire build, the budget didn’t have room so I got the 5070. It’s a great card and I don’t regret.
But if the $700 is in your budget you should get the XT.
Traditional-Box1301@reddit
Going through the same dilemma. Feels like I chose the worst time to start upgrading. Ordered a 5070 for about $620 shipped and then the 9070xt for about $745. I def don’t want to spend over 1k for the 5070ti or 5080. I think I’ll return the 5070. Seems like the price for what you get with the 9070xt is really hard to beat right now.
SavedMartha@reddit
9070XT
exilekiller@reddit
Try to find a used 5070ti. I picked a perfectly good one up a few weeks ago for 850.
seanxfitbjj@reddit
9070xt is your best bang for buck. If you can afford a 5070ti then that’s the best future bet.
Diarrheuh@reddit
Cop a 9070XT