5070's actual performance in 2026
Posted by captainmeowy@reddit | buildapc | View on Reddit | 92 comments
At launch, most people said it was basically just a 4070 Super reskin. But after some driver updates a few months back, I've been seeing comments across reddit claiming it's now closer to a 4070 Ti.
The opinions are mixed and all over the place so I just wanted to check here, has anyone actually seen it perform on par with a 4070 Ti across todays games?
I'm looking to buy one and want to know if it's worth getting 50 series vs 40 series.
Appropriate_Taro3424@reddit
I’ve been using an RTX 5070 for a couple of weeks. Knowing what I know now, I would have gone for the 5070 Ti.
Why?
Because the 5070 sits right on the edge of playable 4K Path Tracing in Cyberpunk 2077.
Raster at 4K is effortless. RT is also very playable with DLSS Balanced and still looks great with current presets.
Path Tracing is where things get tricky. Once you try it, it’s hard to go back:
– the visual upgrade is obvious
– the card can almost handle it at a comfortable level (DLSS Performance is barely playable even with FGx2, Ultra Performance runs better but looks noticeably worse and is a step too far for me)
And that “almost” is the problem.
Roughly +10% performance and +2 GB VRAM would make a big difference. Especially VRAM – PT is extremely memory-hungry and that’s often the real bottleneck, not raw compute.
After a lot of tweaking (mods, overrides, caps, .ini edits), it is possible to reach a stable and enjoyable setup.
But realistically, I’d rather have paid the extra \~300 EUR for a 5070 Ti and just play the game instead of spending hours tuning settings.
TheYucs@reddit
It's funny you mention that, because I just upgraded from a 5070Ti that I OCed a lot to a 5090Dv2 for the exact same reasons. I found, even with a +10-15% OC, the 70Ti to be an edge 4K GPU with Pathtracing since you're required to run it with DLSS B or P with FG to get above 90 FPS. It just introduces a lot of artifacts that I became more and more sensitive to as I continued using DLSS and FG.
I say this to basically tell you, the 5070Ti is an amazing card, but it isn't perfect if you want enthusiast level graphics. It definitely functions much better at 4K than a 5070 would, though.
Appropriate_Taro3424@reddit
Thanks! It probably also depends on personal preference (e.g. one's level of tolerance for artifacts). I am pretty happy with my current stable 80fps (FGx2) DLSS Performance and simplified PT (1ray, 2 bounces). But once I had 5070Ti running vanilla PT and 90 FPS I'd probably think "Damn, I want DLSS Quality".
Locke357@reddit
Upgraded from a 3060ti to a 5070, paired with a 5700X3D
While I don't have the direct comparison, I will say there is once good reason to get a 50-series over a 40-series: framegen
Being able to run framegen above 2x on AAA singleplayer games is a huge win. The input lag is barely noticeable for singleplayer (some people say it's good enough for multiplayer but I haven't tried that), for example I can turn settings to high with ray-tracing on with Cyberpunk 2077 and then max out my monitor's refresh rate at 180hz
Plazmatic@reddit
Keep min mind framegen * does not improve latency * May increase latency (though depending on the game and how the integrate with reflex, that may not increase enough to matter) * Costs performance in it's own right, which isn't a big deal when you aren't GPU limited, but 4x framegen is not 4x the framerate, depending on the scenario it can be 2.5x, but 3x is more common.
At minimum, the time between when you input an action and when you see it is your base framerate. Reflex or what ever other technology Nvidia talks about for latency reduction is all about additional latency reduction, not going below what is possible. This means that if you're struggling to render something at 30fps, it's going to feel like crap when you 4 x it (or more likely, 3x or 2.5x it if you're already struggling), your actions are still going to be felt at 30fps or worse latency.
Ideally your base frame rate is 60 or even better, something like 90+ fps (so GPU limitations don't scew with frame pacing). Many games limit input handling separate from display rate at 60hz anyway, or even what the physics rate is, though it can get more complicated than that. But if you're struggling with 60, the extra overhead may send you down in base frame rate you're multiplying form depending on how your system is handling frame sync, so having good wiggle room will stop that from being nearly as big of an issue.
Additionally, if your monitor cannot display the framerate your running at, you're actually losing out by using frame gen beyond the refresh rate of the monitor. Think about that. You have to be able to comfortably handle the game at 60hz already, even if you could do 4x, you'd be doing 240hz, most people are not using 240hz framerate or playing games at that frame rate, 120 to 165? sure, 240? Not nearly as many. And having your frame rate not be a multiple of 60 can cause issues in some games (SOMA, some emulators, old games, some RPG maker games), so you might just be lowering your frame rate anyway (and having a base 12 multiplier can help with VRR which doesn't have to change as much to get better integer multipliers, 12 is divisible by 2,3,4,6, which is way more than 10 or 5 has, 120 hz has 2x as many factors as 165hz for example).
So:
is really more like ~3x or you're compromising on your visual fidelity, in which case, it's no longer "free frames", or you're using a 5080/5090.
Remember the input lag cannot go below the base framerate, + all the regular input lag you would have had with out frame gen. If your base frame rate is high enough, you're not going to notice much of anything.
3x60.
-Businessman@reddit
Every game I've tried so far with MFG on, turns the gpu latency to 0.3ms. It's actually like black magic because the game plays like smooth butter/. BF6, CP2077, Expedition 33 to name a few.
Locke357@reddit
Yes I'm aware how framegen works, and IMHO the 5070 is probably the most "affordable" Nvidia GPU able to push high enough base frames at 1440p to use 2x-6x framegen effectively
I typically aim for around 60fps without framegen, then active framegen 4x, since as we both know there is a performance hit when enabling framegen.
It's difficult because perceptions of input lag and artifacting are highly subjective. So I always use caveats like "as far as I can tell" or "isn't noticeable to me" or such things. I personally don't notice any difference in input lag as long as my base fps was at least 60-ish prior to enabling framegen, and I don't personally notice any artifacting.
Plazmatic@reddit
Fair.
HEBushido@reddit
I tried framegen with my 5080 on Cyberpunk and it didn't look good. There was a solid amount of artifacting.
MrWally@reddit
Was it DLSS 4.5?
HEBushido@reddit
Yes of course
Locke357@reddit
Hm, haven't noticed any myself. But I know some people pick up on those things more. Even with DLSS4.5 I have a hard time telling the difference between performance mode and native in 1440p with some games
HEBushido@reddit
That's wild to me. The difference in quality vs native is very noticeable for me.
Plazmatic@reddit
I do not know what people are smoking with Cyberpunk 2077 and max settings (Path tracing etc...). I saw constant DLSS smearing, strange shadow artifacts, RT boiling artifacts etc... I had pictures of these things, and people still didn't believe they existed or just quietly down-voted. There was a scene where >!takemura and you are at the parade getting ready and he!< tries to hand you some food or something it looked incromprehnsible because of DLSS/Frame gen artifacts, like I legit was getting a headache trying to understand what I was looking at, like he handed me a brown dream nugget, I remember watching other videos realizing that people who weren't doing this saw what the artist actually intended. Often other people would show footage and I could see the same artifacts in their videos. These are not "minor subtle" things either, they are distracting immersion breaking things. People just have this insane will do not believe things that go against hype of a thing they enjoy (Which to be clear I very much liked Cyberpunk 2077). Either that or they have such low standards they shouldn't be commenting on visual quality at all, or at the very least not arguing with people who do have standards.
Baumpaladin@reddit
It's hard to tell sometimes if something is a genuine opinion or a fake. Kinda ironic as we are talking about something as artificial as frame gen.
My two options to guess from are either a frame gen shill or your average customer that has no standards. I'll go with Hanlon's razor and assume that these people just genuinely have no standards when it comes to quality. It almost feels like bait when someone announces "I have no standards and think this looks alright" for something that objectively looks somewhere between ok and shit.
RAF2018336@reddit
Some people also just don’t know. It doesn’t mean they don’t have standards, but they’re not into this hobby enough to know the difference and there’s nothing wrong with that either This frame gen and upscaling stuff is starting to turn into the same thing the audiophile world turned into. If you’re not listening on vinyl by optical on gold plated RCAs then you have no business being in the hobby and it’s annoying af Sometimes just let people enjoy their games and systems however they want
And for the record, I hate frame gen myself but I’m adult enough to realize people have different tastes
Baumpaladin@reddit
I find it a difficult topic because the line between "ruining it for others" and "just not knowing better" is rather thin. Gatekeeping isn't by nature a bad thing, but when the people that employ it have unrealistic standard, it's just as bad as enforcing no standards. It leaves you wondering if those people even appreciating the hobby anymore.
I'm with you, that everybody is free to enjoy the art the way they like, but if you share your opinion publicly online, you'll always have to consider the discussion it may spark.
I don't mind frame gen as long as the game runs at an acceptable level for the price I paid. But when I'm told to use it on a 900€ card to get acceptable performance, I see frame gen and the studio as the problem.
RAF2018336@reddit
Good points
v-jazz@reddit
Sure. DLSS 4.5 has a wide range of setting variables and variances by game. On my 5080 with quality and path tracing and framegen it looks good at 4k. Does it look native, no.
STRYED0R@reddit
Tried a lot of options but DLDSR 4K + MFG + DLSS vs 1440p native or with DLSS is simply worlds appart.
Sure, there's some input lag but it doesnt bother gameplay at all.
People are sleeping on the software tech.
tO_ott@reddit
I noticed some smearing like you’d see with DLSS, but it wasn’t constant.
I did notice the latency enough to turn it off, but I use a controller a lot so it’s probably compounded.
beirch@reddit
Typically you'd notice input lag less with a controller vs mouse & keyboard.
Temporary-Ad8539@reddit
Same and im playng cyberpunk at 4x with my 5070 ti full setting never saw an aortfact. I dont know the fuk people are tapking about. Prbly they have a 9070xt thats why lol.
Mr_Joanito@reddit
I used 4x and never noticed in Cyberpunk.
Ektojinx@reddit
I must admit was placing hogwarts legacy, everything maxxed at 4k, turned on frame gen so I could have over 100 fps consistent and something just didn't seem right compared to without it.
That being said if I had turned it on first I doubt I would have noticed.
Nexus_3_@reddit
Ngl I think x2 is probably the sweetest spot for frame gen. I think anything above just comes with caveats for a few people that do notice closer details.
Locke357@reddit
Yeah for sure some people do complain about artifacting. Personally I don't notice any.
Inner-Ear@reddit
Same. I run cyberpunk at 1440p but lock it in to 60 hz with path tracing ray regen and DLSS preset k on quality with 2x mfg and it runs so good
zero_x4ever@reddit
As a COD, BF6, Marvel Rivals MKB player, there's a huge truth behind this and some nuance. Framegen adds FPS but at the same time adds input lag, so 3x 4x, have higher frames but also higher input lag. I hate running even 2x on COD, but BF6 and Marvel Rivals, I can run 2x frame gen without affecting my aim and flicks. I even run 4x on Space Marines 2 because I run the game on ultra graphics.
What sets it aside from what I notice is if your CPU can generate frames at examples 180 fps (at lowest settings or at 1080p), and your GPU settings like at ultra get you at 150 fps you ought to try and boost it more with frame gen and see if the input lag is too detrimental. On top of that, the biggest thing that affects it is if your frame times if it's unstable or it dips or your low 1% fps gets really bad with spikes, it's not worth it turning on frame gen.
The added fake frames is not worth it if the game engine just creates low base frames. It's worth it if game engine is running faster than the GPU render calls.
Locke357@reddit
All good points! Hmm maybe I will try 2x framegen on BF6, I'm sitting at 120-140fps with dlss performance preset L, don't really want to turn down the quality settings
MountainDoit@reddit
I used to play CSGO(4K+ hours) so I’m pretty nitpicky about frame timing and input lag in FPS games. 2xFG in BF6 has been a very smooth experience, haven’t noticed it affecting me. BF6 is also a game where raw aim isn’t as important as positioning and tactics so I find it to be basically a non issue. Individual experience varies ofc.
captainmeowy@reddit (OP)
it also helps to even lower the input lag by turning on nvidia reflex on + boost
lukasgoti@reddit
Im playing arc raiders with fg x2 with literally zero input lag
captainmeowy@reddit (OP)
I agree. I’ve got a 4060 laptop and can run TLOU Part II at 1600p on high settings with frame generation pretty impressive tech. That said, is the 5070’s VRAM actually enough to fully take advantage of x4 frame generation?
Locke357@reddit
Playing at 1440p I have yet to encounter a game I play personally where I run out of vram. I noticed an initial vram hit when turning on framgen (though not substantial), but it doesn't seem to use more vram the higher the framegen multiplayer. Plus it goes up to 6x now which is neat.
captainmeowy@reddit (OP)
I see that's good to know. Appreciate the insight you've given so far
rdy_csci@reddit
Similar setup as you. I went from a 2070 to a 5070 with the 5700X3D. I don't use frame gen, but I get high enough FPS and beautiful picture quality on the games I still play; CyberPunk, BG3, KCD2 and Expedition 33.
GABE_EDD@reddit
See the neat thing about quality benchmarks is that there's so such thing as opinion. A quality benchmark would be something like 3DMark Steel Nomad DX12.
RTX 5070 scores 5309 on average
RTX 4070 Super scores 4634 on average
RTX 4070 Ti scores 5032 on average.
So, it beats both of those cards you mentioned by a decent margin.
https://www.3dmark.com/search#advanced?test=sw%20DX&cpuId=&gpuId=1700&gpuCount=0&gpuType=ALL&deviceType=ALL&storageModel=ALL&modelId=&showRamDisks=false&memoryChannels=0&country=&scoreType=overallScore&hofMode=false&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock=
SethMatrix@reddit
Or…. Hear me out….
You look at an actual quality benchmark that covers many different game engines. Not one single synthetic benchmark. https://www.techpowerup.com/review/nvidia-geforce-rtx-5070-founders-edition/35.html
Borigh@reddit
Right, it’s basically right between a 4070 Ti and a 4070 Ti S, but with the VRAM of the former
PCBuilderCat@reddit
Which id say is right in line with what you would expect from a generational uplift personally, the next gen card should be at the very least equalling the performance of its last gen+1
nitekroller@reddit
Well no, it used to be that next gen cards equalled or outperformed the last gens higher ranked counterparts. 1070 vs 980, 3070 vs 2080. Im sure there are launches that brought that up two rankings better than previous gens, and im not talking about refreshes or boosted cards. We are slowly seeing less and less generational uplift with new releases.
Decends2@reddit
Don't forget 970 vs 780 ti
PCBuilderCat@reddit
Well that just makes sense though consoles have gone the same way until we get some kind of major hardware break through if one ever comes we’re feeling pretty close to the limit of what a consumer class card can really do on a hardware level without it becoming a gigantic 1000w Goliath
AlunSagara@reddit
IMO the closest 3DMark suite to actual 1440p gaming performance is the Time Spy graphics score. On Steel Nomad 5070 Ti is 24-25% faster than 4070 Ti Super.
On Time Spy, the 5070 Ti is only about 14-15% on average than the 4070 Ti Super, which is close to the actual gaming results and why the former was labelled as ‘4070 Ti Super Duper’ by reviewers last year.
dorting@reddit
Meh the only way to value a GPU is gaming bench.
Kustu05@reddit
Steel Nomad likes Blackwell architecture a lot and doesn't quite reflect the real gaming performance differences of 4000 and 5000 series cards. 5070 is about on par with the 4070 TI in most games.
Shhh-it-Bruh@reddit
Yeah some ppl put a bit too much on Steal Nomad scores. It's ok for getting a quick look at if ur card might be working properly but it by no means is the end all and say all. And a few hundred points difference between 2 different series cards don't mean much, once u start testing games between them u might find the card that scores a bit higher on SN gets slightly beat a bit more often.
captainmeowy@reddit (OP)
It's weird though because based on this article, 4070 ti beats it on the same test 3DMark Steel Nomad DX12.
https://hothardware.com/reviews/nvidia-geforce-rtx-5070-fe-review-and-benchmarks?page=3
GABE_EDD@reddit
Well, I'd trust the actual source of the data way before "Hot Hardware"
Every_Fig_1728@reddit
Especially since hardware throttles if it gets too hot
itsforathing@reddit
Hot hardware in your area, ready to get benchmarked
DivideByZero666@reddit
That's from 13 months ago and you said yourself it got better with drivers.
Crazy_Dawid@reddit
Yup! And with some undervolt and OC I got stable 5845 on steel nomad dx12 with 5070.
LiterallynobodyY@reddit
I play it on dual 1440p and 4k monitor setup. Tbh dlss on 4k is so good that I play only 4k now. The difference is so big.
Agentofsociety@reddit
It is a great card! I've found upscaling to be a great way to make everything look better and utilizing the card to the full potential on older games, and newer games the framegen is great. Currently playing uncharted 4 upscaled 4k on a 1440p monitor and it runs brilliant at 100+ fps, it's great.
I pair it with a 3600x and I've yet to play anything below 75 fps on 1440p.
Mr_Joanito@reddit
I love my 5070, the fact that I can press a button and double, triple, quadruple the fps is insane.
M88bie@reddit
I’m in the camp that has no issues using dlss and frame gen, for that reason the 5070 was the no brainer for me (paired with a 5700x3d) Had the card for 6 months and loved it, was even playing most games at 4k with nvidia’s ai help with the occasional drop to 1440p. I did replace it in Jan but only because I got a good deal on a 5070ti - the project zero version, otherwise I would have kept it.
Amazing-Matter1985@reddit
At no point was the 5070 no better than a 4070 super. Disappointing uplift vs, sure but where ever you got that from is wrong.
Impossible-Move-2096@reddit
5070 landing closer to 4070 Ti after driver updates makes sense Nvidia always squeezes extra juice later.
hiroller18@reddit
So you’re telling me you can still get a 40 series? An if so are you getting it at a better price than a 5070 that’s wildly available?
captainmeowy@reddit (OP)
Yep can get used 40 series which is 30% more cheaper where I'm from
prank_mark@reddit
If you only plan on gaming, AMD is the much better choice in terms of value right now. You can get the 9070 XT, which has 16GB of VRAM and performs on par with the 5070 Ti, for roughly the same price as a 5070.
captainmeowy@reddit (OP)
I seriously considered it but the 9070 XT's power draw was a dealbreaker for me. Electricity is pretty expensive where I live so efficiency is something I really have to factor in
dorting@reddit
9070 not XT is faster than 5070 with more VRAM and is close to the XT while using 220w https://www.techpowerup.com/review/pragmata-performance-benchmark/6.html
tmanky@reddit
Upgraded to a WC CU7 265k + 5070 last Labor Day. Both far exceed expectations after some tweaking, good updates and better ram (bought weeks before crisis started). SW: Outlaws ran 1440p max-ish settings at 144+ fps for the 10 hours I got into it. Apex Legends runs smoothly at 144+ fps on 1440p high and handles the chaos so well. Pretty much the same with Elden Ring, too. Thought I'd have issues with older games but runs Ages Of Empires 3 and SW: Republic Commando perfectly.
GrimSlayer@reddit
I have both my living room PC on a 4K OLED tv with a 5070 and my office pc has a 4070 super on a 1440p monitor. The 5070 beats the 4070 super from the tests I’ve done on user benchmark. I’m still very happy with my 4070 super at 1440p and my 5070 has been an absolute champ at 4k.
captainmeowy@reddit (OP)
This exactly what I'm looking for. Thanks for comparing these two, it looks like the performance gap widens due to driver updates.
Another W for 5070
GrimSlayer@reddit
I’ve been really happy with the 5070 in my living room PC, both have 9800X3Ds and 32GB 6000MHZ so basically the same pc comparing the two. I can hit 120FPS
Nosferatu_V@reddit
Would you be willing to run 3D Mark tests as well? Or Passmark. Nothing wrong with User Benchmark, just that along with those other 2 it would provide a more complete picture.
GrimSlayer@reddit
This was an idiot moment on my part. Meant to say 3d mark.
AdstaOCE@reddit
The 9070 is the real 70 class option this generation.
MyRedditUsername-25@reddit
I'm well over 100fps in nearly everything I throw at it. 5070, 1440p.
bubbarowden@reddit
There’s no doubt it performs really well, the biggest issue is longevity because of VRAM. But hey, could last you a good few years w a really good experience!
No-Current-6083@reddit
People talk about lasting because of vram like its just going to stop working... Simply reduce settings from max tu super high/high and continue having great performance.
giveitrightmeow@reddit
right? theres also that new neural texture compression stuff on the way.
Ill_Difference_4039@reddit
or use upscaling lol, the vram fearmongering is so funny to me
No-Current-6083@reddit
Literally, that's why I got nvidia's gpu, for great tech.
Locke357@reddit
Plus, with enthusiast PC market in shambles due to the AI boom, devs are going to have to optimize games better if they want customers for a few years at least
Mlluell@reddit
Nah, they will tell us to use the cloud
chaosthebomb@reddit
Look at techpowerups relative performance chart https://www.techpowerup.com/gpu-specs/geforce-rtx-5070.c4218
The 4070s and the 4070ti are very close to begin with. The 5070 is also close to them. However, it's newer, has better synthetic comoute because of the improved core design and theoretically should be supported for longer. There is 0 reason to consider those 40 series cards unless you're finding them for cheap.
9okm@reddit
"across today's games" is absurdly broad.
It's like asking how a car performs "across the roads of the world".
alex46152@reddit
Not really, a 5090 will perform good everywhere…
captainmeowy@reddit (OP)
should've been more specific about that. what I meant was any graphically demanding games released from 2020 onwards
9okm@reddit
Then IMO it comes down to features. If you play primarily big AAA games that can benefit from the latest advances in DLSS and framegen, I'd lean towards 50 series.
dart51984@reddit
If you’re looking for 5070’s and you find a 5070ti at or near MSRP, I’d say go for that. It’s a phenomenal card.
The-Old-American@reddit
I don't play many games, but it's a disappointment running Diablo 4. No matter what I do I get screen tearing and stuttering. And this was before the new xpac. My RX 6700XT ran it fine. Other games are good, though I don't see much of an improvement over the AMD.
reiichiroh@reddit
I had to get it because it was the price of initial 5070ti MSRP and GPU prices finally got jacked up overnight after months of RAM threats.
It went from $700 to $1000 CAD. 5070ti went from $1090 CAD to $1500.
I got it because it was the only one left in my budget.
Livetheuniverse@reddit
Always look at actual gaming benchmarks. On gaming benchmarks the 4070 ti and 5070 are neck and neck, usually within a few % of each other. Very unlikely to notice any real difference in the two in gaming. That said, the 5070 would be the better buy if similarly priced due to it being newer hardware and 4x frame gen vs 2x(if you're interested in that).
Correx96@reddit
Similar to 4070ti, in the +- 5% range more or less, there are video comparisons on Youtube.
On the practical aspect, it can run games at 1440p ultra details no problem. In raster, AAA at 60-90 fps. With DLSS and/or FG x2 100+ fps easily.
Older games can be run at 100+ fps without any problem in raster and even ray tracing active.
Speaking of older games, right now I'm playing TW3 on 1440p, ultra details, ray tracing between high and ultra, using DLSS. I consistently get 150+ fps, only drops to 80-90 fps in huge crowded areas.
I've used FG on Ghost of Tsushima and it looked fine to me.
captainmeowy@reddit (OP)
Good to hear. Most big tech reviewers on YouTube sht on this card so much, mainly pointing out it was only about 2–4 FPS ahead of the 4070 Super.
Thanks to people sharing their experiences with this card, it really does look like performance has improved overtime. I think it deserves a fresh look in 2026
Biggeordiegeek@reddit
It’s a fine card and like most cards it will get better with driver improvements
I was sorely tempted to upgrade my 3070 to one, but I managed to get a great deal on a 9070XT and preferred more VRAM