Real difference between RTX 3080 and RTX 5070
Posted by jackofallcards@reddit | buildapc | View on Reddit | 41 comments
Recently (about two months ago) completed an upgrade and grabbed a 3080, specifically an MSI Gaming Z Trio 10GB if that matters, for $380 (+tax about $414 total) as it seemed like the best bang-for-your-buck at the time. Have a friend who often will grab in-stock cards to resell to other friends for what he paid for them, managed to get a 5070 FE for ~$590 after tax and basically I’m on the fence whether or not the performance upgrade is worth the hassle of trying get as close to $414 back as possible, and paying the difference for the 5070
Most posts related to this that I have found are either pre-5070 release (and most of the comments are just that, “it’s not out yet how do we know) or compare to the 5070Ti which is (from my understanding) and much larger jump in both performance and price.
Basically, performance to effort to price difference ratio- is it worth it? Or ride the 3080 out a couple of years to see the state of the GPU market then?
chrisz2012@reddit
The 3080 is still crazy good. I have an RX 7800 XT I recently upgraded from an RX 6600. I don't really get people upgrading from a 3080 to a 5080 or even a 5070 Ti. The 3080 is still a strong card even in 2025. Only thing holding it back is the 10GB of VRAM potentially, but even then most games only use 8-10GB of VRAM anyways 1440p and below.
I'm going to run my RX 7800 XT for the next 3-years and see where things end up. I'm pretty sure your 3080 will be fine for the next 2-3 years.
Personally I think a 3080 / RX 7800 XT performance is more than enough for gamers. You don't really need a crazy high-end GPU if you're just playing 1440p or 1080p.
My Brother was gaming on a Radeon 6850 from 2010 it had 1GB of VRAM, and he played Fort Nite, Minecraft, and Brawlhala on it. He now has a GTX 1080 and an older i7 as a teenager it's more than enough for him. I really don't see him needing a newer GPU since all he is doing is 1080p gaming.
ColinStyles@reddit
Late reply here, but the people on 3080's for the most part bought them because they were on 4k, and still are. That's why the upgrade, a 3080 struggles with 4k on the latest titles for sure.
LuukeTheKing@reddit
Maybe on very initial release, and even then the diehard 4k'ers just 3090 didn't they?
Although I suppose they were in very short supply
I've owned my 3080 since late '23 and It's touched 4k gaming I think 3 times, which was when I first bought a 4k TV this year which is nearby to it, so I hooked up an HDMI to see how well 4k ran (And was pleasantly surprised)
My i9 9900K@5.0Ghz and 3080 @ 1080p takes everything I throw at it pretty damn well , Cyberpunk full RTX struggles a bit (expected), and over the last year-ish fortnite (I rarely play it, but still) seems to run like absolute dogsh**, I am entirely unable to enable UE5's nanites/lumen without it sh**ting the bed. And even just regular High/Ultra has its fair share of drops for god knows why.
But aside from those two I don't think I ever notice any times It can't either fill my 120Hz in an fps game, or get definitely playable FPS's while maxxing out raytracing for less reaction time needing games, (Ark SE excluded, POS optimization)
ColinStyles@reddit
Wait, you bought a 3080 for 1080p...?
I'm just kind of mindblown by that level of waste/overkill. Considering it could handle most 4k titles at the time at 60-80 fps at high-ultra settings and some RT (assuming DLSS on quality or balanced), it just seems so unbelievably overkill to use one for 1080p and only 120hz.
But anyway, my point is these days it really does struggle with 4k and is ripe for an upgrade if you're in the market. I get it though I'm not in the position to upgrade right now, and will have to wait for the 60 series.
LuukeTheKing@reddit
Uhh, yeah? And have definitely managed to pin it at 100% at 1080p too many a time, can't remember what on aside from cyberpunk and rtx hitman III at the current.
It also isn't getting me much more than 120Hz in quite a lot of titles, it'll sit happily at 120 fairly often, It does in R6Siege, but I definitely wouldn't be pushing up to a 200Hz monitor in most games.
And to be fair my GPU upgrade/ownership history is this: GT710, GTX980Ti, RTX3080 And matching CPU was i5 2400, i5 2400, i9 9900K
I got a full time job finally after I left college and treated myself to a decent pc with my first paycheck, the 3080 was also chosen because It wasn't bad pricing, and I knew I wouldn't NEED an upgrade for a long time.
I'm currently trying to work out what to upgrade next, I'm assuming my CPU but I'm not sure, especially since I like intel but the 13th+ issues worry me. And obviously I likely should just bite the bullet and start 1440p at least, but I quite like my 32" Samsung curved currently I got cheap.
Due_Advisor_1612@reddit
Bro, are you still gaming at 1080p?
Let me know you upgraded your monitor and are actually using your high end graphics card!
LuukeTheKing@reddit
Funny you ask that today, one of my friends owes me back for a switch 2 pre-order because they couldn't make it into store, and long story short still owes me it (no fault of anyone, long story).
And just before I got your notification I had just sent a message asking if she had enough on a specific rewards card to buy that monitor as to pay me back, because if the money doesn't touch my account I feel less bad about spending that much on a monitor haha.
So, no I haven't yet, but it's in the plans as these comments finally won me over.
I think I'm getting the Alienware AW3425DWM because I've seen good reviews, although it is VA and I've heard loads of bad things about ghosting on VA, but also very good things about this specific VA panel so I'm not sure, I've also learnt since looking at them that my current one is not only 1080p, but actually has fairly bad ghosting, which explains why I can never read moving text, which I thought was just me lol.
So 1080p 32" 16:9 W/ ghosting To WQHD 1440p 34" 21:9 (hopefully) without,
Should be a major improvement and actually make use of the GPU for once hopefully
0ptik2600@reddit
Why not get a good IPS 1440p monitor? The picture quality and colors are just better overall vs VA.
I believe VA panels nowadays are for people looking to save money or want the absolute highest refresh rates for competitive gaming.
LuukeTheKing@reddit
It's actually a done deal now, I got the Alienware I mentioned in that the last comment here, a new gen VA, 34" curved ultrawide panel @180Hz, absolutely no regrets, the reason I didn't go for IPS (which was actually what I was looking for for nearly the entire time I was looking) was because they simply do not exist (Or I can't find them) in this form factor, and the main worries of VA are ghosting, which I could find many reviews saying is not an issue.
Also, VA is the one that's known for the better colour Vs IPS I thought? It's been known to have general better colours but suffers with ghosting, so they have lower refresh generally was my understanding?
Either way I am very happy with my choice, the WQHD with HDR is incredible, so so much better, the colours are so nice, even the blacks are pretty damn good imo, and the 180 from 120Hz, now without any noticeable ghosting (I haven't actually done a ufo test, I keep meaning to) is such a nice thing to have, I didn't think I cared until I used it and even just moving around the desktop feels smoother.
And "looking to save money... Absolute highest refresh rates" IPS is the one for highest refresh rates, VA is generally known to have ghosting at higher speeds, and VA isn't really cheape, in this case it was the only option for this type of form factor, IPS ones don't exist, and it's so nice.
Due_Advisor_1612@reddit
Hell yeah!!!
That’s awesome. You are going to love it.
My 3080 is great at 1440. I’m glad you are making the switch
LuukeTheKing@reddit
Haha, thanks very much, yeah it should be good I'm looking forward to it, I have a fairly decent 32" ultrawide at work and I love it (although the little baby i5-8500T compact thinkcentre on integrated graphics it's attached to does not haha)
I have a 32" curved 16:9 currently at home, so I was gonna do the same as I like the form factor, but I couldn't really find a correct spec one until I found a Samsung odyssey G5(? 6? Can't recall) but not curved, and then I saw that Alienware, saw a Reddit thread of reviews for it saying it's great and no ghosting even on a VA, and I was sold, especially since it meant I get an ultrawide.
My current Monitor(Pretty sure it's actually a TV, it's meant to have a remote, but is 120Hz) is also on a racing SIM sorta stand from when I bought it, which is currently just sat against the wall, so the big baseplate is under the desk, with a big metal pole going behind it. Which I'm so looking forward to getting rid of, I'm hoping to wall flush mount the new one at the correct height, that and the keyboard are the two things left ruining my setup. (The keyboard is a £20 "gaming RGB" -its literally red, green, blue led's in different patches 😂- one I got gifted 5 years ago, I know, I know)
Hopefully Reddit will bully me into keyboard selfconciousness one day too haha
ColinStyles@reddit
I mean, I can understand wanting to futureproof and such, but it also just seems like a 1440p monitor or 4k would be very much in the cards shortly after as there's very little other than high RT or PT that would cause it to dip below 120 fps. For the record, R6 siege is capped at 120hz unless you modify an ini file, that's what you're seeing.
Personally, I would suggest upgrading to a 1440p monitor before anything else, maybe 4k if you're more sensitive to resolution over frame rate, and then upgrade the CPU (though this may need a mobo swap and at that point go AMD, intel is awful these days).
I'm not judging, it's just very much a niche usecase and a bit strange, and you can definitely see it's overkill if you're still easily hitting your monitor limit coming up on 5 years later
Pomegranate_Sorry@reddit
I got my 3080 in 2020 with i9 10900k playing 3840x1080p and it has always struggled since day 1. I am now gaming on 1440p super ultrawide, which is just under the same amount of pixels as 4k. The 3080 ain't enough. Super ultrawide 32:9 1080p is much better than 16:9 1440p and although it's about half of 4k it struggles to hit 120hz+ with max settings and raytracing, that's on 2020-2022 AAA titles. It's an amazing gpu, but someone buying a 3080 when current, should be able to use it for years with lowering settings. Both nvidia and amd should've added frame generation for all of the cars from that generation. I shouldn't be considering a 5070 when I have the highest model evga 3080. Also, the 3080ti didn't exist until almost a year after the 3080 with a 60% higher price, and the scalpers, crypto miners, and chip shortage would've had me waiting years if I didn't get mine at retail when I did.
Upgrading a flagship gpu every 5 years isn't overkill, and 1080p might seem like a waist to some, but I'd choose 32:9 at a lower resolution over 16:9 every time. I recommend trying it, fov is way more immersive than pixel density.
ChampionshipUnique71@reddit
Oh man yeah, definitely get yourself a 1440p monitor. You're going to be blown away.
jamesfoo2@reddit
Especially get a 2K if you plan to keep the i9 9900K for a while, put more strain on that GPU than the older CPU. CPU is still ok but getting a little tired imo. I used 1080p for 3-4 years gaming, then got the 3080 and a 2K monitor and wow what a difference in clarity and visuals - tho this includes the general monitor upgrade of course, the 1080p monitor was older.
shawtydearest@reddit
Yea I use 1080p and VR. I got a 3080 for it. Honestly you’d be surprised at how terribly optimized games are now a days. The overkill is necessary.
positivedepressed@reddit
7800XT and 3080 will last even longer if they get their MFR/FSR4 support respectively. Also for the 3080 an extra 6 gb VRAM would help it in the long run
KING_of_Trainers69@reddit
I went from a 3080 to a 5080 and found that somewhat underwhelming. I wouldn't bother going from a 3080 to a 5070.
Kusanagi2k@reddit
"Somewhat underwhelming"!? What? I mean ignoring the price of it, this statement has something wrong, I also have a 3080 and recently bought a 5080, and I don't know what you were exactly expecting, but I don't think have 130+ frames in most games when the 3080 could barely reach 80 with massive customizations (like DLSS Swapper and Engine mods in games) is "somewhat underwhelming"; you're probably doing something wrong
KING_of_Trainers69@reddit
The 3080 was over twice as fast as the 1080. A 60% uplift from the 3080 after the same gap in time is pretty unexciting.
0ptik2600@reddit
This seems to be the consensus people are reaching regarding that upgrade. I came across this post because I too was contemplating upgrading from my 3080 to a 5070.
Max-Headroom-@reddit
You must not use dlss, frame gen, RTX HDR, or have a high refresh rate monitor. No point in buying Nvidia I'd you skip all of their offerings.
KING_of_Trainers69@reddit
I have a 165Hz OLED. 3080 supports DLSS just fine, frame gen is mediocre. RTX HDR is so-so and has a substantial performance impact on both cards, it's not remotely good as a native HDR implementation.
Yeah it's faster but waiting 4 years to pay 40% more for 60% more performance is no way near as impressive as the uplift from the 1080 to the 3080.
Pomegranate_Sorry@reddit
If you sold your 3080 it wouldn't be that much more would it? After my 3080 I feel like going with the XX90 is the better way to go if you want it to last longer, though.
KING_of_Trainers69@reddit
Why are so many picking this 1mo old comment on a deleted reddit post to reply to? Not annoyed, just curious.
You're right I do need to sell my 3080 - the cooler was utterly atrocious on it which is the real upgrade, so I've got to work out how to swing that when selling it.
I've no doubt that the 5090 would be a more major upgrade but I don't really feel like spending 2 grand on a GPU upgrade.
ColinStyles@reddit
4k though? Or 1440p?
KING_of_Trainers69@reddit
UWQHD
ColinStyles@reddit
Yeah, makes sense. Substantially less pixels (nearly half) and similar for the texture resolutions.
SilkyAlchemist@reddit
Well said, you’re giving people buyers remorse though so chill 😂
Electronic_Pin_5730@reddit
Thanks for this. I hope 2-3years from now there is really a gpu that we can say worth to upgrade from 3080.
jamesfoo2@reddit
Even if in X years a better GPU isn't available (6xxx etc) the current ones will be cheaper so makes it more worthwhile to upgrade. 3080 is still strong, and while 5080 is stronger, £1200 is a ton of money for a mediocre upgrade. Depending on your games/CPU/etc I doubt you'd be blown away as 3080 while oldish it's no slouch.
Aesthetic024@reddit
my main issue with the 3080 is that that series does not run 4k 120hz without serious problems. I ordered a 5070 so i could have a less wattage running card and more performance/more space in my case. I thought about a 5070ti but I hate how much power my 3080ti ftw uses and decided cutting it in half is worth it to drop my AC bill by a hundred bucks. Honestly thinking about downgrading my cpu to a low wattage cpu as well
Red_Tooth_@reddit
I undervolted my CPU and use a 5070 and together they use 200 W average to game. 160 W for the GPU and 40 W CPU ( i7 13700 K).
gyufa21@reddit
Can you tell me your Mhz and power values ? I'm planning to undervolt my 5070 as well
Bogus1989@reddit
man thats a wild amount of power monthly it was costing you. jeez. have you ever tried undervolting it?
i dunno about the tis but the regular 3080s had similar issues, and undervolting quit mine from being so hot, and stopped it from throttling and made it so much more smoother.
my buddy has an evga 3080ti, and undervolting helped his as well.
Midknightsecs@reddit
Get a laptop or mini. It solved most of the problems you are facing for me. You can also use a mini/laptop as a daily driver and just game on the desktop. I did that for a while, too and it dropped my bill. For now? Create profiles for low perf/low kwh and one for max so you can switch between. It will at least drop your temps a bit now.
Icy-Willow-5833@reddit
That's not true. I run warzone on 4k at 165 hz and have no issues being over 200 fps
lafsrt09@reddit
The 3080 comes in either 10 GB or 12 GB
SilkyAlchemist@reddit
Hell yeah brother, me too except I’m on a 4k Samsung odyssey monitor.
SilkyAlchemist@reddit
I play everything at 4k on my 10gb Colorful iGame 3080 (don’t ask - my dad was in China at the time and sent me pictures of cards at a local computer shop).
I frequently stream and/or record and I have no issues doing so even today. Some games I, obviously, can’t max out but this card absolutely shreds still.
With that being said I’ve not played on, nor do I own any newer generation cards but from looking at benchmarks you’re not getting that much more performance.
I guess the answer lies within how little ~$400 means to you.
tybuzz@reddit
5070 is roughly 20% faster on average for raw rasterization performance (native FPS with no upscaling or frame generation). It will also have significantly better DLSS and ray tracing performance. If that's worth it to you, go for it, especially if you're gaming at 1440p. If you're at 1080p, the 3080 is fine still.
https://www.techpowerup.com/review/nvidia-geforce-rtx-5070-founders-edition/35.html