TomsHardware - Saying goodbye to Nvidia's retired GeForce GTX 1080 Ti - we benchmark 2017's hottest graphics card against some modern GPUs as it rides into the sunset
Posted by Antonis_32@reddit | hardware | View on Reddit | 165 comments
jenny_905@reddit
1080Ti, the infinite hardware content generator. Youtubers have been drawing on it since 2017 and continue to do so.
Great card with great longevity, not even that bad 8 years later.
996forever@reddit
It only shows us how much progress has stalled after that era. A 8800 Ultra had no chance running demanding 2015 games at all.
Strazdas1@reddit
Yep, game developers have stopped using innovating technology. In 2015 a new game wouldnt be caught dead using same shaders as they used for 8800 Ultra. Now we still have baked global illumination 7 years after better solution was released. we got a single game implement new texture shaders. we had only a few games even try VRS. All thanks to the fact that a lot of developers are aiming at complete ewaste hardware like Series S.
996forever@reddit
And when you point out those outdated low end hardware hold back progress and innovation, you get downvoted to hell.
Overall I think it’s the rise of cost of living (not just directly cost of hardware) coupled with people’s unrealistic expectation in how their 5 year old console that was already mid at launch should perform that arrived us to this state.
Fortzon@reddit
It's also about diminishing returns over graphical fidelity.
And the fact that a lot of gamers are starting to realize that they don't want to spend money hand over fist on new GPU and still suffer performance drops if it means that their fully raytraced 2025 game only looks marginally better than baked lighting in 2018 game.
Strazdas1@reddit
If you think it looks "marginally better" you have never seen one.
Of course sony cant advertise with graphical fidelity, they are once again going to release a console whose capabilities at launch is already outdated.
Better physics would need better hardware. Just a reminder that game physics were mostly killed due to low memory on PS3/Xbox360.
MrMPFR@reddit
Which is why they'll lean heavily into Neural Rendering to make the PT lighting so good that only a blind man isn't blown away. They can easily advertise this.
Not a single aspect of the pipeline won't be neurally rendered. Runtime trained MLPs guided by a trimmed down PT lighting input and outputting approximated offline quality rendering.
And who said anything about RDNA 5 being terrible in PT and backwards looking. PT + Workgraphs + ML etc... All HW accelerated properly this time it seems.
I would imagine physics is a prime candidate for GPU work graphs.
Strazdas1@reddit
We dont know how good the chips for PS6 will be. The last two generations certainly arent inspiring confidence that they wont just cut corners and settle for low end again.
996forever@reddit
It’s not marginal at all. Fully realised path tracing is a complete game changer. You only think it’s marginal because most current examples only use little ray tracing and that’s because the available hardware is shit.
All of this only further highlight how much stagnation there has been. 2015 hardware ran then new techniques better than a 8800Ultra ever could run 2007 stuff.
Xadro3@reddit
Maybe thats not stagnation, but bet on the wrong horse then? If we cant run any game at all at basically any resolution with every possible helper with full realised raytracing, its just a shit technology that needs time in the oven? or the Hardware is there in 10 years. Meanwhile, maybe they should try to find another selling point while Raytracing is still whatever to 99% of people.
Strazdas1@reddit
We CAN run it. Developers just choose not to implement it because the 8% market share GPU brand does not support it.
MrMPFR@reddit
Neither can NVIDIA really except for high end. Hoping RDNA 5 finally adresses this. We need a GPU architecture that actually works with PT acros the entire stack.
But yeah Pre-RDNA4 AMD RT was criminally bad.
Strazdas1@reddit
Well, for full path tracing ill agree, most GPUs cant do it. RTGI and other basic RT though (reflections, light bouncing), a 4060 can do it just fine under normal expectations of performance, and thats the average consumer level card.
Xadro3@reddit
We can? last i checked it still tanks frames so bad that nobody bothers with it.
996forever@reddit
It is not the wrong horse because it’s long been the holy grail of graphics. Creators love it. The speed up in graphical creation is immense when rasteurisation takes forever manually baking in lighting. It is not a fad no matter how much amd fans love to pretend it is. The only thing is having to cater to the extremely poor hardware in the consoles.
MrMPFR@reddit
The consoles have no problem with RT based DDGI. Anyting else can just work without any baking whether raster or RT.
Metro Exodus Enhanced Edition, Doom TDA, IJ&GDC all work fine on consoles. Blame PS4 and lack of engine side progress. And obviously UE5's criminally bad HW Lumen pre 5.6.
Advanced-@reddit
AMD has nothing to do with it.
I have a 5070 Ti, I just don't give two fucks if lighting or reflections are accurate.
I care how the game looks when I am actually playing. And old style rendering showed off the artistic intent just fine 90% of the time.
Like, I never once had issues about reflections being inaccurate in puddles or windows or metal objects or whatever. It's not something I ever noticed, because when I'm sprinting past the puddle and looking at wherever I am going, accuracy is of no value.
Even when I am looking at stuff, so long as it looks "accurate enough" I dont care that my character isn't reflecting or some color isn't glowing or something is missing.
This legit never crossed my mind up until all this RTX comparison BS. RTX could have never been a thing and I don't think I would have ever noticed or cared.
Same thing goes for accurate shadows, or most RTX improved things. Accuracy doesn't actually matter for video games. Same way that plot holes don't actually matter for 99% of movies, or the fact that super heros cannot actually exist, or how your favorite main character seems to have every bullet missed in every shootout scene.
Realism is not required to make a good-looking game. RTX can matter in *some* games, but the vast majority are not better games for it. Just slightly more realistic, big fucking whoop.
DefaultAll@reddit
This reminds me of sound engineering where we produce high quality audio masters that Spotify streams at low bitrates and then are listened to over Bluetooth. Most people are fine with it.
Advanced-@reddit
Spotify has lossless now, so not sure this works anymore lol.
Most people also dont have the equipment nor actual attention to dedicate to activley listening to a song in that manner.
Also, the difference here is people still have the option to buy $20 headphoned and stream 320 kbps mp3. If they want that.
RTX is starting to be forced in and you are now needing far more expensive cards just to run at the same 60-120 fps you did prior to Ray Tracing.
If it was an option we can turn off, I would continue ignoring it, just as majority of gamers did. Id have no issue, let it be an option.
Its being forced. It has very little impact on the gameplay experience to me. I dont care for it for the amount of tax it requires on my GPU. Its a 10% improvement in actual gameplay experience for 50% of my fps.
Im glad you think thats worth it. I disagree.
DefaultAll@reddit
It’s good to see lossless audio, given the bandwidth has been available for years. But with audio you have the option to go for higher quality rather than it being mandatory. From what others have said it seems to be less work for the devs, which must be driving it.
shawarmagician@reddit
Do you lower settings to high or shadows can be even lower and it's not noticeable? Though I understand why hardware sites use ultra for consistency, though Tom's has some medium settings charts recently.
Advanced-@reddit
I do and always have lowered settings (unless the game can already run at ultra at 120+ fps)
I have never used presets for as long as I have been PC gaming. I adjust every setting individually and find the best balance of quality --> FPS. Shadows have always been the first thing to get lowered.
It worked back in 2005 and the same is true in 2025.
Currently, playing Borderlands 4 on a combination of Low/Medium/High/Ultra settings. Looks 90% as good as Ultra when actually playing the game and not comparing still images.
And speaking of RTX, this game did not need this forced ray tracing bullshit. The performance hit is insane, and for what? A "Realistic" looking comic book game? There is nothing realistic about Borderlands, who is this for? 😂
God damn do I hate RTX with a passion in 2025. Maybe I will like it in 10 years when hardware to run it is affordable and games can actually be running at a 90-120 native fps again. But not today.
shawarmagician@reddit
The servers will be offline in 10 years even for PVE, need 700GB unoptimized install UE6 game and two power supplies, two separate wall outlets
MrMPFR@reddit
Then you can infuse that with MLP based neural rendering it becomes even more transformative.
PS5 > PS6 gonna massive just a shame that we'll continue to get shit implementations as long as 9th/10thgen crossgen continues.
Die4Ever@reddit
Compared to the jumps we made from SNES->N64, N64->Gamecube, and PS2->PS3? Absolutely marginal, and I love ray tracing.
anival024@reddit
It only makes games run like shit. They look marginally more accurate or dynamic, not necessarily better. And they do so at the cost of extreme performance compromises, so you have to run at non-native resolutions and upscale, or worse, use frame generation crap.
And people will still flock to older games instead of the new AAA slop because older games are more fun.
deep_chungus@reddit
maybe it's not but cost vs beneffit is a rough sell, unless people have hardware that can do it companies are not going to waste dev time on it making an additional reason outside of price not to bother buying it
996forever@reddit
The cost is so high vs the benefits PRECISELY because of stagnation…..
Strazdas1@reddit
I think cost of living is entirely overblown. Wages increased faster than inflation. Consoles being mid or even dead on arrival (series S) is certainly having a big impact here.
Vb_33@reddit
It's because Moore's law is dead. Maxwell is surprisingly competitive even in 2025. Had we kept the big gains of the 2000s going that wouldn't be the case.
Strazdas1@reddit
Maxwell wouldnt be able to even launch things if software developers implemented new features at the same speed they did in 2015.
Vb_33@reddit
Id argue even back then the pace had drastically slowed down. The DX11 era was not one of rapid evolution, DX9 onwards was a big slowdown and a bit part of that was multiplatform development (consoles). Either way we still have games being made to run on GCN1 (PS4) and the Switch 2 itself is a 2025 console with 2020 hardware (A78 and Ampere) that will last till 2033 and has the raster level of a PS4. Doom TDA is a path traced game that was also engineered to run on handhelds (check out the DF interview with iD).
I'd still argue the big issue is the slowing pace of HW advancement and software merely adapting to it.
MrMPFR@reddit
Yeah we need a fundamental µarch paradigm shift or new process node tech. Post N3 era is nightmarish for entry and midrange gaming.
Strazdas1@reddit
DX9 was so entrenched we had new DX9 games coming out when DX12 launched. This would have been unheard of before DX9 era.
Funny thing ive been cleaning the closet recently and found some old tech magazines. The reviews are funny and in some way very much modern take on afraid of tech advances. One of the review claims DX9 is too resource intensive and will never catch on.
That does not explain why most software is not exploiting HW advancements even 7 years after they got introduced (mesh shaders for example).
BloodyLlama@reddit
Hell 2015 was when I retired my quite dated 9800 GTX+.
ajd6c8@reddit
I'm still using this card! Benefit of astigmatism is that 1440 looks just as good as 4k. Plus r/patient patientgaming = ultra settings and good enough FPS on a card that pre-dates a lot of our kids
Beefmytaco@reddit
Thing is it's STILL not done yet for high performance gaming even now. You can use it as a second card for a lossless scaling build and coupled with a decent level gpu, you'll get some really good fps.
azenpunk@reddit
Can you explain this in some more detail for us slow people?
anival024@reddit
It's a terrible idea, don't even think about it.
There's some crappy program sold on Steam named "Lossless Scaling". All it does is upscale and interpolate frames. It adds a lot of delay and looks like crap. It's basically the worst way possible to do upscaling or frame generation. Using a dedicated card for it is even dumber - the added power draw and heat from a second card is an insane waste.
Beefmytaco@reddit
Dual-gpu lossless scaling is a bit complicated as it took me a bit to figure it out at first, but basically your main GPU (say a 5070ti), well that's going to be your render card. Put it in the top slot then put the second gpu, say the 1080ti, into the second slot and plug your monitor into the 1080ti.
You then set lossless scaling, a app you can buy on steam, to render on top card and display to the second one.
Bam, you get frame generation at far less a cost since one whole gpu is dedicated to frame generation and one to rendering the game. Gives you the least amount of latency with frame generation while giving you a ton of fps, works really well.
The lossless scaling subreddit has even more details.
iAmmar9@reddit
How much more beneficial is it vs running a single 5070 ti? Like what's the FPS increase.
Also would this work with a 9070 XT and a 1080 ti to display gsync? my monitor doesn't support freesync.
unapologetic-tur@reddit
It's a meme. Lossless scaling has issues of its own, namely that it gets no information from the in-game engine, so it tends to be inferior to the newest batch of DLSS and FSR. It also can't differentiate between game and UI.
It is true that upscaling takes away from GPU resources, but nowhere near as much to ever bother with a 2 GPU build.
And while I'm not exactly sure and I'm just spitballing here, a dual GPU set-up where your display GPU is not your render one could fuck with gsync/freesync.
x3nics@reddit
Could an iGPU do this?
Beefmytaco@reddit
I think someone tried it IIRC, a story I read. Think he had one of the newest igpu's from AMD or something like that, the only thing strong enough to push it.
It was barely enough though, kinda helped.
Seanspeed@reddit
250w GPU just for some often lackluster framegen doesn't seem like this is really that appealing.
reg_pfj@reddit
A 1080ti could also hardware accelerate "physX" features the 50 series dropped. I remember that one guy bought a 3050 to accelerate features that crippled his 5090:
https://www.reddit.com/r/hardware/comments/1iv2x5h/i_bought_a_3050_to_pair_with_my_5090_to_uncripple/
kikimaru024@reddit
RTX 5050 consumes half the power & is faster.
sitefall@reddit
It can't do PhysX though. 1080ti can do PhysX and lossless scaling as a secondary GPU.
pythonic_dude@reddit
Then you should hunt for a 3050. 1080 isn't getting driver support moving forward.
Beefmytaco@reddit
More of a thing to do if you already have it, not go to ebay and buy one just to do it.
kikimaru024@reddit
1080 Ti's are selling for ~$130-150 while a new 5050 is only $250.
ialwaysforgetmename@reddit
I'm so sad mine died and had to replace it. This would've been fun.
HulksInvinciblePants@reddit
Well there goes my plan to relax tonight.
Racer_Space@reddit
I wish mine could have made it. Busted capacitor set my 1080ti on fire around the 3000 series launch.
noiserr@reddit
The worst possible timing too due to COVID shortages.
Racer_Space@reddit
Yep. I was lucky enough to get a 3080 from a random dude on discord for MSRP. It was pretty sketchy but he came through.
azenpunk@reddit
Oh that's so sad
opaz@reddit
Mine is still running great today!
Irrepressible_Monkey@reddit
Mine made a buzzing sound yesterday. I took it out, put it back... no buzzing sound so far.
Strazdas1@reddit
and yet aged worse than a 2080 thanks to ray tracing/DLSS.
azenpunk@reddit
I don't have installed anymore in my main machine, but my 1080ti can still do halo infinite at 4k 80fps on ultra, and I think that's pretty impressive compared to my 7900xt that can do the same at 120fps.
OnkelCannabia@reddit
Didn't downvote you, but sounds a bit unrealistic. I get the same at 1440p.
GraXXoR@reddit
Just sold my Zotac 1080Ti last year for $240.
It could still manage CBP2077 at 1080P on medium settings with a few tweaks.
Absolute gigschad of a card.
ShadowRomeo@reddit
What a legendary GPU, I remember back when I build my PC for the first time I had this GPU as my dream GPU, only was able to got up to GTX 1070 before when I transitioned to the RTX GPUs.
It's kind of surreal to see it being slower than even the RTX 3060 nowadays, likely due to games that requires DX12 Ultimate feature set and has Ray Tracing turned on by default, but on old fashioned rasterized focus games, this thing AFAIR is even faster than the RTX 3060 and goes head to head against the likes of RTX 2070 Super.
Firefox72@reddit
It might not even be any RT. Pascal was incredible but all things age at some point.
MrMPFR@reddit
The funny stuff is that Pascal was already kinda outdated at launch. Look at Turing and the ludicrous gen-on-gen gains in DX12 and compute heavy games and how far the cards pull ahead in newer games vs launch. Basically catching up to GCN in Async compute over a decade later.
Tired of this limbo phase we're in rn as u/Strazdas1 said this is not normal at all. I really hope the nextgen consoles and RDNA 5 can turn the tides and make gaming in the 2030s take a solid step forward.
9th gen had one foot in the past and one in the present holding back progress but at least it looks like for nextgen AMD for once is finally taking an approach with two steps in the future. So it seems.
Whatever happens the 2030s better not be a repeat of the 2020s. Fingers crossed a decade defined by ML, Neural rendering, PT and work graphs, not endless crossgen stagnation.
Strazdas1@reddit
I really enjoy reading how optimistic your comments are :)
Quealdlor@reddit
Most games don't have ray-tracing or path-tracing anyway.
Strazdas1@reddit
Sadly true. Can you imagine a decade ago most games not implementing a 7 year old GPU feature? They would be laughed out of the market as outdated trash.
Vb_33@reddit
People got butthurt about Alan Wake 2 needing Mesh shader support. PC gamers have become tech adverse.
Strazdas1@reddit
I think its wider than that. The discourse around AI is a great example. There is data showing a clear divide where the western hemisphere is afraid and hateful towards changes while eastern hemisphere is optimistic and hopeful. The same is happening in gaming tech and every other tech. Take a look at how US treats bodycamers vs how asian countries does and youll see exact same pattern.
Also hopefully we will have better adoption of work graphs, which are supposedly easier to use than mesh shaders.
MrMPFR@reddit
...and makes makes mesh shaders easier to program. Mesh nodes FTW!!!
ResponsibleJudge3172@reddit
AI was trashed by popular influencers like Steve long before it could gain traction in graphics. That's just how people are nowadays despite them complaining endlessly about "lack of progress"
no_no__yes_no_no_no@reddit
Turing was a big leap over pascal in terms of feature set. Even now no games implemented all of what Turing brings.
If games implemented everything from Turing feature set on release even without considering DXR, 2060 could easily match 1080 ti.
MrMPFR@reddit
Texture space shading
Sampler feedback streaming (via Tiled resources)
Mesh shaders
Variable Rate Shading
Ray tracing
ML cores
FP16 x 2 = FP32 x 1
etc...
Vb_33@reddit
Yea the 2060 bodies the 2080ti in Indiana Jones, Assassin's Creed Shadows and Doom TDA.
AdmiralKurita@reddit
Actually, it is more surreal not to see recent hardware being more faster. I think that is evidence of the death of the Moore's law. It is a major reason why I think "AI" is just hype.
azenpunk@reddit
Moores law isn't dead in any way. That was just marketing propaganda from Nvidia to justify their price hikes
Morningst4r@reddit
What does Nvidia control about Moore's Law? And if transistor costs are still dropping at that rate, why aren't AMD and Intel selling GPUs for a third of the price?
Seanspeed@reddit
They dont control Moore's Law, but they are absolutely lying about it not being dead for marketing purposes.
ResponsibleJudge3172@reddit
Nvidia has been consistent about moore's law. They also say GPU accelerated compute scales much higher more quickly than CPU scaling in datacenters. Which has nothing to do with Moore's law, espeecially when AMD and Nvidia rely on ever expanding sizes of "chiplets"/"superchips" to achieve this
azenpunk@reddit
You people thinking that companies can influence More's law Boggle my brain.
Strazdas1@reddit
moores law has been dead for over a decade. Anyone claiming otherwise dont understand shit about moores law.
azenpunk@reddit
Ok, then explain why it's dead.
ResponsibleJudge3172@reddit
New nodes coming every 2 years give a miserable 20% density gains with 30% price hike. Eg 2nm vs 3nm from TSMC, rather than 100% gains of Moore's law
Seanspeed@reddit
Well for a start, we very much aren't getting double the transistor density every two years. Not even really close, honestly. All while SRAM scaling specifically has essentially stalled out.
But even past that, the actual *context* of Moore's Law was always supposed to be about the economics of it. It wasn't just that we'd get double the transistor density every two years, it's that we'd get double the transistors for the same manufacturing cost. This was the actual exciting part of Moore's Law and the race for shrinking microchips further and further. It was what paved the way for affordable personal computing, and why we could get really big and regular leaps in performance without it also meaning huge ballooning of costs.
This has all stopped quite a while ago. We do still get improvements in performance per dollar today, but it's slowed to a crawl. We are more and more being asked to pay more money for more performance with a new process generation.
azenpunk@reddit
Thank you for an informative response that wasn't condescending.
It has been a long time since I have read anything about it. I was unaware of the economic context of Moore's Law. That does change some things.
My perception was that it also included the reality that an exponentially increasing rate of computing power was unsustainable and that it would eventually peak and plateau briefly, until another technology took over and started the process over again of doubling computing power, until it reached its own peak, and so on. Am I mixing my theories?
Strazdas1@reddit
We are not getting double transistor count every two years. Thats it. Thats all that moores law is.
Quealdlor@reddit
What is happening to all the great ideas about how to scale specs further? There has been a lot of research about such topics. For example reversible computing or silicon photonics or new materials. It has been demonstrated that petahertz processors are possible and petabytes of ram that could fit in a smartwatch are also possible.
Seanspeed@reddit
Most all these supposed holy grail solutions have huge practical problems in the real world. Designing a single transistor to run at crazy high clock speeds in a lab is cool, but now make that into a full design, mass manufacturable, general purpose processor. Whole different ballgame.
jenny_905@reddit
It's not really though, new GPUs are the same if not better at less than half the power usage.
kikimaru024@reddit
5050 is faster.
Quealdlor@reddit
Normally lowest tier would be RTX 5030 for $69 and 5040 for $99. We must have had seriously f**ked up something along the way to be in this place with high prices and poor progress.
Strazdas1@reddit
Yeah theres no way a 5030 for 69 would be competetive against iGPUs. its iGPUs that killed low end.
996forever@reddit
The fastest integrated graphics that exists on a socketed cpu (780m/Xe-LPG) is still slower than the GTX 1630.
Dangerous_Growth4025@reddit
la mienne je ne suis pas prêt de m'en séparer.
john0201@reddit
All the more impressive at 250 watts.
Seanspeed@reddit
Modern graphics cards are coming with needlessly inflated TDP's, though.
There's no good reason for a 5080 to be 360w except to juice review benchmarks to the maximum, for instance. That's just ludicrous.
A 9070XT can similarly be driven down to 250w without losing basically any performance.
john0201@reddit
This implies the 1080 wasn’t doing the same thing, it was. The card isn’t from the paleolithic they stopped selling it 6 years ago.
Seanspeed@reddit
All desktop GPU's are 'juiced' to some degree, but they were usually to a reasonable balance between extra performance without blowing out the power draw and cooling requirements.
john0201@reddit
You can drop the power on a 1080 (or just about any other cpu or GPU) sold in the last at least 10 years and get a roughly similar increase in efficiency.
Marketing departments have been aware of the CMOS power law since there have been chips to sell.
Seanspeed@reddit
No you cant. Today's GPU's are way more juiced out-the-box than those from like 10 years ago.
randomkidlol@reddit
the stock 1080 was really in the middle of its efficiency/performance bellcurve. some of the aftermarket cards pushed it up quite a bit just to show how much headroom the silicon had. setting the power limit to say 60% tanked performance significantly compared to modern flagships.
SomeoneBritish@reddit
It’s not efficient at all compared to modern cards, as expected.
john0201@reddit
Yeah I just meant current flagship cards are more than double that so its longevity is impressive.
Ok_Assignment_2127@reddit
Power consumption is not the same as efficiency. Today’s flagships will have lower consumption than the 1080Ti for the same tasks
john0201@reddit
Obviously that is the case with every new process node.
BlueGoliath@reddit
The last good Nvidia GPU generation. Nvidia will never make the same mistake twice.
Sictirmaxim@reddit
The later offering have been great,especially the RTX3000 series... just not the prices they asked.
Crackborn@reddit
what was wrong with the 3000 pricing?
Seanspeed@reddit
Ampere was actually notable *because* it actually offered some pretty good value GPU's. The 3060Ti for $400, and 3080 for $700 were the big catches. 3060Ti was a cut down upper midrange part, while the 3080 was a cut down high end part. It's not the 30 series or Nvidia's fault that cryptomining had a huge boom around this time, ruining the market.
40 series was actually a seriously fantastic leap in performance and efficiency, but Nvidia lost their minds with the pricing. 40 series in many ways is extremely comparable to to Maxwell->Pascal.
BlueGoliath@reddit
So they aren't great. Thanks for the clarification.
DutchieTalking@reddit
Still running my 1080ti. Do want to upgrade. Not yet sure what to upgrade to. Many cards don't feel worth it or are too expensive.
NeroClaudius199907@reddit
The 5070ti costs less than what you bought 1080ti for unless you bought it for 2nd hand like many people here.
DutchieTalking@reddit
Here the 5070ti is the same price as I bought the 1080ti for. But I've got less money to spend now.
jenny_905@reddit
5070 Super might be a tempting upgrade when it launches, it looks like it will have all the chops to be similarly long lived as the 1080Ti.
DutchieTalking@reddit
Looks interesting. Thanks.
I suspect it will be out of my budget range, though. Will prob be €1000+. But worthy to keep an eye on.
jenny_905@reddit
I think they are expecting similar to current 5070, perhaps $600.
deanpmorrison@reddit
This is where I'm at. Tested out GeForce Now just to see what the big deal was and honestly this card is still cranking out enough horsepower to run just about anything that doesn't require RTX explicitly. I'll hang on for at least another GPU cycle
PineappleMaleficent6@reddit
Just upgrade from a gtx 1070 to 5070ti...it served me well...great line of cards those were.
thelastsupper316@reddit
I still think the 2070 super or 2080 super was the better card tbh because they supported newer features that this didn't have support for ever. But the 1080 TI is still a legend among us mortals.
Cjprice9@reddit
Both of those cards came out several years after the 1080 Ti, and at the time of their release, offered no more performance per dollar than the 1080 Ti did.
The 2080 on release day had the same MSRP as the 1080 Ti and the same performance with less VRAM.
thelastsupper316@reddit
By several you mean 2 and they are And yes definitely less Vram and that sucks imo, 2080 super was a awesome card and the Vram issue is only really a problem now. 1080 ti is great but the 2080 super can still run EVERY game out today not just most, and can actually use decent upscaling.
Cjprice9@reddit
Two years is a long time for graphics cards.... or it was, before things stagnated after the 1080 Ti's release. If you bought a 980 Ti, or a 780 Ti, the expectation was that you'd get 12 to 15 months of flagship performance, another 12 to 15 months of midrange performance, and then you'd need a new card if you wanted to stay ahead of the curve.
It's almost like Nvidia accidentally made the 1080 Ti too good, and too cheap, then regretted it. The next generation featured one card that soundly beat the 1080 Ti, and it was $1200.
NeroClaudius199907@reddit
They didn't regret it, 1080ti gave them incredible branding & turing built on that with rtx/dlss. Their revenue increased the following gen
Gippy_@reddit
Most of the original RTX 20 and 40 series weren't appealing, and that's why they got Super refreshes. Today the Super cards are the ones that people discuss, plus the 2080Ti and 4090.
The RTX 30 series was good, and the refreshes (3080 12GB and 3090Ti) were so underwhelming that they were quickly forgotten.
StickiStickman@reddit
Yea, thanks to DLSS those two cards are probably gonna age even better.
Strazdas1@reddit
they already aged even better. half of the titles tested wont even run on 1080ti if you use 7 year old features as per the article.
HuntKey2603@reddit
more vram than most GPUs sold today
amazingspiderlesbian@reddit
Literally every gpu more than 350$ has more vram than a 1080 Ti tho. Even some 250$ ones like the arc gpu.
(Besides the 8gb 5060ti)
HuntKey2603@reddit
Most GPUs sold aren't more than 350$?
ghostsilver@reddit
and the 1080 Ti was 700$ MSRP, which is 900$ today.
Ok-Schedule9238@reddit
ok and 5090 cost about 2000$ MSRP
ResponsibleJudge3172@reddit
And 5090 is several times as fast and literally twice the silicon. It's not the same tier by any means
Ok-Schedule9238@reddit
ok but it was a flagship gpu for only 700$ in that time which other generations cant get to that price
repocin@reddit
Titan Xp was the flagship of the 10-series. MSRP was $1200, which is ~$1600 today adjusted for inflation.
For comparison, RTX 5090 launched at an MSRP of $2000 earlier this year.
amazingspiderlesbian@reddit
I mean tbh the only gpus sold today in the 50 series going by the steam hardware survey.
The 5060 and 5060ti combined make up 2% of gpus.
And a good portion of those 5060tis are 16gb. So likely 1.5% of rtx 50 series GPUs have 8gb of vram at least in the steam survey. The 5050 adds some as well but the number is too low to show up on the survey.
The 5070 and up with 12gb+
Makes up 3.7% of rtx 50 series gpus in circulation.
So most of the current gen gpus in use now are more than 350$ and have 12gb of vram or more.
arthurno1@reddit
Mine is not even Ti, just the very first generstion of GTC1080,, but still well and kicking. However, mine is probably not driven as hard and the one in a benchmarking build.
Plank_With_A_Nail_In@reddit
if you went off only reddit and the gaming community you'd think everyone had bought a 1080Ti when in fact hardly anyone did. The 1060 and 1050 are the cards people actually owned, my daughter has my 1060.
Crusty_Magic@reddit
10 series was so great. My 1060 wasn't replaced until a few months ago.
PaulTheMerc@reddit
still running mine. Eyeing intel for my next upgrade.
exomachina@reddit
I'm still able to get over 60FPS at 1440p with some settings tweaks in most games. BF6 and Arc Raiders run amazing.
iAmmar9@reddit
Yea BF6 runs anywhere between 115-140 fps on fsr performance + low settings. paired with a 5700X3D
Busty_Mortgage3459@reddit
do you experience any input lag? Im using 1080ti with 5800x3d and im getting 50-70fps depending on the map.
iAmmar9@reddit
Nah totally fine fortunately.
Vaxtez@reddit
I can see the GTX 1080 living on within the very low end regions for a while longer, as it is not a bad card for around £80-90 if all you want is older AAA games, esports titles or indie games, though with cards like the 2060 being in the £100 region, that may become the better choice due to DLSS, DX12 & Ray Tracing for modern AAAs
exomachina@reddit
The TI is like 25% faster than a regular 1080 and the 11GB of vram helps it through most games without turning down textures. Usually just turning down shadows and lighting effects is enough to brute force most modern games.
Seanspeed@reddit
At 1080p maybe. 1080Ti is very weak for modern games, and inability to take advantage of DLSS or anything really makes brute forcing high demand games of today difficult.
Ninja_Weedle@reddit
unfortunately everybody glazes it so much that it won't drop below 150$
OnkelCannabia@reddit
I played Horizon Forbidden West on high with that thing. I think the way you are phrasing it is selling it a little short even.
LeckerBockwurst@reddit
Overclocked Vega56 Gang unite
ElementII5@reddit
Yeah, they should have included the Vega64/56 since int released in summer of 2017 as well. Would have been nice to see another follow up comparison.
elbobo19@reddit
I just moved on from my non-Ti 1080 this year, got a great run out of it
slickvibez@reddit
This will now live on as a Lossless Scaling GPU
Quealdlor@reddit
Because of the unprecedented stagnation, the 1080 TI isn't that much worse than the new cards. There has been barely any progress to be honest. And the graphs show it very clearly. Nvidia doesn't deserve its market evaluation.
jenny_905@reddit
It's a lot worse.
The 1080Ti was great but this retconned idea it was some sort of anomaly as far as performance is nonsense, it is similar in performance to a 5050 today.
Gippy_@reddit
The card at the top of the chart is a 5060Ti. If the top of the chart were a 5070Ti/4080 Super, or heaven forbid the 4090 or 5090, you'd see the 1080Ti being completely humbled. Playable 4K60 for AAA games is progress, which the 1080Ti couldn't ever do.
Jayram2000@reddit
The GOAT for a reason, scooping up a used mining card for $350 was one of the best purchases I ever made. Still kickin out frames in my sister's PC
Kougar@reddit
Will forever miss my EVGA 1080 Ti Hydrocopper, and not just because it was the best value proposition that won't be seen in GPU markets again. It also marked the last EVGA card I'll ever own and it will be the last GPU I own to have a 10 year warranty, which ironically I never needed to use.
iAmmar9@reddit
Still using mine lol. Wanted to upgrade to a 5080 this gen, but it isn't looking too good vs the 5070 ti. So it's either 5070 ti within the next few months (waiting to see supers) or wait for 6080, which hopefully will be an actual upgrade vs 5080.
jasonbecker83@reddit
I went from a 3080 to a 5070ti and I couldn't be happier. Amazing performance at 4k, even better if you don't mind using DLSS or frame gen.
MC_chrome@reddit
Truly a legendary card, and much more deserving of the Ti designation than some of its newer counterparts.
DiggingNoMore@reddit
The GTX 1080 and 1080TI will never die.
Nicholas-Steel@reddit
Meanwhile plenty of indie games continue to release that play perfectly fine with a Geforce 1070Ti.
AutoModerator@reddit
Hello Antonis_32! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.