NVIDIA GeForce RTX 5090 reviews go live January 24, RTX 5080 on January 30
Posted by panchovix@reddit | hardware | View on Reddit | 409 comments
Posted by panchovix@reddit | hardware | View on Reddit | 409 comments
panchovix@reddit (OP)
The 5080 reviews going out the same day as release, sounds suspicious.
Pe-Te_FIN@reddit
I dont think 5080 vs 5090 is the problem, its 5080 against 5070 Ti. Cant see 5080 against like 4090 is a problem for them, as if you have 4090, if you are upgrading, it will be a 5090.
Odd-Onion-6776@reddit
considering how bad the 4080's value was at launch, i have to agree
sips_white_monster@reddit
I hope the 5080 matches the 4090 at least, otherwise it's very disappointing. On the bright side if the cards are underwhelming there won't be as much pressure on supply. If you can even call it a bright side.
crshbndct@reddit
The 5080 is faster than the 4090.
FuzzyApe@reddit
In my what dimension
crshbndct@reddit
5070=4090, so 5080 is faster. Simple.
FuzzyApe@reddit
Gief source
crshbndct@reddit
https://www.youtube.com/watch?v=9xPEfOmvkuc
FuzzyApe@reddit
Jensen: trust me bro!
Lmao
crshbndct@reddit
Im not sure i understand?
FuzzyApe@reddit
If Jensen showed you a slide how the 5090 can fly to the moon, would you believe it too?
Strazdas1@reddit
if we put more blower designs on it it might just get a liftoff.
crshbndct@reddit
This is obviously a joke, but why would the person who owns the company lie about performance?
FuzzyApe@reddit
I wonder why Jensen would lie! He would never, would he? He had never lied before… right?
crshbndct@reddit
Im not sure I understand, sorry.
horendus@reddit
It probably wont be faster but we will find out soon
Not_Yet_Italian_1990@reddit
I'm really curious about this as well. It's got 2/3rds the CUDA core count of the 4090 and a 256-bit bus. And it's on the same 4nm node. I guess the memory is faster... but it's got a lot of ground to make up over the 4080.
I'm actually surprised they didn't drop the 5080 first, which is what the rumors were suggesting.
MT-Switch@reddit
Apparently there was a bios revision sent out to the aib to fix a problem with the 5080, which necessitated the 5080 to be delayed.
Not_Yet_Italian_1990@reddit
Gotcha... well, that's going to be pretty unfortunate for Nvidia, I think. Reviews would've been better had the 5080 gone first, I think. Now its performance is going to be compared to the 5090 and not the 5080.
ketoaholic@reddit
All indications are that it won't match a 4090. Probably 30 percent (at best) uplift in pure raster over the 4080 while the 4090 is more like a 50 percent uplift.
Jaegs@reddit
Just enable the 3x frame generation and AI texture compression and it will be twice as good as a 4090! /s
panthereal@reddit
It should be equally as suspicious of the 3080. Might be they are holding off on the reviews so people line up for the 5090 because it's good, only to reveal that the 5080 is 30% worse at half the cost.
bphase@reddit
Unlikely that the 5090 needs any help in selling, it'll most likely be the most difficult one to get at least as FE.
panthereal@reddit
I'm sure it will sell out on day 1 just because it's day 1 but I can see a lot of people choosing to get a 4090/5080 instead if they determine it's a better value than the 5090.
They really have to prove the 5090 is $500+ better in a time when the 4090 is still really good. I personally think the 40 series crossed a threshold from "cards aren't good enough" to "things are fine now" as I was happy to upgrade from a 3080ti which was effectively a budget 3090.
Now, I would maybe go from 90FPS to 120FPS in max settings and I think my CPU might be more of a bottleneck outside of frame gen. And the 40 series is already getting enhanced frame gen on DLSS4, the only thing it will lack is multi-frame gen.
Strazdas1@reddit
i think you are underestimating how many 4090 (and 5090 later) are being bought for non-gaming purposes.
Large_Armadillo@reddit
wrong. the ones who can afford it won't get the 5090. somehow, scalpers will because they won't be on shelf.
airfryerfuntime@reddit
Everything sounds suspicious to this subreddit...
teutorix_aleria@reddit
Embargo lifting on release date means one thing, you don't want day 1 buyers reading the reviews. There's no good reason to do it.
Nvidia is reliant on halo products to create the illusion of a value proposition for their lower priced parts.
Far_Success_1896@reddit
It's stated why they are delaying in the link. They were late providing the bios to aibs and so aibs want more time with it so that they can release their own software to reviewers. They probably won't get it out until the 24th.
Acrobatic_Age6937@reddit
damn they must be lucky to manage to get everything working just in time so that they can lift the review embargo on day 1... /s
Strazdas1@reddit
I think its more that its impossible to have review embargo after release date because every reviewer will just say they bought the hardware and is therefore not under embargo.
Megakruemel@reddit
So like, why even buy a GPU at day 1?
If your PC broke apart and your gpu melted inside of it and you need a new PC now, why make an uninformed purchase and not get something cheaper from previous generations to have something instead of nothing? Like, you can get 30xx cards pretty cheap and those will do what they need to do unless you play freaking 4k 120fps. And if your PC burned down because you were using parts for a long time, that 30xx might probably already be an upgrade. And PCs breaking down now would be a minority.
If your PC is working and you already have a 4090, why upgrade? Those unoptimized unreal engine 5 games aren't coming out that same day.
If you want to upgrade from a 10xx to the newest model, so you can use that card for 6 years or longer till you upgrade again, why make an uninformed purchase? Long time purchases need to be informed.
Like, the only reason I can see for a day 1 purchase, is if you just want the newest shiny thing every time it comes out. Why not wait for reviews, manufacturer reviews and i guess just general vibe around the product and actually buy the higher quality product of the bunch?
Real 23rd December "why can't christmas be now" kind of behaviour.
bphase@reddit
Because they might sell out for weeks or months if you're not ready to get one at the minute of release.
teutorix_aleria@reddit
I literally do that but many people don't. I can be upset about things that don't directly impact me.
ryanvsrobots@reddit
AMD does it too. It's industry standard.
ThrowawayusGenerica@reddit
"Industry standard" doesn't really mean anything in a duopoly.
ryanvsrobots@reddit
It does but ok
Whirblewind@reddit
What does this whataboutism have to do with the topic you jumped into? You aren't replying to anything related to AMD.
ryanvsrobots@reddit
It’s very obvious. It’s not suspicious because that’s how every launch is. I really needed to explain that?
teutorix_aleria@reddit
Sure and its bullshit when they do it too. Keeping information out of the hands of buys betrays a lack of confidence in the product itself.
Veastli@reddit
Yes. What's telling is that Nvidia is releasing the embargo on the 5090 a week before it goes on sale.
Clearly, Nvidia is confident in the performance of the 5090.
But the 5080... not so much.
signed7@reddit
I'm expecting a 5% raster perf upgrade from the 4080S
HystericalSail@reddit
I hope this is true and they wind up burning the scalpers alive.
Veastli@reddit
Not when a company has confidence in their product.
Nvidia is releasing the embargo on the 5090 a week before it goes on sale.
ryanvsrobots@reddit
It is though...
Veastli@reddit
Then why is Nvidia releasing the embargo a week before sales start on the 5090? But only on the first day of sales on the 5080?
Because Nvidia has confidence that the 5090 is a true upgrade, while they know that the 5080 is largely a side-grade.
ryanvsrobots@reddit
I don't know why the embargo lifts early, I also don't know why every other GPU release regardless of manufacturer in the past few generations has been launch day or the day before. Do you want to make anything up about those?
Veastli@reddit
It's clear why they're doing it.
And I get it. You don't want to hear that the card you're planning to buy will be a side-grade and not a real upgrade.
But that's the likely truth.
ryanvsrobots@reddit
Because they and everyone else always do it this way?
You don't know a single thing about me.
Veastli@reddit
Nvidia's not. Not with the 5090.
Nvidia is confident in the performance of the 5090.
Clearly, they're not as confident in the performance of the 5080.
ryanvsrobots@reddit
You keep saying that with zero evidence
Veastli@reddit
Historically, day-one-sales embargo releases mean a company is not confident in the product, irrespective of the product.
Whether it's a movie that allows no reviews prior to release, or a video card. When the embargo releases on the first day of sales, it usually means the product is, at best, nothing special. And at worst, sub-standard.
Why are you in such denial of this tremendous historical precedent?
Why so invested in the 5080 not being a side-grade?
ryanvsrobots@reddit
Source? Again, every GPU launch has been like that for several generations (too lazy to look further). Is every company not confident in every launch?
Veastli@reddit
Every company except Nvidia? Yes! Absolutely!
And even Nvidia when they release a side-grade, like the upcoming 5080.
Again, why are you ignoring this massive historical precdent?
And why are you so invested in the 5080 not being a side-grade?
ryanvsrobots@reddit
Why are you ignoring the massive historical precedent of every company doing this for every launch?
I literally don't care about the 5080. It might be hard for you to not project your biases onto others, but I don't have an agenda.
I do care about misinformation and feelings ruining this subreddit.
Veastli@reddit
You keep saying that. Lol It's not true. Not even close to true.
Nvidia isn't doing it with the 5090. Proving your clam wrong.
It's not industry standard. It's not done for every launch. It's not even done for this launch. lol
Face reality, the 5080 is likely to be a nothing burger.
ryanvsrobots@reddit
Obviously the 5090 is the exception to the rule, you're making it seem the opposite.
I don't know what a nothing burger is in this context and I already told you I don't care about the 5080.
I also don't think embargo dates are a big deal because even if you buy day 1 and it sucks you can just return it.
Veastli@reddit
Wrong again.
Intel's recent B580 released the embargo prior to the sale date. Only one day prior to the sale date, but that's all the really matters.
Reviews hit on December 12th, sale started December 13th.
III-V@reddit
It used to always be this way. There's nothing weird about it.
Strazdas1@reddit
Its not paranoid if its true, though. Day 1 embargo is usually a sign that a company does not trust its product to perform.
Bored_Amalgamation@reddit
that's a suspicious thing to say....
i4mt3hwin@reddit
In terms of raster upgrade it makes sense.. the 5080 isn't going to be that much faster than the $1000, 4080 super it replaces.
The 5090 will look better in comparison to the 4090.
So they build hype with the 5090 and by that point most people are primed for the series before the news hits that the 5080 is more of a sidegrade than anything
TrypelZ@reddit
Judging by that big gap between 5080/5090 there probably will be a 5080Ti next year. If not they can at least say " the 6080 is 40% faster then a 5080 in raster " when those release
LucAltaiR@reddit
A refresh of the 5080 is basically guaranteed I think.
Strazdas1@reddit
I think a refresh of all the cards will exist when they mass produce 3 GB GDDR chips and every card will get a 50% boost to VRAM without needing any architectural changes.
TrypelZ@reddit
I personally also think there will be a 5080Super ( with a performance jump this time around ) or a 5080 Ti with 24GB next year
Zednot123@reddit
They could potentially even tape out a whole new die for it. There is a giant ass hole to fill. Not sure they want to cut down GB202 that far.
TrypelZ@reddit
and it would sell out in minutes with used 5080's swarming the market haha
bryf50@reddit
Everyone thought a 4080 ti that was cut down from the 4090 was going to release too, but it never did.
Zednot123@reddit
The reason it never happened was probably the AI restrictions. Check the SM count of 4090D, that thing smells a lot like a "4080 Ti".
Essentially Nvidia no longer needed a down binned SKU to get rid of defective dies below the 4090 level. Whatever volume is left even further down on the pecking pole, simple isn't enough for a mainstream SKU. Instead they have been thrown at stuff like that AD102 based 4070 Ti from MSI. .
TrypelZ@reddit
Problem is the 5080 is barely an improvement over the 4080 while the 5090 increases the gap to the 4090 by a lot so there is more then enough room to justify a 5080Ti later down the line while still maintaining the "buy more save more" mentality of the 5090
sips_white_monster@reddit
5080 has 30% more bandwidth than the 4080, 10% more cores, slightly faster clocks. And they said that Blackwell has been a major architectural rework. We'll find out soon enough if that means anything.
signed7@reddit
So <10% better than a 4080 super except for memory bandwidth. Lol
noiserr@reddit
They have no business reason to do that though. They have no competition in that segment, and they would rather people upgrade to the 5090 instead.
I mean this is what they did last time, no reason why they would do anything different this time.
starkistuna@reddit
The 5080 is basically that 4080ti, they had a little bit to make a 4090 ti but they held off not to cannibalize their stack, since no need to release because AMD didn't refresh their top lineup. Only reason 4090 was made that wild was because Nvidia actually thought chiplet design was going to rival 4080.
yimingwuzere@reddit
They don't need to when there's no competition.
3080 Ti exists only because of the 6900XT.
Meekois@reddit
The 5080 is unlikely to be a sidegrade, but this review schedule release is suspicious as hell. Gonna have to stay up all night to make a informed purchasing decision by 8am
rabouilethefirst@reddit
It's the definition of a side grade, unless you are buying for the tensor cores. You get the same VRAM, 10-20% raw performance, and a little bit of RT performance. MFG is a feature that will be copied by AMD and LL with similar results.
If you don't have a card, the 5080 is great. I don't think the 5070 is a good buy at all though. NVIDIA has made that card worse than the 4070 is, but everyone fell for the marketing. 12GB in 2025 is much worse than it was in 2022
Meekois@reddit
No. A sidegrade means a different set of performance characteristics. You described an upgrade.
rabouilethefirst@reddit
Sidegrade has a margin of error. If I have a $500 processor, and the competitor releases one with very similar specs, within 10-15% for $500, I would still loosely call that a sidegrade, because I ultimately lose time and money to get that incremental upgrade into my PC.
Raikaru@reddit
By definition something that is straight up better at the same price is an upgrade
rabouilethefirst@reddit
For 4080s users, they will still spend at least $200-$300 to make the jump to 5080, so with that factored in, it’s not much of an upgrade.
fak3g0d@reddit
being concerned for people buying an 80 series every generation because they have too much money and too little sense is kinda weird
DonStimpo@reddit
Upgrading an 80 series every generation is always a waste of money. For people with a 3080 (or lower) a 5080 will be an awesome upgrade
Raikaru@reddit
Sure it’s not a huge upgrade but the 4080s will also be exactly a year old when the 5080 comes out. I’m sure 4080s users will be fine
starkistuna@reddit
What do you mean? Jensen says a 5070=4090 performance!
rabouilethefirst@reddit
Oh shi, you right.
mrandish@reddit
Yeah, I think I'll be sitting tight with the overclocked 4070 Ti Super I got for $750. It looks like NVidia has chosen to, once again, make the xx90-class card wildly over-powered (and priced) compared to the price/performance of all the other cards.
starkistuna@reddit
It's not that crazy jump 3080 to 4090 was something crazy like 60 percent uplift. 4090 to 5099 looks to be around 35ish on raw power minus frame gen gimmicks.
wilkonk@reddit
Yep, said it before, the 5070ti and 5090 are the only ones that are even remotely interesting relative to the old cards IMO, and the 9700xt could make the 5070ti way less interesting depending on how it performs.
i4mt3hwin@reddit
Yeah when I say sidegrade i'm like within 10-15%. Like something where its like "eh i have a 4080/S im not upgrading".
I'm not even convinced you'll see 20% over a Super. A regular 4080 maybe.. and I know most will compare it to the regular 4080.. but the 4080 Super has a $1000 MSRP and so does this and I think it's the more pro-consumer comparison.
That being said I have 3080 and will likely upgrade to a 5080 - it's a big enough of a jump for me on WHQD without dropping obscene cash.
YsGrandi@reddit
Can't you cancel the order if the reviews are bad ?
Meekois@reddit
Yes, but returning product takes time and energy. It's in the best interest of the buyer to get it right the first time. It's in the best interest of Nvidia to build hype and obfuscate the truth.
YsGrandi@reddit
I'm not talking about returns I meant canceling the order after the reviews, let say you order it 29th (the day we thought the reviews will be out) won't you be able to cancel it the next day before it was shipped ?
I'm not from the US or big european country so I don't how it is for you, for me I'll have to import it from usa using amazon, wait a few days to a week for it to ship then about 10 to 14 days to be delieverd.
Meekois@reddit
Ordering online means competing with reseller bots. The people who are actually getting cards are probably standing in line.
Quatro_Leches@reddit
absolutely hate returning stuff. its such a pain and hassle
imaginary_num6er@reddit
In the US, you can still not pick up your order from BestBuy if you don't want it. Not that any of these launch cards historically ship out overnight either.
Far_Success_1896@reddit
It's stated why they are delaying embargo date for the 5080 in the link. It's because Nvidia was late giving out bios to aibs.
TripolarKnight@reddit
I mean, a 4090 with DLSS4 would essentially be 5080 Ti.
CassadagaValley@reddit
Did they release any info on the Ray Tracing updates from the 4080 to the 5080? That's really the only area I think will see the large jumps in upgrading the next few generations.
dparks1234@reddit
24GB 5080 Ti is 100% coming once the 3GB GDDR7 chips become available
MaverickPT@reddit
Meh. Probably just due to the massive difference between it and the 5090. NVIDIA might be trying to reduce the bad press it will get, when compared head-to-head with the 5090.
rabouilethefirst@reddit
The bad press will come from comparing the 5080 to 4080 super
drnick5@reddit
If the 5080 isn't very close to a 4090 in performance (say, better than a 4080 super, but at or below a 4090), then I'd say its a failure.
SolaceInScrutiny@reddit
Vs 4080, 5080 will end up only 15% faster in raster and around 30% faster in RT.
PT10@reddit
How much faster than a 4090 is a 5090 in raster?
yngmsss@reddit
Tracking back to the 980 Ti, NVIDIA has usually delivered 25–30% raw rasterization performance increases generation over generation. The main exception was the jump from the 2080 to the 3080, which brought a massive \~50% boost thanks to major architectural improvements with Ampere. Such big jumps are rare and not the norm. A 15% raw rasterization increase for the 5080 would be unusually low based on NVIDIA's history, though it might suggest we're hitting diminishing returns in raw hardware. These aren’t the days of the 2080 to 3080 leap, but NVIDIA has typically stayed within the 25–30% range for raster performance in their flagship cards.
starkistuna@reddit
Skip this gen ,Nvidia is giving true upgrade to over ,$1,200 GPUs. Can't wait for It Ntel to get their shit together on high end, since AMD is bowing out of high end.
Traditional_Yak7654@reddit
AMD will have a high end competitor before Intel does given how strapped for cash Intel is.
starkistuna@reddit
Their rate of improvement is impressive tho the went from a crap GTX 960 like performance to almost 3070 performance in what seems the span of 36 months. They have good engineers in their ranks
jasonwc@reddit
Based on the NVIDIA's claimed performance uplift in Cyberpunk 2077 Overdrive mode with 4x FG and Alan Wake 2 Full RT w/ 4 x FG and Digital Foundry's reporting that you see a 70% increase in FPS moving from 2x to 4x FG,, and what we know of the performance of the 4080(S) and 4090 in these games, the 4090 will pretty easily beat the 5080 when using 2x FG in these path-traced titles, and the 5090 should beat the 5080 by a 55-60%+ margin when both are compared with 4x FG. Nvidia's first-party benchmarks show the 5090 achieving 2.33-2.41x scaling versus the 4090 (4x versus 2x FG), whereas the 5080 only shows 2-2.04x scaling versus the 4080 at the same settings.
As an example, we already know that AW2 is around 31% faster at 4K DLSS Performance + FG. Daniel Owen's benchmark shows the 4090 at around 105 FPS versus 80 for the 4080 Super. NVIDIA shows that the 5090 with 4x FG achieves 2.41x scaling, which is around 253 FPS. NVIDIA also had a DLSS4 presentation at CES showing AW2 at DLSS 4K Performance mode with Ray Reconstruction using the new Transformer model + 4x FG, with a framerate monitor, that showed high 200s to low 300s FPS in an indoor scene, so a 253 FPS average including more difficult outdoor content is reasonable. In contrast, the 5080 only claims a 2.04x scaling, so 163 FPS. 253/163 = 55% higher performance for the 5090. However, when you back out the gains from 4x FG, you're down to around 94 FPS at 2x FG versus 105 on the 4090, so the 4090 still retains a 12% advantage.
AW2 and CP2077's path-tracing modes are some of the most demanding games on PC, so this doesn't necessarily represent performance scaling for pure raster titles or even lighter RT games. Still, it's arguably in path-tracing games like this where raw performance is needed the most, since you don't want to use FG from a low base, or have to use excessive upscaling. So, it's relevant that these extremely demanding titles are likely to still perform better on a 4090 than 5080 when using 2x FG or no FG. The new Transformer model does appear to provide huge improvements to temporal stability and detail, particularly as to ray reconstruction, but those benefits will also apply to the 4090.
kwirky88@reddit
The history of the XX90 is strange, to say the least. When the 3090 launched, Covid hadn’t been in full swing yet, so most people were lining up for the 3080. Then Covid hit and all these new folks came to PC gaming, and the gpu shortage started. Stores were bundling cards with motherboards and other hardware , shipments were slim, so people were buying 3090 cards just to get a gaming pc together. There was a cash injection for the consumers because many started working from home, so all that commute expense was funnelled into new hobbies. Gaming was popular because everyone was stuck at home.
So with the vast majority of 3090 owners being gamers who didn’t actually need the 24gb of the 3090, the 4090 was released. By this time, shipments may have been a little slim for the 4080 but it wasn’t as nuts as peak COVID. 3090 owners weren’t upgrading to the 4090 because the world started opening back up again and their PCs were becoming neglected in the basement.
And now a 5090 is launching, with 32gb of vram. It’s a quantity of vram which has basically zero relevance to gaming. It’s such an obscure amount that 99% of gamedev projects won’t bother targeting this 1% of hardware owners. These are now back to being niche products, like the Tesla cards of the 2010s.
DiogenesLaertys@reddit
A 5080 is 1000 bucks and a 4090 was 1600. They haven’t offered significant improvement for price tiers in generations unless there was a die shrink.
This is no dice shrink and the 5080 costs significantly less. You are a fool to expect it be better than a 4090.
op_mang@reddit
You forgot about the gtx 700 series to the gtx 900 series. The 970 was $70 cheaper than the previous 770 while being within a few percent of the 780 ti. The 980 was $100 cheaper than the 780 while beating the 780 ti. All on the same node (tsmc 28nm). So people expecting the 5080 to be at least a little better than the 4090 are not foolish.
Elketh@reddit
The example you're citing happened over a decade ago. The post you replied to suggested that Nvidia haven't offered such a deal without the help of a die shrink "in generations", so I'm not sure bringing up a card released in September 2014 is quite the stinging rebuttal you think it is. Nor do I think it's in any way realistic to compare the Nvidia of 2014 to the Nvidia of today. Gaming GPUs were a far more important part of Nvidia's business back then, and their competition was much closer. AMD could match Nvidia's performance across the stack back then, even if they were lagging in terms of power efficiency. Features were also a much closer match in the pre-ray tracing/upscaling era. There was a lot more pressure and incentive for Nvidia to compete hard on price/performance back then.
Bringing up Maxwell as if it's in any way indicative of what Nvidia might do here in 2025 just seems somewhat desperate. I think you're only setting yourself up for disappointment. But that's entirely your business, of course.
dabocx@reddit
That was 10 years ago and multiple generations ago.
op_mang@reddit
You missed the point. The point is Nvidia could have made the 5080 better than the 4090 but they chose not to because there's no competition. Are you saying they can't make big improvements just through architecture changes like they did 10 years ago? Because they can, they're just being greedy.
beleidigtewurst@reddit
Of course it it isn't, it is barely buffed vs 4080.
So expect intense spinning by the hypers of "8k gaming with 3090", "In our super early preview 3080 is twice as fast as 2080", cough, the PF.
defaultfresh@reddit
It won’t be close to the 4090 in raw performance
Z3r0sama2017@reddit
Doesn't matter. If nvidia have managed to avoid fucking up their duel chiplet design, then will have never been such a huge difference in performance between halo card and normal high end. Not even with the Titans.
bphase@reddit
Of course it matters, beating the 4090 at $1000 would be a huge improvement in perf/$. The $1600 4090 would be "obsolete" for everything but its VRAM capacity.
It doesn't matter that the 5090 is massively faster and bigger as it is double the price. Those who really want it and can afford it, will get it pretty much regardless of its price. But for many it's just not worth it even if it is massively ahead of the 5080.
Z3r0sama2017@reddit
It won't, because gamers won't care and they are the loudest bunch of whiners. Look at how nvidia got called out for gimping every card not called the 4090 for the last gen because of the unheard of performance difference between halo and high end.
Now imagine how much they will cry when if their is an even bigger performance difference? It's not like a single chip can match a duel chip with how parallelized graphics are.
Your arguement is logical and reasonable in regards to price to performance. Gamers are reasonable though.
Adept-Preference725@reddit
Why are you being such a corporate bootlicker?
laselma@reddit
Without the soap opera mode it will be on par with the 4080 super.
Zaptruder@reddit
The frame gen rhetoric is getting actually "my brain is trying to escape my ears, plz help" level.
Smooth motion is one of the aims of video game graphics. If you movie like visuals in your games, you're free to increase resolution, increase ray tracing, add chromatic abberation, add motion blur, until you find the correct mix to give you that 'movie magic' feel.
rabouilethefirst@reddit
Sure, but so is responsiveness. By using framegen, one of the most important aspects of higher framerate is thrown out of the window. It has it's uses, but a game natively played at 240fps can have less than half the latency of a framegen game getting 240fps.
If you can easily discern 240hz from 120hz, then framegen will be super noticeable. Reflex is a feature that can be enabled without framegen
StickiStickman@reddit
So youre saying 99.9% of people will happily use it?
rabouilethefirst@reddit
The discourse around the current iteration of framegen is that about 50% can’t stand it and say they turn it off every time, and the other 50% seem to think it is useful.
I think it’s fine to market it, but NVIDIA compared the framegen frames to native frames, which is bullshit to the nth degree.
StickiStickman@reddit
Yea no, it's not remotely 50/50. If you honestly think that you're insane.
rabouilethefirst@reddit
Framegen is completely useless on the most “popular” 4000 series card as shown in benchmarks. The 4060 can’t even get a good performance jump with it. In the past 2.5 years, I have played exactly 2 games where Framegen is useful on my 4090, the rest didn’t support it or weren’t a net benefit.
Indiana jones and cyberpunk are the only games I had it on, and those were heavily influenced by NVIDIA marketing.
ryanvsrobots@reddit
I'd bet much of that demo hasn't tried it old frame gen. 100% hasn't tried the new one.
rabouilethefirst@reddit
Sure, but we only have the current gen in our hands. I’m not going to pay 2k to find out, because I already have a 4090.
And the numbers I was giving was for people using the old frame gen. There are tons of 40 series users that hate it and say they never turn it on. I’ve honestly had better experiences with modded FSR3 and LL. Cyberpunk is the only game where NVIDIA’s framegen actually did anything for me.
I also find it hilarious that NVIDIA is basically just selling us cyberpunk frames at this point. Game has been in their marketing for like 4 generations
ryanvsrobots@reddit
No, the numbers you were giving were made up in your head
ryanvsrobots@reddit
No shit. If you can get high FPS without frame gen obviously do that. It's for when you can't.
The lack of logic around this technology is so blatant it feels nefarious.
rabouilethefirst@reddit
It’s not me using lack of logic, it’s NVIDIA dishonestly pretending that they are equivalent to real frames. I know when to use it, and I’ve used 3rd party solutions that work just as good as NVIDIA’s.
It’s not a “no shit” if NVIDIA makes no mention of the downsides when saying a “5070 gives 4090 performance”.
It factually does not. The input latency is not the same, and IQ goes down with framegen.
ryanvsrobots@reddit
You have now moved the goalposts to marketing instead of the merits of framegen. That's not what your previous comment was about.
Zaptruder@reddit
Given that we're approaching the end of raster improvements... your basic options are pay a shit ton more for modest improvements, or pay the same for minimal improvements... there's not much point to be made - other than in a 'theoretically, if you had this much raster performance, it'd be better than half of that performance!'
Which is just spectacularly unhelpful as a message to propagate.
Additionally, there is a potential pathway forward for further latency reduction with AI generated frames - extrapolation. Obviously the visual artifacts will increase in this scenario, but then latency also decreases - I'm not sure who'll be advocating for that other than latency min-maxers.
But if latency min-maxing is all you're about, the method to do so is already available now - turn down all the graphics settings, have the most powerful GPU and the fastest refresh rate monitor (that's an OLED).
Of course, the only people that go that far have literal money on the line when dealing with latency (i.e. esports pros)... everyone else prefers a reasonable balance between frame rate, latency and visual quality.
rabouilethefirst@reddit
If we’re approaching the end of raster, that means we aren’t going to get more transistors for RT or Tensor cores either, which means NVIDIA is just a software company.
Funnily enough, MFG can be implemented on CUDA cores with great results, so if you think CUDA cores are not useful, that is also nonsense.
NVIDIA created a convoluted solution so that only their cards would be able to run the software. In reality, their framegen solution isn’t much better than even some amateur devs releasing a $5 steam app.
They even threw the entire “optical flow accelerator” out of the window this gen, and basically admitted they can do the whole thing with a standard neural network model.
NVIDIA must realize their only path forward is keeping software features locked to their hardware (aka the Apple approach).
letsgoiowa@reddit
I really like smoothness. I like it so much that interpolate a lot of video purely because it makes my brain happy.
I do not like noticing input lag. The point of game is to play it, to interact with it. Things that get in the way of that suck a lot. This is why I love Reflex and framerate caps but absolutely hate things that make latency much worse.
It's fine to not like latency increases. I want further decreased latency desperately.
Zaptruder@reddit
The new reflex 2 basically lets you have 2x FG with 2ms latency cost (in Cyberpunk). I think the vast majority of people will simply not notice the increase latency, but will notice the frame doubling.
To some degree - the increase is simply so small so as to be imperceptible... but the online rhetoric so far simply refuses to look into the specifics, and simply divides the conversation as 'increases latency, doesn't increase latency'.
letsgoiowa@reddit
That's where they get you though: compare it to Reflex On, no FG. Huge difference right there: over 15ms!
ryanvsrobots@reddit
It's also fine to not mind extra latency in exchange for smoothness. It's ok for people to like things you don't.
letsgoiowa@reddit
And I didn't say otherwise.
CANT_BEAT_PINWHEEL@reddit
It makes the motion smoother at the expense of motion clarity, which is also one of the aims of video game graphics. If that’s more important to you than smoothness then you should use black frame insertion and save money and waste less power
RogueIsCrap@reddit
I don't understand what you mean. Doesn't higher FPS lead to higher motion clarity?
Just for fun, I tried lossless scaling 4X on some games that were software locked to 60 fps. The improvement in motion clarity was substantial. Triggering the frame-gen on and off showed that there's a big difference in clarity between 60 and 240 hz.
ryanvsrobots@reddit
I can tell you haven't used it. Motion clarity is not an issue. Latency is, but there's no point in using it if you have a high FPS low latency situation.
Suggesting using black frame insertion at 30-60 FPS is crazzy.
CANT_BEAT_PINWHEEL@reddit
I didn’t suggest black frame insertion at 30-60 fps. Nvidia explicitly stated that dlss3 frame generation is for high fps scenarios to reach the max refresh rate of even higher refresh monitors. Are you confusing frame generation with gsync?
Zaptruder@reddit
The people continuing to pound the same lines from last gen basically ignore the simple fact that it's a matter of degrees.
i.e. yes, there's drawbacks and there's positives. If the positives are sufficiently large, and the drawbacks sufficiently small, then on balance, it'll be perceived as a positive.
In this case - going off my first hand experience of frame gen and DLSS on the 4090 - the drawbacks are indeed small enough and the positives large enough that I'm using it all the time where possible.
Of course the degree to which one experiences the pros/cons is somewhat subjective (i.e. what matters more) - but at the same time, it's clear that in this gen, the drawbacks have objectively decreased and the positives increased (less motion artifacts, greater visual clarity, improved overall latency).
I'd wager the people actually sensitive to the cons are far far fewer than the people that repeat the cons in comment sections and in forums like this.
Also, how does one 'insert black frames' easily? Is this an option you can check somewhere? Seems like the ideal thing to try with a higher refresh rate monitor and GPU...
rabouilethefirst@reddit
The difference is that the drawbacks of DLSS upscaling are fairly minimal. You get higher FPS, lower latency, and marginal decrease in image quality.
With framegen, my experience has been, higher fps, higher latency, and moderate decrease in image quality.
This makes it not as useful as DLSS upscaling is.
CANT_BEAT_PINWHEEL@reddit
ULB is a black frame insertion tech and some gaming monitors also have custom built in versions (ex: dyac). If your monitor has gsync or is high refresh it probably also has a black frame insertion option you can test out. It’s fun to test out in boomer shooters imo
Zaptruder@reddit
Seems like quite a niche thing. I have a 240Hz monitor, but it's not an option in the OSD.
TrypelZ@reddit
it won't be on paar with the 4080S but it will also not outperform a 4090, i guess it will be right in the middle of both cards ( around 10-15% faster then 4080 ) which is a minor performance increase for a new generation of GPU's tbh.
Zednot123@reddit
Going to be hard in pure raster ye. But there is room for RT gen over gen improvements and perhaps DLSS is more efficient (talking about the upscaling). So it might still eek out a win in some scenarios that doesn't involve FG.
Hendeith@reddit
Nvidia on their own slides showed that 5080 is supposed to be 25-30% faster than 4080 in RT (no DLSS or anything). Even lower difference against 4080S.
Zednot123@reddit
We are talking about the 4090
Hendeith@reddit
Ok so let me spell it out, didn't know I have I'll have to connect the dots for you, if 5080 is supposed to be 25-30% faster than 4080 in RT, then it will be something like 20-25% faster than 4080S, then there's no way it will be faster than 4090.
Zednot123@reddit
FC6 is a very poor data point to evaluate RT performance. It is extremely light RT, there's a reason why AMD performs well in that test with RT on.
You are assuming everything scales the same. Even if we just ignore potential RT core improvements of efficiency gains in up scaling. Blackwell has gained far more bandwidth than compute resources.
That is not out of question in some instances. The 4090 is severely bandwidth limited in some instances when it comes to RT. The 5080 may very well match it in some of those cases if there are architectural efficiency gains.
Both cards are within spitting distance when it comes to raw bandwidth and cache amount. Doesn't take much efficiency gains for the 5080 to have more effective bandwidth in bound scenarios.
Hendeith@reddit
It's wonderful example, exactly because of that. You are able to compare performance gains without having in to factor that 4090 might perform even better in harder RT scenarios simply because it has more cores. TLDR: this is one of the best case scenarios for 5080 v 4080 v 4090.
No, I'm not assuming it. You are just throwing in DLSS into RT discussion.
It's less limited than 4080, so again in other games difference 4090 v 5080 most likely will be bigger - not smaller.
Again, no. You are just saying that to misrepresent data we have.
DILF_FEET_PICS@reddit
Eke*
TrypelZ@reddit
That might be in some specific cases, thats true
DILF_FEET_PICS@reddit
Than*
raydialseeker@reddit
Or the opposite. The 5080 might be unexpectedly good value relative to the 5090.
kelin1@reddit
Or it’s trying to drive people toward a significantly more expensive product just because it has reviews. Could be either or.
kikimaru024@reddit
If you are considering $1000-1200 on a top-end GPU, you're not going to just arbitrarily double that for a 5090.
Like, get real for a minute.
Megakruemel@reddit
I believe that having that kind of money doesn't mean you are smart with money.
sips_white_monster@reddit
Not gonna be many people in Europe lining up for a 5090 that's for sure. Base price is a whopping 2450 Euro (2400 USD), but most AIB models will be closer to 3000 USD equivalent. The combination of having a high VAT and the Euro losing value over the last three years has been brutal.
zetiano@reddit
Dunno, I doubt it won't sell out on launch day either way.
retropieproblems@reddit
“Half the price for half the performance of the 5090”
bubblesort33@reddit
I have a feeling people are going to be shocked at the lack of rasterization gains. There must a reason they don't have a single game that is pure raster on the graphs they showed. On top of that, the fact a lot of the cards they are releasing are coming in complete, or close to complete die formations, and not cut down like the 4080 was, makes me suspicious that they needed to enable a lot of silicon to see many gains.
Curious how angry they will be with reviewers, when a lot of outlets still focus heavily on raster performance. HUB got temporarily black listed briefly last time. And at what point do you stop testing RT, and raster separately?
OwnRound@reddit
Might be what they are banking on.
People who have the money may see the glowing 5090 reviews but feel the 5080 isn't a known quantity, and may push them towards getting a 5090 when a 5080 could probably do what they need anyways.
As always the case, the wise move is to wait and see how it pans out. But I suppose the kind of person trying to buy a GPU on release day isn't too wise.
Biomed@reddit
It’s like that every release.
Jofzar_@reddit
Literally been like that since the 30 series
mrandish@reddit
I've learned to wait until the "Launch Reviews Compared" meta-post here in r/hardware from the amazing u/Voodoo2-SLi (recent example). The statistically averaged per-app scores compared as a percentage and per-dollar with prior generations is simply invaluable.
While some reviewers are more rigorous and reliable than others, this stuff is so complex to accurately measure that a composite average is now the most reliable approximation of the true performance and value most users will see. And all the new synthetic pixel and frame generation features make benchmark results even more variable and dependent on context.
III-V@reddit
Uh, this is how it always used to be, and nobody thought it was weird then.
ryanvsrobots@reddit
That's industry standard.
GTRagnarok@reddit
A week to examine the 5090 is good. Whether or not I upgrade from the 4090 will depend on the power scaling. The 4090 runs great even at a much lower power limit. I hope that's also the case with the 5090 both because I have a 850W PSU and because 500W from the GPU alone is just too much heat for my room. Curious to see how it performs at 350W.
ContextDisastrous795@reddit
Just curious - why upgrade at all from the 4090? Why not wait for the 60xx series because there’s seemingly nothing the 4090 can’t do right now.
nmkd@reddit
1) Path Tracing
2) LLMs
Stahlreck@reddit
Saturating a 4090 is not all that hard as people make it out to be if you play at 4K.
Not to mention right now is probably a better time to sell a used 4090 than when the 60 series launches.
If anything, I would guess 4090 owners are the most likely to upgrade because max performance is max performance. You either fall into that category or into the other one where you bought a 4090 to keep it for the next 10 years because it has so much oompf.
Strazdas1@reddit
Theres also a category of where that card isnt just for gaming. you could be doing work and gaming on same machine for example.
rpungello@reddit
Yeah, if you have a 4090 bought at MSRP you might be able to break even on it if you sell now. Heck, depending on the model, you might even be able to turn a profit. I'm seeing 4090FE models on eBay for well above $1600.
Quite frankly, you could probably have bought every flagship starting with the 2080 Ti and not actually lost anything as each card keeps being at least as valuable at retirement as it was at launch.
So -$1200 for the 2080 Ti
+$1200 (probably even more) for selling it during COVID
-$1500 for a 3090FE
+$1500 for selling a month or two before the 4090
-$1600 for a 4090FE
+$1600 for selling it now
All nets out to $0, so if you then buy a 5090, you'll be out a total of $2k for 6-7 years of having a flagship GPU, and you'll very likely be able to recoup that when the 60-series is coming up.
SoTOP@reddit
You would not sell 2080Ti for 1200 or 3090 for 1500. 2080Ti were half that a month before Ampere release and only increased with crypto boom, but by that point you would also pay much more for 3090 if you could even get one. There were significant sell outs from mining operations before Ada release, so even with AI boom getting above 1K for used 3090 would take finding someone out of the loop.
General idea still is right, if you can spend a month without GPU when new gen launches and know you will get one for MSRP or close to it, doing this is optimal.
rpungello@reddit
I definitely saw 2080 Tis going for above MSRP during the peak COVID GPU shortage, and you could also absolutely get a 3090 during that time if you were patient and had good in stock alerts set up.
The FE cards in particular seemed to hold their value extremely well.
More-Ad-4503@reddit
meanwhile i paid almost 1k for a 3080 during covid (when the gov cared about it) times and it's depreciated to about $330 now
ContextDisastrous795@reddit
That’s a sweet deal honestly. Where do you guys sell your GPUs?
Semyonov@reddit
I usually do /r/pchardwareswap
TheDinosaurWeNeed@reddit
I don’t even get 60fps on Indiana jones with RT only on high.
Until I can run max settings and get 120 fps+, there will be desire for a better card.
And this is just at 3440x1440.
ContextDisastrous795@reddit
But doesn’t that put you in a spot where you’re always chasing a better card? Because there will always be a game that the current flagship doesn’t cut it for right?
TheDinosaurWeNeed@reddit
Yeah I just upgrade when the new ones come out and give old one to a friend.
ContextDisastrous795@reddit
Lucky friends I must say :D
metahipster1984@reddit
VR...
StarbeamII@reddit
DP2.1 would let your monitor run higher resolutions/frame rates without compression I suppose
gdnws@reddit
This is my biggest question as well along with performance at particular power targets. My 4090 with a 300w limit and a mild undervolt/overclock and a mild memory overclock has always been within a couple percent either way of stock performance. Looking at the available specs at the moment, it has a number of things that suggest that power draw is going to increase; it has an additional 16B transistors to power while being on effectively the same node and while the memory is significantly more efficient per bit, it is moving so many more bits when at full speed that the memory portion is likely to draw more power overall. The big thing we don't know right now is how the architecture will play with that as it is entirely possible that architectural improvements will pull power draw down even overall despite those other things working against it.
NeroClaudius199907@reddit
Amd surely going to surprise us with rna4 reveal this week right? Nvidia booked next week
nmkd@reddit
I knew AMD was behind those microchips in the vaccines!!
Whole_Ingenuity_9902@reddit
damn, there is a ribonucleic acid 4? my cells are still running on rna 1
frantakiller@reddit
For each mrna vaccine you get, you increase by 1
Gyroshark@reddit
We have hardware, firmware, and software. Is this the year we finally get fleshware?
Strazdas1@reddit
fleshware is just biological hardware.
Xlxlredditor@reddit
Year of the Linux blood type
Bored_Amalgamation@reddit
You get updated every 7 years,
jaaval@reddit
it’s AMD’s new molecular rendering, they do computation in nucleotide goo.
TheLinerax@reddit
The way new info is being shared about RDNA4 in /r/AMD, the AIBs will do AMD marketing team's job.
szczszqweqwe@reddit
That's probably better for the AMD.
someshooter@reddit
AMD will likely steer clear of the 5090, then launch reviews day before the 5080 - RemindMe! two weeks
mulletarian@reddit
any decade now
BigIronEnjoyer69@reddit
It's kinda looking like this, right now:
5090 - $2000 and a significant perforamance bump. Compact so fits in SFF systems. Extremely appealing to AI nerds for inference due to the 32GB RAM. If you have the money, this is the one you're getting.
5080 - Ripoff in the face of the 5070Ti. Probably Doesn't look that appealing, but it's gonna be available first and doesn't feel as bad of a compromise as the 5070 if you want a FE cooler. compared to 40 series.
5070Ti - Good value but belated availability and no FE cards. This is clearly intended to be the "bang for buck" option.
5070 - Good value, Solid mainstream offering. Priced reasonably but only because it's also gonna be competing against previous gen cards as well. The bar will look just bad enough to make you want a 5070Ti tho. The Flowthrough cooler might make it very appealing.
Outside of the 5070, The focus on postprocessing effects this gen instead of solid raster improvements makes the 50 series kinda unappealing.
Fake frames, while nice for smoothness are still a software thing that will work in only the games that update to the Nvidia SDK. Also it's the kind of thing that seems way more appealing on a 5070 than a 5090. The current frame gen roster is only supported in like 70 titles.
fnjjj@reddit
the 5070 FE does not get the flowthrough cooler - it's design is going to be similar to the 4070 FE (confirmed by nvidia in the 5090 Cooler explanation videos)
RawbGun@reddit
Does the 5080 get the flowthrough 2-slot cooler or is that only for the 5090 ?
BigIronEnjoyer69@reddit
That's somewhat unfortunate. Not that the 5070 *needs* it, but It would have been nice to have a card with an over-spec default cooler you can just reliably know is gonna be a good model to get.
letsgoiowa@reddit
I can't believe we're calling an $800 GPU good value. That's an insane price to spend on a GPU
sips_white_monster@reddit
That would be a bargain for 4080-tier levels of performance, in Europe at least. Cheapest 4080's here are 1200 USD equivalent. 4090's are 2000+. The 5090 MSRP here is 2400 USD, with AIB models projected to be around 2700-3000 USD after conversion. US prices are indeed a bargain lol.
probablywontrespond2@reddit
Is this just another example of why taxes shouldn't be included in the price because it confuses consumers such as yourself, or is the 5090 MSRP where you live actually $2400 (+20% VAT) = $2880 equivalent at checkout?
Strazdas1@reddit
Taxes should always be included in the price and US not including it is whats confusing people.
Based on quick math: RTX 5090: 1999 USD = 1917.48 EUR. with 21% VAT: 2320.15 EUR
StickiStickman@reddit
Actually talking about the price you end up paying sure is totally confusing. Yea, that really is terrible. lol
sips_white_monster@reddit
Yes the 5090 MSRP where I live is exactly 2369 Euro (which is basically 2400 USD). High-end models like the ASUS Astral will go for ~$3000 I have no doubt about that whatsoever. Like I said if you want European prices just convert USD prices to EURO at 1 : 1 ratio since the currencies are about the same now, then add ~20% in value added tax. That will give you the MSRP. For AIB prices, add another 20-40% depending on how high-end the model is.
Welcome to Europe!
ThrowawayusGenerica@reddit
What SFF build is going to dissipate 575W of heat lmao
Strazdas1@reddit
I mean, it depends on how much air it can flow through it. As long as there is a constant intake of cool air it can be done in small spaces. Wont be quiet though. Also just let the temperature delta be higher. Its fine if the GPU is at 90C. Its not going to melt.
BigIronEnjoyer69@reddit
Dunno how the situation is different than a large case tbh. A single column Intake -> GPU -> Exhaust in a case like ncase isn't gonna be that much different than a regular tower.
The ones that have risers and put a solid sheet behind the GPU are the one that're gonna struggle.
For example, something like the NZXT h1 would be out of the question.
tilted0ne@reddit
Can someone explain to me on what basis people are thinking the 5080 is going to be a rip off?
sips_white_monster@reddit
Probably because it has only marginal core clock speed increases, 10% more cores vs the 4080 but it does have 30% more bandwidth. However people may be severely underestimating the impact of the Blackwell architecture.
The one thing keeping my hopes alive is kopite7kimi's quote to Videocardz about the 5080 being 10% faster than a 4090. I disregard all leaks from grifters and Youtubers, but kopite's track record is flawless. I am a believer. Hope I won't regret this post two weeks from now.
Jajuca@reddit
The problem with Kopite is he gets access to cards early in production that get scrapped so the 5080 being 10% faster than the 4090 could have been a 5080ti sample that got scrapped or cut down to the current 5080.
So Kopite is technically never wrong but things change before release making it look like he was wrong.
AlasknAssasn619@reddit
4080S - 80SM, 10240 cuda $999 1/31/24 5080 - 84SM, 10752 cuda $999 1/30/25
Shit upgrade.
Haintrain@reddit
The same logic and reasoning that made people say the 5080 was going to cost $1600
Stahlreck@reddit
Probably remembering the 4080 and the fact that Nvidia is in a better position than even back then.
Also the absolutely giant gap between it and the 5090. We'll see though.
bryf50@reddit
That the second tier card is now far removed from the top tier one. The 5080 is half a 5090. In the past a second tier GPU was only ~15% off the top tier.
panthereal@reddit
Y'all need memories longer than 2 years
the 3080 was $700
the 3090 was $1500
bryf50@reddit
No. Not talking about price... Look at the spec difference.
panthereal@reddit
Okay very true of the spec difference, but it's possible we're at a point of diminishing returns on the 4nm node while the dlss4 upgrades shrink the performance gap.
I would expect the 5080 to become a better value card overall due to that. If they managed to make it worse price/performance while having half the specs at half the cost that's truly impressive.
kikimaru024@reddit
BigIronEnjoyer69@reddit
Yeah. Sharp words, perhaps but on paper the frames per dollar compared to the 5070Ti aint good enough to justify the $250 gap between the two.
The 5080 has a nicer cooler and is available sooner and, if you're buying, and you can snag one from Nvidia, that's better than hoping partner models end up being true to the $750 announced price.
Maggot_ff@reddit
It's just the way the market is moving. Raster is starting to take a backseat. RT is becoming more and more important. You can get really solid raster performance even with the 10/20/30 series still, depending on resolution. RT is where they are hurting.
And while frame generation (I'm assuming that is what you meant) isn't something I'd use outside of single players games that already run at 100+ frames, nvidias suite of software features just makes them better cards for most people. DLSS has impressed me immensely, at least at 4k, not so much at 1440p.
imaginary_num6er@reddit
Like the 4070Ti, it will be a ripoff too since the 5070Ti will have no FE version and MSRP cards will be non-existant
zakats@reddit
Nvidia RTX, graphics cards for suckers.
Maggot_ff@reddit
You buy GPUs based on bang for your buck, I buy GPUs for absolute performance. Both ways are valid.
Why would I buy AMD when it doesn't matter to me whether I save a few hundred USD or not and get worse performance and less features?
We just have different approaches, doesn't make me a sucker, just financially stable enough to enjoy my hobbies the way I want instead of the ways I can.
zakats@reddit
Fair enough, but allow me to clarify my tone: I edited my comment to include 'whales' and I'd put you in that category- though not condescendingly and I admittedly wasn't very clear about that as it reads more like a troll.
I've been in the computer market since the Clinton admin and stuff changes, this is the worst enthusiast market I've ever experienced and it effects all of us.
Maggot_ff@reddit
Calling someone a "whale" is a negative, and you know it as well as I do. I'm not a whale. I buy and sell in the same market as you, and if I want the best, I have no option. You think I wouldn't love top end GPUs at 700 USD or my currency's equivalent? Of course I would. But if I spend 2000 USD on a GPU every 4-5 years, means very little to me in the grand scheme of things. I fully understand that people don't want or can spend that kind of money on a GPU, but then again, it's a luxury. No one "needs" a GPU like that unless it's for work. 99% of people would be more than fine with a 5070, which is priced fairly at MSRP.
People act like GPUs are the one thing that has gotten out of control. Try having cars as a hobby, then you'll se mad prices that actually affect enthusiasts, especially compared to the early 2000's as you mention for some reason.
No one is fooling me, or anyone, into buying these cards.
zakats@reddit
As someone who has worked in IT and auto mechanics (a hobbyist in both), I take issue with your comments as I can't think of an area where enthusiast components of any similar volume and demand have inflated their prices so much. I can still buy an Edelbrock 750, enthusiast ECM gear, non-exotic sized z-rated tires, and I/H/E gear for reasonable prices.
You can definitely make the argument that the x90 has changed from a dual GPU card to take the place of the Titan which has always had professional tier pricing, but I'm hard pressed to accept that a 4-5 year cadence
Maggot_ff@reddit
You're cherry picking. I could pick and choose examples from auto parts and actual cars as well, and show you how the prices both new and used has skyrocketed (compared to what you get) where I live.
And I have 5 years warranty on any GPU, but that doesn't matter. Most GPUs don't fail withing 5 years.
Derogatory? No one has said it's derogatory, but it's clearly negatively laden. Look up where it stems from.
You trying to change the definition of that? Nah, I'm not accepting that.
Like I said, you and I are clearly different, you care about bang for buck, I care about absolute performance. Either way is fine, but I don't go around calling you a penniless fool for it.
zakats@reddit
I seem to be wasting my time here, this feels like a conversation with an investor. I think your arguments are more than just self-serving, they're damaging to competition overall which is bad for all of us in the market.
Maggot_ff@reddit
And I seem to talk to someone that doesn't quite grasp the free market. It's not about whether you like the price hikes or not.
My main point is your horrible attempt at labelling people that does differently than you, then blaming them for the prices that make you miss out on what you want.
Let's agree to disagree. There's no reasoning with you, and you just jump to conclusions about my person. Horrible way to argue. I root for more competition in any market, that doesn't change my point.
zakats@reddit
Well that's one of the more hardcore stan comments I've heard in a while.
Maggot_ff@reddit
At this point your not even coherent. How anything I said could "stan" in anyway to any company is beyond me.
zakats@reddit
I don't think you've used the word 'coherent' correctly, but feel ways about it , I s'pose.
Maggot_ff@reddit
English as my second language, and I still use it better than you.
Yes, coherent is the word. You're babbling.
Leave it. You're done now.
zakats@reddit
I've clearly stumbled upon the backfire effect. I hope your narcissism keeps you good company.
But, yes, I am done now.
TheInfectedGoat@reddit
You are insufferable. Good god, do you like to argue.
Maggot_ff@reddit
Again with the personal attacks. You're great at proving my point. Learn to argue case, not person.
Now, sod off.
JensensJohnson@reddit
its for people who don't want to settle for a radeon, lol
zakats@reddit
In what way are customers 'settling' for AMD and Intel options? Please enlighten me to your perspective if you feel I'm missing out on important info.
Strazdas1@reddit
In every way except linux support. If you are doing things on windows, Nvidia card will do it better every time.
Sobeman@reddit
Nvidia has positioned the 5090 as the only actual upgrade. The 5080 is a bad value for 4090 and 4080S owners, so they are incentivized to go for the 5090. Funneling everyone to 1 SKU will mean they will be 100% sold out for the next year and the only way to get one is from scalpers.
maximus91@reddit
Who upgrades from 4080s to 5080? Let your cards get some wear and tear lol
Eskipony@reddit
Every GPU launch I swear there are people who come from last year's card series complaining about the value of upgrading to the latest just at launch.
Like how many people actually upgrade every year?
mylord420@reddit
My question is how many of these people are high income/net worth vs how many are ho-hum middle class simply wasting their money? Stats show that 50% of people who buy luxury brand items like 3K+ Louis Vuitton and Hermes purses make 50K per year or less. I'm assuming people blowing money on new highest end GPUs per year are the male equivalent of people living paycheck to paycheck buy still buying designer, yet they'd probably make fun of the women doing that but defend their choices via "ITS MY HOBBY THOUGH". Just like other dudes justifying their terrible car buying decisions by saying "I'm a car guy though".
Strazdas1@reddit
If you make 50k a year, buying a 1k GPU every two years (every gen) would be 1% of your income. Thats hardly significant expenditure.
mylord420@reddit
50k Gross so whats left after taxes, rent/mortgage, food, gas, car maintenance, etc etc etc etc etc. Most people making that kind of money have barely if any disposable income left over after essentials to begin with. Can't just think about it from gross. What if they only had 2k per year left over? Better to spend 50% of it on a GPU or put some paltry money into your 401k / IRA so you might not have to keep working until you drop dead?
Strazdas1@reddit
If you have 4% income left after essential products you are bancrupting yourself. You clearly have to significantly decrease your essential expenditures or you will end up in debt the first time something goes wrong. Its normal to have 15-30% of your income go to savings.
mylord420@reddit
How many people making 50K do you think can put 15-30% to savings? You know the stat that 60% of Americans don't have $1000 available for an emergency? Yeah saving that amount is ideal, but its a luxury most don't have.
Typical-Mastodon7023@reddit
A significant majority of people are financially illiterate. Part of it is that money management isn't taught nearly enough (if at all) in school, but part of it is succumbing to instant gratification/"treat yo self" mindset/keeping up with the joneses which often goes overboard. People in developed countries, especially the US are simply entitled and expect to be served and have all the best food and modern conveniences/comfort and I say that as someone who lives in the US and had high expectations too but visiting a third world country (80% live on less than $20 a day, 25% live less on $3.65 a day) changed my perspective massively and it's a shame that more people aren't contented with the basics/small things.
Even those in the lower-mid class in developed nations have opportunities to save more, but they're not going to because they'll make up some reason about how they'll die before making it to even 50 years old. Social security may not even be around in 15 years and unfortunately a lot of people bank on that instead of dialing back their expectations, stop caring about what other people think (oh no, my coworkers are going to make fun of my plain clothing and used toyota vehicle -- the horror!). I could go on but you get the idea.
BighatNucase@reddit
Insane people (i.e. people on this subreddit).
Strazdas1@reddit
or people for whom this is not relevant costs. Median annual income in US is 46,985. Paying 1000 USD for a card every two years would therefore be 0,0106% of their income. And half of the population makes more than that. For some people, 0,01% of their income for their primary hobby isnt big expense :P
BighatNucase@reddit
Your calculus feels very silly. For one thing that won't be a take home pay of 47k so really it's going to much more than 1%. Apart from that, it's 1k which could have been spent on something else (e.g. a shitload of games). I think people over-exaggerate how bad prices are, but buying every gen is a bit silly unless you are either so wealthy and without other hobbies or if you're doing something really peculiar like buying a 70 class card each gen instead of a 90 class card every 3 gens.
Strazdas1@reddit
But that was my point, some people (the ones that upgrade every gen) are so wealthy that it does not matter for them.
Typical-Mastodon7023@reddit
I mean, just in general anyone is upgrading every 1-2 cycles are usually those with a compulsion to max everything out despite the seriously diminishing returns in graphical quality. To each their own, but even if one has the disposable income it doesn't mean you have to throw it -- how about putting it in the SP500 or total world stock? Shrug.
MiloIsTheBest@reddit
I'm once again finding myself lacking any kind of excitement about product launches. Going on 5 years of this now, since the 30-series shortages started.
I can't see myself justifying well over $4000 (Australian) on a 5090.
It's going to be over $2000 for the 5080. Nearly $2000 for the 5070Ti. Over $1100 for the 5070. Weaker dollar or not, other economic conditions aside, this is fucking expensive no matter which way you slice it.
I doubt AMD are going to make the 9070 compelling given prior form and Intel may just not come to the party at all at that level.
After buying an 8GB card in 2022 I'm now very VRAM conscious and everything below the 5090 is 16GB or less.
I have no doubt 16GB will not be any kind of problem today. But in a year? 2 years? I'm not convinced of its longevity. For $2k I need to know I'm not gonna get another 3070Ti repeat.
I think that the PC economy is so broken right now I'm not sure what my future in the hobby is. Used to be able to think about buying stuff to tinker, now it's all so expensive I feel my money is much better spent elsewhere.
Notsosobercpa@reddit
I'd suspect 16gb will be fine until next console generation comes out and then fall off hard. It's enough that you should always be able to run the console quality textures where is the minimum you can be sure the devs put work into looking decent.
Strazdas1@reddit
Depends on what next generation will have and if it will have the series S model that means devs need to make sure the game works on old hardware too.
MiloIsTheBest@reddit
Yeah that's the thing though, that's literally every card short of the 5090.
Imagine having spent all that money on a high end 80-series and having to fuck around with textures. Maybe not on day 1 but maybe not that far in the future either.
That's what I've been doing for the last nearly 3 years with the stupid 3070Ti. It's not enjoyable. It's a bummer. Especially when you can tell the card is powerful enough to render a better looking scene had it only a bit more space onboard.
Notsosobercpa@reddit
I mean having to turn down settings on 70-80 series cards is pretty normal, people just got used to textures being the exception because last console generation was so underpowered for so long.
MiloIsTheBest@reddit
See that's the response that always frustrates me. It's not just about "having to turn settings down" I mean yeah there's always gotta be times when you do that for various effects or lighting, based on the actual power of the chip.
But I can't describe to you the annoyance of seeing a really good looking scene running great until the VRAM fills up and then it's CHONK CHONK CHONK and your only option is to figure out between textures and other VRAM heavy options what will not just get under the limit now, but will stay under the limit during the whole game.
And you know the chip itself can render everything there just fine but now you're stuck looking at a reduced quality image because the chip is being wasted by its lack of storage.
Notsosobercpa@reddit
Hence why I mentioned having more vram than the consoles use as the baseline for a good experience. 16gb will be fine this whole generation at good visuals and consistent performance (if not necessarily maxed out) because the devs have to build around the 10-12gb the ps5 can use and you can just use the nearest equivalent on pc. 8gb cards will struggle because they can fall below the console floor at times.
MiloIsTheBest@reddit
I don't think that's the case. I think consoles will have their downsampled assets and I think PCs will (and should for this kind of coin) take advantage of higher quality assets.
Maybe 16GB will be enough. I doubt it though.
New-Connection-9088@reddit
I'm in the same place but my 2080 is really not cutting the mustard anymore. The 5080 is a huge upgrade. Maybe it will struggle with VRAM issues in 5-6 years, but then it's upgrade time for me.
MiloIsTheBest@reddit
>Maybe it will struggle with VRAM issues in 5-6 years
5-6 years is an ideal scenario. 5-6 years is what you WANT it to last. 5-6 years gets you to 2031. 5-6 isn't a potential problem.
What you don't want is a scenario like the 8GB 30-series cards (and in some cases the 10GB 3080) where they were hitting issues before the next series is even announced.
New-Connection-9088@reddit
I think we’ll be well into the next console generation before we start seeing any VRAM issues. Even then, do we expect the next consoles to have 24GB+ of RAM? I don’t.
MiloIsTheBest@reddit
Personally I would hope that games come a bit further before 2030. That quality advancement will also accompany an uptick in asset quantity.
I hope we're just not in a situation where someone has a 5080 in 3 years' time, that COULD otherwise process a game just fine, having to lower things like crowd density, texture detail, ray tracing quality, etc to stop the game from chunking out all because it just doesn't have the RAM capacity, compared to someone with a potential 5080Ti 24GB not having the same problems.
New-Connection-9088@reddit
Yes that is the worry. For me this sits in the normal technology dilemma: “If I buy today won’t they release something better tomorrow?” Always yes. If we get lucky the tomorrow thing won’t be too good, but if it is, such is life.
Ok-Difficult@reddit
I know this subreddit is heavily focused on American prices and whatnot, but the appreciation of the US dollar versus a lot of currencies is an interesting wrinkle in the pricing discussion.
A lot of Western, developed countries will be experiencing a ~10% price increase just purely off or currency exchange rates when compared to last generation.
surg3on@reddit
Don't worry. The us tariffs will fuck this all up anyway
New-Connection-9088@reddit
I think it's going to hurt sales in other countries. Especially Europe. I suspect it's why they priced this generation more aggressively than last.
sips_white_monster@reddit
5090 base price in Europe is 2400 USD. Just saying. Models like the ASUS Astral will be 3000 USD. Nobody is going to swarm stores for these cards. Pretty much all cards apart from the 4090 are still readily available in Europe right now due to the high prices. Euro has lost a lot of value vs the USD over the last few years, so you really feel the sting of that high VAT now.
Daffan@reddit
2k for 5070ti really? I thought like 1500-1600 aud? ?
MiloIsTheBest@reddit
Well I used the term "nearly $2k" to mean approaching. $1500 is the starting price NVIDIA has listed I'd be amazed if many models stay near that price.
But $1500 makes my point equally well. This is bullshit money to have to spend on any part that may be av significant compromise.
$1500 should be set and forget money. That's why I'm a bit over it.
uppercuticus@reddit
Gotta tack on a little bit for the "woe is me" math and some more for theoretical taxes/fees/tariffs/feels. The numbers people have been throwing around for pricing before and after the reveal have been hilarious
wilkonk@reddit
for the nvidia prices, 16gb is stupid. It's not at all stupid if the 9700 cards are priced as rumoured.
mylord420@reddit
If you have a 4080 super or 4090 why the need to upgrade so quickly anyways? Whats up with the normalization of upgrading every cycle as if its anywhere close to necessary?
Beawrtt@reddit
Framing it just for 4090 and 4080S owners is very narrow minded lol. Most people aren't even on 4000 series yet
metahipster1984@reddit
Why is that narrow myself minded? He was purposefully making a statement about a specific subset of people
Radulno@reddit
You do know there are people that don't have a 4080S or 4090 right?
metahipster1984@reddit
Where did he imply anything else? He literally said "for this specific subset of people" lol
dparks1234@reddit
I’d say multi-framegen is also less useful on the lower-end since RTX 4000 can already do 2x framegen. Going from 60FPS to 240FPS is cool, but someone buying a 5070 probably isn’t going to have an enthusiast 240hz+ display. An Ada card would already let them framegen from 60FPS to 120FPS.
wilkonk@reddit
Yes, that post the other day about multi frame gen being mostly useless right now was spot on I think, you need a high base frame rate to enable it without horrendous latency, and that means you need a really fast monitor to get any value out of more than 2x.
Thatshot_hilton@reddit
There are lots and lots of people who skipped that 4000 gen and will upgrade to 5000 series. I don’t think Nvidia will have any issues selling the 5080 it’s probably the sweet spot for people looking to buy a card for $1K range. Seems like AMD is skipping that range completely and going more towards 5070 as their target buyers
wilkonk@reddit
the 5070ti just won't be far enough behind in performance for the 5080 to make sense, unless the prices for the 5070ti partner models are stupid
NeroClaudius199907@reddit
4090: 1.18% 4080 super: 0.97%, people are overestimating how rich people are.
Orolol@reddit
4090 have usage beyond gaming.
NeroClaudius199907@reddit
4080 have usage beyond gaming as well
Orolol@reddit
Never seen any 4080S available to rent on compute, but maybe yeah.
NeroClaudius199907@reddit
3d rendering, video editing, motion graphics, data analysis, cad and so on.
feyenord@reddit
Yeah, looking at my 3090ti, 5080 feels a bit like sidegrade, it even has less memory bandwidth and less VRAM. I'll just wait for the 5090 pricing to come down a bit.
tilted0ne@reddit
They're both bad value as upgrades if you don't care for MFG...I'm expecting like max ~30% in RT and ~20% rasterization.
CANT_BEAT_PINWHEEL@reddit
The 5090 should be like 30% better in rasterization if cuda cores score scale linearly. It’s pretty nuts how much better the 4090 and 5090 are than the rest of their generation. It looks like the 5080 might not even be as good as the 4090 which would be an almost unprecedented generation on generation improvement flop
Gengur@reddit
Even if the 5080 ends up being the bad deal of the bunch. I can still see it selling out, unlike the launch 4080, because it's not $1,200.
Acrobatic_Age6937@reddit
sure it will sell, my guess is mostly because the 5070ti will be out of stock instantly. And the next obvious buy is the 5080.
Beawrtt@reddit
I know everyone loves to talk about deals and value, but if you're spending 1k on a gpu you're probably more concerned with the performance than the value. Sure you can save a bit of money going down to a 5070Ti from a 5080, but at the end of the day people want their games to run better. That's why the 5080 will be successful
RawbGun@reddit
The 5070 Ti doesn't have a FE edition so even if it's better value than the 5080, once you account for the AIB tax then you probably end up paying the same between a custom 5070 Ti and a 5080 FE at MSRP
SupportDangerous8207@reddit
Yeah I can’t help but think that the large 5080 5090 gap will drive sales especially if amd competes at 5070ti level
The 5080 will be the best card you don’t need to mortgage your house for
The 5090 won’t sell for 2000 if it did people wouldn’t be buying 3 grand 4090s
DiggingNoMore@reddit
And this is why I'm getting it. People go on and on about price-to-performance, but that just ends up with a worse card.
Will the 5070TI be better bang for the buck than the 5080? Surely. But then you just end up a worse card. I buy the best card I can afford, which is the 5080.
chaosthebomb@reddit
Plus you have a lot of people who have been holding off on upgrading since the fall when 50 was rumored to drop. Lots of people on 30 series looking for an upgrade which was less of an issue when 40 dropped. Good or bad the 50 series will sell much faster at launch until that demand dies down.
DeliciousIncident@reddit
Kinda suprised people are upgrading from 30 to 50, you'd think 3080 / 3080Ti is still pretty good; heck, maybe even 3070 and 3070 Ti still are. My Steam library has so many quality pre-2022 games (over 500 maybe) I still have to play, I will be sticking with 30 series for a long time as there is not much point in upgrading.
RawbGun@reddit
I'm upgrading from a 3080 to a 5080, I want the new stuff (mostly frame-gen) and I skipped the 4000 series. The 5080 Ti might be a bit better value though than the 5080 but since there is no FE edition for this one I'm not really interested in it
I am still on 1440p but for demanding games like Cyberpunk I can't really play with RT in more demanding areas (let alone PT) like Dogtown
-Purrfection-@reddit
It's probably that monitors have become a lot better and competitive since 2020. 4k is now pretty cheap and those OLED monitors are coming down in price and pushing down LCD's. Monitor upgrades drive GPU upgrades.
spotless1997@reddit
Yeah I’m upgrading from a 10-year old PC with an R9 380 and the 5090 is too expensive for me. I’m getting back into PC gaming since I switched to consoles and it seems like the 5080 will be somewhere between a 4080S and 4090.
For $1000, I’m pretty content on that level performance.
Illadelphian@reddit
I'm going to buy one despite the somewhat roomer mentality here. I want something better than a 5070ti but not willing to drop 2k on a 5090 nor do I actually need that if I'm being honest. I want to be able to play in 4k more effectively than I can now with my 3080. I am absolutely not willing to jump to amd for my gpu although I did just buy a 9800x3d.
If I had a 4080 I wouldn't upgrade I'm sure but I don't know why anyone with a 4080 would upgrade to a 5080 unless they just don't consider money to be a relevant factor. In which case they should just buy the 5090 anyway.
Even if the 5080 ends up being at or slightly below the 4090 performance, I can't buy a 4090 for 1000 bucks plus it will get the newest fg tech. I'll be patient and buy the 5080 fe just like I did for the 3080 fe and I know I will be quite happy with it.
Gengur@reddit
This is my plan too. I can't justify $2k, but $1k is fine for me.
I already have my new AM5 build built with a 9800x3d + my old GTX 1060. Now I'm just waiting to get a 5080 to pair with it and not worry about upgrading for 5-8 years.
Cars-and-Coffee@reddit
Yeah, this might be the first year I don’t upgrade to the next -80. Even at $1200, the 4080 was a large upgrade from the 3080. I’d like to see another 40%-50% increase like we’re used to and at this point, it doesn’t seem likely.
ax2ronn@reddit
No no, the 5080 is a bad deal. Absolutely no one should buy it (so I can get one on launch day.)
rpungello@reddit
All 50 series cards are a horrendous deal, so I can confirm nobody should buy any of them ;)
ZoomerAdmin@reddit
Guessing this means the 5090 will be great, but the 5080 will be poor value for what it is.
Minute_Power4858@reddit
that was super obvious to the most of us
5080 is 5070 in all of the ways but it's name
and it seems people going too love it for some reason(i hope not) when most of the models are wayyy over msrp
MasterHWilson@reddit
How so? the X03 chip is definitely an 80 class chip.
Minute_Power4858@reddit
i kinda refrenced this video(it was based on rumors but when specs relased it turnout this video was correct)
and if i have to guess based on the spec and prices(they are way over msrp)
reviews on 5080 will not be amazing.
ww.youtube.com/watch?v=cvL-Mplhog8&t=8s
just point of thought
5070 ti got only like 15% less performance and same amount of vram.
It will be pretty hard to find game/task that 5080 is "amazing" and 5070 ti is bad
And prices for 5080 right now are pure garbage (at least for me - after including taxes 5080 will cost X2 nvidia msrp on average)
MasterHWilson@reddit
My issue with that methodology of comparing the % cut down from the top spec chip is it ignores that the top spec chip is getting bigger. 744 mm2 for the 5090 is enormous, larger than some of the Titan chips ever were. If the xx90 is getting bigger, than for the rest of the lineup to stay the same, the % would have to lower.
MasterHWilson@reddit
How so?
nvidiot@reddit
If the expected raw performance of 5080 turns out to be true, it's basically 4080 Ti with same 16 GB VRAM (just GDDR7).
That might be why it's priced the same as 4080S in MSRP, because the card just doesn't offer anything compelling over 4080S if it's priced higher.
MrCleanRed@reddit
I thought the expected performance was 1.1X of the 4090?
signed7@reddit
Given its specs and all leaks / analyses of demo footage we've seen so far 1.1X 4080 Super is way more likely
MrCleanRed@reddit
Again, I am not saying it would not be, it very well cpuld be. My question is, where are you getting those leaks? Most reputable leakers said 1.1x of 4090, and the demo nvidia showed makes it slightly faster than 4090 as well
nvidiot@reddit
If we ignore the multi frame generation tech, and focus on raw performance, people are calculating that 5080's performance uplift is roughly \~18% of 4080S (there was a big thread about it over at nVidia subreddit).
That's not enough to catch up to 4090, and will make it slot right between 4080S and 4090.
MrCleanRed@reddit
Most prominent leakers are saying 1.1x or at least matches 4090. So I thought that was the expectation. Anyway let's see.
swsko@reddit
It didn’t not show that in raster performance, it was with Dlss, FG etc turned on
sips_white_monster@reddit
I think he's just talking about kopite7kimi saying the 5080 will be 10% faster than a 4090. However this was not directly said by kopite7kimi on his X account, rather it came from Videocardz who claim to have gotten it from kopite7kimi through a personal message:
So assuming that Videocardz isn't lying here (which I doubt, the guy who runs that site frequently asks kopite7kimi questions on his X account, I've seen it myself), this is quite a good source since kopite7kimi has been the no.1 source of NVIDIA leaks for at least three generations now. He was the first to leak the fact that the 30-series FE models would use a very uniquely shaped PCB (six months before the first photo leak), he was the first to leak how the 5090 would use close to 600W of power yet retain a 2-slot cooling design (people didn't believe him, yet we now know that it was accurate), and he even leaked the mysterious 3080 20GB card which everyone called 100% fake (only for it to show up on the used market in Russia of all places, despite never being publicly released).
He's keeping my hope alive lol, that and Jensen's quote about Blackwell being the biggest architectural rework in decades. I will be very pissed if it's just 15% over the 4080.
MrCleanRed@reddit
Plague tale requiem and Farcry was comparable.
panthereal@reddit
Or it means the 5090 is overpriced and they don't want people knowing that until they already bought it.
You're falling for the tactics by assuming it means the 5090 is the only good option.
ibeerianhamhock@reddit
Yeah nvidia has shifted from diminishing returns in terms of spending money and performance, to how much performance can you afford and the most you spend the better the value (at least 5080 vs 5090, same as 4080 Vs 4090).
I do think at least the 5090 is less of a value proposition va the 5080 since it costs literally twice as much instead of 25% more like the 40 series.
dwoooood@reddit
So what should I do as a 3080 owner looking for an upgrade? I don't mind the AI/DLSS stuff, but I don't want to spend money on a new card if it has trouble running games without all that jazz.
NeroClaudius199907@reddit
You're probably a 1440p gamer so 4080/5080 even 4090 will be good.
dwoooood@reddit
I just got a 4K monitor in hopes the 5080 would do the trick, then I watched the announcement and I’m not so sure anymore.
NeroClaudius199907@reddit
Buy a second hand 4090 because its less of a headache with vram. It will just work. Or even a 7900xtx
dwoooood@reddit
Good looks. Thank you.
EitherGiraffe@reddit
Coming from a 3080, anything below 4080 / 5070 Ti doesn't really make sense if you want a significant upgrade.
kodos_der_henker@reddit
Wait for benchmarks, by the 24th the AMD and Nvidia reviews will be out and we will know which ones are the best cost/performance and can run native 2k/4k games (be it this or previous gen)
surg3on@reddit
If you can snag a 4090 when they all go on sale second hand that will be your best value
dparks1234@reddit
The 4080 existed to upsell the 4090 since the price gap was only $400 and you got way more for your money with the higher end card.
Not sure what to make of the 5080 since the 5090 is $1000 more this time. Unless it’s significantly stronger than the 5070 Ti I don’t see any logical reason to buy one.
NewRedditIsVeryUgly@reddit
The 5090 has double the VRAM of the 5080. I bet they're counting on prosumers joining the AI hype to buy more 5090. If you're running any local custom LLM model, then you need at least 24GB for decent performance.
Beawrtt@reddit
What if you have 1k to spend and you're on a 2070 super?
Zoratsu@reddit
Can your PC support the 5080 with no changes?
Can you find a 5080 at MSRP?
Do you live in a place with no tax?
If all 3 are yes, I can see buying it but honestly the "find a 5080 at MSRP" would be the biggest problem.
NewRedditIsVeryUgly@reddit
What games are they using to market this generation? I see no major headliners, they're still using Cyberpunk 2077, 4 years after release.
If you need lots of VRAM for Machine Learning/AI then you're buying the 5090. As for the rest? I see very little appeal over the previous generation.
beleidigtewurst@reddit
And what about 5070 and 5070Ti, cough?
MrMPFR@reddit
Lol.
For those wondering it's February. Exact dates are yet to be disclosed.
beleidigtewurst@reddit
Ah. "February" is clear enough, thank you... :)))
Framed-Photo@reddit
Based on the specs I'd imagine the 5070ti is better value than the 5080, so I'm curious to see how the reviews pan out.
imaginary_num6er@reddit
5070Ti "better value" when there is no FE cards and like the 4070Ti, will have barely any MSRP cards
OfficialHavik@reddit
Good point. They advertise $750, but those AIB cards will probably be $800+ on average hurting that value prop
letsgoiowa@reddit
$800 "value cards"
End me right now
BastianHS@reddit
Seriously. Got an EVGA FTW3 3080 at launch for like $860 after taxes. Inflation got us all fucked up.
GiorgioG@reddit
Tired of the "inflation" nonsense, companies have been and continue to raise prices just because they can.
BighatNucase@reddit
Inflation is a real thing whether you want to believe in it or not.
996forever@reddit
It’s real, but somehow for CPUs, it isn’t.
BighatNucase@reddit
It probably is for CPUs but it's less noticeable due to lower overall cost (both on the end user and in terms of building costs) as well as the fact that CPUs have a much quicker release schedule. Trying to compare two different products as if they'll both get inflation at the same rate is very silly.
996forever@reddit
What do you think inflation even means
BighatNucase@reddit
Inflation does not mean that everything rises at the same rate...
996forever@reddit
Right, it doesn’t
What’s the point of bringing up inflating then if there’s nothing to compare against? If it rose 50% year on year then you can still say it’s “inflation” if there no baseline to compare with, right? Any kind of price hike can be “justified” with that inflation word then
Randokneegrow@reddit
The beautiful thing is, you do not have to buy a GPU. Vote with your wallet.
Bored_Amalgamation@reddit
that's capitalism. Hopefully the h i d d e n h a n d of the market will smack the shit out of them.
In_It_2_Quinn_It@reddit
It's the only PC component increasing in price over the years so of course it's inflation /s.
ThrowAwayRaceCarDank@reddit
I paid $769 for my EVGA 3080, and I remember being slightly upset that I didn't get a card at the $699 MSRP lol. This was right before the crypto shortage, how little I knew at the time!
BastianHS@reddit
Right? I did use the 3080 to mine eth tho, so it ended up paying for itself and then some. Kinda sad that I can't do it again with a 50X0.
jerryfrz@reddit
inb4 GDDR7 gives rise to another wave of shitcoin mining
Hifihedgehog@reddit
Totally. At least then 80 tier cards were about 10% or so off from the flagship 90 ones. Now, you get half the core/compute units of the flagship. Granted, 30 series never was as good a value as 10 series was. Remember the $700 flagship GTX 1080 Ti? Pepperidge Farms remembers.
zxLFx2@reddit
FYI $860 in September 2020 is $1042 today.
imaginary_num6er@reddit
Yeah but EVGA does not exist and the 3080 is obsolete with 8GB VRAM. You got what you paid for
BastianHS@reddit
3080 has 10gb. I'm not complaining, I'm saying it's BS that the 70 is going to cost the same as the big cooler 80 didn't years ago.
Chrystoler@reddit
I swear half the people on the sub are absolutely fucking delusional, a 10gb 3080 runs fine, Like yes more VRAM will be better but If you took what people said face value you'd think it's a worthless card.
That being said, probably holding on to mine until the next Gen (1440p/165hz)
Joshiie12@reddit
I'll drag my 6700XT into 2030 before I pay over $350 for a mid tier 'value' card
yokuyuki@reddit
What's annoying is I have a 4070 ti super that has a return period ending on Jan 30 so I won't be able to use reviews to decide whether I should keep it.
Framed-Photo@reddit
Considering the MSRP of that card is higher than the price of the 5070ti.... I'd just return it lol.
I can't see a world where the 5070ti is doing worse than the 4070ti super, AND is so hard to get that you're out a computer for months.
yokuyuki@reddit
Except I only paid $650 for the 4070 ti super so well under MSRP
Vb_33@reddit
If you don't want to pay more then keep it. Will suck if the 5070 and 5070ti are actually faster than we expected.
Framed-Photo@reddit
You probably should start with "I got this for significantly under MSRP" lol.
If your budget is 650 then yeah your card will likely still be better than the regular 5070? Not 100% certain though.
Beawrtt@reddit
I would think most people (including myself) that are buying a 5080 are buying it for the performance, not the highest value/dollar
dern_the_hermit@reddit
To me the thing to be curious about is how off the value/dollar curve it winds up being. I honestly don't expect it to be some big deviation.
nanogenesis@reddit
Judging by the responses in here, seems to be a good time for anyone to snag a used 30 series. I checked myself and a lot of them are below 350$ used in my area (3080ti). I saw a variety of 700-800$ listings for 4090 but none in my town.
Back when RT20 series launched, a lot of youtubers promoting buying the old gen. I wonder if we will see a repeat of that.
EitherGiraffe@reddit
700-800 for a 4090 is 99% a scam, only go for pickup and let them show you it works.
InLoveWithInternet@reddit
Buy the 5090 little goblins... Buy the 5090... Look how it's beautiful little goblins... And cheap too... Buy it little goblins... Buy it... .BUY IT... BUYYYYYY IT!!!!!
Kashinoda@reddit
Aren't the 9070 XT reviews supposedly dropping on the 22nd? That's like when Sony released Horizon Zero Dawn 2 days before Zelda: Breath of the Wild. (or when Horizon Forbidden West released 7 days before Elden Ring 😂)
the11devans@reddit
Reviews? I thought the 22nd would be the announcement since they haven't even done that yet
kodos_der_henker@reddit
Reviewers already confirmed that their NDA says 22nd and this will be the date their articles go live
AccomplishedLeek1329@reddit
I swear the horizon series has the absolute wordt release timings
imaginary_num6er@reddit
Chiphell dropped a rumor that AMD will be postponing the RDNA 4 launch again till after Lunar New Year
sips_white_monster@reddit
AMD's too busy trying to meet 9800X3D demand lol
aeiron@reddit
Updated to 1.6.3. Watching a stream fine. Samsung QN90A
SJEPA@reddit
Final warning to people wanting the 5080. Even NVIDIA knows it sucks balls 🤣
MasterBettyFTW@reddit
but will the 4090 or 5090 be $1k better?
SJEPA@reddit
Absolutely not, but at least they lead the pack. Whether that's worth $2k is another question 🤣
Consistent_Cat3451@reddit
The only problem I had with frame gen was the smearing and ghosting, I think dlss is pretty good but when I turn on frame gen... Yikes, I'm fine with 60fps since I ONLY play single player games on a controller and don't really notice anything above 120 (even the 60-120 is more of a nice to have than a must)
PrettyProtection8863@reddit
Is there a possibility of 5080 Super or Ti in the future?
sips_white_monster@reddit
NVIDIA pretty much always launch some kind of refresh or new model 1 year after the initial launch. As others mentioned, 3GB GDDR7 modules are a thing now, so a 24GB 5080 Super is definitely coming next year. I would be shocked if it wasn't. They won't even have to change the die out for a higher tier one. It's the easiest thing for NVIDIA to do.
MemphisBass@reddit
Yes, a 24gb 5080 has already been rumored months ago.
BigIronEnjoyer69@reddit
Well, of course, but they're not gonna tell you **now**.
DownRUpLYB@reddit
Why are we pretending the 5070 isn't a 5060? Can someone please explain?
ZekeSulastin@reddit
I maintain that Nvidia should learn from AMD how to name GPUs (R9 390 > RX 480 > Vega 56 > RX 5700 > RX 9070) because it would be hilarious, especially if they went with a Vega 56/64 type system.
And also because “it’s actually an x” has been tiring for years, so better to just restart fresh.
theholylancer@reddit
it sounds like they are very confident about the 4090, and is trying to Friday afternoon press the 5080
likely not anywhere nearly as good of a jump, the number of cuda core increase is just too little for what you get, and the new gen and GDDR7 isnt going to help enough
and I still think there is a element of the 5090 not scaling to 2x anywhere near of it as well, like right now, a 4090 is only 20-30 % over the 4080, which has 60% more cuda cores
if the 5090 is doubling, but only the same 20-30 % increase in actual games, that would be a crazy bad deal.
hell, even if its 50% better, it would still be maintain the gap dealie.
something is sus, and I am not sure which.
capybooya@reddit
The scaling is the interesting part. The 5090 has a very significant memory bandwidth increase. The rest of the specs are very 'meh' over the 4090. Outside of the bandwidth, you gotta hope for noticeably more efficient cores in RT performance, else it probably won't impress anyone that much. I will be very surprised if they solved all of the bottlenecks of the 4090, but we'll see soon enough...
theholylancer@reddit
yeah
the other option is, they know that this is the BEST card no matter what happens, and they want to hype it up as much as possible for people on the fence (somehow? the price gap is kind of too large...)
but either way, normally the gap isnt this big, only a couple of days and having the 5080 be a launch day thing just feels like something is off with it. that and the price, esp when everyone and their mother is like yeah its 1200 if not 1300 or 1400 because greed but 999 sounds like a great deal.
this makes me want to say even more, whats the catch?
Sigil09@reddit
Im hoping to snag a 5070 ti gpu and a 5070 ti laptop whenever that releases
AutoModerator@reddit
Hello panchovix! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.