Nvidia’s RTX 5060 review debacle should be a wake-up call for gamers and reviewers
Posted by mockingbird-@reddit | hardware | View on Reddit | 103 comments
JonWood007@reddit
It's stupid too. It actually aint a terrible product for the price and it actually was a somewhat significant jump over the 4060 all things considered. Is it perfect? No, but it baffles me they acted like this when this is the most progress the GPU market has seen for the $300 crowd in over 2 years.
only_r3ad_the_titl3@reddit
It is not. And AMD also just has 8 gb at 300 usd price point. But someone people dont get so upset about.
And then they wonder why AMD doesnt gain market share. It is on the underdog to upset the market, which they should already have done with the 7600. just doing nvidia-10% in raster while having significantly worse rt and upscaling is simply not enough butnpeople dont want to understand that
SEI_JAKU@reddit
Why are all of your posts the same gibberish?
JonWood007@reddit
The 7600 was a $250 card, actually. But hey, you wanna be overly cynical, go ahead and be overly cynical.
only_r3ad_the_titl3@reddit
It was 270 which is exactly 300-10% While it also had worse rt and upscaling performance. Not even really 10% better value in the end
JonWood007@reddit
If you're a sub $300 buyer every dollar saved is a plus. I say this as someone who also bought amd over nvidia (6650 xt- $230, 3060- $340 at the time). Those fancy features don't matter when they come at the expense of price/performance. No one cares about Ray tracing on a 60 card in reality and dlss is better but at 1080p upscale from lower resolutions both kinda suck and it isn't worth the pride premium. Either way idk why you're even bringing up amd here. Then again I've had a lot of weird comments on this thread tonight and a, super close to just turning off responses so I don't have to deal with everyone's overly cynical bs.
only_r3ad_the_titl3@reddit
Dlss4 although not perfect is useable at 1080 unlike fsr3. You are just discrediting whatever is a positive on nvidia cards as usual by pro amd fans
JonWood007@reddit
Fsr is usable. Also if you have better price/performance you don't need dlss as much. Also i didn't ask for your weird side rant about amd that wasn't even relevant to my original op. So....if anything you seem to just have a weird hate boner for them from my perspective.
goldsrcmasterrace@reddit
This guy spends all day every day on Reddit hating on AMD. Hope he is getting paid.
Zarmazarma@reddit
I think his talking about the 8GB 9060XT.
DrSlowbro@reddit
The RX 6700 XT and RTX 3060 are both equal in performance until the 8GB VRAM screws you over, which in modern gaming will be fast, then they wildly exceed it in performance.
If the 5060 improved over the 4060, and it still sucks this hard, wow.
only_r3ad_the_titl3@reddit
They are not equal in performance lmao. Not even close.
DrSlowbro@reddit
Actually look at game performance tests instead of synthmarks for once.
Zarmazarma@reddit
Take your own advice? TPU put the 6700xt at 33% faster than the 3060 based on their game benchmarks.
DrSlowbro@reddit
Once RT is enabled they're about equal, although the 6700 XT might win out in some titles.
And yes, this is absolutely fair to do, because going forward, RT-only is how games will be. See: Doom.
Zarmazarma@reddit
These benchmarks include RT games. To compare them only based on RT games is dumb, especially when these cards are already going to be largely replaced in the next few years.
DrSlowbro@reddit
We're talking about comparing them to the latest price-equivalent card, the RTX 5060.
So no, it isn't "dumb". We're talking about them in the context of 2025 and beyond, you know, like the RTX 5060.
JonWood007@reddit
Uh, you just compared 2 12 GB GPUs. And the 6700 XT is much faster than the 3060, closer to the 5060. And it was a great deal while it was in stock but it no longer is.
Also, not saying the 8 GB is good. But it is usable at lower settings. It's just not very futureproof.
DrSlowbro@reddit
They're also really old and both were in the "midrange".
JonWood007@reddit
Quite frankly, I dont consider $350-400 to be "midrange", maybe upper midrange, but that's old "70" money. Either way it is weird that that price range then regressed from 12 GB to 8. Either way, better than the 8 GB cards that they replaced.
DrSlowbro@reddit
An 8GB card being purportedly better than an older 8GB card doesn't really matter. It'd be like cheering a new 2GB VRAM 5030 because "hey, at least it's better than the old GTX 670 with 2GB VRAM". Like... yeah, but it's still useless.
And unfortunately the low-end for GPUs now is either older, used ones that are lower-end (RX 6400, etc.) or integrated. Mid-range, unfortunately, means Nvidia's 60-series, sometimes 70-series.
JonWood007@reddit
Whatever, you guys in this thread are being overly cynical. I want more than 8 GB too but that's all people are saying here. 8 GB bad. Yes yes, we get it. And I dont disagree, but it IS better than the cards its replacing. Okay? Both can be true at the same time. Reality is nuanced. We dont need to be overly cynical like ALL THE TIME.
DrSlowbro@reddit
It isn't better though because useless due to VRAM is still useless due to VRAM.
Reality isn't "nuanced" in that field. I can't stand when people insist on making everything some multipolar issue where there's no actual answers. Sorry, some things really are black and some things really are white. If you believe no issue on earth has objective answers, I'm not sure what to do with you. shrug
The 4060 wasn't bad due to specs. It was bad due to VRAM.
So like, what, two years later, we get the 5060. Which is bad... for the same reason, except worse because VRAM requirements have ballooned even more since 2023.
This is literally my example on a hypothetical 5010 with 2GB VRAM getting Nvidia drones creaming their pants because it's "so much better than the 1010 and 670!!!" even though... it is still literally useless.
scytheavatar@reddit
Why the fuck are gamers being blamed for Nvidia's incompetenve and greed?
Cheeze_It@reddit
Because too many gamers don't pay attention to the fuckery Nvidia does and they still go and stupidly buy Nvidia even though they are getting less performance per dollar over time. Nvidia knows this and just keeps making it worse because they know gamers won't stop buying Nvidia.
It's like shopping at Walmart despite them being terrible as a business and terrible in how they operate as a business in an economy.
only_r3ad_the_titl3@reddit
And buy what instead? Amd? They also have 8 gb at that price point
chefchef97@reddit
Buy AMD, buy Intel, buy used, keep what you have
Literally anything else is preferable to rewarding their behaviour
only_r3ad_the_titl3@reddit
Have you forgotten how amd manipulated the reviews?
SEI_JAKU@reddit
You obviously did.
Mean-Professiontruth@reddit
Buying AMD is dumb though
SEI_JAKU@reddit
No, it isn't. Time to stop lying about this already.
ImBoredToo@reddit
Not with the 5000 series when 12vhpr can catch fire and the drivers are a fucking mess.
deoneta@reddit
Circlejerking comments like yours are why no one takes the Nvidia criticism seriously. You all repeat the same criticisms ad nauseum but when you do some digging you find out barely anyone is affected by it. What percentage of 50 series cards have actually caught on fire?
reddit_equals_censor@reddit
it is impressive, that amd allowed partners to put the nvidia 12 pin fire hazard onto graphics cards though.
asrock + sapphire both chose to put the fire hazard onto a card each.
so the incompetence from amd's marketing and higherups thus went from: "we are free from any fire hazard and use trusted reliable power connectors", to "we mostly use safe connectors, that won't catch fire"....
DrSlowbro@reddit
The caveat being that the RX 9000 series has proper voltage regulators. Nvidia literally ordered said regulators be removed from the 4000 series onward and it's why their 12VHPR are so particularly awful.
Yes, it's an awful connector. No, it isn't the central issue with the 4080/4090/5080/5090.
reddit_equals_censor@reddit
what is this supposed to mean?
voltage regulators? so vrms? the vrm is the same on amd or nvidia cards. it is whatever powerstages they feel like putting on the cards and the 12 pin nvidia fire hazard as PER SPEC REQUIRED is a single 12 volt blob at the card, it is NOT split and is EXACTLY the same on amd cards as on nvidia cards.
what in the world made you think whatever you possibly meant there?
here is buildzoid saying as much:
https://www.youtube.com/watch?v=2HjnByG7AXY
again, it is EXACTLY, i repeat EXACTLY the same implementation as on nvidia cards vs amd cards.
alright we are playing a guessing game on what you mean now.
this sounds like you are phrasing things in a wrong way, but mean, that the 12 pin nvidia fire hazard gets split and crudely balanced by using certain pints for subgroups of power stages.
this as buildzoid points out is NOT the case and it would again be a violation of the insane 12v2x6 spec itself btw as it REQUIRES to be a single 12 volt blob at the card.
so while a split crude balancing at the card would be probably less melty, it would violate nvidia's insane fire hazard spec. i didn't write the fire hazard spec, i would have never let this fire hazard onto the market and would have recalled it, when the first melting started as well.
YES it absolutely is. a melting fire hazard is the central issue with these cards. i have a hard time thinking of a worse issue on compute hardware.
again don't believe me, hear buildzoid say, that the sapphire 9070 xt nitro+ 12 pin nvidia fire hazard implementation is EXACTLY the same as on nvidia cards.
and why in the world did you get likes on your comment, when it is factually wrong?
do people not do the tiniest bit of research at all?
DrSlowbro@reddit
https://old.reddit.com/r/buildapc/comments/13swywp/why_didnt_the_3090_ti_melt_plastic_like_the_4090/md5qjtw/
https://old.reddit.com/r/pcmasterrace/comments/1jsft5p/another_4090_with_burned_plug/mlt8uyq/
https://www.youtube.com/watch?v=kb5YzMoVQyw
"Current regulator" or "load balancing" would've been the more appropriate term, not voltage regulator. I am not an electrician. Nor am I watching your video that has nothing to do with anything anyone is saying.
You're not doing the "tiniest bit of research" either because I found the above after only two seconds of Googling my statement, knowing what I had read.
Like you tried so hard to swoop in and defend m'Nvidia and you just... couldn't do anything...
reddit_equals_censor@reddit
are you a bot? can you not read?
claiming, that me a person, who makes sure to write
to rightfully asign blame on every mention of this nvidia 12 pin fire hazard is defending nvidia?
are you lost? do you not know how to read?
i linked you a video from the same creator, that you linked me, except, that the video i linked is less than half as short.
do you even know who buildzoid (actually hardcore overclocking) is? or did you just get it linked and are repeating things, that it says without understanding a word of what it means?
you provided 0 actual evidence for this claim.
and a video created by buildzoid, the VERY SAME CHANNEL, that you linked to yourself is proving you wrong.
you know things you would know, if you could read and watch videos, that prove you wrong...
like holy smokes. know when you are completely wrong.
facts: nvidia created and is pushing this nvidia 12 pin fire hazard.
fact: amd allowed partners to implement this 12 pin nvidia fire hazard against any sanity, that should have prevailed.
fact: sapphire's 9070 xt 12 pin nvidia fire hazard is EXACTLY the same as the implementation on nvidia's 12 pin fire hazard cards and thus it is expected to melt all the same.
not me claiming this, but nvidia pointing this out looking at the pcb and having read the 12 pin nvidia fire hazard spec, which as buildzoid says REQUIRES it to e a single 12 volt blob at the card.
again KNOW when you are wrong.
DrSlowbro@reddit
So many words for "I was wrong and I can't admit it; please, let me be right, and you wrong.".
reddit_equals_censor@reddit
now 2 clear easy to prove lies are getting pushed by you.
seems like a completely clueless person, who deliberately doesn't look at video evidence, that proves them wrong,
or a terrible bot.
Economy-Regret1353@reddit
Nitro on AMD uses it too, but it just get swept under the rug
DrSlowbro@reddit
Because it has voltage regulators to accommodate it. You know there's an Nvidia GPU that used 12VHPR without issue? The 3090 Ti.
It had voltage regulators that Nvidia requested be removed from the 4000 series onward.
Blackarm777@reddit
The basis of that statement is what exactly?
surf_greatriver_v4@reddit
You are the problem
RealOxygen@reddit
Elaborate
SEI_JAKU@reddit
Because gamers are the ones who happily and readily allowed Nvidia to get to this point.
DonStimpo@reddit
Gamers keep giving Nvidia money.
Between Arc and AMD there is competition now in the budget space. But Nvidia still sells more
Azzcrakbandit@reddit
I don't think they should be either, but the majority of people keep voting with their wallets. I'd venture the problem lies mainly with people buying prebuilts without really comprehending the specs.
Withinmyrange@reddit
The mfg charts are so fucking funny haha
Logical-Database4510@reddit
What really sucks is that MFG is really cool tech for what it is and in its specific use cases (very high refresh monitors in SP games....playing AAA games at 480hz is a wild experience when CPU bound to 150fps), it's just Nvidia has completely poisoned the well on its discussion and usage by trying to ram it down everyone's throat as something it very much is not.
The worst thing about it all is that MFG does use quite a good chunk of VRAM....which, uh, yeah....oops 🤷♂️
reddit_equals_censor@reddit
on the point of poisoned wells.
nvidia's marketing made it a point to NOT mention ANYWHERE, that it is interpolation fake frame generation.
to the point, where people with dlss4 fake interpolation frame generation thought, that it was extrapolation instead, because EVERY PIECE OF MARKETING did by design avoid the visuals, that show interpolation or the word interpolation.
so how will nvidia sell actual REAL frame generation with reprojection?
a technology so different, that it shouldn't be thought of in the same way at all, but the well is poisoned, that people fall over dead from the fumes meters away even by now.
and nvidia is already working on a probably quite basic form of reprojection, that just produces a single frame per source frame and discards the source frame with reflex 2.
so no one, including the normies will believe a word, that nvidia will say about real reprojection frame generation regardless of how amazing it will be, because of the interpolation fake frame gen bullshit marketing lies.
Plank_With_A_Nail_In@reddit
There aren't real little people inside our computers all of the frames are fake.
reddit_equals_censor@reddit
i suggest, that you do the barest minimum of research, before making dumb nonsense comments like this.
what makes a frame real vs fake?
that is what you could have asked, instead of writing your nonsense.
a real frame has player input.
a fake frame does not.
interpolation fake frame generation does NOT have any player input. it is purely visual smoothing.
as a result it is indeed a fake frame.
the method to create a real or fake frame doesn't matter, what matters is, that it has 0 player input at all.
in comparison reprojection REAL frame generation has at bare minimum camera input, that gets used to reproject the latest frame to create a new real frame.
maybe read an article about this, vs spewing nonsense?
https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/
tukatu0@reddit
I don't agree with you on this. I would extend the definition from just lacking player input to "the system does not interact with the game itself". The implications of what kind of artifacts/experience you could get are different. Reprojection is still not a replacement for full render. Even if we are better off with it than without.
tukatu0@reddit
I severely doubt anyone who can understand this stuff would think extrapolation.
I think you over estimate how good async is. It is still going to be pick your poison situation. Whether it's oculus solution or some kind of extra hardware . I don't really want to type it up coherently but i think you should get a quest 3 to experience it.
Strobing has its upsides and downsides. https://blurbusters.com/the-stroboscopic-effect-of-finite-framerate-displays/ A good upside of strobing (atleast with crt beam simulator) is that it gets your input lag down to the panels native lag. Got a 480hz oled? 2ms lag baby.
Okay nevermind https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/ cheif mark reihjon thinks very highly of reprojection (async) . Since its so amazing. Nvidia will not have to say anything. People will immideatly understand the massive boon.
reddit_equals_censor@reddit
oh not only did people think, that it was extrapolation due to nvidia's marketing, but they were so confident, that they made reddit posts about it :D
https://www.reddit.com/r/pcgaming/comments/1i50sk6/why_does_almost_everone_think_dlss_3_4_uses/
it is a fascinating reddit post, because it shows the glory of nvidia's marketing team.
not only did people belief, that it was extrapolation, but again they felt confident enough to make reddit posts about it to make others get, that it is extrapolation :D
and nvidia's marketing is so deliberately misleading/hiding reality, that you have people in the comments actually unsure about this, because yeah nvidia designed it to not show any evidence for interpolation.
while amd is creating bad ps3 racing game looking raytracing demos, that look laughable (presentation a few days ago by amd),
nvidia marketing makes people question reality :D
just fascinating to look at, so i thought i share it.
i mean i don't have to.
the blurbusters article you pointed out talks about the comrade stinger desktop demo.
click on the link to the comrade stinger video and download it from the download here version as this should be the best version of the demo.
tukatu0@reddit
Good god. I have seen the capabilities of casuals once again.
Originally i said quest 3 since it's the only one actively used. Unfortunately i closed my browser and deleted stuff i wrote. The cheaper way would be geforce now but eh....
I wonder if the marketing is so strong. Playstation engineers were enamored with interpolation demanding amd work on a solution. When in reality amd should have been working on this a long time ago. But since they lack the vision and leadership...
reddit_equals_censor@reddit
part 2:
and well i mean it works on desktop. it does turn unplayable 30 fps experience into a fully responsive 60 or 120 fps experience for example.
definitely test the demo yourself. very basic, but enough to show you, that it is amazing to turn broken 30 fps into a fully playable experience.
nvidia themselves actually did some testing on people's performance with reprojection a while back vs having i believe the latency from game streaming roughly. and having reprojection massively increased player performance at least.
and the best thing is, that we are not at a dead end with reprojection. intepolation will always be shit in the ways, that it is shit. it can't create real frames. it always ads a ton of latency. it doesn't have player input, etc..
reprojection however can get massively improved, even though a basic depth aware implementation would already be amazing to have. as the article points out:
so in the future we should have advanced, depth aware, ai-fill in (for reprojection artifacts, edge pixel extension is enough for now though, even nothing is still fine), major moving objections positional data including reprojection real frame generation, that locks to your monitor's max refresh rate.
a bright future, free from moving object motion blur and amazing performance and responsiveness. and like you said, the chief blur buster himself sees it as the future.
i'd love to see an advanced demo with AAA graphics coming out instead of just comrade stinger's demo + vr reprojection of course.
it is crazy how many resources get thrown after interpolation fake frame gen and reprojection real frame gen? well nvidia is just going to use it to reduce latency, but not increase frame rates.... thus far and that isn't even released yet (reflex 2)
chapstickbomber@reddit
Naive frame interpolation can get latency as low as half a frametime plus gen time, so like 5-6ms at 120fps base, and only like 9-10ms at 60fps, like, we're talking half the latency introduced by increasing the flip queue by 1.
Ilktye@reddit
nah the best part is AMD is going to do exactly the same, but because nVidia did it first, AMD gets a free pass.
RampantAI@reddit
I feel bad for the engineers at Nvidia who know that marketing is distorting and misrepresenting their (still amazing) tech. For example Nvidia wants to compare DLSS + Framegen + Reflex vs native, and say that Reflex "makes up" for the latency as if that's a fair comparison. And there must be some top-down banning of the word "interpolation", despite that being the most accurate way to describe the tech.
Jordan_Jackson@reddit
Yeah, I have nothing against MFG. It’s legitimately a cool tech and I can see it only getting better with each iteration.
What I have a problem with is how Nvidia marketed their cards this generation and compared these cards performance results to cards that either did not have the same level of MFG tech or none whatsoever.
Just be honest about your benchmarks. Tell us that this is the performance to expect but it is using a certain feature set. It’s not like people still won’t buy their hardware. Instead, they make bogus claims and act all kinds of shady when there was zero need to do so.
Swaggerlilyjohnson@reddit
The fact that they didn't let them put the 4060 in there just because it would get some framegen so they wouldn't be able to make it look good.
Like they literally demanded you don't compare it with the card its replacing in your review that's actually incredible.
mrandish@reddit
NVidia must have known pushing this far over the line was going to blow back majorly on them, which has me pondering why they decided to do it anyway. I don't think the answer is simply that "They don't care about the gaming market compared to AI and data center products." While that's certainly true, it's a little more complicated that just that. If they didn't care about sales of gaming cards, they wouldn't have gone to this level of effort to manipulate launch reviews. They would have just done reviews on auto-pilot mode and gotten on with more important things.
No, I think the fact they "don't care about gaming card revenue - especially low-end cards" is only a precursor here. That attitude definitely caused them to seriously short change the 5060 on capability AND it caused them to not manufacture very many 5060s in the first place. They never intended it to be a good value or a huge seller. They knew that most gamers who were paying attention would quickly determine that lower end of the 5000 series would be a "skip it" generation for anyone who had a choice and cared about value.
I think the reason they suddenly started devoting significant effort toward manipulating the 5060 reviews is that they realized the reduced capability 5060 turned out to deliver EVEN LESS performance per dollar than the lackluster "meh" they were shooting for. And that matters because they realized they may not even be able to sell through the vastly reduced number of 5060s they've manufactured and have to take markdowns on sitting inventory. I suspect THAT is what created the sudden urgency to do whatever it takes to sell through the modest number of 5060s.
Icy-Communication823@reddit
You're giving corporate psychopaths WAY too much credit. These types of people will ALWAYS get to a point where they cut off their own nose to spite their face.
Their narcassism is so great, they will continue to blindly crash ahead - oblivious to the fact they're racing to their own doom.
hackenclaw@reddit
yep, look at Disney top exec on how they keep ruining franchise.
Someone in top nvidia must have recently got promoted to lead this kind of marketing. lol
Champeen17@reddit
Exactly, and as the Nexus Hardware video alluded to this came from the top executives.
Icy-Communication823@reddit
Corporate culture doesn't just make psychopaths - it fucking attracts them!
aminorityofone@reddit
There is this weird disconnect on this subject. People will think AMD or Intel are above these shenanigans. It even exists in this sub, where people are more educated about tech then most other places.
Icy-Communication823@reddit
Every company with a corporate structure is IDENTICAL.
zakats@reddit
I wish more people understood this.
pdp10@reddit
For a number of years the discrete GPU market has been anything but normal, causing many to have skipped several generations. A few skips in a row, and someone could end up in a position where they feel they need to acquire something, even if they don't like their choices at the moment.
And then the situation with newly-released titles. I felt that different studios having their own in-house engines was a healthier ecosystem, but lack of performance wasn't one of my original concerns.
marcost2@reddit
What blow back? The reviewer outrage? This thread?
In a couple of months this subreddit will be filled with 5060 pictures, because in reality no one actually cares, they will buy green no matter what.
I mean we are 10 years into the joke "AMD please compete harder so my GeForce card is cheaper" and it still hasn't changed. We got fucked with Kepler, with the titans, with the 970, with the drivers, with the 2000 series and with GPP and still everyone here will go and buy Jensen's GPU like good little doggies.
Why not just make the most profit if it's gonna sell anyways? They have this entire thing down to a science
aminorityofone@reddit
Whats the alternative? Intel, Amd? This isnt to say AMD or Intel are bad, but they are almost always sold out. Nvidia has stock. Same with OEMs, when you go to walmart and want a gaming pc, there isnt a single AMD card. LTT did that secret shopper and only 1 company (Starforge) suggested AMD, none of them suggested Intel.
AzorAhai1TK@reddit
"and still everyone here will go and buy Jensen's GPU like good little doggies."
I mean what do you expect people to do? I'm not going Intel or AMD and gimping myself just to stick it to Nvidia. They're greedy as hell like any other company but at the end of the day I'm getting the best GPU for myself regardless of the noise.
marcost2@reddit
Idk man, my AMDGPU plays game perfectly fine, just like the previous ones, my friends who do video editing say they edit video perfectly fine and even the AI researchers that ordered a couple of MI210 say they do GPU things like every other GPU in existance.
Like i'm not talking about the top end class series, Nvidia is alone there, but those sell in pitiful numbers. And in the trenches? Specially outside the US? You could get a 7800xt for cheaper than a 4060 last gen in most countries i know
But hey, buy green no matter what
scytheavatar@reddit
The answer is because Nvidia hates to lose and right now they kind of are in a losing streak when it comes to Blackwell. What we are seeing is Nvidia trying to stop the bleeding.
Strazdas1@reddit
How do you measure this "losing"?
Ilktye@reddit
Well Reddit dislikes them, so that's a major L /s
Mean-Professiontruth@reddit
What losing streak? Show me evidence of that other than Reddit posts and upvotes
only_r3ad_the_titl3@reddit
The hardware community is such a big circlejerk lol lead by some biased youtubers
Economy-Regret1353@reddit
Some PC subs can't even handle the word "Steam Survey", wait till they see data centers and productive workloads
Strazdas1@reddit
I think the answer here is the same as was with game reviews about a decade ago. They realized that the reviewers dont have anywhere near the clout people imagine them to have and decided to fuck them over because average customer will never watch a review anyway.
tukatu0@reddit
The actual answer may be related to tarrifs.
I am not sure this disdain towards the consumer is that significant. It's very foundational to the elitist culture pc gaming has. Just look at this from one day ago with 6k votes https://old.reddit.com/r/pcmasterrace/comments/1ks3bdr/this_sub_for_the_past_week/mticbjg/ nvidia built this culture not just inside but outside through the forums of the 2000s.
Jokes from over 10 years ago still apply. " i remember we use to have graphics like this. And then my dad got a job" - battlefield friends "pc elitist"
Back to the main topic. I think they realized the 50% tarrifs or more is an actual threat. Already had 25% in 2021. Why wouldnt the tax man do what he said he was going to do? . So when you suddenly have $500 5060s or more for a anti consumer product. It could be truly catastrophic to the business and industry.
max1001@reddit
Still gonna be the best selling card tho. It's what is going to be inside the $800 Walmart pre-build.
king_of_the_potato_p@reddit
Yep, exactly and prebuilds far outsell DIY home builds.
b_86@reddit
Yup, this is the main point. Intel is still "winning" in Steam hardware survey charts but that doesn't mean they're in a remotely good position.
Living_Morning94@reddit
Are there any reviews which compares the 5060 with Strix Halo?
Lendol@reddit
Anyway, 5060 8gig most popular new card on steam surveys next year. Consumers just don't know about this stuff and probably won't care if you tell em.
ipSyk@reddit
The wake up call was the GTX 970
Quatro_Leches@reddit
Watch the 5060 be the best selling card this gen
PT10@reddit
Not a hot take that potentially the cheapest and most widely produced card will sell the most
king_of_the_potato_p@reddit
The x60 card is the most common one inside prebuilds and prebuilds outsell DIY builds, so yeah it always will.
Ilktye@reddit
Yes well luckily AMD is totally not jumping in the same bandwagon with MFG... oh wait
xole@reddit
You have to be shitting me. That's nuts.
ResponsibleJudge3172@reddit
It's faster on average. Matching 4060ti
FembiesReggs@reddit
Should be; yes.
Will be; no.
Beginning-Zord@reddit
Yeah, a lot of people only want that shiny "RTX" inside their glass panel lol.
Shakzor@reddit
most people don't even know what that even means
they just want a pc, ask someone "can it run GAME?" and when they ask a store employee then get recommended something that the store needs to get rid of
DrSlowbro@reddit
Nintendrones are just head over heels for anything Nintendo. It doesn't matter if it's good.
obthaway@reddit
wake up call for people to go buy more nvidia cards for sure xd
drnick5@reddit
None of this matters unfortunately..... Nvidia doesn't give a shit about gamers, it's very clear. Why sell games a $2k card when they can throw some more ram on it and sell it for $10k?
Gamers have been getting fucked since 2020. And there is no end in sight
HustlinInTheHall@reddit
If a product isn't going to be reviewable until launch date, you should be buying it to review. Relying on cherry-picked loaners that have been QA'ed to death before they ever get to you is not any more reliable than charts provided by Nvidia that you have to publish.
When you accept loaners you accept strings.