GEFORCE RTX 5060 Review - Nvidia Didn't Want You to See This
Posted by styxracer97@reddit | hardware | View on Reddit | 285 comments
https://youtu.be/2e1a2-VxxvQ?si=RG5vu9b9cysipnCC
Firefox72@reddit
Always cool when a new xx60 product doesn't even consistently beat the previous gen xx60ti let alone the xx70.
FinancialRip2008@reddit
it's a 3070 with frame gen. imagine if you got a 3070 at msrp 5 years ago, you coulda saved yourself 200$ just by waiting until the 5060 dropped.
grizzly6191@reddit
I picked up an Asus 3070KO Jan 2021 (4 Yr Ago) for $569 which is equivalent to $697 today, a little less than what I paid for my 9070XT Taichi ($730).
kikimaru024@reddit
Because we were all buying cheap GPUs in 2020...
Calm-Zombie2678@reddit
I bought a cheap 1650 waiting for the 30 series to drop when I built my PC in mid 2020
I still have that and it's still worth more than I paid for it lol
There's no upgrade that's worth the price yet, I'm yet to throw a game at it that it can't run with reduced settings and 90% of my gaming time is spent on civ 5 so I can wait a bit longer. Fuck nvidia
Daneth@reddit
If you can get a 5090fe for MSRP it feels super worth it. Literally none of the other cards are a good deal.
Calm-Zombie2678@reddit
If I bought that it would be the most expensive thing I owned...
Beige_@reddit
Sold both of your kidneys already I see. If not, that could be the route to a GPU upgrade.
Calm-Zombie2678@reddit
I'm trying to get there selling blood and sperm
Not mixed together, obviously
sonofnom@reddit
I bought a 4080 at slightly more than MSRP about 2 years ago and thought I would regret it until I saw the 50 series launch
lemmiegetafugginuhhh@reddit
I have a 1650 super OC 4gb, a EVGA 980ti 6gb and a Radeon Vega FE 16gb. The 980to is the best but the 1650 keeps up at like half the power consumtion. the Radeon card can't even get driver updates so I'm kinda disappointed with that one. Haven't had a chance to really push it
PartisanSaysWhat@reddit
I bought my 2070S in April of 2020 for $450.
I was told then that I was crazy.
I still havent upgraded lol
icemantx69@reddit
SAME! I was in March and bought one for my kid and one for me.
PartisanSaysWhat@reddit
Admittedly I am less into PC gaming than I thought I would be going into the end of the world pandemic times... but its still.. fine?
I'm on a 1440p 144hz monitor and I just turn some settings down to get to 60+ fps. i7-10700k still going strong. Maybe I'm just older now but I dont feel the need to upgrade like I used to.
FinancialRip2008@reddit
whole gpu market is a case study in stockholm syndrome
Strazdas1@reddit
inelastic demand is hardy a new concept.
SagittaryX@reddit
Well, 2020 yes. Most of the madness was 2021.
IgnorantGenius@reddit
And half the power usage. With mods you can basically add framegen to a 3070 with any game that supports framegen. So, the 3070 is still the better card other than power draw.
Vb_33@reddit
With mods you can add MFG x4 to a 3070. Take that Nvidia!!
ThinkinBig@reddit
Don't even need mode, just use Lossless Scaling which works on any game
StickiStickman@reddit
... no? Not really?
__laughing__@reddit
You can run fsr3 on any gpu from the Nvidia RTX 2000 series and newer, and on the Radeon RX 5000 series and newer.
FinancialRip2008@reddit
is that the in-game version or the driver level version?
that's really cool if it's the in-game version, that implementation is very good.
the driver level version cuts out when you whip the camera around, and there's artifacts around the HUD. i think it's still worth it for games locked to 60fps (or to max out my monitor without dumping heat in the room), but i can see why people might hate it.
cowoftheuniverse@reddit
Afmf 1 does cut out in movement, but the newer afmf 2 does not. AMD only tho.
chapstickbomber@reddit
I use AFMF2.1 on basically everything non-FPS (and even some of those). 120fps cap smoothed to 240fps and the latency is just not an issue at all. It is literally half a frame time plus the framegen time, so maybe like 6-7ms but you can perceive objects in motion with better fidelity/precision so it feels better.
varateshh@reddit
I have a 3080. I have not seen FSR3 framegen implemented in such a manner that it does not feel awful. Awful input lag and I would hesitate to use it for anything except a total war game.
frostygrin@reddit
You need to use Reflex to minimize lag. Force it with SpecialK or RTSS if necessary.
FinancialRip2008@reddit
never tried it on a 3080, so i hesitate to speculate on why it's not the same. interesting that the experience sucks though; that's too bad.
MonoShadow@reddit
Nukem FSR mod (FSRisbetter or something) and Optiscaler allow you to swap DLSS FG for FSR3 FG in any DirectX game which supports DLSS FG. Vulkan is a different topic. It often works, it sometimes doesn't.
FSR3 FG is actually not too bad. It's not as good as DLSS3, but it's not as bad as DLSS3 SR vs FSR3 SR. nVidia users can also use Reflex and DLSS SR in those titles as well.
Vb_33@reddit
You can do far FG on 10 series cards.
__laughing__@reddit
I did upscaling on my 1650s but I didn't know cards that far back could do frame gen! that's pretty neat.
airfryerfuntime@reddit
https://notebooktalk.net/topic/2330-dlss-3frame-generation-double-framerates-mod-for-3xx-series-graphics-cards/
StickiStickman@reddit
Fair enough, I didn't know FSR 3 Frame Gen actually is usable on a 3000 series card.
Vb_33@reddit
Funnily enough 3070s are valued at around $280-$350 USD and 3070tis are valued at $340-400. So it seems consumers are plenty happy buying a used 5 year old power hog like the 3070 for the price of a new 5060 with warranty DLSS 4 RR, FG and MFG.
Eagle0913@reddit
Since when was the 70s ever considered a power hog? Huh?
Vb_33@reddit
Since it was compared to the 5060 (145W vs 220W) which offers similar performance. Power consumption is relative.
Eagle0913@reddit
Why not just say something like that instead? I feel like we should always strive to not use sensationalist wording. 75 Watts difference is worth mentioning for sure, but "power hog"?
Pub1ius@reddit
I don't understand why power consumption is even a factor for most people. I pay about 12 cents per kwh for electricity. Let's say I run a 1000 watt gaming rig at max consumption for 4 hours every single night of the week. That's $3.36 per week, which may as well be 0 to me.
Eagle0913@reddit
I completely agree. Thats why I was so confused by their sensationalist wording
inyue@reddit
It's like 60% power consumption increase. It's not sensationalist.
Xpander6@reddit
5060 is more like 125W, not 145W. 3070 can get up to 240W.
Sh1rvallah@reddit
3070 can easily undervolt to around 175 and likely gain performance in the process. IDK if you'll be able to turn 5060 that hard.
Xpander6@reddit
Both can be undervolted, so no sense in comparing one undervolted vs one at stock.
crshbndct@reddit
Power hog?
75w is more, but not substantially so. I just switched from a 110w to a 380w GPU and it’s not made a difference to my room heat or anything. I just turn the AC up if it gets warm anyway
Anatharias@reddit
Given that frame generation is software... it's a paywall. My 3090 performs damn well with frame generation mods
an_angry_Moose@reddit
Lmao
popop143@reddit
Never expected my 6700 XT to be competitive to an Nvidia -60 class card that's 2 generations newer.
Pub1ius@reddit
And you have 4GB more vram :D
an_angry_Moose@reddit
Member when every generation basically ensured that the step down card of the new gen would beat the old gen? There were a number of generations I remember the new xx70 basically matching the old top mainstream consumer card (xx80 Ti).
Had no idea how spoiled we were, especially for the prices we used to pay.
Vb_33@reddit
Yea that was when die shrinks meant something and the cost of transistors actually went down.
jnf005@reddit
Even when Nvidia was stuck on 28nm, they delivered massive upgrade from Kepler to Maxwell, they can do it without die shrink, they are just in a much more favourable position because of server, hpc, ai and their massive lead with feature set allows them to squeeze so much more.
AttyFireWood@reddit
Nvidia also released Kepler twice, with the 600 series and the 700 series (aside from the 750/ti). Refreshes were totally a thing. I don't see the 50 series as much more than a refresh of the 40 series.
an_angry_Moose@reddit
Definitely the golden age of pc building.
Constant-Plant-9378@reddit
I'm just sitting here with my RTX 4060 ti 8GB and perfectly happy.
Xpander6@reddit
As long as you don't play any of the games in which 8 GB isn't enough.
Constant-Plant-9378@reddit
Haven't encountered one I care to play yet.
Strazdas1@reddit
He could play those games and still be happy by... lowering settings.
Strazdas1@reddit
But according to reddit your games dont run, your computer is on fire and your life has ended with that purchase.
Constant-Plant-9378@reddit
Just like a lot of people driving $80K pickups on ten year notes, while carrying $30K in credit card debt, far too many folks simply have no concept of 'good enough'.
Wardious@reddit
Yes, 5060 is not fast enough to upgrade from my 3060 ti
Mayjaplaya@reddit
Not even from my regular 3060.
DavidsSymphony@reddit
Bruh the 5060ti still can't beat my 3080, 54 months later. I remember buying a 1060 that matched the previous gen's 980. The GPU market is terrible. Barely any uplifts gen on gen and even leaping a gen.
jecowa@reddit
No wonder that nVidia didn't want people to review it.
yellow_eggplant@reddit
It's literally a 4060Ti for a $100 less. lol
styxracer97@reddit (OP)
Well, at least my 3070 still has some life. Lmao
THE_GR8_MIKE@reddit
My 3070 ran Doom Dark Ages on the highest settings with like 70% usage. Ran like butter. DLSS brought the usage down to 30%. I'm not even mad I couldn't get a 9070XT anymore lol
dbcanuck@reddit
in 2025 the 3070 is a 1080p card. the vram limitation at 8gb really doesn't factor until you try to push 1440p.
yellow_eggplant@reddit
The fact that it matches your 3070, a 2-gen old, 5 year older card.. Jesus man, no uplift at all.
Maybe I got spoiled by stuff like the 1060, which was 10-20% faster than a 970 and like 50-100% faster than a 770 (depending on the memory limitations) lol. Man GPUs suck nowadays
Vb_33@reddit
You got spoiled by Moore's law, hope you enjoyed it while lasted, shame the laws of physics are real and all that.
dbcanuck@reddit
while this is true, deliberately gimping the card by holding to the same VRAM for 3 generations in a row is largely to blame.
i7-4790Que@reddit
Shame you're trying so hard to sanewash the sorry state of the dGPU market. Absolute fool's errand.
You made 22 posts in one thread, which is over 10% of total comments posted at time of mine (it takes <10 seconds to CTRL-F out who's clearly got nothing better to do when you see the same user so many times in almost every comment seemingly posting the same mindless flavored contrarianism)
Whatever it is you got going on, it's terminal.
Vb_33@reddit
I think you need to do some self reflecting. Your comment is so aggressive and emotionally charged but over what exactly? video game chips being more expensive? A redditors with a top 1% commenter badge making lots of comments? You can just ignore my posts, nobody is forcing you to read everything I say. Lord knows I do the same for others.
If video game chips getting more expensive are so detrimental to your emotional state and well being then honestly you might be better off finding a different hobby to spend your time in. Last I checked hobbies are supposed to provide fun and satisfaction not frustration and misery.
yellow_eggplant@reddit
That's nice. If Moore's law is dead, I wish Nvidia (and AMD as well) would price their cards in accordance to the gains, and not attempt to charge twice the price for half the performance gains.
ShadowRomeo@reddit
The 3070 is still strong even by modern standard it practically matches the PS5 Pro according to a bunch of DIgital Foundry testing, but the 8GB Vram didn't make it age as gracefully as it should have.
But if you are playing on 1080p with optimized settings it is still a good GPU and I will argue still worth having to last this shitty current generation.
rustypete89@reddit
Bought a 6650 XT used for $220 today to free up my old 3070 for an ITX build I'm going to be hooking up to the living room TV. I'm confident, with DLSS, that the card will be able to do a decent job at the task (it's a 4k screen), but the VRAM is definitely a bit worrying. Not going to be doing any super recent games on it, Hogwarts Legacy or Dead Space remake would probably be the newest, but I'm still expecting I'll have to drop texture quality more than I'd normally want to. We'll see.
The 6650XT is replacing it in a PC I built for my gf using an old 1440p monitor. She doesn't game very much and they are close enough in perf that I doubt she'd even notice the difference.
The 3070 is a great card, excited to be putting it back to work after upgrading from it in February.
yellow_eggplant@reddit
The 3070 is a 5 year old, 2-generation old card. The 5060 should be compared to the 4070, which stomps it. This runs counter to previous x60 cards which used to beat previous gen's x70 cards and could get within touching distance of previous gen's x80 cards. (Or in the case of the 1060, outright beat it.)
GPUs suck nowadays. The fact that nvidia offered 12 gig cram on the 3060 and haven't offered it on this or the 4060 is laughable. Then again, the 4060 sucked and should have been a 4050ti or 4050
Plank_With_A_Nail_In@reddit
The 2060 Super was almost as good as the 2070 back in the day.
yellow_eggplant@reddit
And the 20-series was considered as a bad gen back then because it wasn't as huge of an uplift as the 10-series was.
Now, I WISH we got that uplift for these cards lol
HayabusaKnight@reddit
Eh we had periods of duds like this from Nvidia and ATI/AMD at times further back too, like certain Geforce 4 cards and Radeon 9000 cards being rebadged previous gen GPUs then if you waited you got 'lucky' with the FX 5000s being your next option lol. RIP to everyone who got sold an FX 5200 in some ibuypower combo.
It's just never been had this wild out of control price increase, if the card or entire line sucked it would get discounted instead. Nvidia GPUs are now in their Intel i7 arc and I hope the same thing happens to them.
Schmigolo@reddit
The problem is that the last 4 gens had 3 of these kinda duds across the board.
Capable-Silver-7436@reddit
2080ti gang feeling confident
shugthedug3@reddit
I love my Titan RTX, it's proving hard to find anything to upgrade to...
velociraptorfarmer@reddit
Guess I'm hanging onto the 3070 that I bought used for $330 4 years ago...
Vb_33@reddit
You bought a $330 3070 used in 2021?
velociraptorfarmer@reddit
Just double checked because I couldn't remember for sure, looks like it was November of 2022.
inyue@reddit
Exactly after 1 month of ETH not being able to be mined anymore xd
viperabyss@reddit
....so that's good, right?
You get the same performance for less.
tukatu0@reddit
Way less than anything before 2022. 4060 should have been 3080 levels and 5060 would be atleast 4070ti super levels. If this gen came out even just 4 years ago, the 5070ti would have been the 5060 for $300.
Yeah. I hipe you can understand why people are very dissaplinted. It's the new reality though. All you can do is suggest people get a console for gaming and not think about graphics anykore. Otherwise you pay up
UsernameAvaylable@reddit
In an ideal world where nothing ever stops node scaling and we still have exponential efficiency and density growth.
Now wake up to the real world and realize that stuff from a decade ago will never happen again.
tukatu0@reddit
Thats right. So like i said. Get a console and dont think.
Vb_33@reddit
Consoles are going through the same problem see PS5 and PS5 Pro, Xbox Series consoles. This console gen is the worst we've had in terms of pricing, performance and cost reductions. Next gen it's going to be even worst, rumors are the PS6's chip may use Zen 6 and UDNA while being produced on TSMC N2, if you think costs are out of control for the PS5 and PS5 Pro now just wait till next gen.
On top of this the console userbase isn't growing anymore because kids don't want to game on consoles, they game on phones, tablets and sometimes they graduate to PC following their favorite Minecraft, Roblox and fortnite YouTubers. Companies are then incentives to grow their profits by making more money per user rather than more sales of their games to new users. PC gaming on the other hand continues to grow and is larger in size. I don't think consoles are the end game anymore, console manufacturers will only clamp down harder (more monetization and limitations) now that they know they can't grow the market anymore.
tukatu0@reddit
Prices aren't going to be as bad as you think. Realistically not much if any more raster than ps5 pro. With whatever path traced capabilities sony wants.
I don't agree with your second paragraph at all. They spend billions collectively chasing dragons like fortnite. Making a billion once a decade is not good enough. They want 10 billion just like gta 5 and fortnite.
Hmm... I dont want to work for free with solutions. So ill leave at least this. Kids play those games because they are free. Thats it. Nothing more. They dont know to look for things they dont know exist. They game on tablets because their parents bought it off credit along their phone plan for $15 a month. Giving their 3 year old a way to shut the f""" up long before they can even read https://youtu.be/SsIuGUPmNMI
Its not a good thing. Nor is the unethical abusive relationship some coorporations have with targeting children. Fortnite is a giant f"""ing billboard that abuses attention mechanics of children. Im amazed they haven't been sued yet. Hiding the word advertisements under the word collaboration sure does alot.
I guess targeting children just like how roblox business revolves around stealing the labour of content creators (children mostly). (Just ping me if you would like some evidence about this one)
Something something, free will, just click off. You tell me if the first video looks like free will. Is it right or fair the parents have no idea that not creating a third space for their children will lead them to those things? Thousands of hours of a kids precious limited life. Where they only have like 5000 hours of being a kid. Spending it on something they might not realize they don't even care about because they have nothing better.
This comment is a bit long. I have way more thoughts but they could belong in a court room for all i know. Or a dissertation. ... Wrote a bunch of stuff but its just free work for someone who might not care about the actual end user. I already f"""ed up once helping some sort of person with business in cinema. Deeply regret it.
Vb_33@reddit
It doesn't matter why kids are gaming on x rather than y, what matters is that that's the trend and it's getting worst with gen alpha kids. Markets need new blood or else they shrink and collapse, if new kids aren't playing double fine style point and click adventure games then the market can't sustain devs making point and click adventure games so we get less and less then big companies stop making them and then we barely get any new ones at all (what actually happened with the point and click adventure genre).
It's just economics and it's going to affect everyone. Point is the long term trend for consoles isn't as rosy as it was during the PS2 or even PS4 era so one should be careful building their digital libraries there.
tukatu0@reddit
Sure.
if you listened to what i said. It should be pretty damm easy to see how it could all go nowhere besides court rooms eventually. On a seperate note Good luck to Nintendo.
Aside from that. Unless you think consoles will dissapear in the next 10 years. It's insignificant to suggest being carefull of where to build your libraries. That's what discs are for anyways. It's no better off than steam whos head people are in their 50s already.
only_r3ad_the_titl3@reddit
Lol delulu
HavocInferno@reddit
The 4060Ti already wasn't a good deal during its production life, and now this 5060 only matches the worse version of the 4060Ti.
And it's still a bad deal, because while it improves perf/$ over the 4060Ti, the improvement is less than you'd typically expect for this time frame, and the baseline it improves upon wasn't good to begin with.
It's technically "good" in a vacuum, but really just "somewhat less bad".
Strazdas1@reddit
There is no typical expectation for a timeframe. It all depends on whether technology improved or not.
HavocInferno@reddit
Sure there is an expectation. It's a trend set by previous generations. Are you just going to ignore the last 20 years of consumer hardware/GPUs when evaluating whether a new generation is as expected?
But even if somehow you don't do that...clearly technology has improved more than this card suggests. Nvidia keeps expanding the upper tiers, but those gains don't translate at these entry level SKUs.
This 5060 has trouble consistently reaching parity with 4060Ti and beating 3060Ti, let alone actually distancing itself from either. The x60 and x60Ti classes have been treading water within a few percent of each other for three generations now. Meanwhile compare x70 and up from 30 to 40 to 50 series.
Strazdas1@reddit
Well, if you have such expectation, it is a false one. And there is no clear trend set by previuos generation. There are many ups and downs in GPU history.
The 5000 series uses the same node as the 4000 series. Chip technology thus remains unchanged, and lackluster changes are expected.
There are no easily picked fruits like that anymore.
Vb_33@reddit
Moore's law is dead your time frame expectations are outdated. You're never gonna get 10 and 30 series price to performance gains ever again just like CPUs will never have the clock speed and IPC gains they used to have in the previous century, it's over and done. Move on.
Cute-Elderberry-7866@reddit
I guess, but technology used to have much larger advances. A lot of this feels like Nvidia not having enough competition, but some also feels like technology is slowing.
The good side is that products stay relevant longer. Well, except for artificially low ram on new products, that's the catch. Nvidia promises neutral compression but we've yet to see anyone actually use it, let alone enough developers using it for it to ever matter. Could be 4+ years away from real use, but it does promise better textures for the memory usage.
viperabyss@reddit
Technology used to benefit greatly from die shrinks, which is what they have extracted the most performance from.
That time is already long gone.
I mean, how is this different from the first generation of DLSS? First it is a technology platform, and developers will start looking into and adopt.
Vb_33@reddit
Every time this happens we whine now and then somehow praise the current gen of cards in the future. There's people in this thread praising the 2080ti when that card was the first modern of example of "muh gains", the 20 series was lambasted for having features you couldn't use at launch and price increases and meager gains compared to the previous 2 gens. 50 series is in a similar predicament except MFG and DLSS4 worked from day 1 and we live in a post Moore's law world using the same node as the 40 series.
Strazdas1@reddit
thanks to Nvidias early thinking and including the DLSS/RT features in the 2080ti it has aged so much better than the 1080ti.
only_r3ad_the_titl3@reddit
You mean lack of comprtition for tsmc
autumn-morning-2085@reddit
Considering the tariffs, the pricing is "aggressive" in a way. Their cost to manufacture this has to be the same or a bit higher than 4060 ti, but selling for $100? less.
Don't know if the current retail price includes tariffs or this is an introductory price they will hike soon.
CJdaELF@reddit
Which is also a 3060ti/3070 for $100-$200 less, 4 years later
BaconatedGrapefruit@reddit
I mean, yay for savings at least? I do miss the days when the 60 class cards weren’t trash value, though.
Vb_33@reddit
Funnily enough this is the highest value card (fps per dollar) in the 50 and amd 90 series lineup.
crshbndct@reddit
It can be the best value while still being trash.
only_r3ad_the_titl3@reddit
It is still the best value card
SovietMacguyver@reddit
You will take it and you will like it! What else will you do... buy AMD??
creamweather@reddit
The video presents a good case for the 9070. Probably the best card on the list. That was my main takeaway aside from 30-series users: don't bother with this crap.
Vb_33@reddit
Yes because xx60 class buyers who spend $299 on a GPU are going to buy a $550 card that goes for $700 instead.
creamweather@reddit
Sure, probably not. Not usually anyway. As a 60 series user I also have to look wistfully at the better options.
only_r3ad_the_titl3@reddit
That is a card more than twice as expensive
only_r3ad_the_titl3@reddit
Well it is on the same node, so that is not even that bad.
DuhPai@reddit
Which is in turn just a 3060 Ti. So it's a 3060 Ti 5 years later for $100 less.
PainterRude1394@reddit
That makes it sound decent compared to competition.
gokarrt@reddit
the way things are going, i guess we're "lucky" it isn't more expensive.
OverlyOptimisticNerd@reddit
The 1060 matched the 980.
The 2060 matched the 1080. And this was a massive price hike for the x60 series.
The 3060 couldn’t even match the 2070.
The 4060 barely edged out the 3060, and lost in some VRAM limited situations.
The 5060 might actually lose to the 3060 in similar situations.
The 1060 was the last great x60 product, IMO, so long as you got the 6GB version.
ET3D@reddit
Based on the current review, the 4060 is 20% faster than the 3060, and the 5060 22% faster than the 4060.
So sure, there's the VRAM issue, but I feel that you're deliberately misrepresenting the figures.
OverlyOptimisticNerd@reddit
1440p has been my preferred resolution and I’ve been fine with x60 products in that segment since the 1060.
To agree with my point and then level that accusation is extremely bad-faith.
Be better.
viperabyss@reddit
You know what else has changed? Games themselves. They have become much more demanding than before.
xx60 products have always targeted 1080 crowd.
PorchettaM@reddit
All that means is GPUs failed to keep up with the rest of the industry. Games advanced, displays advanced, consoles advanced, GPUs in a given price bracket are still targeting the same resolution they did 10+ years ago.
viperabyss@reddit
I mean, given that majority of PC gamers still pay games on 1080p as of last month's Steam survey, is it really a surprise that Nvidia still targets this segment with a product that have always targeted this segment?
PorchettaM@reddit
Bit of a chicken and egg problem there. 1440p displays are dirt cheap nowadays, and 4k is getting there. On the TV side 4k has been the de facto standard at all price points for a while now. So it's safe to say the bottleneck is not on the display side.
Now, is the average PC gamer sticking with 1080p because they just don't care to upgrade, or because they get fleeced trying to play at higher resolutions?
viperabyss@reddit
I mean, people rarely do PC gaming on TVs, so not sure why it has any relevance here.
And I'm not sure how gamers are getting fleeced playing at higher resolution. The 1440p monitor is on average $100 more than 1080p monitors, just as 4070 Super is $100 more than 4060Ti 16G. Are you saying that gamers think the $100 difference between 1440p and 1080p is acceptable, but $100 between a solid 1440p and a solid 1080p card is too much?
cowoftheuniverse@reddit
Don't have any big data but as an anecdote in my friend group about half play PC on TVs. No idea if it is even 10% of all PC players but its not super unusual.
viperabyss@reddit
Given that less than 5% of PC gamers play at 4k (per Steam hardware survey), and that also includes people who play at 4k monitors and not TV, it's pretty easy to conclude that playing PC games on TV is quite unusual.
cowoftheuniverse@reddit
Well the steam does not detect my resolution based on display, but by windows setting. You can check by going to help-> system information. It shows the desktop resolution for both primary and desktop resolutions. "Primary display resolution" does sound like it refers to the pixel count of the display but for me at least it doesn't detect it.
So while me and my friends have been on 4k capable TV's for many years now, we have have been using lower resolution because the GPU power for 4k just hasn't been there until recently. 3 of us are 4k users now because 7900 xtx and 9070 xt are somewhat affordable.
Strazdas1@reddit
They probably also play with controller and prefer third person games?
cowoftheuniverse@reddit
Not really. I do have a controller for soulslike games but it's mostly kb+mouse and I mostly play typical PC type games. Same for my friends.
Strazdas1@reddit
Interesting. Thanks for a reply.
PorchettaM@reddit
First issue I see with that comparison is you're taking the 4060 Ti 16GB as a reference for 1080p cards, even though it was literally the most expensive "1080p card" you could get last gen and widely panned as bad value. Most people are shopping cheaper, and ideally smarter.
Second but related issue is by only looking at the difference and not the actual prices, you're arguing people who can afford a ~$150 monitor purchase are also automatically people who can afford a ~$600 graphics card purchase, which quite obviously makes little sense in practice.
viperabyss@reddit
Okay, then let's use the regular 4060, which the 4070 is $200 more. Again, are you saying that those who are willing to drop $100 extra for 1440p screen, finds $200 too expensive to stomach?
But we're comparing a 1080p card vs. a 1440p card, no? After all, they did spend $400 buying the GPU to power that $100 1080p monitor.
PorchettaM@reddit
The regular 4060 was $300, not 400. The 4070 and its later refresh were $600, so twice that.
$100 monitor + $300 card = $400. $200 monitor + $600 card = $800. You are now in a very different price bracket and most of that is on the graphics card. People who can only afford the 1080p setup might be able to stretch a little for a better monitor (especially since really arbitrary examples aside 1440p is cheaper than $200 today), but they probably won't be able to double their overall budget on a whim.
Strazdas1@reddit
Noone is going to be using 5060 to play in 4k and not use DLSS.
tukatu0@reddit
I mean you did say it would be a regression. The 5060 is still like a 50% uplift. If a game is problematic it will just have stutters, noticeable texture downgrade or outright crash. I doubt you will ever see it with less fps even at 4k.
only_r3ad_the_titl3@reddit
I do. The secret is to simply not use the highest available preset but that seems to be some lost knowledge
puffz0r@reddit
Imagine buying a new card and immediately saying "i won't play max settings"
Strazdas1@reddit
imagine buying a low end card and expecting to play max settings.
puffz0r@reddit
1080p is a resolution from 10 years ago dog
Strazdas1@reddit
It was first introduced in consumer models in 1998, but that does not in any way disprove what i said.
only_r3ad_the_titl3@reddit
The 5060ti 16 gb also cant do max settinfs
puffz0r@reddit
Thanks for making my point for me again
HavocInferno@reddit
Haven't all these tech outlets spent the last weeks showing, in detail and with many examples, how even at 1080p with somewhat reduced settings, 8GB doesn't cut it anymore in some recent and - at their respective launch - fairly popular titles?
You don't need native 1440p and max settings anymore to overflow 8GB.
Sure, it's less of a problem than at 1440p. But it's already a *small* problem, and it is certain that upcoming games will tend to eat even more VRAM, so the problem is bound to grow larger quite quickly. A 5060 has zero chance to age well, even at 1080p. Even at "just" 300$, that's simply not good enough.
Vb_33@reddit
Those are cherry picked games at ultra sort of like the 4070 can't do Indiana Jones at supreme settings (path tracing) at 1440p because it runs out of VRAM so you have to turn down a setting or 2. It happens but not every game or even most games.
HavocInferno@reddit
They are not. At least look at the articles or videos before you talk about them.
At least Computerbase, HUB and Daniel Owen included games at settings below Ultra and down to 1080p with Upscaling. And even then some games needed more than 8GB, otherwise exhibiting massive framedrops, intermittent stutters or automatically reduced image quality.
Sure it doesn't have in most games. But it happens in some, predominantly in newer big name titles. And it is bound to become a more common problem because new games tend to have higher requirements.
So if a customer wants to play new high profile games, it is becoming more and more likely that they'll have to make significant visual sacrifices on 8GB cards purely because of this vram limitation.
Charming-Edge-2710@reddit
when ram not an issue it seems to be about 25% faster sometimes more. When ram is an issue its like hitting a wall there the exact same card with the exact same FPS. This is almost a great card. It like buying a car that drives off the lot and has an amazing 0-60 at the start but the the parts will fall off in about 2 years and can't do 70 ever.
Charming-Edge-2710@reddit
took a closer look if you must buy a NVIDIA I'm seeing a 3070TI going for about the same price as a 5060 at microcenter but the 3070 is a near match as a 5060 so assuming the TI is a little better but without the DLSS.
Vb_33@reddit
3070ti is a worst overall card.
Charming-Edge-2710@reddit
Took a closer look 3070ti does only run a few points better then 3070 but priced about the same. At this price point 2080 ti used I found at similar price... Has a little more ram and runs about the same as a 5060 but now there's the concern of driver support might not be continued in a couple years.
ghostsilver@reddit
IMO you are really cherry-picking the results here.
While yes there might be situation where the 3060 with its 12GB wins over both the 4060 and 5060. Those are really, and I mean really rare. Even then it's usually 1440p and the games that needs more than 8GB at that resolution cannot be run at an acceptable framerate with the 3060 anyway. In the video from OP, they showed example where Hogwart Legacy struggle on 1440p with the 5060Ti 8GB, but then the 16GB card could not even get to 60FPS, imagine the 3060 here.
Idk how is it in the US, but in Europe the 5060Ti (or most of the 50 Series in general) are actually available at MSRP. People mentioned rather buy used card, but people here are selling their 4060Ti used for 350-380€ because "hurr durr I bought them during shortage for 600€". And a brand new 5060 cost 320€.
HavocInferno@reddit
Yeah no: https://www.youtube.com/watch?v=C0_4aCiORzE
You're falling for this myth that these GPUs would run out of core performance before the VRAM becomes a problem.
Unfortunately, that myth is not true for the 5060/Ti, and has already started falling apart for 3060 and 4060 for a while. There are settings that need plenty of VRAM but barely hit the core and vice versa. On these 8GB cards, you're more and more forced to reduce visually significant settings which barely need any core performance.
Escoffie@reddit
10gb 3080s were a tough pill to swallow 2 gens ago yet somehow we still have people trying to justify 8gb SKUs in 2025, on /r/hardware of all places.
What a sad state of affairs.
GabrielP2r@reddit
It's always the same clowns too.
Plank_With_A_Nail_In@reddit
The 4060 Ti released on May 24, 2023 there was no GPU shortage then.
OverlyOptimisticNerd@reddit
Can you please quote the part where I discussed this and explain how it is materially different than what you are saying?
frostygrin@reddit
The 2060 is great, because of DLSS. Especially after DLSS 4.
HavocInferno@reddit
DLSS4 scales pretty badly on Turing though. Significant performance hit compared to DLSS3, which basically eats up any quality improvements because you need to lower the DLSS preset to get the same performance as with DLSS3.
Strazdas1@reddit
but at lower quality settings you still get better end image with DLSS4.
HavocInferno@reddit
Not sure I'd agree.
At same quality preset, DLSS4 IQ beats DLSS3 IQ hands down. At lower quality preset, imo DLSS4 IQ is roughly on par with DLSS3 IQ of the higher preset, but does not decisively beat it.
At least that's my experience with it in the last few games I've played. (on 4090 & 5070Ti, 4K output, DLSS Quality/Balanced)
Yearlaren@reddit
Just like how the 1050 Ti was the last great x50 product
Xece08@reddit
The 3060 matched the 2070 though?
OverlyOptimisticNerd@reddit
You’re correct. I must have been thinking about the 2070 Super.
aminorityofone@reddit
60 is the new 50.... i bet 50 will be the new 30.
Weddedtoreddit2@reddit
RTX 7080 will really be an RTX 7020
ShadowRomeo@reddit
The 5060 in this review actually matched the 3070 which makes it just as good of a jump as the 3060 but is ruined by the 8GB Vram.
Soulphie@reddit
the most important thing about this card and other 8gb cards from Nvidia specifically was just mentioned once. Nvidia cards got an update back when Hogwarts legacy came out and the 3070ti could not run it for shit on epic presets. Nvidia 8gb cards dont load in the selected texture quality if it runs out of vram, these cards dont have the actual performance that is shown by the graphs because they load in the worst possible effects and textures when out of vram and are therefore having a lighter load to render while you get a worse image.
Vb_33@reddit
These guys are old school, they don't test image quality if they did they'd be singing the praises of DLSS since DLSS2 5 years ago, instead they shit on image quality improvements like DLSS and jerk off to inferior TAA or worst no TAA so the image of full of aliasing, noise and artifacts that temporal solutions are built to fix.
What these YouTubers excel at is bar graphs with average frame rates and 1% lows, they don't even cover stuff like shader comp stutters, traversal stutters or camera stutters which are invisible in their graphs when they exist. They aren't about image quality or smooth performance, they're about average frame rates and 1% lows which while good don't show the whole picture. If you want image quality and performance analysis your only good option is DF.
HavocInferno@reddit
Tf you talking about?
They regularly do videos dedicated to image quality analysis. They've done videos on DLSS specifically, praising its quality overall.
You really just make up shitty drama and run with it, don't you?
Vb_33@reddit
Yes they love AI upscalers and certainly don't favor it over native. Mhm.
HavocInferno@reddit
I can tell you've not watched any of their more recent DLSS coverage.
But that seems to be the whole pattern with you in this topic. You have no clue and just make it up as you go. You're just angry at them for no rational reason.
dedoha@reddit
I would expect HUB notice this and mention repeatedly if that was the case, that being said texture quality have marginal impact on performance
only_r3ad_the_titl3@reddit
Bro steve cant even calculate % correctly
Soulphie@reddit
they did mention it in the video, and also back in the day around a month after launch of hogwarts legacy they had a video about this.
ioaia@reddit
Whatever, my RTX 2060 12gig runs everything I play just fine. More money saved.
salmonmilks@reddit
2060 has 12gb ?
ioaia@reddit
Yea it has two versions. 6gb and 12gb
BeerGogglesFTW@reddit
Shame. I've been wanting to upgrade my girlfriend's RTX 3060 TI, that is nearly 5 years old. But even after all this time, the upgrade bath is, spend more on a higher tier. That is a sad state considering 5 years should be nearly an eternity in gaming tech time.
I hope AMD cooks, but I imagine the actual prices of their products won't be appealing.
Vb_33@reddit
You're better off waiting for celestial.
Exist50@reddit
That's terrible advice. You're talking 2028 at best, if they don't drop out of gaming entirely.
Vb_33@reddit
Time will tell.
MiloIsTheBest@reddit
Lol I really want a decent intel GPU but so far the cycle hasn't been promising.
Just gotta hang on until Alchemist drops
Ok the A770 was a bit lacking for when it finally released, and it's pretty buggy, maybe I'll hang out for Battlemage
Ok Battlemage looks pretty solid... but they've only released B500s, I'd have liked a B700 at least.
Ok they've bailed on doing the B700s... I guess I'm waiting for Celestial?
???
Come on Intel, do the thing!
Zerasad@reddit
Steve has been hinting that we should wait for AMD's 9060, but I'm not holding my breath.
BeerGogglesFTW@reddit
Performance could be amazing, but it's all going to come down to what the actual price will be in stores.
Normal_Bird3689@reddit
I feel ya, i have been wanting to get my son a gaming PC but i cant justify the prices of the low end stuff.
Its honestly just easier for me to give him my 3080 and ill go get a 9070 xt
RandoReddit16@reddit
I still rock a 1060ti.... this card is finally the perfect upgrade for me, for less than I paid for a 660ti (whatever year that was...)
Erikthered00@reddit
Same boat here. 4 year old 3060ti looking for an upgrade at a reasonable price, and the only answer seems to be “pay more to get more frames”. There’s no replacement in the stack for even close to the equivalent price point
Nichi-con@reddit
To be fair I can find the 5060 for 310 euro in my country.
It's not the best but it's a good upgrade from a 1060 3gb
Soulphie@reddit
Brother please buy something from the used market, most people will let you test it on pickup if you ask beforehand. A 200€ 6700xt is a good card
Strazdas1@reddit
6700xt is DOA card. It does not have modern features you expect in GPUs for the lasts 8 years.
salmonmilks@reddit
assuming that's directing to AMD 6000 series atp. They have good value, most notably in second hands.
Strazdas1@reddit
They have terrible value and had terrible value since launch because they lack support for many features and software people use. It is especially terrible value when there is a better alternative from AMD now.
inyue@reddit
I think it's just insane to recommend a card that doesn't support DLSS or FSR4.
HavocInferno@reddit
The 5060 is basically this generation's equivalent to that 1060 3GB. Cut down too much and will age poorly.
It may be "just" 310€, but you'll have to significantly sacrifice visual quality much sooner than with more VRAM.
It's not a good upgrade, it's repeating the same mistake. Unfortunate if limited budget forces you into that mistake.
Culbrelai@reddit
TIL that there was a 1060 3gb. I had more vram in 2012 on a 670 4gb. Holy sheeeeit
aminorityofone@reddit
It wasnt even born and it aged poorly.
Nichi-con@reddit
No yeah I'm pretty aware of that.
But it is an HUGE upgrade, from my card. At least I can play some modern AAA.
I hope AMD bring some competition here.
Dhaeron@reddit
You should probably just look for a used 40xx.
Vb_33@reddit
Nah they're more expensive.
Nichi-con@reddit
Used market suck, with people selling their cards basically at msrp
Havanatha_banana@reddit
To be fair, the 780m in amd can be better than the 1060 3gb, so anything is an upgrade to you lol.
aminorityofone@reddit
To be fair, watch the whole video. Even at MSRP it is not a product that can be recommended. that 8gb vram makes it a terrible product.
John_28@reddit
Get the B580. It should age much better than the 5060, and also does perform better / equivalent at 2k.
Nichi-con@reddit
Battlemage supplies and prices suck in Europe.
Also, cpu overhead might be a problem for me.
boringestnickname@reddit
High praise.
Jesus.
SovietMacguyver@reddit
the 1060 3GB was shitty from the very beginning. So its not like thats a good argument. A toaster would be a good upgrade.
Nichi-con@reddit
Yeah but that's what I have.
moschles@reddit
After looking at actual gameplay benches, this 5060 is not so bad after all. You get what you paid for.
salmonmilks@reddit
Genuinely, having 20~30% uplift than 4060 was good on paper.
But 4060 reduced the gap for improvements so significantly it doesn't feel like a lot
Yearlaren@reddit
Makes me wonder if the 5050 will be a better value considerin it also has 8 GBs of VRAM
jforce321@reddit
I mean in the context of everything going on in the world right now, people would have eaten this card up if it just had more vram. Otherwise its not mindblowing, but its a crapload better than 3060 to 4060 was in terms of raw performance(vram downgrade not withstanding).
iambfx@reddit
where wild life benchmark? it is a joke review
SignalButterscotch73@reddit
Steve giving Nvidia the stiff middle finger by managing to get this review out so quickly.
Firefox72@reddit
First he climbed a chair and they didn't listen. Then he went to the roof of his office and they didn't. Now he's on the 20th floor of a skyscraper and they still aren't listening...
chapstickbomber@reddit
lol. Made a damn album to scale the meme; I am intrigued to see Steve's future solution for being even more disappointed
Strazdas1@reddit
Film in a helicopter.
MiloIsTheBest@reddit
Standing in front of the dampener ball in Taipei 101...
thunk_stuff@reddit
The B roll at 2:50 is 😂😂😂
Sevastous-of-Caria@reddit
Bro was motivated. I would be too
Sevastous-of-Caria@reddit
Tims shots of holding the gpu at an hotel bath is just pure aura
g1aiz@reddit
Tim doing stupid things at trade shows is the best part of all these videos. It's like Linus dropping things but actually funny.
averyexpensivetv@reddit
Despite all the rumbling it is a completely viable entry level card. After seeing Witcher 3 ten year anniversary trailer I looked at benchmarks from 2015 and it is kinda funny how 1080P 60FPS become the standard for entry level cards. Honestly it happened so slowly I didn't even realize it.
HavocInferno@reddit
4 years ago, we had the RTX 3050 with 8GB. Even on that card, you could manage to nearly fill the Vram in some games while still running ~60fps.
Two gens later and one tier higher, you still only get 8GB. The 5060 and Ti 8GB are bound to age incredibly poorly.
It could be an okay entry level if the price were entry level as well, but it's not.
Vb_33@reddit
Yes 23% faster than 4060 which was 20% faster than 3060 is fine. It's one of the best gainers for the 50 series. Wish the 5080 was 23% faster than the 4080.
noiserr@reddit
The performance of the card is not bad at all when you're not running out of VRAM. The issue is if the card is hitting the VRAM buffer limit at launch, it will be useless in say 2 years time.
thoughtcriminaaaal@reddit
I wouldn't go as far as to say e-waste. Primary reason why I think Nvidia gets away with still putting 8 gigs on their cards is just that most people play esports games, where it isn't an issue. Unless GTA VI requires 8GB VRAM for minimum settings I think this will just continue on until next gen consoles when it becomes too much of a problem to bear for the average consumer.
ViamoIam@reddit
5060: I make used GPUs look good.
B580: If only I existed!
RX 9060: Please AMD don't mess me up somehow :sweat: 8GB
Havanatha_banana@reddit
Outside the US, the B580M ain't that good of a deal either. It's about 50 bucks more than the US MSRP in Australia and Asia.
That's kinda why I'm hoping for the B770M, as the higher the product stack, the less the 50 bucks mark up makes up to be a problem.
Still, better they do well in US than not.
ViamoIam@reddit
I'm outside the US in Canada.
B580 (B580m I assume a typo) pops up near MSRP here. CAD MSRP is 350 (1.4 * USD). Lowest Price is 360 so $10 more in our fun High Security Colourful Monopoly money!
I found on Canada version of pcpartpicker, but my local store also currently has them in stock. Maybe check pcpartpicker for your country if you haven't already. I used it even pcpartpicker over 12 years ago and it refers you to retailers for your country and allows sorting by price, spec and feature for pc parts and stuff.
Havanatha_banana@reddit
Here's our price tracker
https://www.staticice.com.au/cgi-bin/search.cgi?q=Intel+arc&spos=2
Cheapest b580m is 449AUD, which is 290usd. Price had been this way even before our dollars have dropped.
Cheapest it ever been was 435aud with a game https://www.ozbargain.com.au/node/893191
ViamoIam@reddit
That price tracker is a bit disappointing. No Canada. Seriously though, I don't see an easy way to show price history like the example I shared. Arc search was to general so Searching for B580 limits results to only the chipset. It has four other known English speaking places, but no Canada on other side of the world. Yeah I can read ;) , the about mentions its goals for Australia.
To compare: Here IRC 3 retailers at least, had it near MSRP, BestBuy Canada, Memory Express, and Canada Computers. PCPartPicker does allow tracking for each part and graphs it for up to 2 years back if more curious and want accurate data on Canada for some reason.
Australia PCPartPicker Options are only a few. Price tracking of models shows 260 and 280, but it out of stock. Price seems sus as it is less then MSRP when converted to AUS.
In Canada we have been fortunate to have had decent staff and Ministers of Finance, . Freeland, contributed to forcing the former Prime Minister to resign. Part of that was over the financial management, budgets and plans to counter issues.
Vb_33@reddit
B580 is like $350 online in the US. Sometimes you find one for $290 or $285 but I've pretty much never see one for $250.
Vb_33@reddit
No it's the opposite. Used GPUs (which are typically the best deals around) make the 5060 look good. Buying a used 3070 for $280-350 or a used 3070ti for $350-400 or a used 4060ti 8gb for $350-400 is a worst deal than buying a 5060 for $299 with warranty FG, DLSS4 and MFG.
ViamoIam@reddit
For Used..
All those are Nvidia that are occasionally overpriced here too. When overpriced I look at AMD RDNA2, or RDNA3. The lower end stuff doesn't depreciate as much but I've seen some good deals meanwhile on 6700XT RX6800 6800XT or even 7700XT 7800XT or 6900XT sometimes. I like these as they have more Vram which makes games not have texture or frame pacing issues. Best with 32GB ram for AAA new games and multitasking since some need more vram or memory especially just after release.
Upscaling is a bit behind on older AMD, but 1080p I often just ran native and 1440p and 4k upscaling results are less far behind. XeSS was hard to tell apart from DLSS for Nvidia users. Optiscaler can mod in upscaling if game only supports DLSS.
ghostsilver@reddit
In europe the 5060 is available widely at 320€, and people are selling their used 4060 Ti for 350€ just because it used to be 500-600€ during the shortage.
The 50 Series is actually quite attractive in europe.
ViamoIam@reddit
That is good
5060 starts at 440 CAD in Canada. 1.4 * USD MSRP basically works out to 420CAD so they are basically $20CAD over. Actual rate is 1.39 currently but it is generally around 1.4.
Link to Current from multiple vendors sorted by price
tukatu0@reddit
There was neved a shortage during the 4060tis time in 2023? What country or region do you refer to?
ghostsilver@reddit
500-600 is a little bit hyperbolic there, more like 450-550€, but the average used price for 4060Ti in Germany here really is 350€, which makes no sense when a brand new 5060 is available everywhere for 320€.
ViamoIam@reddit
I'd be surprised to see that for 4060 ti 8GB, as it was a foolish purchase new to me. Games and work suffered. Horizon Forbidden West suffers on Very High. Complex Model Renders Fail. AI Models are much more limited with 8GB and give less useful results.
16GB 4060/5060 ti models are more desirable both for games and for work. Second hand ones here go for much more then 8gb variants. Sometimes a lot more then the original price difference.
tukatu0@reddit
Well just need to give it time. Im still not even sure if its even actually out yet. It will take atleast 1 week before listings realize that people are not buying their stuff and lower their prices. That's if people are even aware. Give it a month and or a few games which don't run very well. It will drop.
Vb_33@reddit
5060 is the only card available at MSRP in the US ($299). Way better deal than the 4060, used 3070 and 4060ti are.
Havanatha_banana@reddit
Wow... Gpu are expensive over there.
ghostsilver@reddit
prices have been "bad" since the 20 series, which was 7 years ago. It's become the norm by now I guess. Also does not help when AMD realize they can also charge more for the GPU and still sells them like hotcake.
Havanatha_banana@reddit
I agree, but it's been trending worse.
I don't really care anymore, I just wish that there are better options. I got a few Tesla p100 for my cloud PC project a while back because I ain't sinking the kind of money modern entry GPU are asking of me.
I miss the 1030. Not because I actually ever bought that card, but it meant as an anchor to the market.
Plomatius@reddit
B580 is like the only reasonable option. Sadly not much of an upgrade from my current GPU. Hoping the 16GB 9060 will be my next GPU, but they're dragging their feet.
ViamoIam@reddit
I'd suggest the 9060 XT 16GB. I wasn't aware of a regular 9060 with 16Gb, but that would be better then 8gb if it is an option.
knighofire@reddit
How? The 5060 is like 25% faster and has miles better upscaling, etc. And the difference is even larger on a low end CPU.
The 9060 will probably similarly run circles around it.
Don't get me wrong, the 5060 is a disappointing card with not enough VRAM.
But the B580 is even worse.
ViamoIam@reddit
5060 vs B580 fact checking:
*Actual Prices 360CAD vs 440CAD means 22% more expensive.*Canada prices checked as US prices are crazy. https://ca.pcpartpicker.com/products/video-card/#c=585,594&sort=price
3% faster 1440p https://youtu.be/2e1a2-VxxvQ?t=692
24% faster 1080p https://youtu.be/2e1a2-VxxvQ?t=679as tested.
DLSS isn't miles better. XeSS dp4a version looks okay 1080p quality and 1440p balanced. XeSS upscaling performs better on Xe hardware. Use Optiscaler on github to tweak or swap upscalers on old single player games that only support DLSS2 or FSR2 or newer. Multiplayer doesn't need it, they perform better.
Budget GPUs like Nvidia 5060 series are not powerful enough for path tracing (Ray Tracing you can see the difference). Simpler RT doesn't seem to make much of a difference visually except ruining fluidity and response times. Games needing RT still run on recent AMD or Intel. Even friends with 3070 8GB have issues with ray tracing since they don't have the vram. 3070 and worse and 2080 and worse had same issue. 5060 performs slightly worse without RT then with RT still will hit 8GB vram issues, plus frame gen feature uses more vram..
True, Intel has higher CPU overhead then Nvidia. The drivers have improved a bit since that was reported, but it is still higher overhead.
9060 speculation is just that. We will see.
5060 is worse when you get it at MSRP or real lowest prices since they are 20% more for 5060. Slightly better performance at 1080p, basically the same 1440p performance, RT performance that I can't really notice the difference on screen or worse is unusable but looks pretty, how is that a win? This is at a higher price in my region too. 12GB for AAA is already needed for higher settings.
Plomatius@reddit
8GB isn't even a consideration at this point. B580 has 12GB, which is alright, but completely unacceptable on the 5070.
knighofire@reddit
It doesn't matter if it's just slower. Even with the more VRAM, the B580 is significantly slower at 1080p and still a bit slower on average at 1440p. The purpose of VRAM is performance at the end of the day, so if the card is still slower it doesn't matter.
8 GB is ass, but the B580 is even more ass.
Plomatius@reddit
It would matter when you can turn up the texture quality on one card and not the other because it has less VRAM. The impact of running out is very noticeable as well.
The 8GB 5060 is also 50 USD more expensive so it should perform better.
Cheap-Plane2796@reddit
I knew the x60 cards were slow pieces of shit but a 5060 having the same performance as an ancient 3060ti is MIND BOGGLING.
What is the point of this card? Who wants this? Who wants this and didnt just get a 3060ti 5 years ago?
Who wants an 8 GB card thats only good for 1080p these days at such a high price?
Who is this piece of crap for?
only_r3ad_the_titl3@reddit
It is like half the price of the 3060ti
Spjs@reddit
The 3060 Ti was $399 in 2020. The cheapest 5060 is $329. That's a 17% discount, not half.
only_r3ad_the_titl3@reddit
That was the msrp ot the retail price
chamcha__slayer@reddit
RTX 4060 was so shitty that it's making RTX 5060 look good in comparison.
Constant-Plant-9378@reddit
I've been running mine for a year and a half, playing VR Alyx, Doom Eternal, Cyberpunk, etc. and have been perfectly happy with it.
Sopel97@reddit
don't worry, people on r/hardware are hardcore american gamers who only play in native 4k at 240 fps and can't accept that 4060 was some of the best value for money in many markets
Withinmyrange@reddit
Just strawmanning for no reason
Username1991912@reddit
No matter how shitty a product is, there will always be people saying "i've been perfectly fine with it".
only_r3ad_the_titl3@reddit
What? The 4060 was the 2nd best card from the 4000 series? Bedt value uplift over their previous fen counter part
nomoregame@reddit
my $250 2nd hand 3070 bought after crypto crash still BOSS
Rossco1337@reddit
Awesome. I got a used 8GB 3070 for $350 3 years ago and Nvidia is now selling it brand new for $300. It's so hard to keep up with technology nowdays so it's very charitable of Nvidia to keep their older products alive and relevant.
Maybe in another 3 years we'll be able to get a 3070 Ti for that price - I'm not holding my breath though.
abbzug@reddit
Nvidia's tactics are making a lot more sense now. Cause damn, this shit sucks. Why not just keep making the 4060 and avoid all this drama.
Luxuriosa_Vayne@reddit
Guess my brother is keeping his 3060 ti
ThatBusch@reddit
In other words: Intel has a clear winner here and people should get a B580
Soulphie@reddit
aslong as you need a 7500f as a minimum to not have your gpu perf crash in games because of the driver overhead the b580 is not recommendable
ThatBusch@reddit
Well, as Steve mentioned, the 5060 only has PCIe 5.0 8x, so if you don't have at least PCIe 4.0 x16, you are just about as fucked as with a B580 and old hardware.
Snow_2040@reddit
PCIE 4.0 is not an expensive requirement whatsoever, AMD's B series motherboards have supported PCIE 4.0 as far back as zen 2. Meanwhile, the b580 loses serious performance even with a 7600x in comparison to a 9800x3d in some games.
ThatBusch@reddit
True, true
Soulspawn@reddit
Damn, those numbers seem bad. The core count alone is a 20% increase from 4060, but it's only 22% faster.
This is a massive disappointment unless the price is right, which I doubt it.
Soulphie@reddit
yeah, that gddr7 doesnt seem to do these cards any good at all. Whatever they spend extra on that instead of GDDR6 seems to have been a waste of money
Noble00_@reddit
I've been somewhat keeping an eye on CS2 benchmark numbers for RDNA4 and they are pretty underwhelming. Though, it seems HUB data has been updated and now the 9070 non-XT is better than the 5070 and just below the 5070 Ti. Don't know if there is more to this, but for the 9060 series sake, this is good news.
spacejam221@reddit
In 2023 I bought a 3060ti for AUD$529. The 5060 is marginally better and AUD$599.
The card is a dead on arrival joke.
Extension-School-641@reddit
The average FPS and 1% lows will be much much lower if he include also The Last of Us Part II, Indina Jones, Horizon FW, and Spider Man 2, like he test in the Rtx5060Ti 8gb reveiw.
Those games dosen't play nice on 8gb Vram also not on 1080p.
Darksider123@reddit
E-waste. Not enough VRAM nor performance to justify that price
Lt_Bogomil@reddit
It blows me how it barely can't outperform a 3070... a 3070... I sold my used 3070 a year and half ago... and a just launched GPU performs similar costing more....
fatso486@reddit
how is this post 2 hours old even though the video is out for 1 hr.
I posted the same link here btw when the video just got released in youtube.
styxracer97@reddit (OP)
Timing is weird. I saw the video on YouTube about 30sec after it released.
rain3h@reddit
Nvidia suck, no doubting that.
It does get me though, if these people are really as outraged as they make out they could boycott but they won't because Nvidia good as well as nvidia bad drives clicks impressions and ad revenue.
The same people taking extreme pleasure telling you that they take nothing from Nvidia actually are from that ad revenue.
Report the news that needs reporting, news that consumers need to know sure.
Nvidia are crap and we need to know that but unless actual action taken by boycott this constant circle jerk of victimisation by these influencers should be called out for what it is, for the clicks.
someguy50@reddit
I don't like clickbait. What's this card look like, what is its value compared to other options?
Big-Performance-3247@reddit
Watch the video