My experience with the RTX 5060 and its 8GB of VRAM
Posted by Internal-Arm6041@reddit | buildapc | View on Reddit | 291 comments
I want to share my personal experience with the RTX 5060 8GB VRAM, because I think there is too much misinformation online about this card. I tested Doom The Dark Ages on Ultra with DLSS 4 Quality and MFG x4 and it ran at over 200 fps, using around 7GB of VRAM. Then I tested it with only DLSS 4 Quality, no MFG, and it stayed between 60 and 80 fps on Ultra, using just 6GB of VRAM. And we’re talking about a demanding AAA title released in 2025.
That’s why I think people are exaggerating the whole 8GB VRAM issue. For 1080p, the 5060 is more than enough if you use technologies like DLSS, which even reduces VRAM usage. Yes, MFG increases the workload, but that’s up to each player to decide whether to prioritize higher fps or stick with DLSS alone.
If you don’t want to use these technologies, then yes, 8GB may fall short in some modern AAA games. But if you do take advantage of them, 8GB is still perfectly viable for 1080p gaming today on High or Ultra settings. The worst case scenario in very demanding or poorly optimized titles is dropping down to High, never to Medium as many people claim without testing it.
Evening_Demand@reddit
HAVING to use dlss to make its “playable” is the most meaningless of justifications. Yall need to watch the video GN did on this, where using dlss when already at such a low fps to get it playable was an exponentially worse experience. On top of that frame gen 4x so 15fps gets you 60 but with 50-60ms input lag plus of how bad it looks and all the missing information that the ai can fill bc there is 4 fake frames for every 1 real one. I feel bad for people who think this is a good gaming experience.
dweller_12@reddit
The GTX 960 2GB worked very well when it launched too. After only 3-4 years it was already incapable of running 1080p AAA games without slashing graphical settings to lowest. The 4GB model outlived it by multiple years and is still capable of some modern games at lowest settings.
The GTX 1060 6GB only one generation later is still relevant today. The 3GB one is not capable of newest games on the other hand. VRAM ultimately makes a massive difference in the usable lifespan of a GPU, not when it launches.
If you plan on upgrading the card in a few years, then 8GB GDDR7 will likely be just fine for that time period. It just won't age well long term.
MyzMyz1995@reddit
Nvidia RTC will remove 70-80% of RAM requirement for games using it though. 8gb rtx 50xx are going to be fine.
1978415@reddit
should i buy the 12gb is it good? i have tight budget
TheAngrytechguy@reddit
I don’t keep A GPU that long .. so OP has a point . Why pay 2000k on a card . If I’m getting newer gen GPU’s every other year .and the experience is pretty damn decent ,
SoftMaterial_Shower@reddit
Ewaste? finite resources? Why do people think it's a good idea to encourage companies to turn Earth into a wall-e style garbage planet.
TheAngrytechguy@reddit
Not E-waste at all, will be sold onto another person to use . If your concern is a dystopian wasteland. Then Stop using ALL your tech right away, make the difference .
SoftMaterial_Shower@reddit
Nobody is going to use a crappy outdated GPU for anything other than scrap and that starts with proper disposal in e-waste bins, etc. Otherwise it just gets dumped at a random poverty stricken area in the world.
You're using edge cases (Proper disposal + scrapping) to defend your irresponsibility and then claiming that my responsible use of technology is also wrong. Try harder.
TheAngrytechguy@reddit
You are WRONG. A lot of older GPU’s get scooped up for home labs and encoding purposes. Even if those GPU’s end up in a 3rd wold country somewhere they still get used there . Where do you think a lot of the old laptops and tech ends up?
SoftMaterial_Shower@reddit
> A lot of older GPU’s get scooped up for home labs and encoding purposes
What GPUs? If you're talking about professional-grade or high-end models like the XX80s / XX90s / Titans. That's correct.
But the context here is entry level garbage like the 50s and 60s which use basic components that are designed to degrade rapidly after the warranty is over. Nobody is going to use them after you throw it away.
Some GPUs are literally e-waste at launch now.
> somewhere they still get used there
That's right, those GPUs have loads of uses. Like leaching metals into people's drinking water, soil and the environment. Giving people who try to dismantle them to recover the precious metals all sort of chronic illnesses in the long run.
Do you think people who struggle to afford electricity care about a GPU? Lol. These things are dumped in the poor areas of poor countries. Watch a documentary about landfill scavengers in the developing world to get an accurate view of their lives and see if they have any use for an old entry level GPU.
> Where do you think a lot of the old laptops and tech ends up?
Ah the good ole false equivalence fallacy. That's right BUSINESS-class laptops are often resold (ebay ,etc) when corporations do an upgrade. Consumer grade laptops are not designed to last beyond their warranty periods, something will fail and the whole thing becomes a paperweight.
bpwells444@reddit
Not sure im following the xx50 and xx60 series being designed to degrade after the warranty expires. Feel like thats not at all accurate
SoftMaterial_Shower@reddit
They're not "designed" to degrade after the warranty period. They components used are designed to survive the warranty period, but no guarantees after that.
Why would any company use more expensive components that could last >5 years or tolerate high stresses if there is no requirement to do so and no expectations of the customer to retain products for such a long time.
SenorPeterz@reddit
I would actually guess that older 50- and 60-class cards survive longer than their contemporary titan/80ti siblings, as they are generally speaking not running as hot.
SoftMaterial_Shower@reddit
I see where you're coming from, 50 / 60 class cards have a lower TDP, therefore they won't run as hot.
But that logic breaks down because the actual parameter that determines whether a GPU runs hot is:
Heat Generated > Heat Dissipated
Simple example. If you heat up water by 100W but the water also radiates / evaporate 100W back into the environment, the temperature of the water assuming a closed system will remain the same. Same thing if its 1,000 or 10,000 watts.
So let's look at the real world. If you are selling a product that has a TDP of 145W as in the case of the RTX 5060, you will want a cooler that can dissipate more heat than 145W, higher to account for inefficiencies.
The same would apply to an RTX 5070, 5080, 5090 and so on, you would design a product with a cooler that can dissipate heat generated. So no, we should not see 80 / 90 / Titan class GPUs run any hotter than 50 / 60 class cards. Unless those 50 / 60 class have beefier cooling solutions which adds costs. I'm sure they're out there somewhere, but that's an edge case and most aren't buying those.
Now we hypothesise the user side. Who is buying an RTX 5050 / 5060? Entry-level consumers or people who want compact systems. They are more likely to
Have smaller cases. Cause they are cheaper.
Have a lower budget for other components such as fans, cases.
Likely to place their Computer towers in places with suboptimal ventilation.
Less likely to clean or maintain their computers. Like open it up and clean it out so it remains dust free.
Now I'm sure there are people who do not fit these four points and will take good care of their GPUs, Computer towers, etc. Those are edge cases, not the majority because really, most people are going to keep their entry-level computer for 4-5 years.
All of these factors ensure that 50 / 60 cards are likely to run hotter because end-users are more likely to keep them in suboptimal conditions and use them as such.
Thank you for attending to my Ted Talk.
TheAngrytechguy@reddit
And why are you using copilot to come up with responses for you? Lol
SoftMaterial_Shower@reddit
If you think I'm using AI to write my responses, that just confirms to me that AI is going to replace you, not me.
I feel much better now. Thanks.
TheAngrytechguy@reddit
Do you need a hug ?
Vengeful111@reddit
Im more of a 70 Series every 4 years guy.
Then theres the 90 series every 7 years ppl.
And you are a 60 series every 2 years enjoyer.
Dont see anything wrong with it
Internal-Arm6041@reddit (OP)
Yes, and I totally agree with you, this card is meant to last a few years and it’s worth it if you’re planning to upgrade your components in about 3 years. But I think it’s pretty good for a $300 card that can last you those 3 years. Of course, if you plan on not upgrading your PC for 4 to 6 years, then this card might not be the answer.
EiffelPower76@reddit
8 GB VRAM graphics cards are now a plague
Why ? Because most gamers have 8GB VRAM, and this will last because some gamers continue to buy them.
So most video games editors continue to limit the VRAM consumption of their games to 8GB, having poor graphics
And gamers continue to say "See ? my 8GB GPU works still well"
All this must stop, people want nice graphics with 16GB VRAM
Stock_Layer5161@reddit
Texture are one of the most scalable setting in games.
I guess your argument assumes if a game's max setting use less than 8GB, they are limited in visual design.
Might be true in the PS3 era, but nowadays any headroom of yours are the dev's.
Just look at those PS4 visuals you get in Monster Hunter Wild and that's with RT. Using up 15GB at 1440p, are 16GB even future proof at that point?
FlarblesGarbles@reddit
The used market price difference between the 3080Ti and 3090 says it all.
KillEvilThings@reddit
Or buy a 350 dollar card that lasts 5-6 years.
Your math doesn't add up, this shit is literally designed to keep poor people poor.
mig_f1@reddit
There is not a $350 card with 16GB atm, at least not according to PCPP.
JJay9454@reddit
9060xt is just higher at $359.99
mig_f1@reddit
According to PCPP rn the cheapest 9060 XT16G is $380 in the US
JJay9454@reddit
Huh, the new computer store near me that opened after Covid has it for $359.99, gimme a minute
mig_f1@reddit
How's that negate the average price gap between the 2 cards?
If some stores sell cheaper offline or they have temporary deals they do it exclusively for the 9060 XT and not for other GPUs including the 5060?
Are we taking about exceptions and hunting down random offline deals here or the general market norm with the items being available and accesible to all consumers no matter where they are located?
JJay9454@reddit
Jesus christ bro, I'm just tryna send you deals. Chill out
FlarblesGarbles@reddit
The 9060 XT 16GB says hello.
mig_f1@reddit
Feel free to share with the rest of us the PCPP links to a 9060 XT 16GB priced at $350.
Elitefuture@reddit
$350 if you wait for the 9060 xt 16gb, but it's readily available for $370-$380.
Mexcol@reddit
How long until it drops to 350 in your opinion?
mig_f1@reddit
According to PCPP it starts from $380 in the US. The difference from $300 gets you a 32GB kit or an ssd or the 2/3 of mobo.
S1rTerra@reddit
Or even cheaper if you go to the used market. The 6700 XT is in the $250 range, though only has 12 gigs. It'll still last you a while though.
PsyOmega@reddit
The boots theory of economics is often lost on the poor.
dweller_12@reddit
Yeah, three years is a good timeframe. For $300 MSRP RTX 5060 is a much better pick than any card like 3050/3060/5050/4060/9060XT 8GB.
But for $80 extra 9060XT 16GB is too close in price to not consider. It will resell for more in 3 years when you upgrade, so even if you stay on that cycle you might net more towards the next upgrade.
Reddit_Lord_2137@reddit
I am playing Helldivers 2 on 1440p 30-75fps (lowest of the lowest settings) on gtx 1650 super oc (4gb vram) and r5 2600x. Stable 60fps on my second 1080p monitor with very few fluctuations.
Yes, I can’t believe it’s possible either.
KajMak64Bit@reddit
Bro i play it with GTX 1050 2gb but 1080p downscaled or... upscaled? It's like set on performance level i think... and all lowest
I have performance issues and stuttery purely based on VRAM being only 2gb
FPS is like... all over the place but can be decent... it can be like in the 50's sometimes but stutters a lot
One of the example of an issue is somewhat recent... for example when i am driving a mech and look ever so slightly down below the horizon the FPS tanks and game literally slows down time and in slow motion
And it's just like just below the horizon... so if the number of degrees go into negatives the FPS drops... even if it's 0.1 degree negative below the horizon and when i look 0.1 degree positive above the horizon the FPS is normal
It makes no sense because all the things are visible the same
untraiined@reddit
8GB is already deprecated for anything higher than 1080p
Educational-Web829@reddit
You can make 1440p work it'll just need to be like low or medium, I've seen some 4060 and 7600 1440p benchmarks and its actually not half bad surprisingly, but yeah its a huge problem for higher settings and at least the 4060 can use dlss.
Hopesfallout@reddit
I've had zero problems running modern AAA titles at 1440p high/ultra so far. The only caveat is that I haven't played notoriously demanding games like Black Myth Wukong.
Vengeful111@reddit
100% agreed, idk why people act like everyone needs to play the most demanding games released on ultra for a gpu not to be "dead on arrival".
Nvidia being stingy is still shit, both can be true at once.
AggressiveLocation2@reddit
I completely agree.
bugeater88@reddit
yeah, my laptop has a 4070 mobile and a 2560x1600 display. its not idea but its workable.
dreamsOf_freedom@reddit
I have a 3070ti and I am able to run most games on high (maybe with a couple tweaks) with 60-100+ fps with DLSS.
Bradley_5546@reddit
1080p with +10 average K/D is just fine with me lol
Terbarek@reddit
I bought GTX 960 2GbB and this was one of my biggest mistakes
hyrulia@reddit
I still have a 960 4GB and it runs Wuthering Waves 60fps (low).
vladandrei1996@reddit
Isn't that mainly a mobile game ported to PC?
insaneahrix@reddit
Wuwa is still Graphically demanding. Especially the latest Septimont map is on par with AA Graphics.
hyrulia@reddit
No, that's Punishing Gray Raven.
Tookool_77@reddit
Both of them are mobile games bruh
hyrulia@reddit
Wuwa is multiplatform from the start (not mainly a mobile game first then ported to PC).
SLICKUID@reddit
It’s still made for mobile. It isn’t a hard to run game. Not really anything that would matter. Try using your 4gb vram card actual newer actually made for console/pc game
AZzalor@reddit
Every game like that is a mobile game first cause mobile is setting the limits.
1corn@reddit
I played through the entire story of Honkai Star Rail up to Penacony on my old 2 GB GTX 770. It was completely playable, even looked quite nice.
SpoiledCabbage@reddit
I just got a 1060 6GB and I'm playing new games all the time on High settings with 60fps. Tony Hawk 3+4 getting 60fps on 1080p. I haven't had a gaming PC since 2011 where I switched back to console. I've had a PS5 since launch so any game that can't run on the laptop I can just buy on ps5. And there is very few games that actually demand that
wrosecrans@reddit
I am also rocking a 1060. I'm planning on upgrading to a whizzy new 16 GB card soon, but that's because of actual work in DaVinci Resolve that I need to render faster. All the video games I play seem to work fine.
The way some folks will eagerly blow a grand on a video card for their gaming hobby, just to avoid not having the graphics setting maxxed to the absolute highest level, has always kinda blown my mind. Very few games actually require a super high end GPU for a good experience because that's a super niche market and they wouldn't sell many copies.
SpoiledCabbage@reddit
Yeah I got this laptop for $200. It plays any PS4 game I was playing on my PS5 way better. I also play tons of older games anyways since I never really buy new releases anymore so I don't need all that and if it's something I really wanna play I can just get it on my PS5 instead. I don't even have a 120hz display and I got a cheap 4k tv so I'm fine with 1080p and 60fps gaming
katzengoldgott@reddit
Am on an RTX 3060 with 12 GB and I cannot complain 🫡
EndlessZone123@reddit
There was a 3gb vs the original 2gb gtx 1050 as well that lasted quite a bit longer and I remember being used in gaming cafes. Lower performance was better than terrible performance and being unable to run some games in the near future.
throwaway85256e@reddit
Who expects their card to last 3-4 years while keeping the same graphical settings and FPS? That's just not going to happen with any card. Even the high-end cards will usually have to go from "ultra" to "very high" or something. It's an unreasonable expectation to have.
FlarblesGarbles@reddit
No one said the same. But it's simply a fact that a card with more VRAM but the same processing power will stay relevant for longer.
JJay9454@reddit
raises hand
My GTX 970 and i5-4690k ran everything from late 2015-2019 with high settings at 60fps. 2020 and later games I have to go down to medium to maintain 60. Games around 2023 and later now I'm at low/Off to maintain 60.
Infinifactory@reddit
well... someone who pays over f'ng 400$ should expect to at least not peg down to medium/low, regardless how poorly optimized new unreal slop games are, there should be standards. Remember that even adjusted for inflation this price point should've gotten you more, we are regressing not progressing.
3x3x3x3@reddit
I’m pretty sure your GPU lasting more than 4 years was the consumer standard prior to 2020ish. The RX480/580 8GB was the standard entry level to mid range card of choice for years. 2016-2020.
typographie@reddit
The argument has never been that it isn't adequate for 1080p, especially if you drop settings. The argument is that you shouldn't be limited to that degree for the price.
This is a GPU that can do more than that, for a price that should get you into 1440p, artificially limited by a VRAM buffer we had on cards a decade ago.
MistSecurity@reddit
I don’t get why the 5080 doesn’t have 24g of VRAM. Super stupid. Just holding it back so they can make the 5080 super that much more enticing.
rubik33@reddit
that 24GB 5080 exists. It's called 5090 mobile.
MistSecurity@reddit
As in the silicon is the same as the 5080? Definitely not the same performance-wise from what I've seen.
rubik33@reddit
yeah it is the same chip, just with 3GB GDDR7 modules instead of 2GB ones. The power budget is more due to the cooling capacity a laptop can provide
MistSecurity@reddit
Ya, that's what I was getting at.
Didn't know that they used the same chip, that's cool. Are there any laptops that have a higher power budget? The ones I was looking at seemed to be in the 150-160W range.
rubik33@reddit
there are a few listed at the end of this articles https://www.notebookcheck.net/Nvidia-GeForce-RTX-5090-Laptop-Benchmarks-and-Specs.934947.0.html
They are mostly 18-inch desktop replacement chonkers though. 160W seems to be the most the 16-inch form factor can reasonable cool.
MistSecurity@reddit
Thank you.
I'll have to remember this site for next time I'm laptop shopping or if anyone I know is. Pretty reliable in your experience?
rubik33@reddit
they are reputable and do pretty thorough testings if you want to look through technical data. They have a youtube channel as well for shorter summaries.
Ok_Example_4819@reddit
Dont forget the resale value will tank by the time you want to replace it since nobody will want an 8gb card. Higher vram cards will hold more value.
pacoLL3@reddit
It's literally the second cheapest GPU on the market....
Paweron@reddit
That's just... not true at all.
Nvidia: 5050, 4060, 3060, 3050
Intel: b570, b580, a770, a750
AMD: 7600, 6700xt, 6650xt, 6600, 6500xt, 6400
All available at the one online shop I checked for less than the 5060
Ok-Parfait-9856@reddit
I think he meant this generation, otherwise you’re right of course. I don’t think any of those cards beat the 5060 in raster though. 5060 has raster performance of a 3060 ti or 3070 based off my memory. 6700xt and b580 are very close and have 12gb so those are decent. I’d take the 6700xt since it had 12g vram
I think the 9060xt 16gb is the best value card this gen by a mile. Same performance as 5060 ti 16gb, fsr4 is impressive, and it can go for $350 if you look around; in the US at least.
4tizzim0s@reddit
Did you mean to say second cheapest nvidia gpu?
SuperPork1@reddit
And yet it's still $300
juan_bito@reddit
I've heard many people talking about at 1080p it wont be good enough for high settings when this is a blatant lie
cvanguard@reddit
It absolutely isn’t enough in some games already, OP testing a lightweight, well optimized game literally tells us nothing about more demanding games.
Games like MH Wilds at ultra settings won’t load textures properly on the 5060ti 8 GB but work perfectly on the 16GB model, Oblivion remastered spills over into system RAM which causes lower frame rates and terrible stutters, same with the DLC area of Cyberpunk 2077 on RT ultra. Needing to use DLSS or lower quality settings to avoid VRAM limits when the card is powerful enough to handle native ultra settings is ridiculous.
South_Ingenuity672@reddit
I agree 100%. Doom TDA devs clearly worked hard to optimize to fit within the 8GB buffer but many games dont. also I thought Doom would have really benefited from a high res texture pack which would definitely not work on an 8GB card.
cvanguard@reddit
Yep. Games aren’t being developed with 8GB VRAM in mind anymore, especially since the PS5/Series X are the current console generation and they have more RAM available. Even with 4GB reserved for system RAM, that’s still 12 GB available for games, and devs are going to target those specs since consoles are far easier to optimize for than all the possible PC configurations, and the console market is far bigger.
It’s not surprising at all that Cyberpunk’s base game runs fine on 8GB cards and the DLC area doesn’t, when the base game was developed during the PS4/Xbox One era with 8GB unified memory and the DLC was developed after the PS5 and Series X released with 16GB unified memory.
Mandingy24@reddit
It's like the inverse of the 4060ti 16gb
corgiperson@reddit
I really don’t think you’re making the argument you think you are. Almost maxing out the VRAM buffer on a recent title is not a good thing. Yes it’s at Ultra quality but games will then release where you’ll hit that max at high, then medium, then low, and you’ll hit those a lot faster than if NVIDIA just decided to put another 8 GB chip on for 30 bucks.
bipoca@reddit
Op I think people make a lot of worse case scenario predictions about this card. Maybe 8gb won't be enough for the bleeding edge of new a few years from now, but there's also going to be a point where developers have to consider what audience they want to reach with their games.
BF6 removing ray tracing in place of better performance is a great example.
My theory personally is requirements for the games being developed will likely increase like they have been for another year before they go stagnant, at least until other factors outside of PCs/gaming change.
Worst case for me, I end up being completely wrong and have to upgrade my GPU in 2-3 years. But I got a great deal on a pre-built that came with a 5060 ($46 over parts price on PC picker), and being that I had an old gaming laptop before this I'm pretty content.
Louiienation@reddit
Why would I want to use MFG on a 8gb card? Try testing other games where Vram allocation exceeds 8gb.. because they are out there. and yes i can always turn down my settings but I am not buying an Nvidia card to use minimum settings. I might as well buy a console and have a better experience that way for the same $500.
benjosto@reddit
Love how you tested with one game and generalized for everybody and every game.
Absolutely useless
Googoobeff@reddit
Rich idiot things.
Nighters@reddit
Dude tested one well-optimized game and is like: "8GB is great; we don't need more for that HIGH price." LOL
Googoobeff@reddit
He's totally clueless but rich.
CeriPie@reddit
You should probably test more than one game before coming to a conclusion?
Also, the single game you decided to base your entire conclusion on is particularly well optimized. Just kind of a strange pick altogether.
Googoobeff@reddit
He probably has no idea what he is talking about. Typical.
Googoobeff@reddit
Doom is not a good benchmark. Also you are using dlss why? That's for us with weaker cards. Doom is so heavily optimized it can run on almost anything.
SoftMaterial_Shower@reddit
Sorry but if you need software "workarounds" to deal with insufficient VRAM that's just a trash product to begin with.
Visible_Broccoli_987@reddit
Exactly, spending over 300 dollars on a GPU to play at a native render resolution BELOW 1080p and saying that’s fine is crazy
SoftMaterial_Shower@reddit
Sadly when it comes to the GPU space, it's functionally a monopoly.
Aecnoril@reddit
Yeah it works well right now. Your tests show that even a pretty well optimized game like Doom (with DLSS) already takes up your entire buffer. So wait what happens when the (proverbial or real) next Doom releases..
bikingfury@reddit
The thing about most games is they have clever texture streaming which means your textures will simply look like scrambled eggs until they are loaded. You will notice that if you look for it.
Less VRAM is not just about fps, games look worse.
Illustrious_Cat6495@reddit
Sorry, but this sounds like copium
godisgonenow@reddit
Your first paragraph is basically "My son passed the trst because he cheated, why are people saying he's suck ? "
The 2nd paragraph is out right wrong MFG doesn't increase workload. It's decreased it it's the whole point of DLSS and MFG
3rd, You do understand that those DLSS and MFG performance relied on your gpu capable of outputing a good reasonable amount raw FPS to work right ? It's working fine now. Next year ? Medium and 2nd year. Low.
Using DLSS and MFG to justified it's performance is a big copium. I can also just buy a simple gpu like 1030 and subscribed to GEforceNow and say see my GPU is working just fine it can decode the streaming see!
MongooseProXC@reddit
It's still probably a really decent card that I'd be lucky to have.
Critical_Mouse_8903@reddit
4060 8gb is fine, 5060ti 8 gb is not. Should only have the 16gb variant or they should have made just 1 version with 12 or somthing. Nobody would have said anything if thats how the cards released. The 8gb 5060ti is what started the whole thing
TurkeySloth121@reddit
Why you’re being moronically obtuse:
zMassy_@reddit
Tbh i have a 5060 and i play cyberpunk on ultra just fine, with dlss ofc
GrassyDaytime@reddit
Yeah, I have a 4070 Super and even 12GB is too low for EVERY setting to be maxed out. 1 example is the Resident Evil Remakes. There are some settings that if all the way up will put the VRAM in red because it needs to go higher than 12gb.
lleyton05@reddit
Ok while I know what you’re saying here, saying “try cyberpunk” which is the notoriously difficult to run game is also a horrid point. 5090s don’t even run that game well. Also “moronically obtuse” is really overly dramatic at what we’re talking about
liaminwales@reddit
Doom is a bad example, your talking about an game made by id Software. It's the most optimised game engine only used by 1 modern game, I ran doom 2016 on my RX 580 at 4K! It was the only modern game at the time that ran well at 4K on my RX 580, there game engines are more magic than anything.
Then say your using Frame gen at 4X, ie the game was ruining at 50FPS. It's kind of hard to know if it's a troll post or not?
SpiderDK1@reddit
Yep, for 1080p - it is totally ok. But for 1440p or 4k... I have 4k and 5080 and sometimes 16GB is not enough for full throttle experience...
wrsage@reddit
Some games use more than 12gb in 1080p. I have 8 gig card and it couldn't handle 2 games that released last year and minimum graphics.
juan_bito@reddit
There is no new game at 1080p that comes close to what you're saying either say the game or don't post nonsense
wrsage@reddit
Try enshrouded. You're welcome.
juan_bito@reddit
Just did it takes 6gb on ultra still not even close to running out of vram lol
wrsage@reddit
Nah, I got 5700 with 4060 8gb and it run out of memory. Was extremely unstable, stuttering and gpu keep running 100% with 8gb usage.
juan_bito@reddit
Not the same experience im having
pacoLL3@reddit
Stop lying!
PropertyFirst3804@reddit
He’s not lying lol to name two MH wilds and the new mafia game. 8gb is not enough anymore.
TheYoungLung@reddit
Which games?
cstark@reddit
https://tpucdn.com/review/hogwarts-legacy-benchmark-test-performance-analysis/images/vram.png
Idk if this has changed since Feb 2023 but 9GB at 1080p Ultra. 14GB if you add ray tracing 😅
I’d like to see more examples from people that claim this though (cause I haven’t got to dig too deep into this).
https://tpucdn.com/review/assassin-s-creed-shadows-performance-benchmark/images/vram.png
https://tpucdn.com/review/kingdom-come-deliverance-ii-performance-benchmark/images/vram.png
https://tpucdn.com/review/dragon-age-the-veilguard-fps-performance-benchmark/images/vram.png
TheYoungLung@reddit
Brother all this does it tell me if you don’t care about RT and use DLSS 12GB is plenty sufficient for 4K
cstark@reddit
I’m not even sure how effective a 5060 would do with Ultra + RT anyway.
TheYoungLung@reddit
I don’t disagree that a 5060 wouldn’t do well, my point is that the hysteria on Reddit around 12GB of VRAM is overblown
cstark@reddit
Yes, people should post more information about their claims.
Mandingy24@reddit
But only ~1gb more for 1440p ultra, and another ~1gb for 4K Ultra
These benchmarks mostly indicate that 12GB vram is still completely viable even at 4K. Most of these are even 4K ultra and still not hitting 12 until you turn on RT. But even as a fan of RT running a 4070, most games have awful implementation and it's hardly worth it most of the time anyway
So yeah i find it extremely unlikely that 1080p games are hitting over 12 like the other guy claimed, at least not without some modded fuckery going on
Quiet_Try5111@reddit
i have a dual 1440p and 4k setup. i managed to hit 16gb limit on my 5080 in 1440p but that’s just one game. 16gb is still mostly perfect for 1440p and 4k to some extent
Liquidbudsmoke13@reddit
I currently have a 5060 Ti 16Gb and nn price to performance it’s unbeatable tho, $399 for the 5060 Ti 8Gb or 5060 Ti 16Gb for $449 I mean it’s one of the best cards out there for the price given, I’ve played BodyCam on max settings and it’s the smoothest gameplay same with any other title don’t struggle at all and I can stream it as well!
aliwalyd31@reddit
How much did Nvidia pay you to post this?
Technical-Swimming74@reddit
Bro ayed 1 game that works well with 8gb and thinks he made a diacovery. Its not a bad card as people say it is but the VRAM does limit you in MANY games
Neurogenesis416@reddit
You used one of the most optimized games out there, from one of the most savy developers, with levels that (and i'm sorry doom fans) aren't that outstanding in scope, on a resolution that's frankly rather ancient. And you're nearly maxing out the VRAM if you dont use an upscaler ... on 1080p no less ....
My man, come on, this isn't the argument you think it is ...
andrew_2k@reddit
You're nearly maxing out today.
It won't last long then, will it?
corgiperson@reddit
That’s the thing these people aren’t understanding. The card can barely run Ultra today. So it’ll barely run high a year from now, then medium, then low, and then you have a useless piece of sand that needs to be replaced anyway than if you just bought a previous gen or used card.
HodeShaman@reddit
Imo its honestly shocking to me that people expect PC hardware to last 5+ years before you notice it struggling.
Almost nothing else does.
corgiperson@reddit
The point is the card would have significantly more longevity if they just slapped another $30 VRAM chip on it.
andrew_2k@reddit
Leave the corpos alone!
corgiperson@reddit
For real. God forbid NVIDIA gives the consumer a crumb for once and puts another VRAM chip on that they presumably get for a huge bulk discount. Instead people almost praise it in some way like NVIDIA are true geniuses handicapping the card just for the budget market.
lcirufe@reddit
On this episode of ewaste enabler…
4514919@reddit
You guys are so entitled lmao
lcirufe@reddit
Gentle reminder that Nvidia is the 2nd most valuable company in the world, and have been neutering generational uplifts on xx50 and xx60 series cards since the 4060
4514919@reddit
That has nothing to do with acting as if not being able to run Ultra settings makes a GPU e-waste.
lcirufe@reddit
You seem to be fixated on the ultra settings thing. My response was to the guy who thinks we’re asking too much for a GPU to acceptably run games after 5 years
4514919@reddit
Are you for real? This entire post is literally just complaints about 8 GB GPUs not being able to run max setting.
andrew_2k@reddit
No, its about paying premium for an artificially handicapped product. Max settings are just a part of it that prove the point.
Reasoning like yours is exactly why we don't have 12GB baseline GPUs. Back in the day these entry level GPUs were pretty incredible. Can't believe how quickly people forgot about the 1060 6GB.
4514919@reddit
Pretending that a GPU with flaws, but perfectly serviceable, is not e-waste is the reason why we don't have 12GB baseline GPUs? This is a pretty stupid take even for this subreddit.
andrew_2k@reddit
The point is simple, 8GB consumer GPUs suck. The market moved nowhere. The corps are catering the production for no reason.
Thats not just my take, but the take of people that know much more about it than you and me together. People need to stop arguing for it, but against it, guess whats going to happen otherwise.
Youre paying premium for a handicap.
You seem to not understand the correlation between the price and what you're getting for it.
Are you getting a servicable GPU? Sure. -- Are you paying the correct price for it not lasting, forcing you to lower settings, limiting you to a resolution that is falling off? No, hell no. (ofc there are deals to find, but that applies for more powerful GPUs too)
The argument isn't mainly about the GPU exploding if you turn on a game. Its about the continuing of people somehow settling for the most baseline mediocrity from a multi-billion dollar corporations, and then defending them for producing a procuct that is overpriced and under-performing.
Remember your "Yeah but its servicable" argument when they release cards that are even more expensive with the same ammount of VRAM in a generation or two and suddenly, you can't "servicably" run games on them. Enjoy paying 400 dollars for mediocrity and then defending it beacause its at least "servicable".
We used to get entry level cards that had VRAM for days and ran almost anything. For christs sake.
4514919@reddit
Jesus Christ, saying that it's not e-waste doesn't mean that it's fine what they are doing, that the product is perfect and the price is correct. It's not black or white.
The 1060 which you seem to love so much released with a MSRP adjusted for inflation of $400, had less VRAM than the competition (which was also cheaper) and 2 generations later couldn't hold 60 fps in most modern AAA games. And guess what? The manufacturing costs of a 4060 are higher than a 1060 and it's being sold for less.
JJay9454@reddit
But that's a lookout for 3-4 years, right? People here talk like 8GB is gonna be unusable in 3 months.
grimreefer213@reddit
They should be freaking out if they intend to play certain games, like MH Wilds. 8gb in that game is completely dead and people need to realize it's highly game dependent
JJay9454@reddit
But that's not the card's fault. That's not a discussion on the card being able to hold up, that's a discussion about Monster hunter wilds specifically.
Let's take an old example; I got my GTX 970 instead of an RX equivalent because the Nvidia's performed far better in the games I wanted to play (mostly Borderlands 2 at the time). That's not because of the RX card, it's because of the game being poorly developed at the time. Nowadays, those cards give almost the same performance after the years and years of patches.
lcirufe@reddit
“It’s not Nvidia’s fault” is the weirdest case of corporate bootlicking I’ve read today. The company is worth more than Apple rn. There’s no excuse.
JJay9454@reddit
I didn't excuse them or their actions, wheres the boot licking?
PropertyFirst3804@reddit
lol stop the nonsense.
grimreefer213@reddit
Sure it is, it's already not 'holding up' in real world cases now. If it was then everyone would be running out and buying one
PropertyFirst3804@reddit
lol the new mafia release is unplayable at 1080p all low settings today with 8gb vram.
JJay9454@reddit
https://www.reddit.com/r/buildapc/comments/1n5vq04/comment/nbwtc1f/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
PropertyFirst3804@reddit
Jjay9454 You can quit trying to slide in my DM’s lol anything you have to say, say it here or don’t say it all….
JJay9454@reddit
This is a new one to me, haha. What?
I asked you if you're okay, because your reactions here seem extreme. I messaged you asking if you were alright.
PropertyFirst3804@reddit
Copy/paste what you want to say bro
AbrocomaRegular3529@reddit
In this case they could put a little bit more and get 16GB variant, use 8 years.
Problem is 8GB cards are overpriced from NVIDIA. I built 4060 8GB PC for my sister because all she does is playing league of legends, literally no other game, so we bought pre built pc on a sale. 8GB GPUs of course have their place, and widely purchased in 3rd world countries where people can only afford the cheapest model.
Problem is they are so expensive, in a way that NVIDIA is forcing people to buy 16GB Variants which are also overpriced.
TristanTheta@reddit
Well it totally depends on use cases and what you're buying the card for. 8 GBs will neuter the capability of the card, even if it has the raw horsepower to run future games on max settings. You say 3-4 years but we already have games that are out right now that the 5060 TI 8 GB can't handle at max settings because of VRAM issues. Sadly, it's only going to get worse. So depending on what you're going for, the 8 GB cards won't be able to run many AAA games at max settings 1080p quite quickly.
JJay9454@reddit
Agreed, I'd say it's because we're at 6 months, at 1 year we see the last drop of Max/High for it, according to the timeline you and I have laid out.
Normal-Emotion9152@reddit
The 16 gb version can use path tracing at ultra performance mode. The graphics are great despite the massive upscaling.
AnonymousSadGuy2@reddit
No one said it's bad for full hd, people are usually talking about 1440 resolutions or more. Then it is not enough.
viperabyss@reddit
Of course it’s exaggerated. Benchmarks after benchmarks show 8GB VRAM to be perfect for most games at 1080p with even high settings. It’s just people getting riled up by “influencers” who were just looking for clicks.
vice123@reddit
The debate on 8GB of VRAM is pointless. There is no "futureproofing" in technology.
Find the cheapest GPU that fits your needs.
Local-Ask-7695@reddit
Another futile justification trying of a bad purchase.. 16 gram one will outlast this one 3years
rolim91@reddit
Famous last words. 24 is the way to go.
Local-Ask-7695@reddit
For 5060, when/how/in which case you will need 24?
Quiet_Try5111@reddit
a guy named daniel owens covered something related to this when he was comparing whether the 6700xt 12gb or 3060ti 8th will age better.
tdlr is they are both a tie. intensive games will run on 6700xt better than the 3060ti natively. however, dlss on 3060ti meant that gamers can use dlss for slight performance loss and still get the same visuals and fps as native.
best of both worlds is having high vram and good upscaling. i can see why 9060xt 16gb is a better value entry level gpu as compared to 5060
Infinifactory@reddit
Stop calling it entry level what the hell is this newspeak, it's over 350$ in most countries. Entry level used to mean gtx XX30-XX50 class. These cards beat some of the best selling performance segment cards from the previous generation.
4514919@reddit
Entry level just means the lowest level of the hierarchy. A $100.000 Ferrari can be entry level if it's the cheapest offer that they have.
TristanTheta@reddit
You really want to call the esports cards that can't handle 5 year old games at 1080p entry level? Ok lmao. Sadly, 350 is the new entry level.
e270889o@reddit
Indiana Jones 1440p. Try that
lleyton05@reddit
“Your famously well optimized game is bad data, play this notoriously vram hungry game instead to prove I’m right”
e270889o@reddit
No. It’s an argument to the OP saying that 8gb is still perfectly viable for 1080 when it’s not some of the new released games crashes or tank to single digit fps
t90090@reddit
8gb is a cot dang shame!
TheOutrageousTaric@reddit
im not paying 300+€ for a new card thats capable of running aaa games at 1440p high but is gimped by vram. If id want the 1080p 8 gb experience id buy a used card for half the money
dorting@reddit
This the card is bottlenecked by it's own memory, at this point just buy a even cheaper GPU
Jermaphobe456@reddit
I recommend anyone that, if at all possible, save the extra up for a 5060 Ti 16GB. It's the best performance per dollar card for its tier on the market currently.
TheOutrageousTaric@reddit
5060 ti 16 gb is really bad value when the 9060 xt 16 gb exist and costs much less.
PropertyFirst3804@reddit
Isn’t the 9060 xt 16gb 5% less performance but like 20% less money?
AbrocomaRegular3529@reddit
Yes it is, 9060XT is the best value GPU. If someone is on the market, trying to build best system with least amount of budget, then 9060XT is no brainer. It is even better value than Arc b580 which is 250$.
PropertyFirst3804@reddit
Only reason I know is I was looking to purchase at that tier as a gift. To be honest I did purchase the 5060 ti 16gb over the XT 16gb. But it was clear to me the value at pure rasterization was definitely with the XT. I went with Nvidia for the better upscaling support and ray tracing. Big part was because I know my friend wouldn’t be able to handle modding in FSR 4 and he plays a lot of multiplayer and didn’t want to worry about him being banned if he used optiscaler.
AbrocomaRegular3529@reddit
9060XT is the best value of GPU per $. According to hardware unboxed, it is not only the best value card today, but one of the best in 2 decades. (considering the inflation).
__breadstick__@reddit
Whether people on the internet like the 5060 or not, it’ll find its way to the top of the steam hardware survey no problem.
Debesuotas@reddit
4060 with 8gb ram is still a good card. Dont need to rush the overpriced hyped peoducts.
Infinity_777@reddit
What's a good GPU for VR? I am currently using a Lenovo Legion with Mobile RTX 3070 8GB. I heard VRAM is way important for VR and apparently AMD GPU drivers are shit for VR
True_Address5741@reddit
Que idiotez, obviamente usaste solo 7Gb de vram porque esta limitada la cantidad que podes gastar a solo 8gb. La prueba real es usar luego una tarjeta de 16GB y ver cuanto consume maximo, si es mas de 8gb entonces 8gb no son suficientes.
Si yo tengo 12GB de ram en mi sistema y abro chrome, solamente podre gastar 12GB de ram, no 16, asi que nunca me dare cuenta
buildapc-ModTeam@reddit
Hi there! Thanks for commenting in r/buildapc.
We ask that posts and comments be in English so they can be understood by as many people as possible. Translations on Reddit are client-side, and not all apps or browsers support auto-translate. Currently many users (and moderators) aren’t able to read your comment.
Could you please resubmit your comment in English?
^(Click here to message the moderators if you have any questions or concerns)
faluque_tr@reddit
Ofc, Sure, you are the guy buying 8GB VRAM Nvidia GPU in 2025 after all.
St3vion@reddit
I heavily dislike the VRAM wankery on reddit. It's definitely not as bad as some are saying and you don't need 16GB VRAM for 1080p nowadays.
But if you're turning on DLSS, it's not running at 1080p. You're talking about it running great at 720p in 2025. That's where the 8GB is an issue and I think performance is unacceptable. 60 series cards should do well at 1080p native - no upscaling. Upscaling is a crutch that makes sense at 4k and looks good to the point you can't tell. At 1080p upscaling being on is almost always obvious and distracting.
MaikyMoto@reddit
Everyone knows that the 5060 is a 1080p card, problem that we have is some users state that 8GB is plenty for 1440p and that simply is not true.
Kagura11@reddit
This is bait. Don't fall for it people.
animeman59@reddit
Test more than one game and then get back to us
NovelValue7311@reddit
8gb is enough. I agree.
It's not future proof though. That's why a 12gb 5060 would be so much better.
sleepytechnology@reddit
Counterstrike 2, a competitive fps, uses 6-7GB of VRAM at 1440p low-medium settings. That's more than Cyberpunk uses at 1440p high settings.
If you think VRAM isn't going to be an issue in the future with 8GB at 1080p, ignorance is bliss I guess.
postsshortcomments@reddit
I'm not directly sure about CS2's backend, but VRAM can address a lot of optimization issues in games with many cosmetic skins. CS2 at least has a bit on the clutch due to 5v5 (primary cosmetic, secondary cosmetic, knife, glove cosmetic, agent model). At 10 players in a match and 5 skins per round, that's 50 unique skins in play during any match.
In CS2's case, it's a bit more of a minor of a concern Valve can quite easily calculate a "worst case scenario" for, let's say 300mb/skin. While the number is significantly less than this, if we estimate 300mb per skin and have 50 unique skins in play during a match.. that adds up to 15GB real quickly. And, of course, VRAM concerns more than just skins (map textures themselves are VRAM hogs). But at least Valve has this in a reasonable range with CS2.. Given that Valve and especially CS2 are known to be optimization kings, a VRAM shortage on an 8GB card would perform fine.. but you may see some "texture pop" especially when you pick up an item on a ground with a skin that hasn't loaded (texture streamed) its HQ model in yet.
Each of these skins are often based off of several up to 4k texture maps. In many cases, you'll see at least 5-6 images for things like normal maps, roughness maps, diffuse maps, metallic maps etc., The way it works is that the GPU stores these actively in VRAM and, with texture streaming, if there is a shortage it "streams" the ones it prioritizes most.
Now let's talk less optimized titles with cosmetics. This is especially true in a battle royale or MMO type games that may not have gotten as much optimization love, especially in the early stages of development. In those cases, many 2025 ultra quality cosmetics become a great performance concern. In some case, cosmetics are after-thoughts or pushed by a less tech savvy company who bought it after development. Or developers will develop a crazy beautiful prototype knowing that novice investors might not understand the impact on future monetization models (IE: we used all of our budget and while there are solutions to it, it probably be a step-down for the currently the next-gen looking presentation that currently looks a step ahead of our competitors for a reason. IE: too good to be true and the same tradeoffs have to be made that other developers have made to make it work).
I like to compare it to a budget. A novice developer may see 8GB of VRAM as their early development target and inch that boundary as it will give them the best results if everything remains the same.. Only to then have a second team jump in and add a bunch of pretty cosmetics and not realizing that the original developers already were edging the 8GB limit to conform to hardware standards. Now all of a sudden, instead of a 12 hero limits across 60 players where the same model is shared across about ~5 players.. with even just 5 unique skins for each hero you're already hitting 5x more skins being potentially loaded in at once (all 60 players can use a unique skin). And that's not even including weapon skins etc.,
Again, solutions like texture streaming always make it playable.. but now you're implying that the VRAM budget that's already borderline 8GB is being cut into and something else has to be sacrificed. Let's add other novice mistakes on top of that: if the max VRAM and player models are prioritized over environment, which some teams may prioritize as "you don't want the things people are specifically buying to look worse" and further very likely for an early developer whose project lead initially misinformed them that "no, these are the only skins we'll ever have in the game".. you start experiencing another project that could effectively become "unfixable" compared to the original, on-release performance. So yes, neat back-end optimization can help address an issue that begins to occur because of it, but it probably will never look or perform as barebones as t original pre-cosmetic prototypes.
Again, you'll probably will never see this with a Valve title as they are the kings of that model and understood the limitations fully.. but if you start playing some Early Access Battle Royale titles or other free titles with cosmetics that VRAM is a massive game changer especially as we see new releases conforming to what 3D modelers often refer to as 2k/4k textures (In games with them we'd call this 'ultra quality', but ultra quality is less of a specification vs. a word synonymous with "the maximum for this specific title" and ultra can and often is even just 1k textures).
So imagine a perfectly optimized early-prototype of a title with 12 default skins across 12 classes/hero. Now imagine 60 players jumping out of a spawn ship, in 60 different cosmetic skins, with their uniquely skinned gear on their back etc., That's especially where VRAM shortages start really kicking you in the butt.. and if a game developer is pushing their beautiful prototype that looks generations ahead of existing, successful titles.. it's often not a good thing but instead a bad thing as they've probably underestimated how massive of a trade-off cosmetics will eventually cause - especially if they're a hero title pushing player-model qualities beyond what any other company has ever succeeded at.
Achillies2heel@reddit
Gaming at 1080p in 2025 is just 😑
k20vtec@reddit
⏳⏳⏳⏳ games are only getting more and more bloated
OwnImpression7486@reddit
Now try Microsoft flight simulator on Ultra 😂😂
GCoderDCoder@reddit
Unpopular opinion: keeping 1080p cards as an expectation to be playable with anything but the lowest possible settings is holding gaming visual improvement back. It's hard to increase visual fidelity with the same low standards.
I don't own a 1080p screen. My phone is qhd. My TVs and monitors are 4-5k. Im not the one targeted for buying a 5060 but I remember when the nice HD TVs 55 inches were 720p. With 50 series 8k is becoming not obsurd.
I don't think we should be telling people to aim for playable 1080p . Such a card should be described as "if you can't afford any better" not "this is ok"
Shad_Owski@reddit
Lil bro tested one game and came into this genius conclusion.
This is why reviews exist.
Available-Ad6751@reddit
Most folks callin 8GB outdated are either rich and keep swappin parts, or just followin the crowd while they still usin 8GB VRAM—or even less. Nvidia ain’t dumb droppin an 8GB card with no reason. The 5060 still got its own audience. For people like me, 8GB right now and for at least 3 more years is plenty. Just 1–2 gens apart can’t be called outdated. Especially for players who only hit multiplayer games, 8GB is more than enough.
Psychological-Part1@reddit
All depends on the games you play.
Plus most of the negativity around it is rooted in its future proofing based on how pathetic most triple A games are released in these days.
Ok-Race-1677@reddit
Now test it while having a YouTube video open and maybe discord at the same time. Doesn’t sound like it should make a difference but suddenly it does
aCuria@reddit
When you have less VRAM, games will try to use less memory. It will still run but won’t look as good.
_Rah@reddit
Congrats. You just tested one cherry picked game and ignored all the testing done by other people.
Some_Finger_6516@reddit
The thing is, the 5060 have to rely and use mostly DLSS and upscales at 1080p...
dllyncher@reddit
8GB VRAM is more than enough for esports titles.
BunnyGacha_@reddit
garbage price for a lackluster card
DumptruckIRL@reddit
Theres a few 5060 ti 8 vs 16 gb benchmarks out there. Some games the 8gb is handicapped and runs way slower than the 16gb. So only "some" games now and in 4 - 6 years? 4 - 6 years is what I'm assuming most people keep a card for; ie 2 generations.
The 3060 already used 2gb modules with a 192 bit bus. The 4060 and 5060 both use 2gb modules too but only come with 4 modules instead of 6 why? Greed?
kickedoutatone@reddit
I fucking hate how OK people are with the way AI is getting used in the gaming industry.
It's just being used as another excuse to bring out badly optimised games. In an industry that's already pretty lazy in optimising games in the first place.
Super sampling is atrocious. Even though DLSS is the best version of it currently, it still looks like garbage compared to traditional rendering.
And the 5060 making you dependent on super sampling is a terrible thing, because it tells me that it's only going to get worse as more and more games are released with a reliant to AI upscaling because the industry cba to render their games properly anymore.
ciwfml@reddit
We shouldn't need fake frames and upscaled low resolutions for decent performance at the midrange. It's posts like that that normalize shittiness, and why end up accepting it in the end.
VHDT10@reddit
I love the 5060. I've had no problems at all. 8 gigs is a lot. I'm talking about at 1080p, of course.
Kittysmashlol@reddit
The real problem isnt that it cant play games NOW(that does happen, but not particularly frequently), but that as you showed, the vram is just barely enough to keep it going. 8 gb cards are still fine RIGHT NOW, but they are quickly running out of track, even when the chip itself is still more than powerful enough to push the frames. Vram use has been going up in games, that trend is not going to stop. A 16 or 12 gb version of the 5060 would have been far more relevant for budget gamers for far, far longer than the 8 gb we have now could ever hope to be.
ogbIackout@reddit
How was that input lag though
AdMaleficent371@reddit
There no a good reason to buy a 8gb now a days it would be a lot better if you save and the 16gb .. and you have only tested one game but how about other games.. especially those latest released titles..
DigitalRonin73@reddit
I also bought a 5060 8GB. In my defense though there aren’t a lot of options for LP to fit in a 4.5L case. I absolutely would have loved to go 12 or 16GB. It’s not my main rig and a lot of times I stream from my main PC anyways. It’s a bazzite box for lazy couch gaming.
Sure-Wish3240@reddit
1080p runs OK with 8gb VRAM. Even at Black mith Wukong ultra.
Two Side effects of the nay sayers: 8GB cards are cheap. So are 5070. 5060ti 8gb And 5070 are the first competitive priced green cards in a long while.
Potential-Cat-7517@reddit
Bro u are literally using upscaling on one of the best carda out there. VRAM Iis an issue if u want to run the game at native or higher resolution, which what good graphics are about
blob8543@reddit
Are you really making the case that 8GB is enough based on just one game?
There is plenty of evidence out there of how the 5060 TI 8GB and 16GB versions compare. Obviously many games are fine with 8GB but there are also many where they perform worse. And this is with 2025 games or older, the issues that come with 8GB will get worse with time.
TheGamerX20@reddit
Ah yes, let's test a single well optimized game, and make deductions based on it... You are running close to the limit RIGHT NOW, how do you expect things will be in 2 years time? On a card that you spent $300+ on?
Not to mention some games won't even stutter at all, but you will have a visible reduction in Texture Quality as the game only loads in the higher quality assets for objects that are right in front of the camera, while everything else suffers.. for ex in Halo Infinite.
RendyZen@reddit
I could not play Space Marine 2 the way I wanted on 1080p.
FinancialRip2008@reddit
'information i didn't want to hear was too widely known'
FlarblesGarbles@reddit
The actual issue is that nVidia are putting amounts of RAM on their cards that are just enough to get by now, but will be the main cause of their cards having poor longevity.
It's not at all misinformation. It's simply a fact that a 5060 with 16GB of RAM will have more longevity than a 5060 with 8GB of RAM.
Final-Owl-4321@reddit
I just bought the 16gb version from micro center for MSRP. Do you guys think that card has a bit longer of a shelf life than the 8gb? Upgraded everything else first and that was the best I could afford
wolfe_man@reddit
That's one game lol
micjosisa@reddit
In the year 2025 AD, AMD and Nvidia should not be pushing 8GB VRAM on mainstream GPUs. Collectively, we should refuse to purchase (boycott) these particular models.
artemnet@reddit
Its a doom, running on id tech by John carmak. Try indi or aw2
Infinifactory@reddit
It's no longer touched by Carmack, he left more than a decade ago.
artemnet@reddit
They are just operators of JC genius :)
Kingdom_Priest@reddit
Uh excuse me. Did you even say thanks to Lord God King Jensen for even giving you the privilege of 8 Gb of RAM?!
Jackal-Noble@reddit
Thank you for posting this. It's amazing what you can accomplish when you actually know how to set graphics settings.
Withinmyrange@reddit
You misunderstand what proper benchmarking is and increasing graphical and vram demands
LeadingAd2738@reddit
I think it’s good for 1080p gaming but ultimately the issue comes with 1440p gaming which realistically should be the new median for gaming
untraiined@reddit
doom is one of the easiest to run games - please test with more, do not just test with one game that is linear small map game, designed to be run at high fps. You might not even be able to load into kingdom come at 4k.
Infinifactory@reddit
Mate I was playing Doom Eternal with a GTX 1060 3GB VRAM and it ran OK-ish, stutterry, with dynamic resolution and all that... You experienced the same thing with the 5060 with upscaling and fake frames, just because it says it doesn't use 8gb doesn't mean it absolutely would and would benefit.
It falls short in many ways, and if you're not on PCIe 5.0 with the 5060 you're screwed, because it starts loading assets into system RAM and the bandwidth becomes the bottleneck (despite the GPU being perfectly capable of more).
It's an unbalanced bad product, it shouldn't exist. It should be much much cheaper to be considered. You can say all you want about poorly optimized games, the case for 5060 and 9060 8gb is closed, they have poor sales, they deserve worse sales so the gpu makers learn their lesson.
KoopaKlaw@reddit
PropertyFirst3804@reddit
One games results is meaningless. There are enough examples of extensive testing proving your hypothesis wrong…
K3V_M4XT0R@reddit
The higher the VRAM the lower the BUS. My 3060 has 12GB VRAM but it has a 192bit BUS your 8GB 5060 has a 128bit BUS if I'm not mistaken. Again that's not written in stone since the 5080 has 16GB of VRAM and a 256bit BUS. Your card can send lesser data with that 128bit BUS width. So overall you can expect lower performance maxing out settings.
Scottamemnon@reddit
192 bit gddr6 on pcie4 has lower bandwidth than 128 bit gddr7 on pcie5. The equivalent would be 256 bit 3000 or 4000 series. Each pcie series doubles throughput.
turboMXDX@reddit
Assuming the person getting the 5060 has Pcie 5, otherwise those 8 lanes would be rather shit
K3V_M4XT0R@reddit
When I spoke about mine, I gave a comparison about the bus width. Yeah, mine is slower but a 4090 has a much wider 384 bit BUS. Coupled with the 24GB of VRAM obliterates the 5060 by miles.
ThatOneHelldiver@reddit
Sad considering the 4060 had a 16gb model.
kyguy19899@reddit
There's no point in buying 8 gig cards at all when 16 GB cards are literally cheaper. $405 for my xfx rx 9060 xt oc. You may not experience issues but you will have to update within two years likely. Makes no sense you might as well spend less money and have it last longer. Literally killing two birds with one stone
kokosgt@reddit
Unless you like to play with RT enabled.
kyguy19899@reddit
It's Overkill and 100% not needed
kokosgt@reddit
Says you.
AGhost118@reddit
Please don't justify Nvidia, 8 GB cards in 2025 are not good. These kinds of posts would only motivate Nvidia to make RTX 60 series with 8GB.
dfm503@reddit
While it’s good enough for 1080p for now, it’ll last at most until the next console generation drops, which is likely to render all of the 8gb cards obsolete at the same time. You can argue that the 10 series is obsolete finally due to lack of raytracing support, but the 2070 is still usable and I don’t think the 5060 8gb is going to outlast it by much. The thing about upscaling and FG, is that they do really well at making playable experiences better, but once the native frame-rate sucks, the added input latency makes what would be an acceptable native frame-rate, feel really crappy.
jhenryscott@reddit
I have run every game I tried on a 3050 4GB laptop. Obviously at dogshit settings sometimes but that makes sense for a cheap ass gaming laptop.
nuk3dom@reddit
What about real hungry games like indiana jones or latest mafia ?
tugrul_ddr@reddit
I'm developing a cuda-accelerated terrain-streaming algorithm for open-world games to use limited amount of vram, instead of dumping everything on vram. It will use compressed data to pass through pcie and decompressed by gpu and only the required tiles of terrain will be streamed such as visible distance only. Then it will be cached on vram in a 1-2GB cache area backed by cuda-compressible-memory for even greater bandwidth. Then I'll try to market this open source algorithm to as many game producers as possible. Then maybe just 1-2 GB will be enough for dynamically loading only the required textures, terrain data, npc data, etc. Then my 12GB card will still be usable in 2030.
mike9184@reddit
Oh let me enable that on my most played game, Helldivers 2...whoops, it's not there.
Krauziak90@reddit
Now run Ark survival and watch your fps dipping to 30s because on insufficient amount of vram.
KoalaBarry@reddit
I'll leaving the testing to those who know what they're talking about
Neat-Phrase-9814@reddit
I have the 5060 Ti and it's been great running FFXV and FFXVI on High Settings on a 3440x1440p monitor.
FFXV has stable 60 fps while FFXVI lingers down to 54fps when things get too demanding.
I could simply play with Medium graphics on XVI and it would resolve that but it's still playable to me so I don't mind it.
Mexican_man777@reddit
I have a 4060 with 8gb of vram and I run 1440 with no problem
D-no-UK@reddit
depends what youre playing. i play indie games, fighters and fps like warzone.. so 8gb vram is never an issue for me even though i have a 9060xt 16gb. slow paced pretty games need vram, fast paced ones dont imo
GABE_EDD@reddit
Just watch these.
https://m.youtube.com/watch?v=AdZoa6Gzl6s&pp=0gcJCf8Ao7VqN5tD
https://m.youtube.com/watch?v=MG9mFS7lMzU&pp=ygUZSGFyZHdhcmUgdW5ib3hlZCA5MDYwIDhHQg%3D%3D
In some cases the difference is negligible, in other cases it’s playable vs completely broken.
Alfie_Solomons88@reddit
I need that reminder to check back in two years to see how well things are going.
HereForC0mments@reddit
A sample size of ONE is never valid for anything, and that's what you have here with only testing a single game. Also, you tested DOOM, an id software game which is a studio that is famous for optimizing the hell out of their games. They're the exception, not the rule.
Bonfires_Down@reddit
For me it is largely about being able to rule out VRAM as an issue if a game runs bad. There are already multiple other bottlenecks so if I have plenty of VRAM I won’t have to think about that part at least.
Own_Complaint_3521@reddit
I actually agree with you. I played with a 5060ti a little bit ago since I wanted to dive deeper into GPUS, myself. The 5060ti 8gb was good enough to run games like AC Shadows at 4k medium and even Cyberpunk in 4k - both with DLSS of course.
I was very amazed and impressed by these results. Not to mention, I bought the 5060ti after buying my 5070ti and compared it to my old faithful 3060ti.
I made a whole post about it on my other account. If anyone is interested, I’d love to post it here :).
b-maacc@reddit
I’d be hesitant to base your entire opinion off 8GB VRAM on a single title from a developers who is known to optimize their games well.
iRSS7@reddit
Your card is compressing the textures and you're losing performance.
chapaholla@reddit
I think it'll be an issue with large open world games. I don't think it'll survive GTA 6 for example, when it comes out.
FORSAKENYOR@reddit
the thing is its a 1080p card and laptop manufacturers are mostly shipping the 5060 mobile with a 1440p or 1600p displays
Naerven@reddit
Yet I was playing a 4 year old game last month and was maxing out 8gb of vram. Each and every game will still be different and testing has shown that at 1080p 8gb isn't always enough.
mig_f1@reddit
The 5060 is the best all around card one can buy new below $300, with tech, features, support and compatibility that cover all kind of workflows.
Pretty much what the average consumer wants from a GPU without selling an arm and leg. It never targeted enthusuast gamers, despite them keep judging it as such.
IYKYK808@reddit
Really depends on use case. But since it seems like the "majority" try to push their systems to the limits the lower end cards will never be enough. My 3080 10GB still runs a lot of games fine on high on my 1440p monitor and many more games on ultra the older they are. But I limit my monitor to 120Hz and frames to 60fps because I cant really tell the difference (and i stopped playing more than casual pvp a few years ago). The 3080 10GB is still running strong but I upgraded to the 5070ti.
Somethings telling me the 5060 could probably run 1440p at 60 fps on med-high. But if you're trying to push your system to the limits then obviously it won't be enough.
No_Guarantee7841@reddit
Think i will trust this dude's experience with dlss quality and rt more than non unsubstantiated cope claims that dlss quality solves all vram issues. https://youtu.be/8GOX_hX0mvw?si=luZR8r36oQzPYIYB
ALMOSTDEAD37@reddit
How happy are u to run ur car on redline all the time ? 8gb vram is on the redline , shifting as soon as possible is optimal . Now onwards 12 gb should be minimum ( low end cards ) , 16 for mid range and 24+ for flagship
pacoLL3@reddit
This is a full on moronically bad comparison.
OhShizMyNiz@reddit
This is literally the most "I think this makes sense as an analogy" post ever 😭 your cars meant to hit the limiter, it's not meant to surpass the limiter.
Meanwhile your GPU, usually has several "redlines" it can meet, low idle boost clocks, too full load boosting to 3050mhz on the core. There's no redline for your GPU, and your GPU would save itself before it damages itself.
Haunting_Philosophy3@reddit
Oh bro with my honda redline is where the fun just begins
Milk_Cream_Sweet_Pig@reddit
There's plenty of proof out there that proves this otherwise, especially tests on a larger number of games. Atleast test it out on multiple games, not just 1.
There's a few ways games handle the lack of VRAM. Sometimes they start removing textures, sometimes they try to use slower system ram which cripples performance, sometimes it justright crashes the game.
https://youtu.be/P2qs2lLdWHY
Daniel Owens made a good, in depth video about it. You should also check your 1% lows. When you're running out, while your averages may be high, your 1% lows could be poor which results in stutters.
jjslowd@reddit
You seem to have misunderstood the problem with 8 gb cards. Your card is already so close to its max today. What about in 2-3 years, when the next generation of consoles pushes the minimum specs required? Plus, the 5060 and 9060 are powerful enough to make use of 16 gb, so capping them at 8 is just kneecapping them.
Mad-Destroyer@reddit
It's not 2010 anymore, come on.
pacoLL3@reddit
Are you people genuienly dumb? 2010 cards had like 1GB VRAM.
What is with this subreddit that 11/10 stupidity is getting regulary upvoted? I don't get it.
Ahoonternusthoont@reddit
Yeah its 2025, 8K resolution is the standard idk why these poor and peasants are stuck with 1080p and flithy 8GB vram GPU. Should have gotten RTX 5090
Internal-Arm6041@reddit (OP)
Thanks to Steam we can see that today more than half of players game at 1080p, it’s the most used resolution right now. And on the other hand, the whole “I don’t get why people don’t just buy an RTX 5090 and problem solved” argument is like saying “why do people take a taxi if they could just buy a Ferrari.”
andrew_2k@reddit
We can also see most players are on a 2 generation old entry level GPU. No wonder most of them are on 1080p. That argument isn't valid in that regard.
If you're happy spending premium to be on the limit from day 1 be my guest. You're the reason companies get away with this shit.
We used to get 8GB with RX580s. How is getting 8GB now good in any way, shape, or form? For higher price too?
Mad-Destroyer@reddit
Hilarious, mate, but you don't need to go for a 8K resolution, just try a 1440p monitor.
Round_Ad_6369@reddit
1080 is still by far the most popular resolution on steam
juan_bito@reddit
I know people exaggerate so much to shit on 8gb when its easy enough to run new games at 1080p ive never had a issue now if were talking about 2k or higher thats a different conversation
RedBoxSquare@reddit
We say it is a bad product because it is not balanced. Not that it won't work. The cores are quite capable, but the VRAM amount is holding it back. If it had 10GB or 12GB it would have been much better balanced, even if we had to pay slightly more fore the additional VRAM. But Nvidia won't sell us that configuration because they want to upsell.
It's like buying a computer with a i3 12100F and a 5070 Ti. The 12100F will hold you back. It would be so much better if you can upgrade to a 13400F. But the seller won't let you change anything. So most of the money you paid for the 5070 Ti is wasted because you won't be able to use it to its full potential.
Fortunately, B580 exists at a lower price and is much more balanced.
XSHIVAMX@reddit
You are right, but people saying its bad because most games aren't well optimized and 8gb vram is enough only for current state, considering the trend, it will fall short within couple of years.
aragorn18@reddit
Thank you for sharing your experience.