Damn.. I was entirely wrong about Vram..
Posted by Impressive-Formal742@reddit | buildapc | View on Reddit | 400 comments
I was using a Rx 6800 on Indian Jones 4k with medium Ray tracing high settings using FSR. No issues, crashes etc ( Running above 60 to 80 fps ). I found an open box Rtx 4070 super today for a good price and thought it might be a nice step up . Boy was I fucking wrong, 4k .. kind of fine with lower settings because of Vram no biggie. Well I go medium settings, dlss balanced, Ray tracing to lowest setting and it crashes everytime with error Vram Allocation lmao. Wtf, without Ray tracing it's fine, but damn I really proved myself wrong big time. Minium should be 16gb, I'm on the band wagon. I told multiple friends and even on Reddit that it's horseshit.. but it's not at all. Granted without Ray tracing it's fine, but I still can't crank the settings at all without issues. My Rx 6800, high settings lowest Ray tracing not a damn issue. Rant over, I'm going to stick with team red and get a open box 6950xt refrence for 400 tomorrow and take this back.
Affectionate-Memory4@reddit
6950XT for $400 is a solid deal. Nice find!
And yeah, VRAM is going to become a problem for a lot of cards. 3080 is going to age much faster than it should due to the 10GB model being by far the most common. I wonder how the rumored 8GB 5050, 5060, 5060ti, 9060, and 9060XT models are going to fare.
OrganTrafficker900@reddit
They gimped the 3080 so hard. I have a 3080Ti and I need 16GB of VRAM minimum because I also use my GPU for work, I'm begging games to use more GPU because every game I play I'm still getting 70 fps minimum with optimized settings so I can't justify getting a 5090 for the 32 GB of VRAM, wish ROCm was more widespread so I could move to AMD every day.
Affectionate-Memory4@reddit
The 12GB on the 3080ti is what drive me to a 7900XTX. I hate upgrading back to back like that, but I was still just barely in the return window so the swap was made. Haven't missed Cuda myself after setting up Pytorch DirectML and finding the OpenGL/CL/Vulcan alternatives for my other things, but that's a very specific set of things to go right.
As for 16GB Nvidia options, they've unfortunately been really stingy with it on anything that would be a real upgrade, as the 4060ti 16GB isn't all that fast. It's pretty much just the 4070tiS, 5070ti, 4080, and 5080 in that range, and none of them are great value themselves coming from a 3080ti.
mcphee187@reddit
The 4060 Ti 16GB is also stupidly expensive, here in the UK at least. £100+ more than a 4060 Ti and all you get is an extra 8GB VRAM.
It makes me worry for the 9060 XT if they do release two memory configurations...
OrganTrafficker900@reddit
Sadly I searched for alternatives in my use case and there simply isn't any and people are just telling you to use your CPU instead but that's like 100x slower than using CUDA. I wanted to get an RX 9070 for my second PC just to be able to have an AMD GPU in my hands again but sadly they are 1250$ and the XT is unavailable.
Ok-Difficult@reddit
Undoubtedly 8 GB will be "fine" at 1080p for several years still, but I expect there will be an increasing number of games requiring medium or low textures to fit in 8 GB of VRAM.
Some people might be fine with that, or unable to afford something more expensive, but buying an 8 GB card in 2025 is opting for an inherently compromised experience.
prosetheus@reddit
Have a 6900 xt and the 6950 xt at 400 would be incredible price-to-performance. It is basically a 7900 gre and if you can undervolt and OC, it can hit 7900 xt performance levels.
JonWood007@reddit
The good news is at least some of those should at least run games on low with no rt.
NoHandle6266@reddit
India Jones and the great vram usage
Redericpontx@reddit
Just wait till you see how much vram you need for 1080p high res texture pack in monster hunter wilds 😬
LawnJames@reddit
Does high res pack make any difference in 1080?
Redericpontx@reddit
Definitely looks better but not like a crazy amount better. You need 60gb more space and at least 20gb of vram to run it. It says 16gb vram minimum but it'll stutter on 16gb.
KingZarkon@reddit
20 GB? JFC so you can only run it on a 3090, 4090, or 5090?
Redericpontx@reddit
Or 7900xt or 7900xtx
VanitysFire@reddit
I'm running wilds on my 3080 ftw3 ultra 10gb on ultra settings, high res texture pack, max ray tracing amd getting a solid stable 48 fps. Not 60 but better than 30. 10gb is enough to get the job done.
physicsMathematics@reddit
Are you sure it's using those high res textures? Some games automatically downgrade the textures when they don't have enough vram
Redericpontx@reddit
by ftw3 do you mean fsr? but what it says in the settings is a lie it uses more
physicsMathematics@reddit
Ftw3 = Gor The Win 3 It's an EVGA sub model for graphics cards
VanitysFire@reddit
Why would I mean fsr when the card is called ftw3 ultra?
I honestly don't know how much memory the game says is being used but I know it's not maxed out. I could boot it up real quick and check and verify memory usage with hwinfo.
Tmak254@reddit
I found the same on the texture pack for space marine 2, specs say 16gb but was a stuttery mess on a 4080 super
Redericpontx@reddit
Yeah the mhw one is filled with bad reviews because of 4080(s) stuttering.
Leather_Let_2415@reddit
Tbf it says about 12gb usage so it seems more an engine issue ATM
Redericpontx@reddit
Nah it just lies about how much vram is being used if you use any external program to show how much vram is being used it shows the correct amount which is a lot more
sdcar1985@reddit
With how badly the game was made, it wouldn't surprise me if it was a bug. Their new patch made the game unplayable for me because it'll crash after a few minutes now.
Galaxy_boy08@reddit
I don’t think I ever noticed that actually
The 9070xt kinda brute forces the really bad performance which is great but they aren’t getting brownie points from me despite how much I love it lol.
20 gb is pretty insane
TheCheshireCody@reddit
o-O
ItsLe7els@reddit
when i ran hardware monitor when running the game, high res textures in 1440p, it was using 15.2GB of vram
Redericpontx@reddit
What gpu, max settings and ai or native?
ItsLe7els@reddit
i9 14900kf 32gb rtx3090. high res textures, max settings, ray tracing high, fsr quality, 100fps
Armendicus@reddit
Ray tracing eats the fuck outta vram. Here’s hoping neural textures n ray reconstruction will save us plebs on 16 gb vram. Atleast Im steppin up from 12 gb vram.
ItsLe7els@reddit
yeah, RE engine and unreal 5 are just really unoptimized currently and i think over time we’ll see improvements. they’re still figuring things out. that’s my hope at least lol
AdditionalMap5576@reddit
RE engine is crazy good for what it was made for, RE. the small and very detailed environments look great and run great, and even other smaller scale games like dmc also run great for how good they look, but there was obviously work to be done for an open world implementation
Redericpontx@reddit
Gotta enjoy that 24gb of vram
ItsLe7els@reddit
haha gotta use it somewhere. I completely forgot my main point though. the game says i’m using 8-10 but hw monitor says it’s actually 15.2. so the games vram slider is incorrect i believe.
Armendicus@reddit
Holy shit that game’s broken!!
Redericpontx@reddit
Yup it lies about how much vram it uses just like the bench mark lied about performance
ItsLe7els@reddit
i mean arguably the benchmark was correct because it only shows certain specific scenarios, and isn’t rendering the entire game, doesn’t have jerky camera movements. imo benchmarks have to be taken with a grain of salt because there’s so much more to a game than little cherry picked snippets. my in game performance is close enough to my bench score that I would say i probably got the same fps in bench and in game during the same cutscenes
Redericpontx@reddit
Nah it was capping cause they used cutscences to inflat the fps average. According to it a 7900xtx max settings rt high native no frame gen should average 110fps and it's more like 70-80fps
ItsLe7els@reddit
oh yeah. that’s cap forsure lmao i mean the recommended specs. are supposed to hit 1080p 60fps on medium with frame gen and upscaling…. pathetic
aVarangian@reddit
not gonna bother looking into it, but it could be an engine issue
Redericpontx@reddit
Most of it is a engine issue. Re engine was made for small area resident evil games and now they're trying to force it as a open world engine despite the foundations of the engine not being one. It's like how all old MMOs can't do certain cool things because the engine is 15 years old and limits them.
LegendofLing@reddit
I knew getting a 7900xtx for 1080p would pay off one day
Redericpontx@reddit
LMAO I did the same thing mostly because I like a lot of poorly optimised games
LegendofLing@reddit
It helps a lot, went from a 3070, but eventually going to get a 1440p monitor
Redericpontx@reddit
I'm waiting for oled monitors to either be cheaper or not have burn in anymore. I'm not spending $1500(aud) on a monitor that is going to degrade over time.
Imaginary_Switch_747@reddit
burn in is pretty solved these days, but ye expensive
Redericpontx@reddit
Nah I've been looking into it and it's still a think just no where near as bad. I've been watching some guy do monthly updates on his daily driver OLED monitor and burn in started at the 6 month mark and gotta very noticeable at the 9 month mark. Idm spending the $1.5k if it never happens cause I'll be able to use it for a decade just like I've been using my 1080p 144hz monitor for approaching a decade.
Imaginary_Switch_747@reddit
Yee I watched the same guy lol. He raw dogged the shit out of it tho ahaha. But I hear you. I'm happy with my 1440p 144Hz ahaha. The OLED colour is beautiful though I must say. Got a 3 year old Samsung 4K QLED for downstairs for a bargain £250. Given me a taste of that beauty ahaha. I must say the QLED is definitely enough of a quality jump from IPS if you wanted to look at a QLED monitor.
LegendofLing@reddit
Same, if I'm gonna upgrade it's gonna be a OLED, just don't have OLED money rn
TRi_Crinale@reddit
Hahaha my buddy said the same thing when he bought his 3080Ti new... he still has the same 1080p monitors
ShadowsGuardian@reddit
Funny thing here too... got a 7900GRE and still haven't upgraded from 1080p yet...
MH Wilds being one of the reasons, being so unoptimized...
DatDudeManGuyBro@reddit
I'm running 1440p on mine and getting just under 60, around 53 or so
parkesto@reddit
Hahaha 7900xtx 1440p here. Picked it up in jan. Went from a 4770k / 1070ti to 9800x3d and 7900xtx. Wowza. Lol
emily0069@reddit
me with my 6900 XT
AHrubik@reddit
Valheim 4K textures... 18GB VRAM.
jolsiphur@reddit
I play Monster Hunter at 4k with a 7900XTX, no RT on and I have seen upwards of 20gb of VRAM used.
galaxydrug@reddit
I play at 4K with a 7900XT 20GB, ray tracing on full, and I'm definitely not using all my VRAM.
Though I do hover around 50 FPS (which is fine for me), but no stuttering either.
exosnake@reddit
I play mhwilds in 1440p with the high texture pack and my 5080 only uses 10gb vram with maxed out everything and dlaa.
Redericpontx@reddit
High or high res texture, max settings, rt, ai or native?
exosnake@reddit
Max settings, max RT, dlaa.
"So, to push the VRAM requirements to the limit, I ran the benchmark at 4K with DLAA, Frame Gen and Ray Tracing. This is the worst-case scenario when it comes to VRAM. And even at those settings, the VRAM usage was below 12GB on the NVIDIA RTX 5080."
https://www.dsogaming.com/news/nvidia-rtx-5080-can-handle-the-hd-texture-pack-for-monster-hunter-wilds/
Redericpontx@reddit
The game lies of how much vram it uses in the in game settings it says only 8gb for max possible setting native 1080 high rez rt but if you use with 16gb of vram or less it stutters
exosnake@reddit
Dude L Connect and MSI apps tell me it uses 10gb vram while I'm playing. Though my friend has a 4090 and it uses like 15+. We don't know why it does that. The only difference is he has a 13700 and i have a 9800x3d. Maybe the 3d cache helps? Maybe the 5080 uses a different kind of compression but we've compared our settings and they're really the same.
Redericpontx@reddit
I'd assume it might be that the 5080 uses gddr7 while the 4080(s) uses gddr6
exosnake@reddit
You might be onto something.
dsinsti@reddit
mean my backup build with a 60€ 1060 with 6 Gb won't be able to run it?
General-Animator-333@reddit
Yo my 1060 wouldn't even run PUBG on the lowest settings. Could've been CPU but I just built a new rig so we'll never know.
Redericpontx@reddit
I'm sorry to have to inform you but unfortunately I don't think your GPU will make it😔
NoHandle6266@reddit
Time to get an a1000
rarehugs@reddit
snakes? he luv snakes
tony_montana091@reddit
No time for ray tracing ROV Doctor Jones!
sdcar1985@reddit
I read this in Short Round
Sick_Benz@reddit
Do not redeem the vram sar
lostllama2015@reddit
Another Bollywood classic.
Far-Bag7993@reddit
I feel you. My 3060ti aged very fast due to low VRAM and I am now looking for a 9070/XT
51dux@reddit
Indian Jones, is it a bollywood twist on the original?
Mechanical-Force@reddit
This is the reason my buddy and I bought a 4070TiS and not a 4070S. VRAM.
Edwardteech@reddit
We keep saying it. Yall don't listen until it smacks you in the nose.
RedDawn172@reddit
It's not a problem until it is, and then it is a very big problem lol.
flatgreyrust@reddit
That’s the thing about ram in general that some people don’t realize. It’s do I have enough?: Y/N. It’s not like a cpu or gpu where every minor increase improves performance a bit.
mostrengo@reddit
I mean...I agree it's a problem and nvidia is the cause.
But is it a very big problem? It's mostly only a problem at 4k, you can still reduce textures and it's less preeminent if you turn of RT.
Jebble@reddit
It's also a minor problem in a very small amount of games and it'll become smaller once Neural Texture solutions become more mainstream.
FarSmoke1907@reddit
Hey no... don't destroy their number one reason to hate Nvidia. Gotta be angry with something and when it's not the prices it's vram which is so precious on cards that weren't made for 4K!
kawalerkw@reddit
The problem can arise even at 1440p when 5070 performs worse than 4070 ti in some games (and in Indiana Jones can have even 4x less frames).
GolemancerVekk@reddit
But if it's caused by badly optimized games... that puts game studios on a direct collision course with Nvidia. And if Nvidia doesn't give a shit and simply refuses to put more VRAM on consumer cards, eventually studios will take the hint.
How many people bought Indiana Jones? The estimates I've seen vary 120k-400k, and simultaneous Steam players peaked at 12k. Meanwhile there are games like Split Fiction selling millions of copies in two days and peaking at 250k players.
Maybe we shouldn't take single super-specific games as the bar for the entire industry, is all I'm saying.
kawalerkw@reddit
There's no way to count how many people played Indiana Jones, because it was on gamepass since day 1.
kawalerkw@reddit
It's so big problem that 5070, advertised as having similar in performance to 4090, can have 4x less frames than 4070 ti Super in 1440p in said game.
windowpuncher@reddit
You're right, but playing games at 4k with bad textures should be legally punishable. 1440p with better textures, even without upscaling, is preferable imo.
Does depend on the game, though.
bubblesort33@reddit
The textures at medium are absolutely fine in this game. I don't actually believe this guy turned his textures to medium. I'm guessing he changed stuff to medium at 4k but still left texture on the maximum possible to crash the GPU on purpose and to make this post.
No one else has complained about crashing in this game yet. I've seen people play this at the VRAM limit, and what happens is frame drops when out of VRAM. If you're crashing you likely have bigger issues.
SignalButterscotch73@reddit
He's right it's usually not a big problem, but sometimes games don't sacrifice assets for fps when they run out of vram and just become unplayable. Then it is a big problem. Very game dependent but mostly settings dependent, we're not at the unplayable at even min settings state with new games yet.
Bad textures should always be punishable. My biggest gripe with Cyberpunk is the textures are fairly crap even at the highest setting. HD texture packs were a golden generation, even on low end cards, if you have the vram capacity, better textures could make the game look like you upgraded gpu and were playing at max settings.
TrollCannon377@reddit
So your suggestion to solve Nvidias vram issue is disable the one feature that makes them a better option for most buyers ....
mostrengo@reddit
No - I am putting the problem in perspective. If you are CPU limited, there are very few things you can do, very few changes you can effect. Whereas with if you run out of VRAM you have options. Hence it's not what I would call "a very big problem".
perfect_for_maiming@reddit
It's one of those failures of human reasoning. "I don't have personal experience with it therefore it isn't a real issue."
Good on the OP for coming clean and admitting he was wrong though. Most people just seem to double down and act like a child about it these days.
step1makeart@reddit
FTFY. There's hope for OP, maybe the kids aren't all bad after all :P
All_Work_All_Play@reddit
Hope would be changing his mind based on data.
Aureliamnissan@reddit
I mean, buying a4070 and not being able to run it is data.
Plenty of people (myself included) tried to convince others of the vram issue using data.
DeeHawk@reddit
It’s not much better that he blindly changes opinion (albeit to the right one) because of ONE situation where he tested ONE game.
He really learned nothing.
step1makeart@reddit
Baby steps are baby steps. No one walks in a single day.
Naturalhighz@reddit
I'm very adamant that vram is being focused on way too much. I'll never say it can't be a massive issue but for most people it really is a non factor. Afaik 1080p is still the most used and most people play esport games that have basically no requirements. For people playing new AAA games it's 100% legit to obsess over though, but as a general rule, nah it's still a niche
Difference_Clear@reddit
Steams hardware survey suggests that it's about a 50/50 split between 1080 and 1440 these days.
The percentage of people actually and regularly playing at 4k is still super low though.
deadlybydsgn@reddit
And then there are weirdos like me who use DLDSR to play 1440 downsampled onto a 1080 TV. I have little kids and my rig is in the living room, so that display is staying put. (plus, nobody really makes 1440p TVs)
Difference_Clear@reddit
That's something that I always found odd. We kind of went from 1080 to 4K for TVs with no real 1440p when 1440 is kind of ideal for Smaller TVs! Unless it's just a lack of TV content is in 1440?
marlontel@reddit
No one pays 600€ for a 4070 super and only plays 1080p Fortnite.
People that make up the Majority in the Steam Survey are on 60 Class Prebuilt and Laptops. No wonder they are on 1080p still.
Synaps4@reddit
Loads of people buy huge e-peen machines and use them for fortnite/minecraft. Its so common it has its own meme.
shroudedwolf51@reddit
Just because a lot of people do, that doesn't necessarily make them common. A lot of people drive ridiculous cars they'll never take out of first gear, but that doesn't make owning a Ferrari then norm for most people.
Sure, I'll give a machine with a 5900X a 7900XTX, even though most of what I play can be handled by a 6800XT with room to spare. I know so because my other machine does all of that...plus VR with a 6800XT. But, the budget for when I put build lists together or build machines for friends is usually closer to a R5 7600 and 7700XT. Or, I guess, now it'll be 9070 when the prices chill out.
There are a lot of people that open their browser settings to configure things, turn off creepy tracking, and install add-ons...doesn't mean most people even know the Settings menu exists.
Synaps4@reddit
You seem to be saying "its not the norm"
the rest of us are saying "a lot"
These are not and never were incompatible.
No_Instructions133@reddit
That's one of the most redeeming qualities a person can have, humbling themselves and admitting they were wrong. After all we're all wrong now and then. I know I have been.
bubblesort33@reddit
This is the excuse people made for years when it comes to AMD drivers. I saw hundreds of posts every week of people complaining, and others making excuses that if it's not happening to them, it's not an AMD driver issue despite plenty of posts like it.
But it's entirely possible in both cases to be something else in the setup. Maybe trying to run the game with 16gb of RAM and 12gb of VRAM is a bad combo, because 12gb means maybe once in a while you'll swap to system RAM for 0.2 gb in a certain game area when you go over. So 32gb RAM systems are fine. But I haven't seen anyone else complain about this on 12gb GPUs. Something is odd with their setup.
Dr_Findro@reddit
Wouldn’t this logic apply similarly for thinking there is a problem due to niche gaming requirements not being met?
deadlybydsgn@reddit
To be fair, not everybody plays at 4K. If someone is buying a video card for 4K, they really will notice it. If they stick with 1080/1440, they may or may not.
But launching a card in 2025 with 12GB of VRAM is still dumb—even with the small reduction in use that DLSS4 provides.
I assume we'll see a 18GB 5070 Super in about a year with the new 3GB modules Samsung is putting out.
Vengeful111@reddit
Yea i think its wrong to assume 4k is any kind of standard.
Its 3.65% of users...
DeeHawk@reddit
The problem is he just convinced himself he was wrong from ONE singular experience.
That’s exactly the same problem.
It doesn’t really matter that it was right this time.
shroudedwolf51@reddit
I have been doing my best to make sure I acknowledge and even reward when people acknowledge being wrong.
As cathartic as it feels to have people that you told repeatedly to that something is one way and they decided not to listen eat shit, mocking them as well as pointing and laughing will only create more people like the flat earthers. Where eventually they end up in a place where everything in existence is willed into existence by people talking about it.
The only reason why the 4060Ti 16GB is slower than the 6800XT despite their at the time price being almost the same is because I said that the 4060Ti is slower when they asked me for advice and bought the 4060Ti anyway...otherwise, the 4060Ti would have been faster. The Ultra 9 285K being slower than the i9-14900k and being worse value than the 9800X3D is only that way because I said so when they asked me for advice, otherwise it'd be faster. Their relative transitioned because I asked them to respect me enough to use my name...otherwise, trans people wouldn't exist. Police brutality only exists because BLM complained about it, it never existed before then. And so on and so forth.
And once people are in that mindset, there's no way of getting them back or reasoning with them anymore.
ime1em@reddit
does this apply going for 64 GB of ram instead of 32 GB for day to day and gaming use?
Kornstalx@reddit
I tried the paraphrase in my head last night what you just said so perfectly.
Got into an argument with some nimrod saying that VRR monitors are only for 240fps CS cryhards. Dude legitimately thought his 60hz fixed refresh was best for gaming on a mid/potato PC.
SweatyCondition2025@reddit
Greatest comment of all time, applies nearly everything
FarSmoke1907@reddit
Listen to what bro? You have been saying for the past like 5 years that even 12gb isn't enough and yet after all those years I can count the games that are unplayable on the fingers of one hand. Indiana jones is one of them and all of them only have a problem when RT is on at 4K or path tracing at 1440p+. Who cares about either of those. With 4070 super you aren't targeting those anyway.
BrianBCG@reddit
Not having enough VRAM won't make most games 'unplayable' to most people. It just causes stutters and/or trashes the visuals.
I think that's where a lot of people get hung up on this argument. It would be more accurate to say 'if you want the best experience having more than 8/12GB VRAM is recommended'.
Ok-Difficult@reddit
I think Nvidia (and to a lesser extent AMD) giving barely adequate amounts of VRAM would be less of an issue if prices hadn't gone up so much while performance gains, especially at more affordable tiers, have all but dried up.
Way less people are going to care if the 5070 has 12 GB of VRAM if it's $500 and 25% faster than the 4070 Super.
zoemgs2@reddit
This is what I have been saying. Also the majority of systems in the steam survey only have 8gb VRAM. If these developers don't like money they can go ahead and increase vram requirements but 4k and path tracing are completely unnecessary for me personally.
dcjt57@reddit
😂 those people who have a gpu with 8gb aren’t likely to spend $40-60 for a new game idk why developers would try to conform new games for them
kento10@reddit
Only game I know that need more vram is RE remake
deadlybydsgn@reddit
Yeah. Even when I was still running an 8GB 2080, the games that typically put my VRAM usage in the red were big PS5/console ports. If people play those, then maybe get more VRAM. Otherwise, it's pretty possible to not run into the limit.
But yes, 12GB in a new card in 2025 is kind of dumb. We'll see 18 & 24GB Supers next year with the new Samsung 3GB GDDR7 modules.
DA3SII1@reddit
i played that using the highest textures at 1440p using a 2060 super
cover-me-porkins@reddit
In everyone's defense, 30 series owners needed to get a 3090, which was a terrible value proposition at the time. It stretched it legs over the rest of the 30 series now, but at the time it was ~3-12% improvement for more than double the money.
40 series was better as the 4080 had 16, but it was also an insane markup over the 3080, which felt like the same old story as the 30 series.
involutes@reddit
Sorry to be pedantic but the expression is "moot" point. Not mute.
ProLevelFish@reddit
*moot point
Edwardteech@reddit
I got my 3090 for 900ish with a 3 year drop and dent warranty. On Amazon.
With the way this is going im quite happy with that.
Bitter-Sherbert1607@reddit
ehh it’s not as cut and dry as people make it out to be…
The 3090ti has 24Gb of VRAM, but is within ~5% of the 4070s which has 12Gb.
The 4070s also destroys the 7800xt and the 7900GRE despite those cards having 16GB.
Speaking of the latest releases, the 5070 and 9070 are basically at parity despite the 9070 having 16GB.
For 4k 16GB should really be the minimum, but you can definitely survive with 12Gb for non-RT 1440P
Personal-Acadia@reddit
The above comment is absolute horseshit. You haven't watched a single benchmark and probably get your information from Userbitchmark.
Bitter-Sherbert1607@reddit
GPU Benchmarks Hierarchy 2025 - Graphics Card Rankings | Tom's Hardware
What a respectful and dignified way to express disagreement with someone....
Anyways, look at these graphs, the 12gb 4070s is superior to the 16gb 7900GRE, 7800xt, 6950xt, and 9070, despite having less VRAM. This applies to raster, RT, 1080p, 1440p, and 4K
laffer1@reddit
Most reviews when the 5070 lost to the 9070xt were because of stuttering due to vram. Even more so when comparing dlss to fsr.
TaigaChanuwu@reddit
Yeah, its almost like VRAM size doesnt affect the gameplay unless you need more than your GPU has...
Bitter-Sherbert1607@reddit
Which doesn't occur for the large majority of games that 99% of people are playing and benchmarkers are utilizing in their suites
All_Work_All_Play@reddit
Sounds pretty cut and dried
jasons7394@reddit
Probably 95% of the PC gamers have under 12gb of VRAM.
Game devs aren't just going to eliminate 95% of their potential customers. Relax.
NinjaLion@reddit
"looks at mhwilds" idk man a lot of developers dont seem to care about the average hardware spec
Tamotefu@reddit
Wilds was rushed out so it couldake the end of the Japanese fiscal year. We're probably looking at a lot of optimizations with the first big update to add monsters.
Shap6@reddit
It’s perfectly playable on my 8gb GPU though?
Kamishini_No_Yari_@reddit
Shh! The parrots need to be right
jasons7394@reddit
Perfectly playable on a 4060 at max settings in 1080p, medium in 1440p.
MorCJul@reddit
According to the latest Steam Hardware Survey, 30% of users have 12 GB VRAM or more. Source
nolander@reddit
However consoles do have 12gb of VRAM. And we are starting to see more games now requiring more, Doom, Indy, Wilds. It could be the sign the damn is breaking or it could just be a blip but we'll see.
Visible_Ad_9459@reddit
how about 7600xt 16 gb ? will it be able to use it if two games are played simultaneously in two different monitor with the same gpu ?
Ibuprofen-Headgear@reddit
I used to run games with less than necessary vram, but pretty much never had crashes, just had less than perfect textures and such. A few years ago though, like gtx 660, 1060, rx470 days
NinjaLion@reddit
Regardless, 8gb is dead now because one of my most played series just released a new game that barely functions at my resolution with 8gb (MH Wilds). Which is definitely in line with the complaints about VRAM that I have most commonly seen: its either a problem right now or will be soon, to only have 8gb in a card at 4k.
12 is a lot more and probably fine for future proofing, but its still going to feel bad as a consumer to see Nvidia being so stingy with the vram, especially when we all know its being done to maximize the vram for their corporate customers. we've been second class consumers for a while now...
vkevlar@reddit
TBF, Wilds release notes say to use resolution scaling. Both of my sons' laptops (4060ti/8gb, 1600p) are well over 60fps with DLSS on. My take on it requiring DLSS for good framerates is that the game is horribly unoptimized, and the texture sizes are mostly a secondary issue.
BrianBCG@reddit
"That's only a problem in a small number of unoptimized games, who cares!"
dulun18@reddit
hardware unboxed talked about this almost two years ago..
16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit
https://www.youtube.com/watch?v=Rh7kFgHe21k
even in 2023 some of the pc games (eg. last of us remake).. were already using more than 8GB of VRAM at 1080p high..
1440p gaming -- more and more games are using around 11.5GB of VRAM and 17-19GB+ of RAM on 1440p high/ultra
ChawnkyCheez@reddit
It's only a matter of time before ray tracing is a requirement for every AAA game. Lower Vram cards will just become obsolete much quicker.
spraeeza@reddit
The whole 16gb vram is due to bad optimisation by devs who force gamers to upgrade their hardware because the software is no good. Dlss or fsr is upscalong tech and with common sense should make your card work less.. but everyone wants a new toy. As they say "the more you buy, the more you save"
AzysLla@reddit
Bunch of people will then come out and say there is no difference between 1440p and 4K.
diijae@reddit
Fine then, now I'm going the 7800xt route, was looking to upgrade to a 4070 super
ognihc@reddit
I think 4k you really need more Vram, 12gb below is for 1440p and 1080p stuff
GigarandomNoodle@reddit
This is an insane edge case. This is like one of a very select scenarios where the 4070s doesn’t absolutely shit on the rx 6800.
MagnanimosDesolation@reddit
It's an edge case now, though not insane, it's a very popular game. But games are going to continue trending in the direction of heavy RT requirements.
Kubocho@reddit
still an edge case and with easy solution, turning off RT and you can play the game 4k with your 4070S like I am doing with no issues +60fps on a single player game
JonWood007@reddit
I always say it, the big killers of longevity of cards comes down to vram, driver support, and support for apis. I'd generally prefer to buy a somewhat weaker card that's more futureproof in the above things than be hard limited by any of them.
jakedasnake2447@reddit
This is how I ended up building my first PC. By late 2007 everything coming out needed DirectX 9.0C support to run at all.
ThatOnePerson@reddit
Fun example of this is the 5700XT can do Indiana Jones at 1080p at mediumish 60FPS because the (Linux) drivers support Vulkan Ray Tracing in software.
While the GTX 1660 can do FF7 Rebirth because it has mesh shaders.
Both are missing the other features, so those games don't work on the other.
beck320@reddit
This is the main reason I keep wanting to upgrade my 5700xt. I am very happy with the performance in most games especially from a few years ago but newer games and kicking its butt because of the api
Witch_King_@reddit
9070XT would probably be the perfect upgrade if you can find one.
beck320@reddit
I gotta save up for it first lol maybe by the time I have the money it’ll be in stock at MSRP
Witch_King_@reddit
That's the spirit!
Jarpunter@reddit
You prefer slightly worse performance in >99% of games in order to avoid significantly worse performance in <1% of them?
AnEmpireofRubble@reddit
thread is full of disingenuous jerk offs. imagine in a PC sub huh?
JonWood007@reddit
Id prefer to run the maximum amount of games acceptably without caring what they look like.
laffer1@reddit
This is what forced me to upgrade my r9 fury nitro crossfire setup. VRAM was causing major issues. I went to a used 1080ti and ran that for awhile then upgraded to a 6900xt.
I had a 4k monitor with the 1080ti and it was running at 85c constantly with a water block. It causes some of my tubing to melt. I downgraded to 3440x1440@144 to get a better experience.
The amd card wouldn’t go past 65c on the 4k display and now I’ve got more rads and lower res it’s 55c max but most games it’s under 45c.
The r9 fury nitro was the best gpu I’ve owned. With water blocks, two of them didn’t go past 45c. Most reliable drivers too.
abrahamlincoln20@reddit
That works if you're willing to suffer increasingly bad performance and user experience. My experience with GPU's over the years:
9080 pro: never an issue with vram, had to upgrade because performance was too low
gts 8800: never an issue with vram, had to upgrade because performance was too low
hd 6950: never an issue with vram, had to upgrade because perormance was too low.
980 4gb: never an issue with vram, had to upgrade because performance was too low.
1070 8gb: never an issue with vram, had to upgrade because performance was too low.
3060ti 8gb: never an issue with vram, had to upgrade because performance was too low.
3080 10gb: never an issue with vram, had to upgrade because performance was too low.
4090 never an issue with vram yet, will have to upgrade eventually because performance is too low
(might be missing a few cards I don't remember but always needed to upgrade because of low performance, never because of low vram)
JonWood007@reddit
You also sound like an enthusiast who isn't happy just to run a game. You upgraded from a 980 to a 1070 because "performance was too low"? In 2016-2017? And then to a 3060 ti? And then a 3080 10 gb? And the. A 4090?
That's a you problem. Most many of those cards are still perfectly capable. You just seem to want "the best."
abrahamlincoln20@reddit
Yeah some of those upgrades were pretty minor, but the previous GPU always went to another person's computer that also saw a lot of use, so no waste there really.
It's just my preference for high FPS.
JonWood007@reddit
Yeah but that's largely not what I'm talking about with GPU longevity, I'm talking about running a game at all, or getting acceptable performance at all. Let's go over MY GPU history, now that im awake and able to have more complex thoughts.
2008- HD 3650 AGP, my first GPU. The one GPU that was legit "not powerful enough" (given the rest of my system its no wonder) and prompted an upgrade to a better computer in general.
2010- HD 5850. My first real GPU. Titles playable until around 2015. Ultimately killed by 1 GB VRAM and lack of driver support, although some games got long in the tooth by then like rainbow six siege and witcher 3 by then.
2012- GTX 580/760. I had a friend like you. Wanted to get rid of his 580 to get a 680. Gave me the 580. Otherwise I wouldve just kept using the 5850. Actually did have to go back to the 5850 a few times because it kept dying after a year and I had guest RMAs so I used them. Got bumped up to a 760. Used it until 2017 when it died out of warranty. It was getting long in the tooth by then anyway. Big issue was 2 GB VRAM. I know wolfenstein 2 wanted 4 and I ran it at like super low resolution just to play it. It was still playable but not intended to be played on a 2 GB card...
2017- GTX 1060. The GOAT. One of the most legendary cards ever made. I ran every title through 2022 on it, although by then I was running low with FSR on to get decent frames. I upgraded because I knew 2023 was gonna have a system requirements bump and because GPUs finally got cheap again and there were options in the sub $300 market for people like me. I'd say that 2023 requirements bump was the true end of this card. It was the power to some extent, but it was also the 6 GB VRAM, and also lacking RT support and mesh shaders. You might ask, what aged better, the 1060 or the 580, and the 1060 actually did, simply because AMD's driver support and lack of support for DX12 ultimate killed it in 2023 a little harder than the 1060. But yeah.
2022-present- RX 6650 XT. It's an acceptable card. 3 of your GPUs are all better than this. This is why im like lulwut when you say you want more power. Admittedly it is forced to run stuff on low without RT and sometimes FSR on to get good performance. And it only has 8 GB VRAM. But it runs games acceptably at 1080p and given how this is the market now, I aint upgrading any time soon, I cant see the 5050/5060 offering significant improvement, if any, nor the 9050/9060. So I'm probably gonna be stuck on this until 2027 at this rate. Simply because affordable upgrades dont exist.
And yeah. THATS what i talk about with GPUs aging. Most of the GPUs youve gotten rid of were still perfectly capable when you got rid of them. With the exception of the 5850, which I used as a backup card, where i got a free upgrade, I just use my GPUs for about 5 years until they no longer run new games acceptably by ANY reasonable standard.
abrahamlincoln20@reddit
I appreciate the story. I can see how low vram can be a real limiting factor on lower and mid end cards (and even higher end), if they're being used for a long time. But yeah, ever since getting a high refresh monitor (somewhere in 2013 or something) and later a 4K monitor, high performance cards have been pretty much a must.
JonWood007@reddit
Yeah I explicitly limit myself to 1080p/60 as to not get on a treadmill of needing to buy more expensive cards frequently.
RSNKailash@reddit
My 1080ti still crushes new games on ~med 4k, 16gb vram
BadSneakers83@reddit
They made a 1080ti with 16tb of vram? I thought it was 11…
madeformarch@reddit
Maybe, but not from stock. I've seen 2080tis modded with 22GB VRAM, so I think its possible
flushfire@reddit
12k all-time peak isn't what I would call "very popular"
CrateDane@reddit
Have to account for people who played it on Game Pass instead.
flushfire@reddit
There are many games that can be compared i.e. game pass title, multi-platform release. Lies of P is one example, it has 19K in steam.
Starfield is what I would say is a very popular title since despite being also available on game pass, it has 330K in steam.
KoolAidMan00@reddit
It is "insane" in that it is an optional path tracing setting that isn't required to have a great experience.
The biggest difference path tracing makes is in the jungle intro, and that is only the first five minutes of the game. I wouldn't hesitate for a second to tell people to play Indy without PT enabled or use it at low setting.
BitRunner64@reddit
With the consoles having 16 GB of VRAM but rather weak GPU's in terms of compute, developers are going to turn to texture detail to improve visual quality. Which means users with 12 and 8 GB VRAM are going to have to turn down texture quality settings, which will result in blurry textures since the textures are optimized for high resolutions.
It's not just going to be an issue at 4K and 1440p either. Running at 1080p might buy you some time, but the frame buffer itself is going to take up less and less memory in proportion to assets.
jolsiphur@reddit
The consoles generally have 10gb of the system ram set aside for VRAM, though it's more variable. The PS5 is using basically an RX 6700 (non-XT), which is a 10gb GPU, but the PS5's 16gb of RAM is shared between system and video either way.
GolemancerVekk@reddit
It's more like very hyped rather than actually popular.
It's "popularity" was mainly established by astroturfing and articles like this which are completely empty of any meaningful numbers.
CrateDane@reddit
You need to account for the people who played it on Game Pass instead of Steam.
Jellyfish_McSaveloy@reddit
Why don't we go even further. I can use 20GB VRAM in Monster Hunter Wilds, that is an extremely popular game. No one should buy a GPU with less than 24GB VRAM clearly.
GigarandomNoodle@reddit
Dude…. I hate to break it to you but indiana jones is NOT that popular. 511 playing now on steam lmfao. 8k all time peak.
Kenjionigod@reddit
Indiana Jones is also included in Game Pass and sold through the Xbox app; concurrent Steam players isn't exactly the whole picture for how well the game is doing. Hi Rush had 3 million people that have played it, but the Steam charts only show a peak of 6043.
MuscularBye@reddit
Minecraft has a peak of zero
Kenjionigod@reddit
Exactly, people put way too much weight in Steam charts.
TR1K71@reddit
That's because most people buy GTA on console instead of waiting for the PC release.
Kenjionigod@reddit
I mean, even just 10% of the 210 million GTV sales are on PC we're still talking about 21 million players.
TheCowzgomooz@reddit
It's also a fairly linear single player only game, spread across multiple platforms, and has been out for a few months, it could have had millions playing on launch day(no idea if true or not) and this would still be about an average number of people to be playing today.
Impressive-Formal742@reddit (OP)
Exactly, I agree I'm not shilling one way or the other. Just my particular use case, especially with a story driven game I like to enable all the eye candy on my oled tv. It sucks because I do think dlss looks better, but I would have more peace of mind having more Vram.
VersaceUpholstery@reddit
FSR4 is looking pretty damn good as well. It’s a shame AMD went the Nvidia route and locked it behind its latest hardware
KH3player@reddit
Unless you can add AI cores to previous cards, it's not physically possible. They stated its the best they can get out of Non-AI Upscaling. If that is true, then im glad they finally moved on. FSR3 looks bad. I have a 6950XT and do everything i can to stay at native res. Unless a game has TSR, then ill try that.
osteologation@reddit
I must be oblivious because I never saw anything wrong with fsr on my 6600xt and I don't really feel dlss on my 4070 is superior.
PsyOmega@reddit
RDNA3 has AI processing.
RDNA2 has DP4A, which does alright with XeSS and would probably handle FSR4 (at a heavier penalty than FSR3, but, handle.)
Bal7ha2ar@reddit
rdna 3 has ai, yes, but no fp8 compute which fsr4 is based on. they could try and make a lesser version that runs on rdna 3s ai cores but it wont look as good or perform as well
guachi01@reddit
It's why I'm glad I have a 7900 XTX. It's the first time in 35 years I've ever bought anything close to the high end. At least it can manage without using FSR.
cinyar@reddit
Is it definitive? They talked about "FSR4 for RDNA4) which implies there will also be more generic version of FSR4 for older hw.
winterkoalefant@reddit
AMD definitely has the incentive to enable it on older Radeon cards if possible, the same way Nvidia allows DLSS 4 on all RTX cards.
So it's most likely that older Radeon cards don't have the ML performance for such a high quality upscaler. FSR 4 uses FP8 operations which RDNA4 supports, and so do Nvidia and Intel cards, so perhaps AMD could make it work on those.
FuckMyLife2016@reddit
AMD's supposedly working to bring FSR4 to RDNA3. But don't expect miracles, especially from AMD.
xenocea@reddit
Sounds like the 9700XT would be the perfect card for you. It's not too costly compared to Nvidia, FRS 4 looks really good, and it has a good amount of VRAM.
AShamAndALie@reddit
Then do what I did, sold my 6800XT, got a used 3090.
GigarandomNoodle@reddit
Vram is important, but 90% of gamers will never utilize 16gb of vram lol
Imgema@reddit
So? When you buy a $700+ card you expect it to last for a couple of years at least. So VRAM needs to be enough to support games released in the near future. Expensive graphics cards like this are not supposed to be monthly consumables or something.
resetallthethings@reddit
had to go hyperbolic
"90% of gamers won't utilize 16 gb of vram in games they are playing right now"
would have been fine.
10 years ago that would have been true of 8 gb too, but stuff changes.
The_Anal_Advocate@reddit
That's a real bad take.
Edwardteech@reddit
May i introduce you to the modding community?
After-Stress9693@reddit
His point still stands. I’d say that less than 5% of the pc gaming community mod their games in any sort of form
Firm_Transportation3@reddit
Which seems crazy to me, because I love mods. It's such a great bonus option compared to console.
9okm@reddit
Even 5% feels generous to me, lol.
chris-tac0@reddit
256k will be enough.
Red-Eye-Soul@reddit
the 6800 is almost 1/3rd the price of a 4070 super in my country. The fact there is even a single game that performs better on the former should be illegal. A few years from now, the 6800 will atleast still be able to run games at 1080p medium-low while the 4070 super will fail to run some of them or suffer from insane stutters.
AnEmpireofRubble@reddit
i will not be sitting on a 4070S 4 years from now when it might start to matter.
abrahamlincoln20@reddit
I dont't think there will be that many games in the near future (~2-3 years) that absolutely require more than 8GB of vram, seeing as the overhelming majority of PC users have a card that has at most 8GB, including the 5060 which will no doubt become the most popular GPU of this generation.
Starky3x@reddit
4070s is like 60% more expensive.
chris-tac0@reddit
Do you consider the MSRP of a 4070 an investment?
If so should you consider future titles which will come with higher VRAM requirements when you make that investment? Especially at 4k.
GigarandomNoodle@reddit
U r underplaying how massive the difference in rasterized performance is between these two cards lol
Synaps4@reddit
Performance comparison goes out the window when one card refuses to run the game.
GigarandomNoodle@reddit
It does run tho. Just not with every single setting cranked to max at 4k lmao
FarSmoke1907@reddit
If you buy 4070 for 4k that's on you tbh. If you don't then you won't have any issue if you are not using path tracing. It's that simple.
Successful_Line_6098@reddit
You're an insane edge case.
(no hate, just couldn't resist)
Synaps4@reddit
And this is why in 5 to 7 years there will still be a market for rx6800s and nobody will want a 4070
lt_catscratch@reddit
That's kinda how they make their money. Nvidia and intel want you to upgrade as soon as possible. Meanwhile amd still releases cpus for am4 platform.
madeformarch@reddit
It took me a long time to see it for NVIDIA but you're right, for sure. My gaming PC is 5700X3D and 4070S, up from 5600X / 3060ti. I could have upgraded the CPU and left it alone, but they got me on the GPU upgrade. I 100% did not need the GPU upgrade, but around Thanksgiving 4070 supers were in stock. I went from 980 to 3060ti during the crypto/covid shit era, and figured it would be summer before (Nvidia) GPUs stabilized again.
Intel has my Unraid server by the balls as well. Again, perfectly fine on a 10th gen CPU, but they managed to make the 12th gen appealing enough to want to upgrade. I'm sure it's worse for Intel gamers, I can't imagine building a PC 3 years ago and feeling the need to scrap the whole thing because you're 4 CPU generations back
SubstantialInside428@reddit
RX 6800 oponent was the 10Gb 3080...not so much an edge case in this matchup
danielmutter@reddit
Yo dude, 60-80 4K medium for indiana jones is INSANE for the 6800. Even the 7800XTX suffers with Raytracing. Even low setting raytracing dude. At 1080p you should get the FPS your getting right now but for 4K, how in the living i9 13900K? Used 4070's are sketchy off ebay. Where did you get it? And yeah VRAM in budget nvidia cards are trash. Keep the 6800 or get the Intel Arc B580, that thing is INSANE for the performance, cheaper and just as powerful as the 4070. Competes with the 4080 Over clocked.
Broly_@reddit
Ray Tracing to Path tracing is resource crazy btw
But I didn't see you mentioning DDU when you switched from red to green. ~~Also why you on your alt?~~
SolidProtagonist@reddit
This post is verging on misinformation. I'm surprised it was allowed to stay up. There are some compromises, but you certainly can use path tracing with a 4070 super in this game.
https://www.youtube.com/watch?v=PqkO-dDVKFY
Impressive-Formal742@reddit (OP)
No shit you can, but at 4k high settings you can't. I never said flat out you can't, I said at my settings I want you can't.
SolidProtagonist@reddit
I read it, man. The fact that you were trying to run it at 4k high could have been clearer. Also, I don't think that 4070 super should crash if it runs out of vram. Should just kill performance. Not trying to hate here either
Impressive-Formal742@reddit (OP)
I agree, I wrote this at 2am half asleep lol. My ADHD is atrocious when I'm tired and I tend to ramble. And I promise you, it literally crashes with a Vram error lol. I dropped to 1440p same settings, high, dlss quality and path tracing.. 90 fps easy if not more. I'm going to mess around with it more tonight. I was just stating that I ran the same 4k high settings with FSR quality and same path tracing and got 60 to 80 fps on my 6800. I'll update this post more tonight with more testing. I didn't mean to stir a damn Nvidia Hornets nest lol. I also bought a 6950xt today, so I'm curious as to how far I can push it at 4k.
SolidProtagonist@reddit
Oh you're good man. And I believe you about the crashing, I'm just saying maybe it's a driver issue or something.
And for the record I agree with you about the vram on the 4070 cards. I actually have one and I'm considering getting a 9070 or 9070 XT because of the extra vram.
montrealjoker@reddit
I have a 4070 and 5800x and play in 4K on a 55” LG C3 with ray tracing and DLSS4 Performance (transformer model) with no problems. Obviously not psycho level ray tracing or path tracing but no issues or crashes. I am going to pass this GPU to my son shortly and upgrade with more vram but have not had a game I could not play yet as long as you tweak settings a bit.
SolidProtagonist@reddit
Yeah, I played through this game with a 4070 at 1440p with path traced sun shadows. I did turn the textures down to high. There are plenty of videos on youtube of people playing at similar settings at 1440p or 4k. My guess is OP had the textures maxed or his card is simply faulty.
montrealjoker@reddit
Maybe they had issue with switching from AMD card to Nvidia card without drivers being erased properly who knows…
Smajlanek@reddit
All you need to do is to lower the settings and you're fine. You can't really expect to run everything maxed out with 4070.
Homolander@reddit
Facts! I don't know why so MANY people feel the absolute need to run everything maxed out.
WhoTheHeckKnowsWhy@reddit
same, I usually go to optimisation guides for that very reason. So many games have ultra settings that cost a many times more performance what they deliver visually. Cyberpunk 2077 is notorious for this.
Only exception is ironically Black Myth Wukong, it has several 'cinematic' settings that actually look noticeably better while only costing perhaps 5% over very high. But then that game is a b!tch to run without potato settings to begin with.
Chahay@reddit
You can’t run that game without ray tracing. The ray tracing toggle is for path tracing which is incredibly resource intensive.
Impressive-Formal742@reddit (OP)
Interesting! Thank you for the information! I did not know that, it is still crazy to me that I can enable the setting on my Rx 6800, but the 4070 super crashes lol. And I'm not talking like 30 fps on the 6800, but with FSR it's more than playable. I just picked up a cheap 6950xt reference card today, so that should be even better!
ounehsadge@reddit
Thats why my 3080 10gb is already kinda done
WhoTheHeckKnowsWhy@reddit
it's clinging on for 1440p some RTX for me. But yeah if I could have gone back 2.5 years ago during the crypto crash when I picked it up; I would have sprung the extra $400aud it was for a 3090.
Even 12GB seems a risky buy for 1440p gaming that can last a few years onward these days.
abrahamlincoln20@reddit
That card is soon 5 years old, and its performance has been severely lackluster for some years already. 10gb isn't the thing that's holding it back, but raw power.
ounehsadge@reddit
Calm down. Severely lackluster for some years is wildly exaggerated.
pacoLL3@reddit
This place sounds worse than a cult.
A 4070 is not a card designed for 4k gaming with raytracing.... It is literally that simple.
A 4070TI would have zero issues in that scenario and perform on a completely different level to an 6800 in raytracing which has horrible performance on AMD in the game.
And Indiana Jones will not crash with every card on every setting when VRAM is full. Settings do exist for that reason. If you want to play the most demanding game with raytracing in 4k get a card that is designed to do that.
And i love how you guys base your prurchasing cedecission decisions on extreme examples with raytracing settings instead of averages - what any sane person would do.
And then you pride yourself on "figuring out how it really is".
It genuinly insane behavior.
No-Source2885@reddit
Literally LOL. If you use the 4070s for what its designed for, 1440p, you would not have any issues.
KKnotoK@reddit
Thays what i do and it ran great at max settings, most games do honestly and if it doesnt its more the devlopers fault not optimizing it right if anything
IKWhatImDoing@reddit
As Steve from GN pointed out in his most recent video, AMD looooves to help their circlejerk along. I honestly wonder how many posts like this are just AMD astroturfing.
CrazyElk123@reddit
Keyboard warriors all of them. For every amd user there are 10 nvidia users, but damn are they good at shilling. Its like the tables have turned completely from nvidia fanboys now being less obnoxious.
joe1134206@reddit
child left behind
AnEmpireofRubble@reddit
you maybe
al3ch316@reddit
Yep. My 4070ti runs this game fantastically even with path-tracing, and it's only got twelve gigs.
But I'm very aware it's not a card designed for that kind of thing plus 4k 🤣🤣
vkevlar@reddit
it always comes down to "I bought this, because I could afford it. Now it's good enough for everything, and I have to justify why."
It's not a new problem, I mean, example: I bought parts yesterday to build a new box, including a Ryzen 7 9800x3D. Today I'm seeing a review for the Ryzen 9 9950x3D, and my first impulse in watching the video is to dismiss the gains it has over the 9800, because I'm sad it's no longer king of the hill.
Reality is that every piece of hardware is purpose built and obsolete by the time you get it home; something built to a higher spec that's newer is more than likely to be better.
We just have trouble with the concept of "progressive" obsolescence. The hardware we have is always a set of compromises, usually based on money or the technology of the time (remember 1994's raytracing demos?).
birdman829@reddit
I've been playing the Great Circle on my 7900xt recently. 1440p very ultra settings with ray traced sun shadows and reflections ("high" ray tracing).
FSR quality and getting around 60fps which is perfectly fine for that game considering the gameplay and how fantastic it looks. Frame gen actually smooths it out nicely too. Sitting pegged at about 19.5GB of VRAM usage though.....and had been wondering how cards with better ray tracing performance but lower VRAM buffers were performing. Not well, I guess lol
RunalldayHI@reddit
Increase your page file size, done.
YB90@reddit
Indian Jones and the card that does not redeem
uxcantxseeme@reddit
7800xt FTW.
_NiceTry@reddit
I run Indy at high texture, medium ray tracing with frame gen on with no apparent issues. With 12gb vram you can go up to high texture resolution.
I7-13700kf, 4070 super and 32 gb ram.
bunkSauce@reddit
I mean. While true, IJ is basically cherry picking a worst case.
Gamebrogamingyt@reddit
I'd say 8gb for 1080p, 12 for 1440p, and 16 for 4k. Then again I'm not much of an expert, that's just an assumption.
withoutapaddle@reddit
For 12+ years, I've always just followed the rule that my GPU should have half as much RAM as my CPU, and it hasn't failed me.
My old "4GB" 970 was fine when I had 8GB of RAM.
Had an 8GB card for a long time when 16GB of RAM was recommended for high end gaming.
Now that 32GB of RAM is recommended for high end gaming, I have 16GB GPU.
Had an 11GB GPU in the middle for a couple years.
The biggest problems is Nvidia seems to have plateaued, because they are assholes with a near monopoly. AMD is still increasing VRAM at a steady pace.
awr90@reddit
First of all why does everybody constantly try to play that shit game? It’s a kids game, and it’s horribly optimized on top of forcing RT. 10-12GB is good for 99.98% of games. I can play 4k max settings on Warzone with 10GB currently.
Azog_tre@reddit
My 1660 with 6 GB of vram Will never see neither 4k or indiana jones
Impressive-Formal742@reddit (OP)
Reference 6950xt for $400.
Impressive-Formal742@reddit (OP)
I always thought this card was pretty, ever since it was released.
whatisalegacy@reddit
LMAO “Indian Jones” is sending me to space why are there not a billion more comments about it
1__ajm@reddit
Raiders of the Lost Vram
879190747@reddit
Never had interest in the game, but never heard of people talking about it except for the whole vram thing. It's clearly an unoptimized piece of shit.
So people should'nt go on apology tours for vram, they should say "Indiana Jones is an unoptimized piece of shit".
emptyzon@reddit
There's a reason why people making efforts to get the 4090/5090 cards.
FARAON_FACTORY@reddit
Went from a 3070 to a 3080ti because of stalker 2, playing in 1080p. The 3070 was at the limit with the vram with everything epic. Best balance is core speed/capability corelated with vram size for the resolution you are playing. I also went from 16GB ram to 64GB ram because the game uses 22 sometimes…you can imagine i had stutter of the century with 16gb….
CockroachCommon2077@reddit
Well no shit the 4070 super shat the bed for 4k. It has 12 gb of vram while the 6800xt has 16 gb of vram. Obviously the 6800xt will do better. I get it Nvidia bad, but jesus this is just sheer amounts of copium
ShadonicX7543@reddit
It's an edge case - if a game is that strict about vram usage I'm gonna blame the game a bit tbh. This is the only way the Nvidia card loses out really in this comparison.
Dark_ShadowMD@reddit
Good for you. I'm not changing my GPU fr a game that probably won't even fit my needs.
This subject is as subjective as people liking and not liking the game. Besides, nVidia doesn't seem to share your views about VRAM so... let's hope your post makes them reconsider their "8 GB is the new 16 GB" motto...
Rhoken@reddit
The 4070 Super is not a 4K card and indeed even DF suggest for 4K of have at least 16 GB if you want to crank up everything on that game.
But in 1440p can run extremely fine even with all crank up (except path tracing of course).
I can say so beacause i have a 4070 Super and i have played that game and even with all settings crank up to max and without DLSS, the real VRAM usage was 10800 MB in the worst case.
With DLSS the VRAM usage was lower meanwhile with FG enabled was over 11 GB of usage but lower than the limit.
Setting Very Ultra texture pool instead of Supreme have reduced drastically the VRAM consumption with totally zero difference in image quality.
RTGI was on high in every case, only Path tracing was disabled beacause without a 4090 is impossible to use it
KoolAidMan00@reddit
I use a 4070 Super in my HTPC outputting to a 77" OLED and it looks fantastic. Final Fantasy 7 Rebirth and Monster Hunter Wilds look spectacular on it.
There are edge cases like Indy and Cyberpunk that will have problems if you crank up the path tracing settings, but by no means do edge cases like those games make the 4070S not suitable for 4K. As it stands the 4070 Super is significantly better than what the PS5 Pro can output, I've compared FF7 Rebirth, RE4 Remake, and MH Wilds, and there is no comparison.
If PT on anything other than low settings is a priority in games like Indy or Cyberpunk is a priority then you should have something higher. For me the 4070S boiled down to size, power usage (my HTPC case is sub-10L), and price, and that card checked all of those boxes.
Best of all games look amazing, it really is a console on steroids. I still prefer using it for gamepad games despite having a 5080 in my desktop PC.
Imgema@reddit
Price wise the 4070 Super should have been a 4K card. It costs more than the whole PS5 console itself.
It's raw power is enough for 4K. The low VRAM amount is the main bottleneck. It's the same bottleneck the original GTX 960 2GB had that prevented it to play games like Resident Evil 7 at 1080p, despite otherwise being very fast for the game at max settings.
Bottlenecking their cards with low VRAM is the simplest way for Nvidia to gimp their cards for planed obsolesce purposes.
abrahamlincoln20@reddit
Raw power enough for 4K at maybe 30-60fps or something, but what's the fun in that? High resolution slideshows can be watched without a dedicated gaming GPU.
Low-mid tier cards aren't made to last, anyway. This has always been the case.
Zaldekkerine@reddit
No, planned obsolescence would be increasing VRAM every generation. That way GPUs would turn into trash that couldn't play the latest games in at most two generations.
By not increasing VRAM between generations, they're actually extending the lifespans of older GPUs. If over 40% of the market plays at 1080p with 8GB or less VRAM because even the 5060 series still only has that much, most developers aren't going to make games requiring more than that. If they did, their game would instantly lose a massive number of potential buyers.
Rhoken@reddit
The 4070 Super is perfect as a 1440p high refresh rate GPU but not a main 4K one.
The 4070 Ti Super is a 4K card
SovietKnuckle@reddit
What I never understood is the amount of people who don't play at 4k. You can't even buy a modern TV anymore that isn't 4K - are the majority of PC gamers not even the slightest bit interested in playing on their TV on a couch?
A game like Indiana Jones is a great example of a game that doesn't require fast reflexes and is very cinematic in its presentation - a great game for playing on the couch like you're watching a movie. But all I ever hear on this sub is how we should dial down our expectations of ever comfortably playing at 4k.
KoolAidMan00@reddit
The issue is path tracing, not ray tracing.
Indy has ray tracing on by default, it won't even work on Pascal cards or earlier because of that. It is path tracing on medium or high settings that demands loads of VRAM.
acai92@reddit
Silly question as I’m not 100% sure but wasn’t Indy one of those games where the texture setting was the pretty much the only one that massively hogged vram and basically you could keep all the other settings high and just drop that setting as needed?
Dramradhel@reddit
I run it at 1080p if streaming or 1440p local with max everything on 12gb (3060) and it runs smooth with dlss. I’m only an hour in though.
_SirLoki_@reddit
I see the price comparisons but not gpu. 4080 should be beside a 6800 and a 4090 to a 6900. So a 6800 should be better than a 4070, at least in theory.
squidgee_@reddit
4070 super with its 12gb vram is more suited for 1440p than 4k. I'd want 16gb minimum for 4k.
Try setting texture cache to low or use DLSS transformer model on performance mode. Most other settings are not as VRAM dependent so if those adjustments get you within your VRAM limit, you can probably crank everything else back up to max, except maybe ray tracing. Note that the texture setting doesn't affect quality the way it does in most games and there is absolutely no point in setting texture cache above medium/high in this game.
bubblesort33@reddit
Even textures to medium should be fine. I have a feeling he never dropped his textures to before ultra, and just used the maximum still with other settings at medium to crash the GPU on purpose so he could simp with this AMD post.
KirenSensei@reddit
It's not even suited for 1440p if you want raytracing. I have a 4070 and 4070 super both fall incredibly short the moment you introduce any kind of raytracing with raw performance. Because of lack of vram.
Thorwoofie@reddit
Under development, the sequel: Indiana Jones: Raiders of the Lost ROP's
Mobslayer56@reddit
I run out of vram with my 24gb 4090, it's not the most common thing but at 1440p maxed settings in modern titles it'll fill up quickly whether it's just allocation or a memory leak which I swear every other new game out rn has a memory leak issue. Meaning after about 30mins to an hour the game will just chug and stutters until I restart the game to clear the memory. On rare occasions my vram will fill up and my system will crash, also Nvidia drivers released after 566.36 have been complete garbage booting to a black screen forcing me to use safe mode to reinstall the drivers then everything's fine, also lowering my power usage and in turn performance to cover their own asses after they sold a bunch of explosive 5090s and 5080s
fascfoo@reddit
Why would you go "go on reddit" or tell multiple friends its horseshit without evidence in the first place?
evangelism2@reddit
You are attempting to play it at 4k with RT on a 4070s? Sigh. Im so tired of this shit.
al3ch316@reddit
Your actual problem is that you're trying to use a 4070S to play this title at 4k.
Twelve gigs is totally fine for 1440p, even in this title. I have a 4070ti (non Super) and it runs great with heavy path tracing and DLSS set to balanced. But it's not quite enough for modern AAA titles at 4k, where sixteen gigs is really the minimum if you want to avoid those issues.
If you're playing at 1440p, twelve gigs VRAM will last you just fine for at least three years.
KirenSensei@reddit
Lol been saying this for years if you want raytracing you NEED vram. Doesn't matter how good a card is at raytracing. If it doesn't have the vram to back it up that card is USELESS for raytracing
TechniCraft@reddit
I have 4Gb vram 🥲
thunderc8@reddit
It's not only Indiana, I bought a 3080 10gb in 2020 and sold it in 2022 because of Vram hick ups, don't listen to redditors saying Vram is but important, it fucking is. My sons 6800 run games smoother because of mute Vram even though my 3080 had a stronger GPU Chip. Lesson learned by me, more Vram is always better.
EirHc@reddit
If you're playing at 4k or DQHD or even just a 1440p ultrawide, then I fully agree you should be aiming for a minimum 16gb.
I think that's a big reason for a lot of the Nvidia hate on the latest generation. 12gb of vram for a 5070 is like outdated before it even launches. Really pathetic how aggressively Nvidia uses VRAM to tier gate.
I ended up that exact mistake a previous generation. I play on DQHD and I upgraded from a 1080 to a 3070ti without looking at the specs. The 3070ti was a little nicer just because of DLSS... but without the gimmicks, it barely felt like an upgrade. I ended up selling the 3070ti and buying a 4070ti super instead, and then my computer started to perform how I kind of expected it to. Keeping your VRAM headroom adequate is important when upgrading.
qeratsirbag@reddit
exactly why I decided to buy a used 3090
Minterpreter@reddit
Indian Jones and the Redeeming Rebels
Psychonautz6@reddit
I genuinely don't understand, I can screen but my VRAM usage never went above 12GB on Cyberpunk ultra at 4K + ray tracing
aVarangian@reddit
4k or "4k"?
Zaldekkerine@reddit
Cyberpunk and Indiana Jones are are on opposite ends of the optimization spectrum.
stormarsenal@reddit
Now enable path tracing
MyzMyz1995@reddit
16gb vram is the standard these days especially with these upscaling DLSS and FSR. PS5 and Xbox have effectively 16gb vram when a game is running and they're built to last years.
BaxxyNut@reddit
Rule of thumb: look at what consoles have to determine what you'll need. People should pay more attention to points like yours.
mostrengo@reddit
The consoles have 16GBs for both GPU and CPU. By your logic 8GBs RAM and 10GBs VRAM would be plenty.
BaxxyNut@reddit
Yes, the point is it has enough for what it needs bud. You people are so annoying.
Unique-Client-4096@reddit
8GB is fine most of the time at 1080p unless you’re using raytracing and/or frame generation. In the handful of games that do go over 8GB rasterized you can usually tune settings to lower VRAM using an optimized settings guide without losing too much of the visuals.There are even fewer scenarios where 10GB is not enough for 1080p.
Most cards marketed towards 1440p have atleast 12 or 16GB of VRAM anyways and in my experience it’s very difficult to go over 16GB as even indiana jones with both path tracing and frame generation at 1440p supreme settings like barely tugs that 16GB line and that’s without using upscaling to reduce vram.
Retr_0astic@reddit
They have 16 GB unified memory.
In real wold usage, thats about 10-11 GB of VRAM for the gpu.
MyzMyz1995@reddit
For a PC yes, not for a console. It's usually not at minimum 14. The ps5 pro has 16gb dedicated to the "GPU" and 2 for the OS so the regular ps5 is using at least 14gb for the graphical side of things.
Retr_0astic@reddit
The PS5 Pro is using the extra 2GB for the OS, it is not allowed for games as per my understanding.
The 16GB is fully dedicated to games now, meaning the CPU and the GPU use it.
On the PS5 OG, the OS takes part of the 16 GB, probably bringing it down to 14 GB overall, setting aside alteast 3 GB for CPU during games, thats’s around 11 GB for the GPU, I am being generous in my guesstimate. The system might allocate 4 GB to the CPU for all we know to.
nimbulan@reddit
What are you doing running GPUs of that tier at 4K? They were never designed for that. You're only getting 60 fps in an absurdly well-optimized game even with settings turned down so I can't imagine how much image quality you have to sacrifice to get other games to run decently...
emily0069@reddit
I'm with ya on this one, I got a Red Devil 6900 XT recently for just 350, hope you enjoy your 6950 XT!
Armendicus@reddit
Where are you finding these cards at such good prices?!!?? Where do you live? Heaven?!!
jmurra21@reddit
Hey. So, I didn't read through all the comments to see if someone else posted this or not, but maybe it will be useful.
When I was getting VRam errors on a couple of games (Rivals and Indy, with two similar, yet different, solutions), doing this fixed it.
(I've got a 4060ti.)
If you haven't already, install Nvidia Control Panel. In there, open the panel and navigate to "Manage 3D Settings." You can adjust these settings globally (I don't) or by the program (let's do this).
Scroll down to the "Shader Cache" option.
You can set it to different values. For Indy, I set it to the highest possible setting i could under Unlimited. I didn't do Unlimited because I didn't want to have to check that folder and delete it from time to time, but if you're okay doing that, go for it.
Now, for Marvel Rivals when i was getting the same issue, the solution was actually too look at the amount of VRam I had on my graphics card and set the Shader Cache size to the one that's just under the amount of VRam you have... for example, if you have 12GB and the closest to it under that value is 10, you choose 10.
If you give it a shot, I hope it works out for you. It fixed it for me.
cdown13@reddit
So many GPU could have been saved from a "need to upgrade" death if GPUs were like motherboards and you could add more RAM to them.
ibeerianhamhock@reddit
I mean this is literally the textbook example of a AAA game that uses a lot of RAM.
Still the fact that even one mainstream game eats up more than 12 GB at 4k means 16 kinda needs to be the new norm.
FrequentWay@reddit
If you can find a RX 9070 that would be a better deal.
izoru__@reddit
the only way to handle the vram.... reducing your resolution down to 1440p
xenocea@reddit
With all due respect, you should know by now that VRAM really does make a impact, and can be a big difference between a playable or prone to crashes and stutters, especially at higher resolutions.
This has been proven a lot of times by the guys at Hardware Unboxed for a few years now. In saying so, a good lesson learn.
foggeenite@reddit
Friends don't let Friends buy Nvidia GPU's with low VRAM
DistinctStink@reddit
8gb was good 7 years ago, now I'm only considering 16gb or higher for any gpu updrade
DaudDota@reddit
Running 4k is the first issue here. I did get a 4070 super Ti to play at 1440p. You can play at 4k, but you have to compromise on everything.
MrTreb@reddit
Brother I was playing monster hunter wilds last night and my 7900XTX was pushing 20gb RAM I couldn't believe it
DontKnowHowToEnglish@reddit
Yup, raytracing uses VRAM, frame generation uses VRAM
Famous_Dust7912@reddit
Ray tracing is just a waste of fps if you ask me.
NovelValue7311@reddit
Find an rx 9070xt or 9070. The best of both worlds...
Interesting_Ad5748@reddit
How does a gamer get so distracted and concentrates more on hardware than the games, is thee a point in a game where you tell yourself you need a new card, is there a lot of marketing in gaming ? Where is the enjoyment in opening up a computer to upgrade parts how did a box to play games on turn into something that requires a clear side, soAnother side of consumer culture, too, is people buy a lot of stuff to prop up their self-identities -- which I think would be a whole lot less of a problem if people had the time to actually invest in themselves and their interests. Someone who thinks of themselves as an outdoorsy type of person, but who is working themselves to exhaustion and then collapsing in front of a television every night to unwind before bed, is a lot more likely to continually impulse buy a bunch of outdoorsy gear and gimmicky gadgets they'll never use -- just to feel better, to feel like the person they want to be -- than someone who is actually going out camping or kayaking or whatever all the time. And really how much cooler would it be for people to be “producing” and “consuming” stuff like dance lessons, cooking classes, and professional campfire ghost story telling? I think we all understand, spending all our time selling people junk so we can afford to pay rent and buy ourselves a little consolation junk is just the most unhappy and uninteresting way we as a society could be using our time. LEDs can be seen.
TimmmyTurner@reddit
sell that away and get the 9070xt
VastInformationX@reddit
Oh, perhaps a mid tier GPU isn't meant for 4k ultra on latest games?
AdhocAnchovie@reddit
Shhhh nvidia switching is tabu in this sub :))))
EpicSombreroMan@reddit
What CPU are you running? I have a 4070 super with 9800x3d and am running the game with medium RT and quality upscaling (on ultrawide monitor) 1080p and haven't crashed once.
Impressive-Formal742@reddit (OP)
7600x, 32gb ram and I'm at 4k on my LG OLED. I don't want to have to sacrifice turning down resolution because of crashing. It's just crazy they still won't just go for 16gb, except for the high end.
greggm2000@reddit
Not so crazy when Nvidia are limiting VRAM in part to coax buyers to spend more for a higher-end card. I don’t like it, but they do it bc they think they can get away with it.
mostrengo@reddit
Their market share and market cap clearly indicates that they can get away with it.
greggm2000@reddit
Indeed. There’s been enough problems with the 5000-series, and on top of that, AMD isn’t competing at the high-end this gen, and on top of that, the whole AI bit is still going strong, that I hope that if all those things change for 6000-series in a couple years, that perhaps next-gen will be a much better one. Possibly.. but probably not. Like you say, Nvidia can get away with it, most of their profit is elsewhere, they won’t change until they think they have to.
EnforcerGundam@reddit
4070 is not a 4k card lol, thats more suited for 4080/4090
Zaldekkerine@reddit
Nvidia even segments their GPUs for different resolutions. 5060 8GB 1080p, 5070 12GB 1440p, 5080 16GB 4k. If you step above that, most games will still play just fine, but prepare to lower settings sometimes for more demanding or less optimized games.
EpicSombreroMan@reddit
Your CPU might be a bit underpowered for the performance you're looking for, per the recommended specs sheet for the game. If you were to upgrade to the 7800x3d or 9800x3d you might have better luck. Because like I said the game hasn't crashed on me yet.
Have you also set your ram to the proper overclock speed in BIOS?
saysthings@reddit
You said you were running at 1080p ultra wide. Do you not understand that 4k is literally 3x the pixel count of your res?
It's the GPU causing the bottleneck, not the CPU.
TrollCannon377@reddit
Uses a 1440p/4k card at 1080p
Doubts OPs performance while their using a 4k monitor
tan_phan_vt@reddit
Ultrawide as in 2560x1080?
EpicSombreroMan@reddit
Yeah. 75hz, its like 10 years old lol
tan_phan_vt@reddit
Ah ok, no wonder its running fine lol.
4k is a different beast compared to your resolution tbh, its so much more demanding.
BringerOfNuance@reddit
You used ray tracing with nvidia and no ray tracing with amd, what did you expect? Ray tracing bumps up VRAM usage crazy high. What point is with all the vram on amd cards if you can't use it for machine learning or ray tracing?
Miguel3403@reddit
My 3080 does fine in Indiana at 4k dlss b i just had to turn the down the texture streaming setting to medium everything else maxed out , path tracing off , even if it had 24gb path tracing is just too much for a 3080.
That setting does not affect texture quality but how aggressive the gpu is at loading textures
abrahamlincoln20@reddit
Luckily there are always options for the people who absolutely need to crank settings and resolution high and who are willing to play with low FPS.
For most people, high settings and resolution in all games isn't a must, for them 8GB is enough. 3060ti, 4060 8gb, 5060 8gb etc.
For enthusiasts who want high settings and high performance, 4080, 4090, 5080, 5090 etc.
For the minority that doesn't care about high fps, but wants to use high settings and maybe high resolution, but who won't buy a high end card, 3060 12gb, 4060ti 16gb, 5060ti 16gb or any AMD card that has proportionally more VRAM compared Nvidia equivalent in performance. And this segment really is the minority, seeing how AMD cards are wildly unpopular.
GoatShapedDestroyer@reddit
I appreciate anyone willing to change their mind/opinion about things when confronted with new evidence, but I am curious why you thought this in regards to VRAM:
Repulsive_Coffee_675@reddit
Lesson learned. Don't talk about stuff you don't know shit about and experienced yourself. Glad your eyes are open now :)
eaglefan316@reddit
My son plays Indiana Jones sometimes, too (and cyberbpunk, too). I made sure to get him a card with at least 16 gb vram for Christmas, so I got him a 4070 ti super. That card runs basically anything he wants at pretty high settings.
fightnight14@reddit
Who would buy a 12gb card and think every game would run smoothly in 4K? If so then what's the point of a higher end card if a $550 GPU is enough for 4K? Does it mean that a $300 RX 7600XT 16GB is a 4K card? Absolutely not.
Farren246@reddit
I can run it with only 10GB on my 3080, albeit at only 15 fps. It shouldn't be crashing on you.
yesfb@reddit
Stop playing shitty games
chrump4eva@reddit
I have a 4070 super :( but only play 1440p. Hopefully that'll be fine for the next 10 years
OkithaPROGZ@reddit
You aren't wrong. But 16GB still isn't the minimum.
Unless you want to play AAAA games all the time.
You need to ask yourself, are you playing the game because you bought an expensive GPU? or Are you buying an expensive GPU to play a game.
If all you do is play Minecraft and GTA V, you don't need too much VRAM.
Basically your experience is an outlier rather than the norm.
LGWalkway@reddit
Yea I’m not sure what’s up with nvidia and giving cards low VRAM. I’ve got a 3070 and it should absolutely not have 8 gb’s while the 3060 had 12. At this point 16 should be the minimum.
Perplexe974@reddit
4K needs at least 16Go of VRAM from what I’ve been seeing lately with all the news about the 5070.
It’s really funny when even intel offers a 12Go card… …aimed at 1080p-1440p for around 350 bucks
AntiGrieferGames@reddit
Thats Optimisation Dude. Devs cant optimize the Games anymore.
Pucketz@reddit
Did you reinstall shaders?
iskender299@reddit
You need 16GB VRAM for 4K and that's not a secret. Regardless if it's AMD or nvidia.
nowadays 12GB is kinda bottom of the barrel in video gaming, and that for 1440p
Liambruhz@reddit
I went from a 3070 (8GB) which couldn't run Indiana Jones on anything but absolute low settings at 1440p. Any slider I put to medium capped the VRAM.
Then I got my 7900 XT (20GB) and now I can run everything mixed out and over 100fps consistently.
VRAM was a hugh oversight for me when I built my first PC.
Imgema@reddit
Can't wait for the 8GB 5060 to become the most sold out card because we get what we deserve.
Fredasa@reddit
Cyberpunk, a 2020 game, oversaturates a 3080, a really damn nice GPU, without a lick of modding. On my 3080 I literally allowed this to dictate how I play the game, mostly in that I avoided opening the main map unless I truly needed it.
Somebody found an Nvidia setting that really helped with that. I don't remember specifically what it was but it was some kind of minor optimization that's on by default and turning it off can lower performance by like 1%. In exchange, whenever VRAM would get oversaturated, the game wouldn't be permanently stuck in 6fps mode. After about 15-20 seconds it will have cleared up whatever it needed to and I was back in business. That setting saved my ass.
I don't know if VRAM oversaturation always permanently breaks games or if that's just a Cyberpunk thing.
cancergiver@reddit
Im suffering everyday with my 8gb 3070ti, every day.
JeffGhost@reddit
Damn, I almost bought a 4070 super. When I went to buy it the store had bumped the price above what I had saved to buy. Sadly the 9070 XT was way too expensive
rasmusdf@reddit
Where do you find open box cards at decent prices?
serose04@reddit
Everyone was saying 16GB 4060 Ti wasn't a good choice. Time will prove them wrong.
thebeansoldier@reddit
Why not use dlss quality so it renders at 1080p instead?
Impressive-Formal742@reddit (OP)
I did I tried performance etc. It still fucking crashes lol. Either this game is optimized for Amd or it just uses a metric ass ton of Vram, because medium settings and dlss quality it's really good. But flip the Ray tracing and it shits the bed.
Bmacthecat@reddit
low vram shouldn't crash the game, just make it run worse. did you get all of the new drivers you need and delete the old ones?
icantchoosewisely@reddit
It depends on the game how it performs on low VRAM: some games are a stuttery mess, some crash, and some games run great with low VRAM but load textures with a delay or drop them.
grandmapilot@reddit
It depends on engine
icantchoosewisely@reddit
Hogwarts used to be a stuttery mess when it ran out of VRAM, after a patch it started to run fine but to load textures with a delay or to drop them.
Could you change the behavior of a game engine that much with a patch? (really curious because I don't know if you can or not)
grandmapilot@reddit
Yes if engine supports it from the start, but you've just decided to not use this at first for a graphic fidelity.
Impressive-Formal742@reddit (OP)
I did a whole new fresh install since I had free time to do so. So I know it isn't that.
erantuotio@reddit
What about turning texture cache to a minimum? I was running out of VRAM on my 4080 in Indiana Jones and then someone pointed out, its a texture cache graphics option. Not a texture quality graphics option.
thebeansoldier@reddit
Have you tried deleting your shader caches? All your shader caches were set to run on an AMD gpu. So you have to delete all of them when you installed the 4070.
Claymore_Hunter@reddit
Vram is insanely important, especially when you start modding things here and there. Back in the Skyrim days it was already evident we needed way more VRAM, 16GB should indeed be the standard, or better yet, it should be expandable.
THEYoungDuh@reddit
4k is your problem
kociol21@reddit
This game is something special. Path tracing is absolutely obliterating my 9070XT.
Like, I play at 1440p and if I set everything to absolutely max settings with no upscaling, I easily get over 100 FPS in most areas.
If I turn on even lowest path tracing, it drops to sub 30 fps. We're talking over 70% performance hit.
On path tracing medium even FSR ultra performance with frame generation won't help. At this point we probably need another 3 generations to be able to play current games with oath tracing on midrange GPUs.
Although - and I realize this is probably stupid, but I genuinely don't know how this works - I was always under impression that ray tracing would decrease VRAM usage, not increase it.
How I understood it is .. well, compare it to audio, because that's more up my alley - you can use sampled instrument which basically is fancy way to play a ton of wav files and require a lot of RAM, or use synthesizer which doesn't need RAM because it doesn't load any premade audio files, but require mych more compute power.
So I thought ray tracing is like that - instead making tons of textures dependent on lightning, light maps for everything etc. which then have to be loaded into VRAM - we use real time compute to simulate behavior of the light which makes all these hand made light maps and textures not needed.
Is that not how it works?
No_Guarantee7841@reddit
4070 super is not targeted at 4k tbh. Just 1440p. If you were aiming at 4k you should have gone for the ti super.
Narrow_Chicken_69420@reddit
I didn't know Jones was Indian :D
anyway, there is a video from hardware unboxed where the rtx 5070 runs out of memory in that title. It gives like 12 fps or something lol.
LosMechanicos@reddit
To be fair with my 3070 I never had any crashes at 4k and with the 6900xt the uplift was only barely noticeable really
alfisaly@reddit
Indian jones
MrMunday@reddit
What you can try is normal textures with DLSS upscaling, vs high res textures with no DLSS, and see if you can tell the difference
I use a 3080 10gb so i never have enough vram. Hence using regular textures + DLSS is such a good pair because it looks high res but is easy on the vram. Not that it’s the same, but if the difference is small enough for you, it could be a good trade off.
CobblyPot@reddit
The Digital Foundry video on this game did a great breakdown. Basically, the texture pool setting has to be set to match your VRAM but everything else is super optimized. For 12gb, you should be able to run texture pool at high (IIRC on my 4070S I was able to run RT medium and still get decent performance).
Did have an annoying issue where every time I booted up the game after playing it would overflow VRAM and give me slideshow performance, but I was able to fix it by lowering the texture pool to medium, loading the game, then I could crank it back up to high and it would be fine.
theRealtechnofuzz@reddit
This is currently the boat i found myself in one year after buying a 3080 (10gb). It's a crime Nvidia put less that 12gb on this card, should have been 16gb.... I played halo infinite campaign, not even that intense of a game, but at 1440p on ultra, you need 11GB of VRAM. I dont have stuttering or crashing issues, game plays fine. I do however get terrible texture pop in, something that is very distracting when you run into a tree that looks hideous and then it pops into full texture. Is the game unplayable? No. Is Nvidia incompetent and greedy? yes.
corvus917@reddit
I always found it kind of baffling that Nvidia has been so stingy about VRAM since Ampere. I get that they are incentivized to avoid adding generous amounts of VRAM to as a way to encourage more frequent upgrades, but Ray Tracing and DLSS Frame Gen NEED more VRAM, which means that their x070 cards are inherently and unnecessarily handicapped from the moment they’re released.
It’s just so unnecessarily stupid and greedy to handicap the killer features of your own cards like that.
Demi180@reddit
Yeah their specs sheet was clear about needing cards with lots of vram for RT as opposed to the actual beefiest cards.
6retro6@reddit
Nvidia and their shortage of VRAM if you don't buy a 2K+ card sucks.
6retro6@reddit
Take a look at the budget friendly 9700XT with 16 gb or the 7900 with 24GB.
moksa21@reddit
Yeah one of the biggest tech YouTube channels made an entire video on the subject on the exact games and settings you’re talking about. Not sure what was driving your expectations.
catsandcars@reddit
6950xt sux at rt tho
honeybadger1984@reddit
8 to 12 gigs is a well known issue if you crank it up at 4K. Especially with full RT and frame generation.
Per Steam surveys and console users, most people will never fully crank things up to where they need 16 to 32 gigs. But the idea is that anyone on the bleeding edge will be running into a VRAM bottleneck sooner than normal gamers.
Zerog2312@reddit
I've got the 6950xt and it's a really good card. No complaints at all. There was no way I could have gotten 16gb of vram for the same price from Nvidia.
BaxxyNut@reddit
There are certain thresholds for certain resolutions 12GB is a 1440p card. 16GB+ is 4K. You really only need a certain amount, after that anything extra is just a cherry on top for the system that it will use if available. Doesn't mean it actually needs all that. Like RAM, if I have 16GB normal processes will eat like 6-8GB. With 32GB it consistently sits at 15GB. Doesn't need it, but will use if available.
EiffelPower76@reddit
Pretty stupid to replace a 16GB graphics card with a 12GB VRAM one, it was nonsense from the start
Impressive-Formal742@reddit (OP)
Yeah I'm pretty dumb like that.. damn know how to hurt a man's feelings. lol jk with you. I was just curious is all, a great deal popped up and figured id try it out. I mean would you say the same if I replaced a 16gb Vram 7600xt with a 4070 super? Probably not.. but who am I to say anything.
ogioto@reddit
Ray tracing in general utilizes shiteload of VRAM. If not having enough VRAM for a game that maxes or nearly maxes the memory before RT was enabled- better not try using it...