NVIDIA GeForce RTX 3060 12 GB Returns in June, RTX 5050 9 GB Edition on Pause
Posted by StarbeamII@reddit | hardware | View on Reddit | 157 comments
Posted by StarbeamII@reddit | hardware | View on Reddit | 157 comments
MythicalJester@reddit
There is something incredibly wrong within this industry, right now...
CoconutMochi@reddit
The one thing I'm glad of is that the vast majority of game studios seem to have stopped pushing the bar on system requirements, I think people are going to be able to stretch their purchases a lot more than they would've been able to maybe 8 years ago.
Strazdas1@reddit
you are glad that the industry stagnated in worst possible way?
CoconutMochi@reddit
Where'd I say that?
Strazdas1@reddit
Here
CoconutMochi@reddit
how does "you are glad that the industry stagnated in worst possible way?" = "The one thing I'm glad of"? Can you read?
Strazdas1@reddit
You said:
This is you being glad the industry is stagnating in worst possible way.
Its like you dont even know what you yourself wrote.
CoconutMochi@reddit
No, it's you taking stuff I wrote out of context to ask a bait question. Nobody is happy about the current state of gaming hardware but at least devs aren't stupid enough to target $1000+ GPUs for system requirements, I think anyone should be able to glean that much but you're obviously being obtuse just to be hostile to people.
EdiT342@reddit
Getting a 4080 at launch for -£100 off MSRP was the best build decision I've ever made lol. It's still in the top 5 GPUs in benchmarks and the only real upgrade would be either a 4090 or 5090. And it looks like we won't see any 60-series till next year maybe.
I am kicking myself for not getting a 5090 at MSRP during Black Friday. I had the thought of buying it, keeping it for a few months still sealed then selling it on eBay for a bit of profit lol. Knowing myself I would have probably ended up putting it in my system tho
beatool@reddit
I got a 4080 FE at MSRP at launch ($1200) and sold it recently for ($1000). 3.5 years of a kick-ass GPU for $200 ain't bad.
I don't know why someone would want to spend $1000 on a 4080 when you can get a 5070ti 16gb brand new for the same money, but someone did.
fmmmlee@reddit
I got my 4090 for about 1800 USD almost 2 years ago, and looking now the cheapest refurbished one on Newegg is like 2900 USD, it's insane...
building a PC a year or two ago was like the last chopper out of 'nam, my <$250 4TB SSD is now going for 750, my $99 32gb ram is now $500, I think the only thing that actually depreciated was my 13700k and maybe the fans? Lmao
EdiT342@reddit
I was just checking mine, I paid £243 for a 4TB MX500 in 2023 and £200 for 64GB of RAM about a year ago.
That kit is now out of stock but Crucial lists it at £1100. Madness
SireEvalish@reddit
The overwhelming majority of games, especially those that are pushing graphics, have to run on a base PS5/XSX/XSS. They're targeting that hardware. This happens every generation.
NeroClaudius199907@reddit
Stop using logic, majority of devs should target 5090 tier gpus.
Strazdas1@reddit
Majority of devs should target GPUs not yet released because dev cycle is long enough to release multiple GPU cycles in the meantime. If your "extreme" settings arent meant for future GPUs, you did it wrong.
NeroClaudius199907@reddit
Thats not a target, devs main target for this console gen is to get it running well on series s-ps5.
For PCs they can release ultra ultra settings if they want its not going to increase sales
Strazdas1@reddit
Yes, currently devs are targeting lowest common option, causing a severe stagnation in the industry.
NeroClaudius199907@reddit
Industry already facing job issues, no one going to create a new crysis and not sell anything.
Strazdas1@reddit
by job issues you mean it still has more employees than before covid?
NeroClaudius199907@reddit
and they dont want to lose anymore, thats why we havent seen a new crysis. Everyone understands risk
zghr@reddit
If a dev is aiming for for shiny graphics, advanced settings should have a simple toogle "future proof" that maxes out LOD, draw distance, any light-bounce based effects and similar. For future generations.
Prasiatko@reddit
Is that what Crysis tried and then got panned by people complaining they cound't run it on max setting s on the latest 8800 Ultra and thus the game was unoptimised trash.
Strazdas1@reddit
No. Crysis issue is that they expected single core CPUs to continue advancing and just put all their AI in LUA scripts that were singlethreaded. This is why the worst FPS drops in Crysis is when you raise alarm and all AI activates and why its still an issue on modern CPUs.
zghr@reddit
Yeah, GPU manufacturers who often work with AAA devs also might not like it. "What do you mean our 5090 only gets 25 fps?? That's bad publicity!"
Antrikshy@reddit
I think it's partly because games truly look good enough and making them look better isn't top priority for sales anymore.
Seanspeed@reddit
Games have always looked 'good enough' by the standards of the day. I remember seeing Final Fantasy 7 and being absolutely awestruck at how amazing it looked back in the 90's. We only ever look back and see graphics as dated because developers continually push and set new bars and standards. People's imaginations tend to be pretty terrible, so it really takes developers pushing things before people understand how much better things can still get.
I personally find it tremendously disappointing that we're plateauing in terms of graphics and ambition. The way things are going, there might genuinely never be another truly large leap in graphics again. And today's graphics are, whether people realize it or not, still extremely 'gamey' looking and not remotely close to as good as things could get. But that would require hardware to continue to get big boosts in performance per dollar over time, and that's just not happening. So while hardware may get more powerful, what companies can offer at an affordable price point isn't getting that much better. Some of this is financial reality, but some of it is also just companies getting greedier and greedier.
Just sucks.
Antrikshy@reddit
Several games have come out \~10-15 years ago that I feel look so good, they will never age.
Games don't have to look like real life. They can look better, through artistic flair, and good looking art styles don't need realism.
Seanspeed@reddit
Of course not all games need to look like real life.
But it's massive misnomer that graphics advancements are only to achieve real life visuals. Improvements in graphics tech generally increases the overall freedom for developers/artists to do what they want to do. Indie games also benefit hugely from graphics tech advancements, for instance.
Better graphics are always something gamers like. Whether you want to admit or not. Imagine if graphics and tech had stalled in the PS1 era. Sure, there were still plenty of great games then, but we'd never have a game like Red Dead Redemption come out. Skyrim couldn't exist. The Last of Us couldn't exist. Elden Ring couldn't exist.
You only think it sounds ok to stall now cuz you have limited imagination for what games could be in the future.
Antrikshy@reddit
I really think there is a difference. My claims were made based on a look back, not the future.
In the PS1 era, could we look at 10-15 year old games and be satisfied with the graphics? I can do that now with Arkham Knight, Tomb Raider (2013), Uncharted 4, Doom (2016), Far Cry 3.
Horizon Zero Dawn came out 9 years ago and it’s my go to example of a game that looks so good in its own way, it will look just as impressive another 10 years from now.
Strazdas1@reddit
This shows how horrible stagnation became, and also your extremely low standards.
Seanspeed@reddit
This is a specific result of this new generation NOT pushing the hardware and ambitions anymore.
That's the point here. Every game nowadays has to be 60fps, for instance. This already essentially cuts in half the potential of the new hardware.
Horizon Zero Dawn also has loads of very limited visual aspects. You're really falling foul of what I'm talking about. Not understanding how much things could be better cuz you cant imagine it/haven't experienced it yet.
People felt the same way about a game like Wave Race 64 at the time.
Antrikshy@reddit
I have seen newer games than those. They are clearly nicer and more realistic looking on average. Despite having seen the advancements with my own eyes, I am more than satisfied with mid-2010s graphics, at least from the higher budget games.
kekmanofthekeks@reddit
And yet, technical advancement will continue despite your boomerisms. The takes on this sub, man.
Strazdas1@reddit
Yes they do.
Not only good looking art styles need high quality assets, majority of games do not benefit from heavy stylisation.
cadaada@reddit
I play games for gameplay not graphics, if we would actually reach a plateau i would be more than happy, we could finally be free from this importance devs give to graphics and advance on the gameplay, something that AAAs are mostly the same for more than a decade.
Seanspeed@reddit
Everybody says they care more about gameplay than graphics, but every time graphics take a leap forward, everybody also loves it. Games like Uncharted have little going for them on a gameplay-basis, but their presentation values are through the roof and so people love them.
Either way, improvements in graphics improve the medium overall. It's not all just superficial by any means. Immersion, presentation quality with characters, animation, draw distances, great particle effects, exploratory design, etc - all these things can be improved by advancements in graphics tech.
NeroClaudius199907@reddit
Uncharted games pushed their games beyond what the consoles were capable of?
proto-x-lol@reddit
CoconutMochi said:
Not even. Game devs are genuinely scared to push anything graphically intensive that pushes beyond 8 GB VRAM for GPUs. Doing this means the devs are held liable for PIPs and are the next target for layoffs.
Ask former Treyarch, DICE, Respawn and other devs in big game studios. That’s sort of the reason why some games haven’t gotten much better after 2022 and later graphics wise.
NeroClaudius199907@reddit
stopped pushing system requirements? Almost every other triple a game now needs upscaling, and the whole move to ue5. Theres just diminishing returns now because they cant push graphics further than what consoles can allow.
I wish more devs target Callisto protocol facial technology and graphics. I'll be satisfied
Chrystoler@reddit
My 10GB 3080 keeps on trucking along, still rocks for 1440p
psiphre@reddit
This is the card I’m still using. I have yet to stress it even gaming plus three extra monitors
Chrystoler@reddit
Yeah, I'm not worried in the least for now. If anything, I want to upgrade my monitor to an OLED but my current monitor is still working fine so I don't think I'm doing that anytime soon.
ThrowawayusGenerica@reddit
And by "right now", you mean "for the past decade or so"?
LukaC99@reddit
The thing wrong is the supply crunch. The downstream suppliers, TSMC, Micron, SK Hynix, did not build enough capacity to serve demand. We're all paying for that, since not every customer can be served, so demand must be destroyed.
EnglishBrekkie_1604@reddit
Not their fault Sam Altman went to each memory supplier, bought 40% of their future DRAM production, and didn’t mention he’d done the same thing with the others.
jenny_905@reddit
You could argue that it makes sense to keep older stuff in production if it's still viable and the 30 series very much is.
Of course it should be a lot cheaper than it will be.
WildVelociraptor@reddit
If they'd kept it in production, it would be great. However, we seem to be resurrecting older tech stacks that were discontinued, which implies that they can't meeting existing demand with their current products.
jenny_905@reddit
3060 was in production, I'm not convinced it ever really stopped but many tech sites claim it was. If it did stop then it hasn't been for long.
Like, for the entirety of the 40 series and a big chunk of 50 series it was in-stock and available globally. People speculated that it was either in production or they had such a giant stockpile, it was still a strong seller too.
poorlycooked@reddit
keeping older cards in production as a lower end product is consumer-friendly; already happens with CPUs
actual prices may not necessarily be palatable in this case though
Plank_With_A_Nail_In@reddit
People buying what they are selling so they must being doing something right, whats wrong is people crying about every little thing, its nvidia's card they get to do what they want with it no need to have a go at them for it, don't like it don't buy it.
NeroClaudius199907@reddit
The Return of the King
Sh1rvallah@reddit
This isn't about 1080 ti though.
NeroClaudius199907@reddit
1080ti is doa in 2026 unless they add rt & port dlss
Sh1rvallah@reddit
Just saying it's the king, respect the king. The 3060 is not the king.
Strazdas1@reddit
Its not the king and it was never the king.
HayabusaKnight@reddit
It is but a crown prince to the true king, the 8800 GTX
excaliburxvii@reddit
God that generation was such an insane leap forward. Even comparing the best of what came before it, it was going from Half-Life 2 to Crysis. I miss that progression.
WildVelociraptor@reddit
You seem to be confusing a meme with relevant info
Sh1rvallah@reddit
I'm allowed to dispute the use of the meme on an unworthy product
_Flight_of_icarus_@reddit
It's crazy to think a nearly decade old GPU would still be viable today if it did have RT/DLSS support.
I doubt we'll ever see a GPU that good again.
-WingsForLife-@reddit
2080ti.
Strazdas1@reddit
the 1080 ti aged horribly compared to 2060.
ExplodingFistz@reddit
King of the entry level. The 3060 12 GB is a great card. Decent performance for 1080p and has adequate VRAM. Could probably run it at 1440p with some DLSS too. It's aged incredibly well.
Sh1rvallah@reddit
3060 isn't entry level and let's be real, it was a disaster trying to get one for the first year+ it was out. It was also an underwhelming card until 8gb cards hit vram ceilings. $70 more for the 3060 ti and a 30% performance uplift.
Seanspeed@reddit
How have people forgotten about this? :/
For just $70 more, the 3060Ti was way more powerful.
Antec-Chieftec@reddit
Expect by the time 3060ti was truly 70 dollars more the lack of vram of 8gb cards were fully exposed.
THXFLS@reddit
It was barely faster than the 2060 Super while the 3080 and 3070 left the 2080S and 2070S in the dust. 3060 Ti was much faster for (at MSRP at least) not much more money. 12GB helped it age well, and I’m very jealous of those 2GB it has over my 3080, but let’s not go crazy here.
Malygos_Spellweaver@reddit
If my 4060 mobile can run 1440p+ with DLSS, surely the 3060 can.
pythonic_dude@reddit
40 series run dlss 4/4.5 much better though.
Malygos_Spellweaver@reddit
You're absolutely correct. Still, even if DLSS has to be lowered, the IQ should be reasonable, ofc will depend on how much these are priced at.
ShadowRomeo@reddit
Except 1080 Ti would literally be worse than even a 3060 in modern games today that utilizes DX12 Ultimate API and Direct X Ray Traced only titles as well as relying to proper hardware AI upscaler to get good image quality. As much as how i respected and dreamt of that GPU back nearly a decade ago, it's time to retire it, as it fully deserves.
DiggingNoMore@reddit
I see literally zero difference between my RTX 5080 and my GTX 1080. Games look exactly the same.
Loading times are faster, but that's presumably due to my 9800x3d instead of my i7 6700k.
BavarianBarbarian_@reddit
Do you play in 720p?
DiggingNoMore@reddit
1900x1200 60Hz monitor.
BavarianBarbarian_@reddit
What games do you play? In anything recent, you should see a substantial difference in frame times between the two GPUs. Did you maybe set the monitor to a lower framerate?
DiggingNoMore@reddit
Rocket League, Dead By Daylight, stuff like that. I do have Chained Together and Slay the Spire 2, which are both newer, but I have to be seriously pressed to spend more than $10 on a game. I'll make another exception for GTA 6 when it releases. The Monster Hunter Wilds benchmark tool said I'd be able to play it on Ultra at 137 frames per second.
But since my monitor is 60Hz, it cannot display more than 60 frames per second. My GTX 1080 had no problem pushing Rocket League at 60fps on 1900x1200, so both it and my RTX 5080 max out my monitor's display.
zerinho6@reddit
If your display output standards are out of the GTX 1080 era and you mainly play games that are only as heavy as games from that era then you won't feel the difference indeed even with a 5090, I have a RX 9060 XT which means I would be able to do and run anything in existence at perfect performance if I were targeting 1080p, but I have a 1080p 300Hz monitor which made my upgrade from RX 6600 massive in image fluidity and frame time, I too play a lot of simple/live-service games like you but the times I play something like requiem and bf6 I thank myself I got such a newer card.
BavarianBarbarian_@reddit
I can't imagine the 1080 pushed steady 60 fps on Dead by Daylight or MHW.
Also, I'd recommend investing in a 1440p 144 fps IPS monitor. They've come down in price a lot, and the upgraded visuals are worth it.
DiggingNoMore@reddit
MHW, surely it could not. But DBD ran just fine (until they upgraded to Unreal Engine 5 and it stopped working on Windows 7).
One day, I'll get a nicer monitor.
But I built this $2,500 computer in February of last year, intending to get a new monitor at Black Friday. However, my wife and I ended up going on an Alaskan cruise last September, so I was more strapped for cash than I expected. Then, in October, my dream car finally became available for sale within my budget (the cash that I had been setting aside for this purchase for well over a decade), so I bought it.
Between our various cars in December through February, I ended up replacing three flat tires, fixing a door, and fixing a coolant problem. Then, in February, our backyard tree fell on our house.
I'm supposed to take my family to Disneyland for our vacation this year.
I just haven't been in a position that I can justify replacing a monitor that works.
Omniwar@reddit
When it rains it pours, I guess.
If you really want, trade your 5080 to someone for a 5070ti and $200 cash (someone will take that deal - verify the card works first though) and go get a new monitor. Those 1440p fast IPS screens are regularly under $200, even as low as $150. I'm assuming you're on something like a Dell U2412M.
DiggingNoMore@reddit
Heh, if I really wanted to, I could sell my 96GB DDR5-6000 CL30 and downgrade to 32GB like a pleb.
Sh1rvallah@reddit
And? What does that have to do with the 1080 ti being the king GPU?
I'm not asking for it to be remade.
jnf005@reddit
3060 has been top of the steam hardware chart for years now at this point, 1080ti was an awesome card and probably one of the best ever relased, I knew it, I had one, but 3060 has been the king of the common people.
Sh1rvallah@reddit
That's completely irrelevant. Lower cost cards get higher representation, that doesn't make them better.
zerinho6@reddit
Yes it does, more people will care about a good and cheap 60 class card that is also the most used all around the world representing half the market than to make the strongest card which costs more than they make an entire month and this don't even care about it.
Seanspeed@reddit
Crazy that people have forgotten the 3060 was, 12GB aside, somewhat disappointing. The 3060Ti and 3080 were the highlights of the Ampere lineup.
_Flight_of_icarus_@reddit
Even if it's performance was just OK, I think it's the great VRAM panic of 2023/24 that did a lot to inflate the reputation of the 3060, with all those hardware reviews and videos coming out that showed new games stuttering on 8 GB models.
Not a bad card, just the 3060 Ti and 3080 were the standouts as you say.
Antec-Chieftec@reddit
I remember people were pointing out after the release of doom eternal how 3060 would outperforms 3060ti at ultra nightmare 4k.
Seanspeed@reddit
Yea fair enough.
antaran@reddit
The 3060 was a pretty shitty card back then, heavily criticed for it's comparatively sub-par performance.
Vaxtez@reddit
I'll be curious to see the price. If it's at old MSRP, this 3060 re-release is probably near DOA. However, if for whatever reason the 3060 is a £200 GPU, then it might actually be a solid purchase for people wanting new parts.
jenny_905@reddit
It was £250 for years so... yeah, I suspect they'll re-introduce it (it hasn't really been gone, just out of production briefly) around that price.
FatalCakeIncident@reddit
Seems plausible. For comparison, the cheapest 5060 8G is currently £270, so £20 less for a slightly slower GPU but more VRAM seems a reasonable offering.
Still seems mad that 5yo parts are still so valuable though. I'm not knocking - I still run a 3080 and 3090 in my workstations, but the fact that these old cards are actually appreciating in value seems do wild to old me, who cut my teeth in the era in which a 5yo part was practically worthless.
jenny_905@reddit
Yeah it is silly. It's 2020 hardware, it should be £150 or less.
Of course for a lot of games it basically matches the 5050, it lacks support for the latest features though which probably see it fall behind in newer titles. That it can be worth £250 (or more, I'm sure some will try) is... a sign of the times.
It has been such a strong seller though, all the way through the 40 series life because they never really replaced it. It's a great choice for people who just want something that works and is unlikely to throw any surprises at them due to its adequate amount of VRAM.
hardware2win@reddit
Thats not how you decide prices
jenny_905@reddit
It is how we used to do it.
Strazdas1@reddit
No it isnt.
imKaku@reddit
I wouldn’t call 40-50% performance difference slightly different.
FlyingBishop@reddit
what are you measuring performance in? dollars per FLOPS or FLOPS per watt? The latter is probably more interesting, though if we're talking about budget GPUs the former might be what actually motivates people.
Vb_33@reddit
Overall gaming performance in fps.
FlyingBishop@reddit
That's essentially driven by dollars per flops, but flops per watt still matters.
n0stalghia@reddit
Good for you. The market is not you, though. Which is corroborated by gigantic power-hungry GPUs selling really well for generations on end: 3090, 4090, 5090.
kwirky88@reddit
If we’re talking about the market as a whole and not the enthusiasts of this sub then most desktop pc owners own brand name desktops with anemic power supplies. Power does matter.
FlyingBishop@reddit
Actually the market is not you. Power-guzzling gaming PCs are a tiny slice of the market; most people buy integrated graphics and laptops, even most people who play 3D games don't buy those kind of graphics cards.
antaran@reddit
Nobody goes into a shop, planning to buy a dedicated graphics card, and then asks "how much efficancy per watt do I get", lmao. They ask "how much frames in Cyberpunk 2077 will this card make".
n0stalghia@reddit
The 3060 12GB is not a handheld or laptop GPU, so I don't see how that's valid. It's specifically for desktop PC users.
People who buy integrated graphics and laptops won't buy either the 3060 nor the 5060.
FlyingBishop@reddit
desktop GPUs compete directly with all of those things
imKaku@reddit
Sorry, but no.
FlyingBishop@reddit
On the contrary, which is why everyone buys laptops and desktop gaming rigs are very niche.
Xpander6@reddit
Which GPU do you mean by "the former"?
FlyingBishop@reddit
No I mean FLOPs per watt is more interesting to me than dollars per FLOPs.
kwirky88@reddit
The rav4 hasn’t changed for 20 years. GPUs could end up the same.
Vb_33@reddit
The 5060 is about 45% faster than the 3060. No way in hell is that "slightly slower".
Vaxtez@reddit
The prices of used AMD RDNA 2 GPUs do feel baffling. There's little reason to nab an RX 6700 XT for £250 when used 3070 Tis are the same amount.
resetallthethings@reddit
I nabbed a used 6700xt in 2023 for $200
kinda crazy that's still a steal 3 years later
IguassuIronman@reddit
I've honestly been fairly pleasantly surprised by used GPU prices. Sure the 3000 series is fairly old but they're holding up pretty well at 1440P
jenny_905@reddit
3080 for £300 or less is still my pick, that thing holds up very well.
BavarianBarbarian_@reddit
I can attest to the staying power of the 3080 12GB, it deals well with everything I've thrown at it. Probably helps that I mainly play older games, but last year I treated myself to Hellblade 2 and Space Marine 2, both of which worked nicely on that card.
Vaxtez@reddit
I find that Ampere GPUs minus the 3050 & 3060 12GB are pretty good value used. Especially the 3070 & 3080s. Turing is also pretty good value too.
The only RDNA 2 card that I consider decent value used is the RX 6600, which can be had for £130-140 at times.
HayabusaKnight@reddit
Remember for 2+ years the constant beating drum of 8GB is dead and a waste of money, so all of those cards you see at good value are 8GB VRAM while the cards at 12+ maintain a high resell.
TrippleDamage@reddit
12gb vram will 1000% not sell for 200gbp.
its gonna be 250 at best.
jenny_905@reddit
They'll presumably price it competitively with B580 so I'd expect £250 as well.
kinkycarbon@reddit
Nvidia isn’t going to discount the hardware for past prices on used market. If they can sell at MSRP or higher. They will try.
TheAppropriateBoop@reddit
Not ideal, but if it keeps entry-level GPUs affordable, it’s not the worst outcome.
bubblesort33@reddit
What does "returns" mean? Nvidia claims they've been selling it this whole time, and even though it's terrible price in most places, it might be true.
flgtmtft@reddit
That's fucking insane. Rerealising 5 year old GPU
DukeofVermont@reddit
I'd buy it for $100, I have a 2060 super. Unfortunately it will not be $100.
Beautiful_Ad_4813@reddit
excellent, I can't wait for the trusty 3060 to come back out.
3 machines will get ' upgrades '
ImSoDoneWithUbisoft@reddit
If it costs more than $150 then good luck.
hackenclaw@reddit
I think Nvidia can canned the 5050, it is a seperate chip compared to 5060/5060Ti.
the only place they needed 5050 is laptop, but laptop 5050 also use more expensive GDDR7 which is in short supply now. And Laptop 5060 on average only cost $75 more than laptop 5050. So there is really no reason to have 5050 occupy high demand 4nm and GDDR7 capacity.
If they can sell the 3060 below 5060, it can fill the 5050 role, the capacity needed from 4nm/GDDR7 can help for higher end GPUs. 5050 is a high volume product, canning them will free up a lot of manufacturing resources.
Vb_33@reddit
The 5050 uses GDDR6 not 7.
JorgitoEstrella@reddit
The desktop one, I think the laptop one weirdly enough uses gddr7
jenny_905@reddit
Yeah the laptop version does.
It's the better version of that chip for sure but also slightly awkward as part of the laptop product stack given they've got 5050-5070 Mobile all with very similar specs.
skylinestar1986@reddit
A desktop 5050 can sell very well if it doesn't need a dedicated power connector.
Seanspeed@reddit
This is a very overrated advantage, especially for DIY market.
hackenclaw@reddit
the market for that is very niche, compare to most consumer that doesnt mind that. Fulling majority from shortage is actually the priority.
Seastorm14@reddit
Eh, the RTX 3050 6GB got a fanbase for that exact reason.
Businesses or universities throwing out computers with i5-11400k's, 16GB DDR4, and 512GB Nvme's for $100-$200 and slapping in the 3050 6GB with power directly from the PCIe slot made budget gaming a thing again.
Yeah its 1080p and maybe medium-low, but its technically as cheap as console entries (even cheaper now that both are $650+)
But I think a low power single slot 5050 that runs off the PCIe slot would do really well for the budget/entry side of things.
AdeptFelix@reddit
Crazy that we've now reached the point that we are quite literally going backwards in computer hardware releases. Fuck this AI bubble and everyone enabling it.
Darksider123@reddit
One of the biggest players in this space is Nvidia
Darrelc@reddit
Gotta let folk have their FPS interpolation.
Mark my words, Nvidia wont be selling you a 4k card anymore soon, it'll be a "4K experience"
Seanspeed@reddit
Frame gen is a pretty good use of AI.
This is not what is driving the problems at the moment. Nvidia's resource dedication to graphics AI stuff is probably pretty minimal overall compared to everything else they're doing.
Appropriate_Name4520@reddit
why was it ever discontinued in the first place
resetallthethings@reddit
because it's 2 generations old....
Appropriate_Name4520@reddit
Yeah but could have worked as a cheap GPU even if the market was more normal. People still bought those mini ps1 consoles in the early 2000s...
EnigmaSpore@reddit
watch this shit be $350 and not $200 like it should be in 2026.
StickiStickman@reddit
Why would they price it higher than a 5060 which is way faster?
FrikandelebroodjeNL2@reddit
Just let the 3060 fucking die already
Verite_Rendition@reddit
Hey /u/StarbeamII
Can we please get this re-flaired as a rumor. This is not confirmed news.
To quote TFA: "According to well-known hardware leaker MEGAsizeGPU on X"
Dreamerlax@reddit
Could be useful for local LLMs if cheap enough
GalvenMin@reddit
AI fucks the whole ecosystem --> Nvidia resurrects tech that's half a decade old --> yay, let's use it for AI
PaulTheMerc@reddit
I mean, makes sense. Not good enough for Corporations, but good enough to run AI at home.
gabeandjanet@reddit
3060 was a budget 1080p card before ue5 came out.
Now its a 30-50 fps card at 900p in ue5 upscaled to 1080p
Dont buy this gpu unless you just play esports games.
3060ti second hand is a far better card and could be gotten cheap last time i checked because its only 8GB
Dangerman1337@reddit
Wonder if this 3060 will have a non clamshell design with 2GB modules vs OG clamshell design.
hackenclaw@reddit
They probably go with 128bit 8GB vram but with faster 18/20gbps GDDR6 to make up the lost of bandwidth. RAM is very expensive, not like they can sell these 3060 for more.
jenny_905@reddit
Never saw a 3060 with a clamshell design? not saying one didn't exist but they always seemed to use 6x2GB modules, usually the same PCB with space for 8 modules per side and also used for 3070.
blackbalt89@reddit
Checks calendar, what year is it?
PhantomWolf83@reddit
Gonna grab one to pair with my 5060 Ti 16GB for local LLMs if the price is right. 5060 Tis are either sold out or going for real bad prices in my country (I'm lucky I got mine early before they were gone). It's going to be slower at inference without GDDR7 but 12GB is still better than 8 for my use case.