AMD Radeon RX 9060 XT features 32 RDNA4 CUs, 8/16GB GDDR6 memory and PCIe 5.0x16 - VideoCardz.com
Posted by Cute_293849@reddit | hardware | View on Reddit | 47 comments
1mVeryH4ppy@reddit
AMD could've used a 12GB configuration which would be a spit on nvidia's face. But once again they chose to follow nvidia's steps. Corporate is not your friend. Let's see if Intel will offer something interesting.
TurtlePaul@reddit
There isn’t really a big supply of 3 GB modules out there, so they really couldn’t make a 12 GB card from a 128-bit memory interface.
the11devans@reddit
That's a problem of their own creation. 6700 XT, 7700 XT, both 192-bit. Did they just forget how to do it?
GenericUser1983@reddit
AMD decided well over a year ago that this gen was going to be basically a placeholder while they got the real next big graphics architecture ready (UDNA). So they went the cheapo route & only designed 2 chips, a low end one that will be going into the 9060xt, and a medium-high end one (for the 9070 xt) that is basically a simple doubling of the low end chip. This is also why AMD with cheap and readily available GDDR6 instead of the GDDR7 Nvidia is using.
Strazdas1@reddit
But they already had two placeholder gens.
changen@reddit
RDNA1 was 100% place holder as it didn't even have the "doubling" for the larger chip.
RDNA2 was VERY good and competitive with the 3000 series as it did have the double sized chip (6900xt 80CU is a doubled 6700xt 40CU).
RDNA3 was supposed to be good, but their chiplet experiment basically failed.
RDNA4 is the placeholder for UDNA, with a doubled medium sized chip (32/64CU). Big RDNA4 with the 80CU could have been competitive with the 5080, but it was not worth the engineering cost.
Strazdas1@reddit
RDNA 2 was placeholder dead on arrival that noone actually wanted.
RDNA 3 was a chiplet experiment failing so they decided to sell none of them hence 10% market share drop.
RDNA 4 they decided not to compete at all, because they werent able to. But they sold you a nice story.
And people still believe the next gen will save it.
changen@reddit
RDNA2 was not placeholder at all lol. It was competing with 3090 in raster in most games as RT was non-existent back then.
I would say that it was the only recent gen where AMD was on equal standing with Nvidia (with the last one being the 7970 vs the 680 lol).
Strazdas1@reddit
No, the RDNA2 RT was nonexistent. Nvidia users were enjoying RT without issues.
changen@reddit
I had a 3080 lol. I was not enjoying RT at all.
It was a gimmick until games forced it (wukong, indiana), then it became a requirement and not a gimmick.
RT was the equivalent of physX or hairWorks or w/e other tech that was pointless, sure it's nice to have the option to turn it on, but it's completely optional and extraneous.
Strazdas1@reddit
But PhysX were awesome (and have evolved into a engine-integrated physics that great many games use today). Hairworks, especiallt TressFx and PhysX solutions were awesome and made the characters look so much better. It was not pointless. Neither was RT.
Ninja_Weedle@reddit
If this is their placeholder gen then I have high expectations for UDNA
Burns504@reddit
No, they want to sell less for more.
ThrowAwayRaceCarDank@reddit
Couldn't they just use a 192-bit memory bus, like the RTX 3060 did? That came with 12 GB of VRAM.
Strazdas1@reddit
no. Not a chip this small. Every extra memory controller takes space away from compute area.
noiserr@reddit
They could but the chip is too small for that, it's only 153mm^2. You need lots of Phy edge area for these wide memory buses. B580 is 272mm^2.
They could have used a 96-bit bus though. This would also give you 6GB and 12GB (when mounted on both sides of the PCB in clamshell configuration).
But then you would also only have a 96-bit bus, so a slower GPU.
Tuna-Fish2@reddit
Yes, and it would probably have been a better card.
But that decision had to be taken ~2 years ago, and they didn't. Now they have what they have.
uzzi38@reddit
There literally isn't any 3GB GDDR6 modules, forget having enough supply of them.
Kryohi@reddit
Intel was the first to release in this performance bracket... They won't have more for quite some time, until the next gen.
Also, a 12GB config on this 9060XT would have been a very bad tradeoff, this thing at 3.13GHz will already be bandwidth starved in some games as it is, with a 96bit cut down bus it would become much worse.
I also think a 192bit bus on a 153mm2 die likely wouldn't even be possible ( would be happy to have someone with more insight on this).
ryanvsrobots@reddit
Quite simply, it's planned obsolescence.
DerpSenpai@reddit
Not really, AMD doesn't have a choice in this. If they made a 12GB SKU compatible card that would mean increasing heavily on costs for all SKUs, going for pricier memory or making it memory starved.
They can make a 9060 12GB though with 96 bit bus
Kryohi@reddit
The 8GB models in a way yes, although I bet we'll see more of the 16GB ones.
ryanvsrobots@reddit
In a way?
https://www.reddit.com/r/hardware/comments/1kg25i6/hub_rtx_5060_ti_8gb_even_slower_than_the_arc_b580/mr1ikwe/
Kryohi@reddit
And?
I'm not going to write a comprehensive answer to obvious low quality bait.
ryanvsrobots@reddit
You wrote more than you did in your original comment brother
GenZia@reddit
9060XT will have GDDR6, just like its larger brethren.
To deliver 12GB on a 96-bit wide bus, AMD will have to move up to 24Gb GDDR7, which should give it slightly higher bandwidth than 160-bit RX6700 non-XT @ 336 GB/s.
ThermL@reddit
The 9070GRE is 192 bit on a 151mm2 die. AMD literally already makes the correct product.
Dangerman1337@reddit
If there was 3GB GDDR6 Modules it would've been perfect for this.
mishka5169@reddit
Please, have the 8GB be a Chinese exclusive. And stock up on the 9070s. That should carry you over to the year end, AMD.
Anything else is silly at best and a blatant error at worse.
cabbeer@reddit
dude, that's so mean, why should they be stuck with 8gb
Strazdas1@reddit
I think hes more hoping its limited edition run than shafting chinese.
mishka5169@reddit
I'd prefer no 8GB cards, but to be clear, the idea is more so "give the cards to only one market" = there's enough of it; "that one market can tailor it to a specific use" = OEM and cyber café.
OEM, prebuilt and cybercafés are in a particular use case where they can still make money off of cheap cards and they can point their customers to a fair use for these cards (MOBA, FPS and other smaller and/or older popular games, like MMO).
With these conditions, there are many Asian market where that's a huge portion of gamers (Korea, China).
AMD (and Nvidia) have historically released products for a single market or purpose, namely, in China. So I picked China for their cybercafés gamers population.
PS. but yes, part of it is they btch and man much less about those type of deals than the rest of the world, for "whatever" reason and thus, the 8GB won't get a bad rap on the 16 GB model, if the latter is priced right.
Leo1_ac@reddit
Most important thing is waiting to see if AMD will pull the same underhanded BS they did with their 9070/XT launch wherein they launched a minimal amount of cards at MSRP to be sold at Microcenter ( they subsidized the AIB's to sell at MSRP) and then when they stopped subsidizing the cards the jumped up in price +$150 to $300+.
ButterscotchFew9143@reddit
Here's to hoping that NVIDIA will react by lowering prices of their 5060 series and in turn AMD does the same, but NVIDIA seems so unconcerned with anything that is not datacenters that I guess this will never happen.
GenZia@reddit
That's a nice break from the x8 nonsense, though I can't say I'm too happy about the 32 CUs.
Given the the specs, the 9060XT is essentially a 9070XT chopped in half, which basically puts it in RX7700's ballpark in terms of rasterization.
Still, it largely depends on the MSRP.
Would be interesting if they sell the base 8GB variant at sub $250. That would finally give us the true successor of RX580.
mrblaze1357@reddit
Eh not so much the B580/B570 is around that performance/price threshold but with more vram on both cards.
DerpSenpai@reddit
The B580 is a 4060 competitor, not a 5060 Ti... competitor
GenZia@reddit
The CPU overhead is still a problem with Arc, and that's ignoring its optimization issues.
Not a big problem if you've at least an R5-7600 or better. But most users looking for cards in $250 range are still stuck with R5-5600, if not the 3600.
Personally, I'd much rather get a used RX6700.
OutrageousAccess7@reddit
B580/B570 >> are these still relevant? in price or perfomance, they aren't. good luck to find these product for glorified msrp at $250/220.
ZGMF-X09A_Justice@reddit
so this is a 7700xt in raster, but with way better RT?
ParthProLegend@reddit
And fsr 4. Overall a much better card i hope
MasterLee1988@reddit
Yep. And if they can get the price right then it'll be fine!
ParthProLegend@reddit
Yes. Pricing is the most important part apart from availability
_comicallycluttered@reddit
I'm wondering how much the core count impacts FSR 4.
I know we're talking about rasterization here, but if it's able to utilize FSR 4 to its full potential (or at least close to it), then the 16 GB model might be a decent investment for mid-budget(-ish) builds, depending on the price.
As someone who's currently stuck trying to decide between a 7700 XT and 9070 (because there are literally no other 7000 cards are available where I live except extremely expensive 7900 XTX models), it could be a decent middle ground for me, but who knows. Could also be a terrible option. Guess we'll have to wait and see how it performs in comparison.
floof_attack@reddit
Hopefully the price and supply are good. Right now in my area there are the 16GB 5060 TIs at ~$480 in stock now.
If AMD can't do the Nvidia minus $30 it is going to continue to be a disappointing year for GPUs.
ThatBusch@reddit
Pleasantly surprised by it being x16, although 8GB still sucks
AutoModerator@reddit
Hello Cute_293849! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.