As someone who plays basically everything in 1440p quality mode AMD would be a real alternative if the FSR4 adaptation wasn't advancing at such a glacial pace.
I am confident that they'll get there in a year or so but for right now I'd always shell out the fifty additional Euros for the 5060Ti 16GB
I honestly don't think the 5060ti 16gb deserves as much criticism as it gets. Sure it does not have raw power. However, the 16gb ram complements the extra features it has and more than makes up for the lack of raw performance.
I can comfortably run it on a 1440p monitor with max settings, RT, DLSS and Framegen. Sure I can't do that without Framegen but it exists and it's pretty good tech for the average player who is the one getting such a card.
150 to 160 dollars more for 10 more fps is not the smart choice as you could achieve the same base performance as a 9060xt by buying it OC edition but yes when it comes to dlss
DLSS > FSR I agree ,that the image on fsr really looks blurry and dlss does a better job at upscaling but I hate frame gen of both sides cause of the latency and fake frames
Yeah the problem isn't a glacial pace, but that they had to start from zero FSR4 games when most any DLSS 2 games support overriding to the transformer model.
The article is so PAID, that even tests are handpicked in favor of 5060, also the price comparison goes by MSRP and only 5060ti price is mentioned on Amazon, while no mention of 9060xt price there.
What matters is the actual price in your region. Deciding which card represents best value for money based on MSRP is nonsensical.
Here in the UK, the cheapest RTX 5060 ti 16GB I can find is £390 and the cheapest RX 9060 XT 16GB i can find is £315. Meanwhile, the cheapest RX 9070 XT I can find is £650. So it's more than twice the price of its little brother and 67% faster than the Nvidia card, while being around 50% faster. So there's definitely a market for the other two cards, unless you have a spare £300 under the mattress.
The 9070 xt is 72% faster at 1440p and 82% at 4K over the 9060 xt. https://www.pcgameshardware.de/Radeon-RX-9060-XT-16GB-Grafikkarte-281275/Tests/Release-Benchmark-Preis-Specs-1473547/3/
I use msrp for the comparison because the 9060 xt does have an acceptable price ratio to the 9070 xt if it was $700, but not at $600. And the 9070 xt is overpriced at $700.
This tells me AMD didn't really plan to sell the gpu at $600 and this 9060 xt basically comes pre-scalped from the factory. I think it should be at least $330 to maintain parity with the 9070 xt that should be $600.
I got my 50% faster figure from the relative performance at 1440p tables on Tech Power Up and Hardware Unboxed.
Also, comparing the relative worth of GPUs based on either MSRP or what you think they should cost is still nonsensical. What matters is the actual dollars (or pounds or euros) per frame based on the price YOU pay. The MSRP figures are marketing fluff designed to make Nvidia and AMD look benevolent.
Additionally, these raster comparisons ignore upscaling and frame gen. Like them or not, they're here to you stay and they allow budget cards to achieve resolution/FPS combinations otherwise impossible.
Who said anything about my reversing percentages? I said 50% faster. If I had arrived at that figure through reversing percentages, that would imply that the 9070 is twice as fast. But I didn't. And it isn't.
They probably meant the 9070 XT if it was sold at MSRP, the 7700XT isn't even relevant being only slightly faster in rasterize than a 9060 XT while being slower in RT and lacking the new upscaler, all that while being more expensive currently.
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
Please don't make low effort comments, memes, or jokes here. Be respectful of others: Remember, there's a human being behind the other keyboard. If you have nothing of value to add to a discussion then don't add anything at all.
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
Please don't make low effort comments, memes, or jokes here. Be respectful of others: Remember, there's a human being behind the other keyboard. If you have nothing of value to add to a discussion then don't add anything at all.
There's an argument that image quality would be equivalent for 1440p with DLSS 4 Transformer Balanced setting vs FSR 4 Quality setting, which shifts the value proposition more substantially.
Bring ray tracing image quality into the mix, and then path tracing performance, and the numbers stretch further in favour of NVIDIA.
At current pricing in Europe, NVIDIA is the easy buy, but both probably fall away in value compared to the RTX 5070 (and possibly the RX 9070).
I've used both DLSS 4 (rip 3090) and FSR 4 (on a 9070 XT), and they're close enough that you should be comparing the same relative scaling setting. DLSS 4 is better, but not that much better.
Now FSR 3, sure for those titles if you don't want to flaff around with optiscaler, it'd be valid to compare balanced to qualify.
5060ti path traces fine it's only 10% slower than a 4070. The biggest limiter for the 5060 in path tracing is running out of VRAM. 5060ti 16gb fixes this. If you're curious about path tracing performance of the 5060ti, 4070 and more check out DFs reviews, majority of them test path tracing.
I think DF showed the 5060 ti hitting like 120fps with path tracing using MFG X4 mode in Cyberpunk. So only like 30fps of rendered frames (showed well over 70ms of latency) and there can be a lot of artifacting in my experience coming from that low of a frame rate.
I doubt anyone will be chasing native graphics at that point, especially with path tracing, it'll just be too computationally expensive to deliver at appropriate frame rates, and it's become very apparent that latency is an issue with scaling monolithic designs, let alone MCM.
What does need to be ditched is frame interpolation, but likely it will be replaced with frame extrapolation, but much less of an issue if frame updates are down to 1ms at that point (1000 FPS).
No kidding, and that number recently shifted down to 44% resolution scale at 4K, it will also shift again through neural compression bringing a more detailed set of assets, inference bringing path traced-equivalent subsurface scattering and every bounce after number 2, and future FP4 DLSS models bringing the bar down again.
Do you think that the RTX 60 series and RTX 70 series won't hit that mark pretty much across the board? Not a huge ask for frame extrapolation to jump 2x each gen either.
Take away the demand for native graphics and your target is hit before 2030, and you won't mind the result.
Completely rookie question from someone trying to make an educated decision on what to purchase, can you explain why they fall away in value compared to the RTX 5070? I have seen the 5070 where I live for a decent price, and just want to find out more since everyone says that 12GB of VRAM is a limiting factor and that it is not a good buy, while the 9060XT 16GB and 5060TI 16 GB naturally come with more VRAM and are cheaper. Thanks for any advice you can provide!
It's overstated how regularly this is a factor, but it does seem to be currently pushing the limits of 1440p, most notably when you utilise path tracing (and likely frame generation along with it).
Over time, neural rendering techniques are coming through that should compress game textures and assets further for 40 and 50 series cards,.but there will be other models that demand VRAM alongside this.
Additionally, there remains a number of asset streaming techniques that aren't really being used in PC games yet (eg. sampler feedback, direct storage).
$549 vs $429 for a 40% performance bump today is straightforward, and there's a decent chance that slightly extends in the future as game engines continue to make inroads with the implementation of these GPU features.
Over time, neural rendering techniques are coming through that should compress game textures and assets further for 40 and 50 series cards, but there will be other ML models that demand VRAM alongside this. DLSS 4 is also slimmed down to help free up VRAM.
Neural rendering won't be used to free up VRAM usage, it'll be used to increase texture quality at the existing VRAM usage
There's multiple texture settings now, supposedly maintaining the same standard of quality is about 7-8x the file size, so why couldn't both exist?
Because the target development platform is consoles, and therefore VRAM usage will target console levels. The tech demos that arrive with support for neural rendering will therefore be added on top, and therefore increase VRAM usage.
You mean the same consoles that flex between 10.5 GB & 12 GB memory allocation for 4K output and don't have the hardware support for cooperative vectors?
I think the issue is how far can the tech be pushed in terms of model footprint and minimum file size, this might not be something that even works on current consoles.
It probably won't work on current gen consoles, meaning that whatever tech demo/game adds support will have to layer it on top of the existing rendering pipeline and textures.
It's always going to be layered on top of an existing pipeline... there's a trained model for the textures in a game, and the incentive is there for file size on disk, not just VRAM, so there's a broad cost saving to be had.
Okay so from what I understood it is not a major limiting factor now, but it could become one in the future, however, they may also have better optimisations for games in the future meaning that less VRAM is used? I am looking to game in 1440p, so that would potentially be something that I consider. From what I have heard, multi-frame generation is also getting better, so I do not have a major issue with using it.
Regarding the AI aspect, I will 99.9% of the time utilise the GPU for gaming purposes, so I assume that will not impact me much at all regarding the limited VRAM?
So in the case of where I live, the cheapest 5060TI 16GB is $590, while the cheapest 5070 is 700$. Would you still consider the performance bump for the 5070 being worthwhile in that case? The 5070 TI is out of my price bracket, and the 9070 is $815, so also out of price range. Thank you for all the assistance.
Most of the games highlighted by content creators now are able to be navigated through their issues without major visual quality lots by dropping some settings and/or upscaling a little extra (DLSS 4 Balanced vs DLSS 4 Quality), and should in some cases see some developer work done to reduce VRAM requirements through improved asset streaming.
Neural rendering and DLSS technologies are all AI models that help improve frame rates and/or image quality, but they're much more compact than something like a typical LLM you might see someone using this GPU for outside of gaming.
The 5070 is better value for that price difference, but if that is EU/€ pricing for an entry level model, I would wait for it to settle again, and pay attention to not wind up with an SFF or a hot and noisy basic design, when there's typically better designs for about 7-10% extra.
Oof, that is painful. 80 euros more for a real dodgy GPU brand at the moment versus the lower tier model of what is generally considered the top dog AMD exclusive AIB partner.
£314 in the UK for the cheapest 16GB model. I think it will stick as well considering they have sold 110+ on OCUK, when the 9070XT launched the price rose pretty quick.
what happened to the post for his video showing it vs the 8GB? people were complaining that it was 'unreasonable' to say they were the 2 closest priced cards given we didn't know if the MSRPs would hold but given that they are doing so far it seems like a reasonable video for people with a specific budget
Like I said, the trends for things like the 5070 and 9070 may be precluding previous chicanery, because if they take the piss here the customer may be upsold to a 5070... and the 5070, while being a joke as a 1440p card because of the cache would dominate 1080p gaming!
constantlymat@reddit
As someone who plays basically everything in 1440p quality mode AMD would be a real alternative if the FSR4 adaptation wasn't advancing at such a glacial pace.
I am confident that they'll get there in a year or so but for right now I'd always shell out the fifty additional Euros for the 5060Ti 16GB
Cinematicfly@reddit
The RX 9060 XT costs around €400 in my country, and I just bought my 5060 Ti 16 GB for €439… I think the price is pretty okay tbh
elevenatx@reddit
Except the price difference is a minimum 70 euro. And if AMD restocks and sells at msrp it’s 114 euro difference.
Muaddib_Portugues@reddit
That's a big IF considering AMD cards have consistenly been above MSRP and often not even available depending on where you live.
And honestly, the NVIDIA feature set is worth the 70 euro difference if you care about longevity and new games.
YOU_KNOW_WHO315@reddit
price is everything
where I live there is a whopping 150 dollar difference between an oc edition of 9060xt 16gb and the base model of rtx 5060 ti 16gb
Muaddib_Portugues@reddit
I honestly don't think the 5060ti 16gb deserves as much criticism as it gets. Sure it does not have raw power. However, the 16gb ram complements the extra features it has and more than makes up for the lack of raw performance.
I can comfortably run it on a 1440p monitor with max settings, RT, DLSS and Framegen. Sure I can't do that without Framegen but it exists and it's pretty good tech for the average player who is the one getting such a card.
YOU_KNOW_WHO315@reddit
150 to 160 dollars more for 10 more fps is not the smart choice as you could achieve the same base performance as a 9060xt by buying it OC edition but yes when it comes to dlss
DLSS > FSR I agree ,that the image on fsr really looks blurry and dlss does a better job at upscaling but I hate frame gen of both sides cause of the latency and fake frames
an_angry_Moose@reddit
I’m looking at both of them right now and it’s $100 Canadian difference. ($519 vs $619 for lowest cost on the shelf). That’s a 19% difference.
constantlymat@reddit
Every market is different.
Nvidia, especially Asus, is doing some very aggressive promotions here in Germany right now.
An Asus Prime RTX 5070 was 508€ on Amazon right now and a 5060ti 16G just 415€.
stemota@reddit
optiscaler exists fortunately but yeah, not an excuse
W_ender@reddit
it's fair but it's not as slow as people like to describe it, we get 10+ games every driver update, when sdk gets released it'll accelerate
Earthborn92@reddit
Yeah the problem isn't a glacial pace, but that they had to start from zero FSR4 games when most any DLSS 2 games support overriding to the transformer model.
Fritzkier@reddit
yeah, basically 60+ games already for only 2 months since the announcement is anything but slow.
BoringForumGuy@reddit
LOL, NVIDIA got as low as buying personal reviews of 5060ti 16GB vs 9060xt 16GB such as this: https://www.pcmag.com/comparisons/amd-radeon-rx-9060-xt-vs-nvidia-geforce-rtx-5060-ti-16gb-which-is-better
The article is so PAID, that even tests are handpicked in favor of 5060, also the price comparison goes by MSRP and only 5060ti price is mentioned on Amazon, while no mention of 9060xt price there.
Titi-Martinez@reddit
Mhhh, so in summary buy the RTX 5060 ti
lacovid@reddit
For content creation, realtime 3D work, video editing, or working with AI models, I assume 5060 ti will be a lot better or not really?
Antonis_32@reddit (OP)
TLDR:
GPUs perform similarly. If both cost the same he would choose the 5060 Ti for the better feature set.
Gearsper29@reddit
€369+8%=€399 for hypothetical equal performance.
So basically the classic Amd strategy. Nvidia price-10%.
conquer69@reddit
The cards are getting faster in RT and have a better upscaler now though. So the -10% strategy might be enough this time around.
The problem I see is the price. The 9700 xt at msrp is offering better price performance which would make these cards overpriced.
Particular_Respect_7@reddit
What matters is the actual price in your region. Deciding which card represents best value for money based on MSRP is nonsensical.
Here in the UK, the cheapest RTX 5060 ti 16GB I can find is £390 and the cheapest RX 9060 XT 16GB i can find is £315. Meanwhile, the cheapest RX 9070 XT I can find is £650. So it's more than twice the price of its little brother and 67% faster than the Nvidia card, while being around 50% faster. So there's definitely a market for the other two cards, unless you have a spare £300 under the mattress.
conquer69@reddit
The 9070 xt is 72% faster at 1440p and 82% at 4K over the 9060 xt. https://www.pcgameshardware.de/Radeon-RX-9060-XT-16GB-Grafikkarte-281275/Tests/Release-Benchmark-Preis-Specs-1473547/3/
I use msrp for the comparison because the 9060 xt does have an acceptable price ratio to the 9070 xt if it was $700, but not at $600. And the 9070 xt is overpriced at $700.
This tells me AMD didn't really plan to sell the gpu at $600 and this 9060 xt basically comes pre-scalped from the factory. I think it should be at least $330 to maintain parity with the 9070 xt that should be $600.
Particular_Respect_7@reddit
I got my 50% faster figure from the relative performance at 1440p tables on Tech Power Up and Hardware Unboxed.
Also, comparing the relative worth of GPUs based on either MSRP or what you think they should cost is still nonsensical. What matters is the actual dollars (or pounds or euros) per frame based on the price YOU pay. The MSRP figures are marketing fluff designed to make Nvidia and AMD look benevolent.
Additionally, these raster comparisons ignore upscaling and frame gen. Like them or not, they're here to you stay and they allow budget cards to achieve resolution/FPS combinations otherwise impossible.
top-moon@reddit
Percentages aren't reversible, you're reading the table wrong.
Particular_Respect_7@reddit
Who said anything about my reversing percentages? I said 50% faster. If I had arrived at that figure through reversing percentages, that would imply that the 9070 is twice as fast. But I didn't. And it isn't.
I'll let you get back to your exam revision.
top-moon@reddit
Hardware unboxed says 119 vs 70 fps at 1440p (170%) and at techpowerup at 4K lists the relative performance as 185%.
I assumed you reversed these to 59% and 54% and then rounded down for effect, since it mostly fits. Regardless, you have the wrong percentages.
Jeep-Eep@reddit
Yeah, 7700XT raster is perfectly decent with that RT and ML (which adds to RT) uplift.
Hairy-Dare6686@reddit
They probably meant the 9070 XT if it was sold at MSRP, the 7700XT isn't even relevant being only slightly faster in rasterize than a 9060 XT while being slower in RT and lacking the new upscaler, all that while being more expensive currently.
Jeep-Eep@reddit
Which is what I was trying to say in a sleep deprived way?
A few driver iterations's worth of uplift behind a 7700XT in raster but much better RT and ML is not a bad deal for a -60 tier SKU.
hackenclaw@reddit
they will never gain market share with these pricing.
16GB 9060XT should have price at the same prie as 5060 non-TI
Jeep-Eep@reddit
I'm not sure that sort of maneuver is well advised until GPU MCM matures, same with halos. Node economics make it difficult until that is an option.
Vb_33@reddit
That would have been a steal. $330 would have been good tho.
kikimaru024@reddit
>GPU is €80/18% cheaper
Here's how AMD is actually screwing you!
FOH
Your mind is so rotten, you can turn a positive into a negative.
Gearsper29@reddit
18% cheaper for 8% less performance=10% cheaper for same performance.
So you pay 10% less to get the same performance and less features. Basically the status quo.
The only positive I can find is that the feature disparity is smaller this gen. So Amd is not bad value this time but it is not a bargain either.
kikimaru024@reddit
Is this that American "New Math" I've heard about?
hardware-ModTeam@reddit
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
RHINO_Mk_II@reddit
100% cheaper for 50% less performance = a deal I'd take any day
(percentages don't work like that)
hardware-ModTeam@reddit
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
jasswolf@reddit
There's an argument that image quality would be equivalent for 1440p with DLSS 4 Transformer Balanced setting vs FSR 4 Quality setting, which shifts the value proposition more substantially.
Bring ray tracing image quality into the mix, and then path tracing performance, and the numbers stretch further in favour of NVIDIA.
At current pricing in Europe, NVIDIA is the easy buy, but both probably fall away in value compared to the RTX 5070 (and possibly the RX 9070).
Jobastion@reddit
I've used both DLSS 4 (rip 3090) and FSR 4 (on a 9070 XT), and they're close enough that you should be comparing the same relative scaling setting. DLSS 4 is better, but not that much better. Now FSR 3, sure for those titles if you don't want to flaff around with optiscaler, it'd be valid to compare balanced to qualify.
CrzyJek@reddit
"Path tracing" and "60 class card" shouldn't exist in the same sentence.
Vb_33@reddit
5060ti path traces fine it's only 10% slower than a 4070. The biggest limiter for the 5060 in path tracing is running out of VRAM. 5060ti 16gb fixes this. If you're curious about path tracing performance of the 5060ti, 4070 and more check out DFs reviews, majority of them test path tracing.
GloriousCause@reddit
I think DF showed the 5060 ti hitting like 120fps with path tracing using MFG X4 mode in Cyberpunk. So only like 30fps of rendered frames (showed well over 70ms of latency) and there can be a lot of artifacting in my experience coming from that low of a frame rate.
Jeep-Eep@reddit
Yeah, we're not hitting steady native path tracing at reasonable FPS in this tier until the early 2030s.
jasswolf@reddit
I doubt anyone will be chasing native graphics at that point, especially with path tracing, it'll just be too computationally expensive to deliver at appropriate frame rates, and it's become very apparent that latency is an issue with scaling monolithic designs, let alone MCM.
What does need to be ditched is frame interpolation, but likely it will be replaced with frame extrapolation, but much less of an issue if frame updates are down to 1ms at that point (1000 FPS).
Jeep-Eep@reddit
You need a baseline level of native competency to upscale well.
jasswolf@reddit
No kidding, and that number recently shifted down to 44% resolution scale at 4K, it will also shift again through neural compression bringing a more detailed set of assets, inference bringing path traced-equivalent subsurface scattering and every bounce after number 2, and future FP4 DLSS models bringing the bar down again.
Do you think that the RTX 60 series and RTX 70 series won't hit that mark pretty much across the board? Not a huge ask for frame extrapolation to jump 2x each gen either.
Take away the demand for native graphics and your target is hit before 2030, and you won't mind the result.
Jeep-Eep@reddit
Yeah, no. My assy motorics are bad enough, I don't need latency to make anything with response time being important even harder.
jasswolf@reddit
Frame extrapolation doesn't add latency. It's Reflex 2 but additional frames.
snipe_j@reddit
Completely rookie question from someone trying to make an educated decision on what to purchase, can you explain why they fall away in value compared to the RTX 5070? I have seen the 5070 where I live for a decent price, and just want to find out more since everyone says that 12GB of VRAM is a limiting factor and that it is not a good buy, while the 9060XT 16GB and 5060TI 16 GB naturally come with more VRAM and are cheaper. Thanks for any advice you can provide!
FreshWing2754@reddit
5070 and 4070 super are the bad cards right now.
5060 is 300$.
5070 is 2x the price. But it gives only 60% more performance and has only 12gb.. Go for 5070 ti. otherwise wait for 50 series cards.
SkySplitterSerath@reddit
People looking at 5060 are nowhere near the price bracket of a 5070Ti
Jeep-Eep@reddit
And that is a good part of why that will stay in that bracket.
Muaddib_Portugues@reddit
And 12gb while not amazing is definitely enough for 1440p. Hell, I'm playing on a 6gb 2060 at 1440p lmao
chapstickbomber@reddit
AMD is trying to upsell them to the 9060 XT 16GB
jasswolf@reddit
It's overstated how regularly this is a factor, but it does seem to be currently pushing the limits of 1440p, most notably when you utilise path tracing (and likely frame generation along with it).
Over time, neural rendering techniques are coming through that should compress game textures and assets further for 40 and 50 series cards,.but there will be other models that demand VRAM alongside this.
Additionally, there remains a number of asset streaming techniques that aren't really being used in PC games yet (eg. sampler feedback, direct storage).
$549 vs $429 for a 40% performance bump today is straightforward, and there's a decent chance that slightly extends in the future as game engines continue to make inroads with the implementation of these GPU features.
Noreng@reddit
Neural rendering won't be used to free up VRAM usage, it'll be used to increase texture quality at the existing VRAM usage
jasswolf@reddit
There's multiple texture settings now, supposedly maintaining the same standard of quality is about 7-8x, why wouldn't both exist?
Noreng@reddit
Because the target development platform is consoles, and therefore VRAM usage will target console levels. The tech demos that arrive with support for neural rendering will therefore be added on top, and therefore increase VRAM usage.
jasswolf@reddit
You mean the same consoles that flex between 10.5 GB & 12 GB memory allocation for 4K output and don't have the hardware support for cooperative vectors?
I think the issue is how far can the tech be pushed in terms of model footprint and minimum file size, this might not be something that even works on current consoles.
Noreng@reddit
It probably won't work on current gen consoles, meaning that whatever tech demo/game adds support will have to layer it on top of the existing rendering pipeline and textures.
jasswolf@reddit
It's always going to be layered on top of an existing pipeline... there's a trained model for the textures in a game, and the incentive is there for file size on disk, not just VRAM, so there's a broad cost saving to be had.
snipe_j@reddit
Okay so from what I understood it is not a major limiting factor now, but it could become one in the future, however, they may also have better optimisations for games in the future meaning that less VRAM is used? I am looking to game in 1440p, so that would potentially be something that I consider. From what I have heard, multi-frame generation is also getting better, so I do not have a major issue with using it.
Regarding the AI aspect, I will 99.9% of the time utilise the GPU for gaming purposes, so I assume that will not impact me much at all regarding the limited VRAM?
So in the case of where I live, the cheapest 5060TI 16GB is $590, while the cheapest 5070 is 700$. Would you still consider the performance bump for the 5070 being worthwhile in that case? The 5070 TI is out of my price bracket, and the 9070 is $815, so also out of price range. Thank you for all the assistance.
jasswolf@reddit
Most of the games highlighted by content creators now are able to be navigated through their issues without major visual quality lots by dropping some settings and/or upscaling a little extra (DLSS 4 Balanced vs DLSS 4 Quality), and should in some cases see some developer work done to reduce VRAM requirements through improved asset streaming.
Neural rendering and DLSS technologies are all AI models that help improve frame rates and/or image quality, but they're much more compact than something like a typical LLM you might see someone using this GPU for outside of gaming.
The 5070 is better value for that price difference, but if that is EU/€ pricing for an entry level model, I would wait for it to settle again, and pay attention to not wind up with an SFF or a hot and noisy basic design, when there's typically better designs for about 7-10% extra.
Jeep-Eep@reddit
Oof, that is painful. 80 euros more for a real dodgy GPU brand at the moment versus the lower tier model of what is generally considered the top dog AMD exclusive AIB partner.
conquer69@reddit
The cards came out yesterday. There will be more AIB variety soon lol.
teutorix_aleria@reddit
Pulse is generally a basic MSRP model even though sapphire are great card makers its still a basic card.
Jeep-Eep@reddit
Yeah, but it's still one of the better basic cards.
teutorix_aleria@reddit
Yep, that's why i own three of them.
Strazdas1@reddit
Price is everything? Is that why the more features higher price brand has 90% of the market?
Firefox72@reddit
Its not a contest when it comes to EU pricing.
RX 9060 XT 8GB: €315
5060ti 8GB: €369
RX 9060XT 16GB: €369
5060ti 16GB: €447
Muaddib_Portugues@reddit
That looks good. In Portugal the 16gb models of 9060 XT is 440€ and the 5060ti is 480€.
Across the board, in my county, the Radeon prices are too close to NVIDIA. There's barely any reason to buy Radeon because of that.
Vb_33@reddit
US MSRP with tax included for the 5060ti 16GB would be $455 for me except the cheapest available is $508 with tax included. Geman pricing looks good.
kuddlesworth9419@reddit
£314 in the UK for the cheapest 16GB model. I think it will stick as well considering they have sold 110+ on OCUK, when the 9070XT launched the price rose pretty quick.
Muaddib_Portugues@reddit
In Portugal:
Considering the feature set there's no reason to buy the radeon card.
wilkonk@reddit
what happened to the post for his video showing it vs the 8GB? people were complaining that it was 'unreasonable' to say they were the 2 closest priced cards given we didn't know if the MSRPs would hold but given that they are doing so far it seems like a reasonable video for people with a specific budget
Jeep-Eep@reddit
Well, in these parts there's a model at Canada Computers for literally 130 cdn less then an analog 5060ti, so I dare say it won on price here.
ProvenAxiom81@reddit
I would add that 2 days after launch there's a ton of stock at canadian online retailers if you know where to look.
Darksider123@reddit
16gb cards are MSRP here in Norway as well
Jeep-Eep@reddit
Like I said, the trends for things like the 5070 and 9070 may be precluding previous chicanery, because if they take the piss here the customer may be upsold to a 5070... and the 5070, while being a joke as a 1440p card because of the cache would dominate 1080p gaming!
shugthedug3@reddit
Saw there are £315 16GB models in the UK, hopefully that price holds
wilkonk@reddit
still in stock at that price a day later so it looks like there's a chance this time, at these prices it's the better deal.