AMD FSR Upscaling 4.1 officially coming to Radeon RX 7000 GPUs in July, RX 6000 in 2027 - VideoCardz.com
Posted by KARMAAACS@reddit | hardware | View on Reddit | 224 comments
dc_IV@reddit
So this makes my decision to do a new to me build using FB Marketplace and Ebay a little less crazy? The RX 7900 XTX I picked up for $640 late last year will get even more life after all.
romanTincha@reddit
Saving grace for the steam machine
APES2GETTER@reddit
It’s probably the reason why it’s happening to begin with.
Seanspeed@reddit
No, AMD has obviously been working on this for a while now, and with Playstation in partner, given that PSS4 is literally just FSR4.1.
schneeb@reddit
except pssr is not on the base ps5 because it requires hardware support
ShadowRomeo@reddit
Steam Deck too, i wonder how this official version will compare to the leaked INT8 version?
Saneless@reddit
The delay for RDNA2 should at least suggest it has a chance to be better
jocnews@reddit
Could be waiting FSR 4.2 if we consider that RDNA 3 had to wait for FSR 4.1.
Of course, it's not clear if it was really blocked on the overal progress of the software. Using the leaked FSR 4.0 on Radeon RX 7700 XT it seemed to more or less work in the one game I tried. I only used it to get better AA though and no idea about the performance impact.
b0wz3rM41n@reddit
The performance problem of the leaked INT8 version is not so much on the 7000 series (though it could be better) but rather the 6000 series
"Quality" mode offers a very inconsistent uplift that varies from game-to-game, ranging from around a modest \~20% increase to pretty much negligible (2-3%)
However, using more agressive upscaling options like balanced or performance result in exponentially lower performance gains due to those cards' horrendous compute speed making upscaling slow and cancelling out most of the performance gains from running the game at a lower resolution
Zarmazarma@reddit
Hmm, it's odd though. FSR4 is actually not that intensive to run. It's a very small model, like 100k params- it takes less than 100 gops / frame to run, so almost any card with DP4A can make meaningful use of it (say, with 25+ tops of int8 performance). By the time they get slow enough for it to stop being meaningful you start hitting cards that are too old to use in modern games in general (or run into memory bandwidth issues).
I actually made a mod to run it on a 1660 super, and with some other minor performance tweaks (replacing an expensive exponent operation that ran once for each pixel neighbor with a faster approximation), had usable results.
_hlvnhlv@reddit
Do you have a video about it?
It sounds really interesting.
Zarmazarma@reddit
Nothing really substantive. I have some stills from CP2077, KCD2, Satisfactory, and TOW2- plus a CP2077 video, but I recorded it streamed from my friends PC with with the 1660 super, so not very high quality... I have been meaning to write a post about it, and also fix an outstanding bug with jitter (causing the jitter not to resolve correctly and the frame to become blurry every certain number of frames) and release a version people can try if they want. It's just a DLL override- you replace the existing FSR DLL (should work with anything that supports fsr2 or later, though I have only tested it in a handful of games) and run the game.
Maybe next week when I get internet in my new apartment.
wizfactor@reddit
For RDNA2, I would say that the priority is image quality over performance improvements. FSR4 just smokes FSR3 in image quality at equivalent FPS, so it would be a meaningful upgrade for RDNA2 users.
Especially the Steam Deck, where FSR4 will be a much appreciated upscaler upgrade when working at 540p or 360p internal resolution.
spazturtle@reddit
The leaked version does not use the 7000 series's Int8 AI accelerator at all. It runs on the shader cores, so it takes compute resources away from the game whilst the Int8 AI accelerator sits idle.
ViniCaian@reddit
Wait, really? Where has this been documented? Did anyone test it?
b0wz3rM41n@reddit
Yeah, the leaked INT8 version has pretty poor performance on the 6000 series, they're probably taking their time to further optimize/simplify it so that it at least runs ok on those cards
perfectly_stable@reddit
I'd say it already ran somewhat ok on my 6800xt, fsr4 on performance looked better than fsr3.1 in quality mode, while running slightly better too. At least in cyberpunk. Standing still and letting fsr3 gain stability would offer higher resolution, but ain't nobody standing still in a videogame, and fsr4 had MUCH better temporal stability in motion
bubblesort33@reddit
Unless the compute cost has been massively reduced, it still seems to me like it won't be viable. You're spending so much time upscaling in frame time, you might as well be rendering natively for similar performance.
ExplodingFistz@reddit
That's not the point. Native rendering uses TAA which looks like a blurry and smeary mess. RDNA 2/3 cards don't support the latest upscaler models so they end up outputting games with crappy image quality, if forced to use TAA or inferior upscalers to FSR 4 like FSR 3.1 or XeSS (so basically every modern game). Sure the performance hit from a heavier upscaling model might not be great but that would just be a tradeoff you'd have to endure. I'm sure it'll be fine though, since the higher end cards in the RDNA 2/3 stack can sort of brute force it with their high raster performance, especially the 7900 XTX. Lower end cards will be suffering the most unfortunately.
lucidludic@reddit
Depends entirely on the game in question. For what it’s worth, FSR4, DLSS etc. are themselves a form of temporal AA. Since we’re talking about Steam Deck, the resolution and screen size is relatively low, anyway. By far the primary reason players on steam deck would enable upscaling is for the improved performance and/or lower power consumption.
your_mind_aches@reddit
It really is. Just two days ago the Steam Machine basically felt cooked.
FSR4 is gonna make a huge difference.
VanceIX@reddit
Yup, if they can keep the entry level machine to less than $900 it’s definitely got its niche now
chemastico@reddit
Yup basically will make it a pretty good entry level pc, now they just have to nail down the pricing
unixmachine@reddit
AMD's marketing team needs to be studied as a case of incompetence. The company likely lost a lot of market share due to its insistence that FSR4 only worked on RDNA 4, even after this was proven false.
ElectronicStretch277@reddit
They'd had interviews where they said they were working on it but were unsure if it would work acceptably.
Additionally, whether Radeon announces new features early and people overhype it themselves (and inevitably get let down because these things don't magically get support) or they keep silent till they're sure something works. It's always seen as a loss for Radeon. So honestly, I'm fully convinced that Radeon users will never be happy with what they have.
unixmachine@reddit
In recent months, all their interviews reiterated that FSR 4 was exclusive to RDNA 4.
https://www.reddit.com/r/Amd/comments/1p5mu2g/amd_reiterates_fsr_redstone_is_only_for_radeon_rx/
https://www.reddit.com/r/hardware/comments/1qzw07o/amd_radeon_still_refuses_to_officially_release/
HandofWinter@reddit
It was then? Now they're confident enough in the int8 version to announce it for older gens so it's not anymore. It wasn't great when it leaked, so I definitely get not broadcasting that, I tried it myself. I don't really see any confusion and I say this at a 6800xt owner, so I probably care more than most people, which admittedly isn't all that much.
Nicholas-Steel@reddit
Right, so marketing said one thing, then eventually backpedaled but by then it was too late, everyone has the original stance stuck in their mind.
Then when they did backpedal they had no timescale for deployment, no commitment, and now there's a massive delay from original release to support for old hardware despite already having a functional build for old hardware leaked eons ago (demonstrating a lot of, if not all of, the groundwork being already established).
I'm sorry sir, there's no excuse they can give to justify their inaction.
HandofWinter@reddit
What do you think marketing said? There might have been something I missed, but I only remember it being announced as a 9000 series feature, nothing at all about previous generations.
Then the int8 build leaked, which made it clear that they were working on a version for older gens, but again I'm pretty sure there were no announcements or timelines. So it was obviously being worked on, so this kind of announcement was expected at some point. Now they made it. I'm honestly not sure what the confusion is.
ElectronicStretch277@reddit
There's really very clear explanations. Just because you interpreted things in the worst way doesn't mean you were correct.
ElectronicStretch277@reddit
The first article is just the initial announcement of Redstone. It doesn't say that Redstone won't come to older gens. Just that December 10 is when it will launch for the 9000 series.
The second makes no mention of exclusivity. It just states that AMD doesn't have any updates to share about the Int8 version just yet.
Neither article makes the point you say they make.
InevitableSherbert36@reddit
Neither of the linked VideoCardz articles provide evidence of this.
The second article twisted AMD's statement of "no updates to share at this time"—regarding FSR 4 INT8 support—into this absurd headline (which they've now changed):
VideoCardz "journalism" aside, AMD clearly wasn't making a statement of exclusivity about FSR 4.
Meanwhile, the first article is about FSR Redstone, a larger suite that includes FSR 4, FSR Frame Gen, FSR Ray Regeneration, and FSR Radiance Caching. Again, there's no statement of exclusivity of FSR 4; the suite as a whole being exclusive doesn't mean that each individual part is as well.
VideoCardz even reported in January that "AMD leaves the door open to FSR 4-like support on RDNA 3." Do you have any recent source where AMD actually said that FSR 4 would be exclusive to RDNA 4?
theholylancer@reddit
Well, better late than never
just, why take that double barrel to the foot with the messaging, unless seriously, at the time they were not going to support it (looking at the beta leak, likely not??)
like a solid, hey, we want to bring this to our older cards, it will take time because resource is thin would have prevented a lot of the salt and people who dumped 6000 series for nvidia (likely not 7000 series given how new it is)
KARMAAACS@reddit (OP)
This is the best news this week. Too bad RX 6000 users will need to wait longer.
imaginary_num6er@reddit
Well the RX 6000 were on "Maintenance Mode" and AMD never denied claims that they will not be supporting Day 1 driver optimizations for RDNA 1 & 2.
ZeroAnimated@reddit
I thought they rolled that back for now after all the backlash?
imaginary_num6er@reddit
They only "clarified" that they are not ending driver support, but only: "This approach helps deliver a smoother, more consistent experience for your games while insulating previous generation GPUs from rapid changes designed for newer architectures."
They never disagreed with the claim that there will be no Day 1 driver support for RDNA 1 & 2, regardless of whether they claim it is on "Extended Support" or "Maintenance Mode", etc.
glizzygobbler247@reddit
And in that clarification, they stated that new features would only come to rdna3 and 4, so the fact that rdna2 gets it is a welcome surprise
ZeroAnimated@reddit
Ah okay, thank you!
imaginary_num6er@reddit
Also fits into RDNA 3 being on Extended Maintenance Support or whatever the hell they're going to call it in July 2027, since AMD has been very consistent about not supporting Day 1 driver optimizations for GPU architectures \~5+ years old. With RDNA 3, this would be in 2027.
Exotic_Accident3101@reddit
RDNA 3 isn't on extended Maintenance
They said both RDNA 3 and RDNA 4 are together
And RDNA 1 and 2 are together.
Seanspeed@reddit
What they call it isn't important. It's the SAME practice that Nvidia has done forever. Nvidia doesn't provide these Day 1 optimizations for architectures more than two generations old, either.
Seanspeed@reddit
And to be even more clear - when they say 'Day 1 driver support', they mean very game-specific drivers that give some minor Day 1 optimization boosts for some recent big release. They really never amounted to all that much in the first place.
It's literally the EXACT same thing Nvidia does, except Nvidia never advertised this. They dont provide Day 1 optimizations for new games for older architectures either. Just because you can access the new driver also doesn't mean there's anything in it for you if you're on an older GPU.
So many people have wildly misunderstood this whole situation. Big old nothing burger.
_hlvnhlv@reddit
Tbf, I think that Nvidia does port new Vulkan and DirectX extensions to their old GPUs.
AMD doesn't do it with RDNA2
your_mind_aches@reddit
God, I upgraded from an 6600 to a 5060Ti 16GB more weeks before they announced that.
Radiant-Fly9738@reddit
they have continued to optimize games for older gpus, that was just FUD spread on the internet. that's easily to verify and refute.
Bonafideago@reddit
I'll wait as long as i need to. Take your time and do it right, or don't do it at all.
dsinsti@reddit
Sounds a delay...wait....not happening to me thing
SabretoothPenguin@reddit
Isn't the article about Windows drivers?
It's possible Linux drivers timeline to be different...
b_86@reddit
Right now the official Proton versions on Linux already can do the automatic FSR3 -> FSR4 dll swap on RDNA4 since a long time ago, and you can use unofficial forks (ProtonGE, ProtonCachy) to also do it for RDNA3 with the leaked INT8 dll so it will just be a matter of some days after the official dll is released for the unofficial forks to just include it, perhaps a bit more for Valve to validate it and include it on official Proton.
_hlvnhlv@reddit
Yes, it's about the windows drivers, and official support.
Right now on Linux, you can "emulate" the full fsr 4.1 upscaler, or use the leaked int8 dll with no issues
Seanspeed@reddit
I think Proton officially supports FSR4.0 already now? FSR4.1 should basically be a drop-in replacement.
Seanspeed@reddit
This is EXACTLY what I suggested AMD were doing with all the questions of why FSR4 wasn't being pushed to RDNA3 despite AMD doing some work to support it.
PSSR2 was really the big giveaway, as I said at the time. There was probably plenty of joint work to get PSSR2(which is for the most part FSR4.1) working on RDNA3 as efficiently as possible, and so instead of releasing FSR4.0 to RDNA3 only to replace it very shortly, AMD refocused efforts to get the superior FSR4.1 working, which will provide better benefits to RDNA3 users since improved image quality gains means you can potentially run slightly lower resolutions and save on the heavier performance cost versus FSR4.
TheGillos@reddit
AMD working with the consoles is a real advantage over Nvidia, even in the PC gaming space.
Not that Nvidia is lacking advantages of their own, lol. But I can see Nvidia failing to focus on gaming and losing market share to a great AMD product in the future. It's happened before, and Nvidia is so balls deep in AI that they may let their gaming efforts slip.
I'd like to see more competition in PC Gaming hardware, always.
vitek2121@reddit
Amd is also balls deep in AI, while still playing catchup in the consumer GPU space.
sittingmongoose@reddit
Ps5 pro has custom hardware to get pssr to work. Mark cerny talked about it in one of his interviews at length. It’s not just plain rdna 3 at all.
Seanspeed@reddit
It's wild how many people just take Cerny at face value and dont realize how much weasel words he uses to make Playstation hardware sound more special than it is. His original Road to PS5 should be infamous for this now, but hardly anybody paid attention(it was obvious to me even at the time, but hindsight should have proven it clearly to everybody). A lot of the 'custom' work of PS5 was simply it really only being RDNA1.5. lol
PS5 Pro does not have custom hardware 'to get PSSR to work'. While it's not plain RDNA3, it's using a good chunk of what makes RDNA3 what it is. It's very slightly customized to the degree that there are also slight calculation differences for stuff like this, but there's enough similarity to where a lot of the work will absolutely have significant crossover and relevance and usefulness for PC RDNA3.
It's so very obvious now that this work had cross relevance, otherwise AMD would have released FSR4.0 on RDNA3 by now. The fact that they waited til FRS4.1 was ready, and it just happened to coincide with PSS4(which is FSR4.1) is not some insanely wild coincidence! lol
uzzi38@reddit
No he's right, the PS5 Pro has wildly differs in it's ML hardware. It doesn't align with any RDNA generation with it's 3x3 convolutions per lane. FSR4 in other implementations operates on 16x16 matrices. It would require more of a rework than just requantizing the model, and that work wouldn't be applicable to standard RDNA2/RDNA3.
I'm sorry, but there's no evidence to suggest that the FSR4 IN8 model was even being worked on when it was leaked last year. The pre and post passes were FP8 only, the model used was half a year older than the FP8 model in the same leak, and there was a WMMA path that was totally broken (but from what we can tell, that WMMA path wasn't any different for RDNA3 and RDNA4). It's not even drastically faster than the FP8 model running via FP16 WMMA on RDNA3, which was a huge hack in and of itself - only about 25% faster on the same hardware. A native FP16 WMMA model probably would have performed almost the same as the INT8 model.
Obviously I can't be 100% certain, but all the signs suggest the reduced quality in the FSR4 INT8 model that realised was probably almost entirely down to the older model, and the reality is AMD just made a decision early in the development cycle of FSR4 to focus on the FP8 model. RDNA4 gets the same throughput on INT8 and FP8, so the thinking was probably originally to either do a slightly reduced quality model across the RDNA3/RDNA4 stack, or do FP8 for RDNA4 only... and they decided on the latter for whatever reason.
Seanspeed@reddit
This sub continues to amaze me in its attempts to still shit on AMD no matter what. To say that AMD weren't working on this beforehand when it's very obvious they were, and to act like the work done with Playstation has no crossover, given all the timings and things we know with PSSR2 being FSR4.1, requires an absolutely INSANE level of belief in magic and coincidences.
"for whatever reason"
Key bit that you simply have no answer for, and which you need to revolve your entire argument around, despite having no actual explanation for.
My explanation makes way more sense and it's something I was saying much earlier. But of course my explanation doesn't involve making AMD sound as bad as possible. Though dont get me wrong, I'm still with everybody who thinks they are still generally well behind on this stuff and AI upscaling came painfully late.
uzzi38@reddit
This has to be one of the first times I've been accused of shitting on AMD. Like seriously, it's usually the other way around.
Is not what I said. The existence of an INT8 codebase implies they did work on it in the past. My point was all signs pointed to them dropping that version of the SDK.
...just because I didn't write anything does mean there's no justifiable reasons why they didn't do it. Just that I didn't want to speculate on the reasons why. Everything else in that post is based on the prior leak: there's no guessing. The INT8 model is clearly auto-generated using an older version of the FSR4 model - there's a comment stating as such. There is a (broken) WMMA path present in some of the code. The pre and post passes for FSR4 only have INT8 implementations, whereas most of the other stages have both INT8 and FP8 implementations. None of that is a guess, and all of it points to a work in progress that was abandoned.
But to answer your question directly on reasons they would have decided on the INT8 model over the FP8 model:
FP8 does technically provide meaningfully better precision over INT8 when needed, which is better for image quality. It's also easier to quantize from FP32/FP16 which the model would have been written in down to FP8, which reduces the chance of some accuracy drift and thus, further benefits for image quality
RDNA4's able to do sparsity better on FP8 than INT8, so it would also help with runtimes and efficiency.
They probably were just too concerned with the performance impact of FSR4 on RDNA3 at the time. Remember, this was before DLSS4.5, and at the time the higher runtime cost of FSR3 upscaling vs DLSS3 upscaling was a noticable downside to it. There was probably some apprehension from management.
The reason nobody likes: meddlesome middle managers who want to lock down support and push sales of the newest generation. I'd like to rule this out personally, but it wouldn't be the first the the FSR team at AMD ran into issues with this. Dom has talked about there being plans for a heavier version of FSR3 upscaling that would aim to tackle some of it's weakest points (disocclusion etc) that was killed by management who wanted to focus solely on FSR4. It wouldn't surprise me if that wasn't the only roadblock they ran into.
RHINO_Mk_II@reddit
PS5 marketing also said it had some magic storage solution that couldn't be matched either, and we see where that ended up.
sittingmongoose@reddit
It never ended up being used for streaming in things that were just entering the viewport. However, it’s a fast solution that allowed them to use very good decompression which had a real impact on file sizes. It also greatly reduced the cpu load of file decompression and streaming assets.
ConsistencyWelder@reddit
This sub never disappoints. Even the most positive news about AMD is spun in the most negative way possible in here 😊
KARMAAACS@reddit (OP)
AMD's fault isn't it? I mean nothing was stopping them from doing this earlier considering they leaked it months ago and why is it taking much longer to get working on RDNA2?
ElectronicStretch277@reddit
Because you people don't realize that RDNA 2-4 are radically different architectures. RDNA3 at least has WMMA to fall back on and improve performance. RDNA2 will likely use DP4A. So essentially, quantizing the model for 3 different datatypes + quality assurance and stability + whatever projects they have in the meantime = longer time for development.
Nuck_Chorris_Stache@reddit
We know. And we also know that a version of FSR4 that uses INT8 rather than FP8 already exists and works fine on RDNA2 and RDNA3, probably because it was developed for the PS5 which also doesn't do FP8 well.
ElectronicStretch277@reddit
Because it didn't work fine. It already had a big hit in performance for RDNA3 (not to mention RDNA2) in the state it released because it didn't use WMMA at that point in time. It was likely when PSSR2 got finalized and released that they had a good enough version (also PS5P Pro only does PSSR2 because it has triple the Int8 performance of the 7900XTX) and now they're getting it done and fixing edge cases most likely.
Nuck_Chorris_Stache@reddit
Yes it did, and does, which is why people do use it via Optiscaler right now.
Yet FSR4 performance mode looks and performs better than FSR 3.1 quality mode.
ElectronicStretch277@reddit
There are times where the model gives like 5 FPS gains at performance mode and there are times where it regressed performance.Just because you accept it in a workaround doesn't mean that people would accept it with an official release. People use it in games where it works fine. AMD needs to provide a solution which works well in EVERY game. Wildly different goals man.
And you're comparing the performance of a low quality setting to the heaviest model of another upscaler. Again, just because some people accept it doesn't mean it's the correct way to go about things.
Nuck_Chorris_Stache@reddit
Not from what I've seen, especially not the performance mode, which as I've said, performs and looks better than FSR3 quality mode at the same time.
Seanspeed@reddit
I called it ahead of time that when this day came(and I was pretty positive it would), many in this sub would STILL find a way to complain about it. Because so many of the complaints before were not legitimate, they were pure and total concern trolls. People who weren't upset by the situation, rather people who were giddy to have a talking point to bash AMD with.
It should have been obvious when most people here would at the same time say that AMD GPU's were essentially irrelevant in the market, but also everybody and their mother was just super upset about RDNA2/3 GPU's weren't getting FSR4 support, as if any of them were actually affected. lol
996forever@reddit
Coming to apus on rdna2 and 3/3.5 too?
siazdghw@reddit
Considering AMD is still selling RDNA 2 and RDNA 3 mobile CPUs they better, but also by the time it releases in 2027+ the hardware will be obsolete.
They should've been supporting old architectures day 1 like Intel was doing with their XeSS updates.
996forever@reddit
Rdna3.5 APUs will be sold well into 2027 and beyond (including zen 6 APU will still be rdna3.5) so
FlatTyres@reddit
I hope so - especially in the mobile sector. My Lenovo laptop has the 860m but I have to stick with Lenovo's OEM drivers as whenever I've removed them and installed AMD's official drivers I got instability issues and an issue where the laptop would heat up when asleep with the lid closed (doesn't happen with OEM driver), so I'm sticking to OEM drivers. The current latest driver was released in January based on the September or October 2025 driver which might mean either never getting FSR 4.1 on this laptop model or getting it months to a year later than AMD releases it. Also, Lenovo annoyingly don't provide Adrenaline software with their driver updates (driver only) so I'm using the hidden version someone on Reddit linked to from the Microsoft Store after having uninstalled the AMD version and re-installed the Lenovo version.
996forever@reddit
Happened with my asus laptop with strix point too, one of the first laptops with it in 2024. Fucking terrible (and I did NOT make any changes to the laptop just the way it came as is and ran all the windows and asus and amd updates) with such a premium laptop but at least it runs on latest adrenline drivers without issues now.
aresthwg@reddit
Just in time after I ditched my RX 6700 XT for a RTX 5070ti, no need to thank me guys.
WhoTheHeckKnowsWhy@reddit
having gone from a 6800XT to a 5070 ti a year ago last early may, trust me you are not missing out. It's really sad to see both ARC and Radeon mentally check out from the software front.
lifestealsuck@reddit
I ditched 3070 for 9070xt , fight me !
Dreamerlax@reddit
I mean you still get FSR4 day 1.
aresthwg@reddit
Why? You got a massive upgrade and also didn't have to sacrifice yourself for others to get something better. Enjoy the upgrade man!
Narishma@reddit
Looking forward to using it on my RX 580 in 2029 and my HD 5850 in 2033.
_hlvnhlv@reddit
You may be joking, but the RX 580 can run RT only games like the newest doom on Linux lol
Noble00_@reddit
By the time RDNA2 support is out will we get vulkan support (and on linux)? lol
_hlvnhlv@reddit
Have you ever heard of Valve time?
GenZia@reddit
No idea about Linux but Vulkan support on Radeon is pretty good.
I played GTA-IV with the DXVK Vulkan wrapper on my HD7790 (GCN 2.0) and it was a buttery smooth experience with a nice, healthy bump in performance.
Sadly, the card passed away before I could test DXVK in Crysis, another title that 'notoriously' gets a huge performance boost under Vulkan.
Noble00_@reddit
Oh, I know Radeon is well supported I meant FSR4 official support with Vulkan (optiscaler supports FSR4 Vulkan but nothing official from AMD)
lucasdclopes@reddit
They took their damn time. AMD can't afford to have a software team that moves that slow.
Anyway, great news. The RX7000 are very capable GPUs. Thats algo good news for all those portable devices that are using RDNA3 and 2 APUs.
GenZia@reddit
Too little, too late.
While I like the underdog as much as the next guy, my feelings towards AMD has cooled down significantly in recent years due to locked BIOSes (Red BIOS Editor doesn't work on RDNA 3+), janky upscaling scene (you pretty much need OptiScaler) and, frankly, garbage RT performance.
To add insult to injury, Radeon drivers aren't great, VCN isn't as good as NVENC, the lack of CUDA support is a major deal-breaker for many (myself included), and their cards are a nightmare to repair.
No wonder AMD's market share is in single digits now. At this point, I'm not even sure why AMD insists on making consumer discrete graphics.
They are basically irrelevant to most people (excluding "tech celebrities" on YouTube), though I'm glad Nvidia still has some semblance of competition, not that AMD is particularly interested in undercutting them in terms of price-to-performance ratio.
This turned out to be a bit of a rant but... I think people need to wake-up and smell the ashes.
SplitBoots99@reddit
Ok, buy Nvidia. Plenty of people will still use Radeon if the price is right. AMD doesn't seem too worried with how well their real money makers are doing anyways. I'm sitting on some good gains from investing in them 8 years ago.
Anoukroof2c@reddit
You're spending so much time upscaling in frame time, you might as well be rendering natively for similar performance.
deezznuuzz@reddit
Yea, price/performance AMD is still ahead.
Nuck_Chorris_Stache@reddit
I mean, it's 2026, and how much does ray tracing actually improve graphics?
Most of the time the games still look very good without it, and only marginally better with it.
tommytmopar@reddit
Classic AMD. Late to the party but at least they're showing up eventually. That RX 6000 wait is rough though.
Nuck_Chorris_Stache@reddit
It's almost like they're intentionally waiting until after most people upgrade.
tommytmopar@reddit
Feels more like messy roadmap timing than strategy, but the optics are still terrible for them.
Nuck_Chorris_Stache@reddit
I mean, it seems like AMD misses opportunities so often that it doesn't seem random. If it was random, you would think they would randomly stumble into doing the right thing more often.
tommytmopar@reddit
Thats actually a really fair point.
Gambler_720@reddit
It's obvious this decision has been taken due to the negative backlash otherwise AMD would have announced support from day 1 like Nvidia does. Like they could have just said that FSR 4 will come later to previous generations and there wouldn't have been any backlash.
Also we are not even half way through 2026 so a 2027 release isn't exactly good news. That's a pretty unreasonable timeline.
Seanspeed@reddit
Nvidia has had to do no extra work at all to support DLSS on older GPU's going back to Turing. Literally all function the exact same way with the same calculation models(and it's also why DLSS4.5 is unusable on 20/30 series GPU's regardless of whether it technically 'supports' it or not).
The same is not true for FSR4. FSR4 was built on FP8 calculations on RDNA4. RDNA2/3 does not support this, so AMD has had to do extra work(which has been in partnership with Playstation) to get FSR4 working on RDNA2/3 using very different calculation methods.
There is no mystery here. There was no 'recent' decision taken on this, it's not something they always had in their bag and only had to flick the switch, ffs. They've been working on this for quite a while.
As I 100% predicted though, I knew that when this day came, some of y'all would still find a way to make a negative out of this. You weren't upset AMD didn't offer this support, you were giddy that you had a talking point to bash AMD with. And now that's being taken away from you. Called all this before.
Nicholas-Steel@reddit
Except a build of FSR4 using a compatible method leaked eons ago, to get from that build to whatever they're releasing next year is an insanely long delay.
ElectronicStretch277@reddit
Yes, AMD should have announced this and watched people riot over how long it would take for them to get support. Nvidia can announce support day 1 because their similar architectures enable faster backportsm
Seanspeed@reddit
Not 'faster backporting', there was literally no backporting necessary at all. It's all the same model, no reworking needed.
ElectronicStretch277@reddit
By backporting I mean optimizations. Yes, the initial model can "run" on every RTX GPU but Nvidia would still make some changes to the upscaler to allow somewhat acceptable performance for previous generations.
Seanspeed@reddit
No, Nvidia literally doesn't do any work at all for older architectures. It's literally the exact same algorithm for all architectures.
It might be possible to rework some of the newer DLSS models to run better on older GPU's, but we'll never know cuz Nvidia isn't going to even bother to try.
stahlWolf@reddit
The fact they said they were not supporting 6750 cards made me switch back to Nvidia. 2 evils - I'll take the one that provides proper support now. Not to mention that playing with AI models is way easier with Nvidia cards.
AMD718@reddit
AMD's engineers are great but their marketing dept. Is terrible. All they had to say 9 months ago was (coming to RDNA3 in mid 2026 and RDNA2 in a future update). Done. Good will preserved and most outrage avoided.
Nuck_Chorris_Stache@reddit
They're still taking far too long to do it.
snollygoster1@reddit
The preaching about how AMD is better for long term on their CPU's because of sockets seems to be less and less true as time goes on. Sure, Intel just released CPU's on a dead end socket however there's a number of people who are still running LGA 1700 boards just fine because Intel didn't just drop support right away.
KARMAAACS@reddit (OP)
If anything, Intel and LGA1700 turned out to be a better platform than AM5 simply because of the RAM situation. The fact you can pick up a B760 or Z690 board with DDR4 is pretty good, don't get me wrong, AM4 is very good too, but if you need multi-core perf, Intel's got a very compelling platform in LGA1700.
ghostsilver@reddit
/r/radeon users have no idea what to whine about after this
snollygoster1@reddit
The timeline surely. It's still a crap timeline when Nvidia's cards from the 20 series still get some DLSS4 support
XtremeCSGO@reddit
Better late than never
Jejiiiiiii@reddit
Cant trust AMD until it officially released
YouDoNotKnowMeSir@reddit
Same can be said for literally anything else. Can’t remember a preorder or early access that went right.
diskowmoskow@reddit
The only early access game i got was hades ii, it was great. I didn’t play it after the release though, I should check it
RHINO_Mk_II@reddit
It's very good now.
diskowmoskow@reddit
It was already great, will check it back!
Vb_33@reddit
4090 preorder went right
NapsterKnowHow@reddit
Still waiting on Nvidia to drop Reflex 2 and Ray Reconstruction 4.5
Jejiiiiiii@reddit
Remember FSR 3 Frame gen? Took almost a year to appear in games
YouDoNotKnowMeSir@reddit
Yes, just as there was a “robust” catalogue of raytracing implementation and dlss upon release as well right?
Are we scorning AMD for developer integration? Adoption rate is a problem that exists in every new feature.
East-Today-7604@reddit
It's irrelevant to the end customer who's buying a GPU which will benefit his experience, by buying an RTX GPU you're getting DLSS as upscaling and Frame Gen in almost all modern games plus Reflex for latency reduction, Ray Reconstruction & Path Tracing in handful of games - slow adoption rate of AMD features is AMD problem, NVIDIA GPUs are 94% of dGPU market and most people with RTX cards will benefit from those features on day 1, this can't be said about AMD and their feature set.
To add, people had stable access to DLSS upscaling in modern games since 2020, meanwhile AMD released something adequate only in 2025 - AMD users were limited to inferior software set for 5+ years, and only with RDNA4 they're finally getting something decent (FSR4 upscaling), but most of AMD features are either worse, less adopted or both.
Cultural-Accident-71@reddit
This is just marketing at this point. They should have had support from day one and there is no excuse to the customers that paying the price.
Seanspeed@reddit
FSR4 was never promised to RDNA3 users.
This isn't marketing, there were clearly reasons for all this in the background. The explanation of "They just want you to buy an RDNA4 part" never made any real sense.
EnthusiasmOnly22@reddit
Alright, but nvidia continues to backport refinements to their upscalers to cards that are almost 8 years old, sure they don't run them as well and nvidia warns users as such but the option is there at least.
spazturtle@reddit
So put another way Nvidia sold you hardware and then took years to actually provide the software to full use it.
Darkknight1939@reddit
No, but this was exactly case for AMD’s “fine wine” marketing crap.
Seanspeed@reddit
Fine Wine™ was a purely community moniker. And AMD did deserve it in some respects at the time.
EnthusiasmOnly22@reddit
More like they continued to advance what could be done with a platform that could be expanded whereas AMD designed a poor platform that didn’t have extensibility. Opposite of AMD cpu sockets in a way
Seanspeed@reddit
Again, Nvidia doesn't have to 'backport' anything. It's literally zero extra work for them to make these certain DLSS models work on older architectures. How do y'all not understand this?
Notice that DLSS4.5 is practically unusable on 20/30 series parts? We have no idea if Nvidia could make it work better. Maybe they cant, but we dont know.
Certainly multi frame generation doesn't work on <40 series GPU's and y'all just accept blindly that this is ok because it could never work, even though we dont know if it couldn't still work with some extra effort. We just have to take Nvidia's word for it cuz they've never made any effort to try.
ElectronicStretch277@reddit
Yes, Nvidia can do that because all 4 of their gens use the same underlying architecture. There's no reason for not supporting them. RDNA2 doesn't have the AI hardware so it needs much more optimization than even the 2000 series from Nvidia.
Cultural-Accident-71@reddit
Never wrote anything about them needed to support it day one. Clearly they want you to buy new products as that's how they exist. This news is clearly a marketing step as they prepare to lunch a new product and people need to talk about them for few days to build up the hype, that's what I wrote.
Seanspeed@reddit
AMD has always been pretty good at supporting older GPU's when possible. They've never leaned on cutting off features arbitrarily for older GPU's to sell new ones.
froop@reddit
Nothing ever makes you happy, does it?
Cultural-Accident-71@reddit
No, amd really lost it with me this generation 🥲 i use to buy the GPUs and think they are the "good one" but they are just the other end of the sausage.
froop@reddit
Well that's your own fault for ever thinking there was such a thing as 'the good one'. Just take the win.
Cultural-Accident-71@reddit
I was young and needed the money 😌
fatso486@reddit
I'm grateful that it's finally happening but is there really a convincing technical reason why It's pushed it a year for 6000 users.
AIgoonermaxxing@reddit
The only thing I can think of is that RDNA3 has some WMMA stuff going on that RDNA2 might lack? Not sure on the specifics.
anthchapman@reddit
The current FSR4 uses cooperative matrixes, with the Wave Matrix Multiply Accumulate instructions which RDNA 3 and 4 have but previous Radeon hardware doesn't. The devs who modified Linux drivers and to add FSR4 support said that FP8 emulation on RDNA3 could be done without too much performance penalty but that VK_KHR_cooperative_matrix on RDNA3 would be too slow to be worth doing. They then implemented that and confirmed it was too slow.
psi-storm@reddit
Games now often default to fsr. So AMD has to optimize FSR to a point where it runs smooth on rdna2 without graphic errors, or people will complain again. The alternative is to integrate FSR4.1 into the driver as an opt-in with a warning. So you first have to actively choose to activate it, then you can deactivate it again, if the game isn't running good with it.
LastChancellor@reddit
But will Strix Halo get it
Acu17y@reddit
Yes, on Linux they are all already available, on Windows since July for all rdna3 architectures therefore also rdna3.5 which are essentially almost identical and since January also on rdna2
psi-storm@reddit
No. Linux doesn't have the newly announced int8 version of fsr4.1 yet. It uses fp8 emulation to run the regular fsr4.1 version for rdna4.
Acu17y@reddit
On steam before launching a game, set the variable:
PROTON_FSR4_RDNA3_UPGRADE=1 %command% for FP8 with WMMA emulation (I only use this)
PROTON_FSR4_FALLBACK=1 %command% for INT8 native FSR4.
Or if you want to do it manually:
Download the dll and
WINEDLLOVERRIDES="amd_fidelityfx_dx12=n,b" %command%
psi-storm@reddit
First one is the regular fsr4 with emulation, the second is the old leaked fsr int8 version.
The new fsr4.1 int8 version that is coming in the summer is not here yet.
sooka_bazooka@reddit
My framework desktop is also wondering
Joshiie12@reddit
Hallelujah, finally. After spending quite some time running FSR4.0.2 with framegen on a 7900XT in Starfield, Crimson Desert, Doom, and a couple other games, I was getting pretty sour on AMD not officially releasing it for the 7000 cards considering I was running these titles in excess of 160 FPS. Glad Optiscaler won't be necessary soon.
ShadowRomeo@reddit
Even though by 2027 the next gen GPUs will be out and RDNA 2 will be so old nearing 7 years old and most people will be ready to upgrade from them, but still better late than never.
Ath0m1x@reddit
Have you seen the market prices recently for literally EVERYTHING?
Stoicza@reddit
Next gen GPU's should be roughly 2x the performance for the same cost. 9070XT was \~1.70x the performance of the previous gen mid-range 7800XT. 6800XT was only a few % slower than the 7800XT.
I've been chilling on my 6800XT waiting for the next gen GPU's. We have another year and a half to wait, costs of RAM appears to have stabilized at a plateau. We just need to hope that plateau starts on a downward path near the end of this year.
plantsandramen@reddit
Can you share any details on this? I personally haven't heard anyone say this.
Stoicza@reddit
If the 9070XT is roughly 1.75x the performance of the 6800XT. A small 10% improvement in IPC, with no other improvements, would put the next gen GPU's to nearly 2x the performance.
ActualWeed@reddit
In what world is a 9070 xt 1.75x better than a 6800 xt
East-Today-7604@reddit
Ray Tracing games, at 1440p native difference is that big.
Sapphire Radeon RX 9070 XT Nitro+ Review - Beating NVIDIA - Ray Tracing | TechPowerUp
ActualWeed@reddit
Lmao ray tracing
East-Today-7604@reddit
yeah, lmao, most people who buy 500$+ GPUs care about Ray Tracing performance, lmao.
ActualWeed@reddit
Wasn't it polled lately that more than half of gamers don't care about ray tracing?
East-Today-7604@reddit
We're discussing mid to high-mid tier GPUs, on which Ray Tracing performance is great - when you're talking about "polls", there are too many variables, you can't poll specific group of people which have good enough hardware, polling randoms on the internet is not a very valid argument - at first, AMD gamers "didn't care" about "upscaling" - now they're pleased that AMD made FSR4 which is decent, also gamers "didn't care" about Ray Tracing performance - AMD greatly improved it with RDNA4 and one of the selling points of this architecture is greatly improved Ray Tracing performance.
Gamers do care about upscaling, and gamers do care about Ray Tracing - now, NVIDIA dGPU market share is at 94%, and they clearly know what they're doing and what matters to gamers - if you specifically don't care or visual benefits of RT and FSR4 are non-impressive for you, that's great, but your subjective experience is not more important than reality - there are games with mandatory Ray Tracing, which work at \~60FPS at Native 1080p Ultra settings with mid-tier hardware from 2020, and there are many games that greatly benefit from RT on.
So yeah, for most people who can afford decent GPUs like 9070XT, 5070 ti and similar Ray Tracing performance is very important, because they have performance headroom to benefit from those features - people with weak hardware are usually limited by their hardware compute power or VRAM, and Ray Tracing is not a feasible option for them.
plantsandramen@reddit
Oh you meant 2x vs the 6800xt. Thanks for clarifying.
East-Today-7604@reddit
Not everything, high end models are noticeably more expensive than their MSRP - you can get an RTX 5070 for 600-620$, which is 50-70$ higher than its MSRP, it's a great GPU for 1440p gaming which will be perfectly fine for the next few years, meanwhile when it comes to GPUs like 5070 ti, you're paying at least a 1000$, which is 250$ over its MSRP - cheaper models are still more expensive than they should be, but it's not nearly as bad.
Ath0m1x@reddit
"25% over it's launch price is not bad" for a card that's over 1 year old.
East-Today-7604@reddit
Your math is faulty man, or reading comprehension is not your best quality.
5070 at 620$ is 12.7% higher than MSRP, not 25% - I literally said in my message that 5070 ti is 250$ over MSRP, which is 25% - 12.5% over MSRP isn't as bad as 25% or 65% when it comes to RTX 5090.
It's a non-argument, card being 1 year old does not reduce its MSRP, when we had normal prices, before covid and AI chip shortages, cards became cheaper when new generation was released, or at least Super models, but not mid generation.
Ath0m1x@reddit
MSRP in Europe is 800-something Euro's.
Currently it's at 1000.
East-Today-7604@reddit
And? I'm not arguing in favor of 5070ti, I'm arguing in favor of 5070 - I used 5070 ti as an example to show you that the higher you go, the higher markup over MSRP you'll get - since reading is hard for you, I'll try once more - 12.5% over MSRP for 5070, 25% for 5070 ti, 30-35% for 5080 and 65-70% or higher for 5090.
Your first message said, "for literally EVERYTHING" which is not true - mid segment GPUs like 5070 are at "normal prices", all things considered.
Electronics in Europe were always more expensive than in the US for various reasons.
ZeroAnimated@reddit
Exactly, and as long as games are coming out for the PS5, the 6700 and up will remain totally usable GPU's for modern games.
Earthborn92@reddit
This will give RDNA2 better resale value on the used market.
APES2GETTER@reddit
I’m sorry. With what money?
lcirufe@reddit
With what RAM?
xole@reddit
With what electricity?
By then electric bills may have doubled, depending on how close they are to a new AI data center.
APES2GETTER@reddit
Exactly!
capybooya@reddit
I see people wanting to buy new PC's talking about AM4 as a 'current' platform. Not that I want to shit on it, but the price increases, shortages, and inflation does actually make people completely rewrite their expectations. We get so much worse value now. Or, at least I hope that they're realistic about expectations, but I fear that people will pay for really old stuff and be disappointed by its performance.
b0wz3rM41n@reddit
bro talking as if everyone was a millionaire 😭😭😭
Leo1_ac@reddit
Hahaha, nope.
CrAkKedOuT@reddit
2027 damn
TopdeckIsSkill@reddit
They probably needed more work to optimise the code that got leaked since it wasn't the final version.
Saneless@reddit
Still would have been trivial to day "That was some test code, we're working hard to bring th official versions to those cards, we're planning for the last half of 2026" primary focus
TopdeckIsSkill@reddit
Before that they needed to be sure that the code was stable and good enough to be shipped. What if the quality is fine but it cause too many bugs?
996forever@reddit
Has that ever bothered AMD like ever?
Saneless@reddit
They're a multi billion dollar company not named Microsoft. I would expect bugs could be something they could work out
WJMazepas@reddit
It was possible that they werent even sure if they were going to reach a good level of performance back then and wouldn't release it
Saneless@reddit
Everyone was fine with the performance as was with the leak last year
ElectronicStretch277@reddit
Who was everyone? The enthusiasts? Sure. Even they complained at times. To the average person the performance hit would be seen as unacceptable.
996forever@reddit
It is less of a performance hit with matching quality than FSR 3. And then it's only a matter of what you want to label as "quality setting" or "performance setting".
Seanspeed@reddit
I'm guessing their partnership with Playstation in working on this prevented this, otherwise yea, that should have been the very obvious thing to do.
Saneless@reddit
I've thought this too for at least 6 months
meatwaddancin@reddit
But then imagine the lawsuits of for some reason they find out it's just not possible. And people sue saying they bought cards based on that expectation.
Saneless@reddit
It was already possible. Not a risk
AIgoonermaxxing@reddit
Yeah, they garnered a ton of negative PR by not saying this. Better late than never, but you'd see a lot of people online swearing off ever buying AMD again because of this
Saneless@reddit
Well sure, they had a right to be mad because it was proven it works on those cards. For AMD to keep pretending like it doesn't even exist was strange
AIgoonermaxxing@reddit
Oh, they absolutely did. I have a 7800 XT and while I didn't swear off buying Radeon, I definitely was reconsidering going back to them if I ever did want an upgrade.
Honestly, the leak and their response to it was very damaging to the brand. Things are different now that we've been promised support, but before this I genuinely thought they'd be better off if the leak never happened. They could've just pulled what Nvidia did with the 10 and 16 series and been like "yeah sorry this doesn't support it".
But since literal evidence was released of the cards running it fine, their complete refusal to acknowledge it really came off as them intentionally withholding it to boost 9000 series sales.
Saneless@reddit
I'm no conspiracy theorist but I still think some of it was Sony not wanting PSSR2 to be overshadowed, like perhaps they helped out with INT8 code
rain3h@reddit
All for longevity but in 27 will there really be enough RDNA 2 users for it to be worth while?
I get 3/3.5 due to the current igpu in the chips they sell (470 etc) but 2.0?
Seanspeed@reddit
Probably worth it just so people stop bitching about it and pushing this idea that AMD is 'totally dropping support' for older GPU's, which was a bullshit claim.
bestanonever@reddit
I kinda bought a used RX 6700 XT in November.
Yeah, I'm going to keep using this one next year, lol.
This is great news.
psi-storm@reddit
I have a 6700XT and it does everything i want from it. When it gets fsr4.1 it might even hold out a few years longer.
RedTuesdayMusic@reddit
6950XT is still a very good GPU, and 6800XT is better than today's budget cards. Would you say the same about a 4070Ti getting some new feature? I still have my old 6950XT which is getting put into a LAN rig.
b_86@reddit
Extremely likely if the RAM shortage continues, especially in developing countries with lower salaries where getting parts is expensive and people are still rocking 480s and 580s 8GB cards.
jenny_905@reddit
Did Valve insist on this maybe?
psi-storm@reddit
They definitely want that for the console. Maybe Microsoft also wanted it for xbox? Since they are using the full rdna2 cores, they could also run the int8 version.
Original-Material301@reddit
2027?
I can wait. I'm not buying a new card anyway lol.
Last-Owl-8342@reddit
already sold it too late for me
Leo1_ac@reddit
Hm, I don't like the delay and the "wait until 2027" thing for RDNA 2 cards.
2027 when? Jan? July? Dec?
I think they'd rather say "yeah, just wait" than say "sry, no banana for RDNA 2 cards". I just don't think that FSR4XX is very likely for RDNA 2 cards no matter what they might claim.
ElectronicStretch277@reddit
Early 2027 likely means Q1. So anywhere from Jan-March.
zerinho6@reddit
This can possibily come to PS5/XSX and XSS too right? XSS might not keep losing to switch 2 if it does.
Marco-YES@reddit
That was a mouthful
Hayden247@reddit
PS5 base has no INT8 instructions so it probs can't run any FSR4 AT ALL. However Xbox Series S and X are full fat RDNA2 feature wise so yeah you'd hope AMD and MS would work together to get it going on those consoles.
fatso486@reddit
Everybody loves dunking on that poor thing. To be fair the series is is objectively much faster than the switch 2 despite some outliers like Capcom games. For example Indiana Jones is recognized as a very good and optimized port for the switch 2 And yet it's maxes out at half the series s 60 FPS.
Valmar33@reddit
While I am happy for this... AMD, geez, fix your marketing, please!
All of the confusion and frustration could have been so easily avoided with proper communication!
ElectronicStretch277@reddit
They had 2 pieces of interviews where they said they were working on it but were unsure if it was possible. AMD has been at the wrong end of some early claims and expectations before so they likely went full radio silence till they were sure that the backport was possible.
Agloe_Dreams@reddit
2027 is a “eventually” that means Never.
Deeppurp@reddit
Finally! I purchased a 7800xt cause the price was right, I just want native AA better than TAA and better than FSR3 because a lot of games I play right now run fine at native. The games I want a visual and FPS boost I dont have to worry about the fsr3 ghosting as much.
accountformymac@reddit
add it to strix halo plz
techtimee@reddit
Multi billion dollar company by the way.
BUDA20@reddit
also, a less disgusting FSR 3 will be nice... even 2.x looks better
Nomnom_Chicken@reddit
Massive delay, but... Better than no official support, absolutely. Guess this even counts as a rare Radeon win as well.
Hifihedgehog@reddit
So this should apply to integrated graphics using RDNA 3.5, RDNA3, and RDNA2, as well, correct? And even the Steam Deck uses RDNA2 so even it should theoretically be able to tap into this once RX 6000 series gets the FSR 4.1 treatment in 2027, right?
Educational-Lynx1413@reddit
You betcha!
Hifihedgehog@reddit
Cool! Updated my comment. The article wasn't clear so I went straight to the tweet itself, and that makes ME SO HAPPY! Steam Deck alone will hopefully get a new lease on life from this.
nukleabomb@reddit
Finally. Great news for radeon users.
Mllns@reddit
Does this means no APUs?
RumbleTheCassette@reddit
This will give some extra like to those GPUs, and especially for people buying them secondhand.
AutoModerator@reddit
Hello KARMAAACS! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.