Rip all the people who were saving up to get a 7900 xt
Posted by Isaac-cad@reddit | buildapc | View on Reddit | 81 comments
I was saving up for one and pretty close then when AMD announced that FSR 4 is come to the 7000 series chips they are all sold out or $1000 now.
GarrettB117@reddit
People are still saving up for and buying 7000 series? What’s the thinking there? I mean I have a 7900xt and I love it but unless you have some specific need for the extra vram the 9070xt is a much better option.
Accomplished-Tap-888@reddit
7900xtx is still has more raw power if you dont care about upscaling or framegen
Mikelius@reddit
With native fsr4 support would the xtx be the strongest amd card?
Sphexus@reddit
Absolutely
Responsible_Tank3822@reddit
In raw power year. In ray tracing no.
ExplodingFistz@reddit
Yep. 7900 XTX trades blows with the 9070 XT in raster, but as soon as you introduce a heavy RT workload the XTX gets decimated
Responsible_Tank3822@reddit
Not even just ray tracing, but path tracing as well. Like u said in terms of raster they trade blows with the XTX winning most of them by a smidge, but when it comes to RT, and PT the 9070XT is honestly a performence tier above the XTX.
With ray tracing becoming more widely adapted, and path tracing being the new kid on the black the 9070XT is just going to age that much better than the XTX.
ButterscotchFar1629@reddit
Can’t wait to see the numbers when stacked up against a 4090, or even a 5090, especially in straight raw frames. I suspect there are going to be a lot of seriously depressed Nshitia owners out there…..
Responsible_Tank3822@reddit
In terms of raw performence yea, but it loses out in ray tracing. Its about equal to the 9070 in terms of ray tracing.
ButterscotchFar1629@reddit
The 7900xtx was a fucking beast when it launched and still is. If it wasn’t twice the cost of my 7800xt when I bought it, that’s what I would have went with.
Plus-Statistician320@reddit
7900xtx is 6% faster than a 9070xt. But gets destroyed in RT. And 16gb will be enough vram for a long time to come.
dorting@reddit
Not really, the 9070 XT is even faster sometime
Darkknight1939@reddit
It's basically equal to the 9070XT in raw raster. I own two bazzite systems with a 7900 XTX and 9070 XT respectively.
The extra VRAM is useful for people who want to run VR or LLM's (with AMD's lower performance versus Nvidia and Apple) but for games they're largely equal in raster, with the 9070 XT being much better at ray tracing.
Official int8 FSR4 support for the 7900 XTX definitely narrows the gap a lot more, but I think 99% of users are better off opting for the 9070 XT if you want a Linux PC, or any Nvidia 5070 Ti or better GPU if you're running Windows.
ButterscotchFar1629@reddit
7900xt is a better when not using frame gen than the 9070 which of course is optimized for all the AI shit. Same reason a lot of people still prefer their 4090’s or even 3090’s over anything 5000.
WizardMoose@reddit
wait what? They're going for $1000 now?.....I might sell mine....
ExplodingFistz@reddit
Do it and buy a 9070 XT lol. Cash in the extra $300, since the XT can be had for $700 these days.
ComprehensiveOil6890@reddit
Why 7900xt when 9070xt exist that is better on everything
Coady54@reddit
Less Vram is the only downside is can think of.
Merrick222@reddit
Gaming doesn’t need that much VRAM though
PigSlam@reddit
There's non-gaming applications that are wildly popular these days, and VRAM matters a lot there.
RepresentativeIcy922@reddit
Reddit does not like AI.
PigSlam@reddit
Yeah, if they downvote all mention of it, RAM and gaming GPUs will cost less.
solonit@reddit
I use Vantage (GPU exclusive render) and it eat through every VRAM its mouth can get. And before you ask "why not a stronger CPU with CPU render" (like Vray/Corona), RT render is significant faster with slightly less accurate physic, but for my workflow (interior design concept) it doesn't matter. As a freelancer, more work I can do means more money.
Reddit should, for once, stop thinking that people only use GPU for gaming.
KillEvilThings@reddit
Doesn't need 20gb, yet, but with rapid enshittification VRAM is always the #1 king of ensuring longevity of the majority of cards that aren't entry level XX60 class cards since forever.
HereWeGooooooooooooo@reddit
Some does. VR, simulation games. I regularly get my 7900xtx over 20g of vram usage
JoeZocktGames@reddit
VRAM usage is not linear. If you have more, the game allocates and uses more even if not needed. Bus width, drivers and optimization also plays a role here.
AMD cards usually use 2-3 gigs more per game than Nvidia.
AHrubik@reddit
So your argument here is that AMD software typically requires 2 to 3 gigs more RAM than Nvidia and your advocating the purchase of a card that has four gigs less RAM?
JoeZocktGames@reddit
That’s not what I’m saying at all. There is a massive difference between VRAM allocation (caching) and VRAM requirement.
Modern games will grab as much VRAM as you give them to store textures ahead of time, just because the space is available. AMD's memory management tends to be more aggressive with this pre-allocation. It doesn't mean the game needs those extra 3GB to run, it just means the card is utilizing its free space. If you run the exact same game at the exact same settings on a card with less VRAM, the game adapts, allocates less, and often runs perfectly fine without hitting a bottleneck.
CanisLupus92@reddit
That’s not what he’s saying. Empty VRAM is wasted VRAM, so software will load more into VRAM even if it isn’t needed.
R6ckStar@reddit
Do games/software allocate more usage to vram even for stuff normally handled by generic ram? Or do they play completely different roles?
CanisLupus92@reddit
Completely different pools. In theory a GPU can fall back to regular RAM, but that has significant performance impact.
pre_pun@reddit
No their comment is about allocating RAM, not software overhead.
Fredasa@reddit
There are edge cases even outside VR.
The one I mention when I can is Stellar Blade. For the PC release, they provided exclusive tiers of quality for character and environment textures. It is not possible to play the game in 4K with textures at max, on a 16GB GPU. Most areas, including the two main large worlds and the hub, will, after a minute or so of walking around, saturate the VRAM and enter permanent stutter mode. No matter how thorough and knowledgeable you are at cleaning out VRAM before starting the game.
And it's just a major headache to have to fuss over closing every damn app before playing a modern game, so even if 16GB is "enough", it's not enough to be convenient.
Truenoiz@reddit
It does, VRAM size is the most important spec of the entire card. Nvidia cards age out faster than AMD because of their smaller buffers.
Coady54@reddit
No argument here, I was just saying the only real difference where the 7900XT could be considered better. It's not a practical choice if gaming is your only priority.
Traditional-Flower55@reddit
I believe that it still has to use the modified form of processing FSR 4 the INT8 version which has some amount of performance drawbacks. How much that will be in the end with the official drivers? we will see!
ChildhoodVegetable73@reddit
So now all those ai and mining bros can game and compute at the same time while still ignoring their jobs.
Can confirm, I work with several of these types
Blue-150@reddit
My thought was used which used to be found as low as $500 but OP didn't mention that, plus the extra vram. I wouldn't buy new
Fragluton@reddit
Who exactly was saving up for that?
vitek6@reddit
It will not be the same fsr. Are you aware of that?
AgentOfSPYRAL@reddit
Isn’t it the same visually just not as efficient fps wise?
Appropriate-Math2360@reddit
In short, yes - but the NVIDIA trolls are out to win right now
Darkknight1939@reddit
It's not Nvidia trolls, those largely don't exist on Reddit.
It's the AMD stock crowd that was coping about AMD refusing to properly support their GPUs. When Nvidia brings the latest DLSS to Turing GPUs there was really zero excuse for AMD not bringing int8 FSR4 to their old GPUs when the community had it running through optiscaler for ages now.
It seems like they've finally decided to support it, but another year for the 6000 series seems insane. Better late than never, but they were dragged kicking and screaming.
The people arguing it's not a big improvement are AMD sycophants wanting to excuse their piss poor long term GPU support, not Nvidia trolls. They're too busy enjoying DLSS, lol.
RepresentativeIcy922@reddit
I dunno, the drivers are open source, I still have support for the 1GB R7 I have on the other PC on Arch :)
Appropriate-Math2360@reddit
NVIDIA also refuses to provide any generational updates to older GPUs besides DLSS. Good luck getting NVIDIA next big advancement on the 5 series, when they go 6 series. Multi frame gen for example lol.
There are interviews of AMD GPU developers saying they will provide support to older RDNA GPUs once they figure out the hardware restraints from a couple of months ago, and lo and behold, they did.
Blame the AMD marketing department. I have a 5070 and a 9070xt right now sitting on my desk. But that’s a cool long story mate.
kemicalkontact@reddit
The 9070 XT is better for gaming period.
The 7900 XT is only better for native rasterization performance or productivity that requires massive VRAM overhead.
Ambitious_Handle7322@reddit
It's not better for native raster, the xtx is like 5% faster than the 9070 xt, the 7900 xt is slower
JoeZocktGames@reddit
Not to mention the RT performance on RDNA 3 is pretty lackluster. My 9060 XT 16GB equals a 7900 XT in RT.
Liesthroughisteeth@reddit
All this for what? World domination to the first ones in on AI?
beirch@reddit
Feeling pretty good about the $400 7900 XT I got for my second PC a couple months ago.
Many-Victory-1825@reddit
How is it in the used market, where you are located?
ConsistencyWelder@reddit
Heh, I sold my 7900XT and got a 9070XT instead, not that it's much of an upgrade in performance, but because I wanted FSR4.
I'm still glad I did it though, I got the 9070XT cheap on Black Friday and sold the 7900XT for the same price. So it was a free upgrade, and the card is an absolute champ.
Sphexus@reddit
Your thermals will be much better as well
beirch@reddit
That obviously depends entirely on the cooler on your specific card. My Asus TUF Gaming 7900 XT runs cooler than my Asus Prime 9070 XT at the same power draw.
Because the 7900 XT is a fucking 353mm quad slot monster.
ConsistencyWelder@reddit
You're probably right. The 7900XT was never loud but I did hear it spin up sometimes. I never notice that with the 9070XT.
ButterscotchFar1629@reddit
For my 7800xt a year and half ago for 499 Canadian brand new. Then again the 32 figs of ram in my system was just thrown in, so things have changed…..
ItzKeezie@reddit
I’m an AI Engineer with a background of Cybersecurity so that’s why I like the extra VRAM, just a bonus on Gaming too. I usually run a 4+ monitor setup myself…
CaptMcMooney@reddit
purchased mine for the same reason, 9070xt just didn't have enough vram. performance wise don't think it matters much 100fps or 99 fps, i could care less
Reasonable_Case4818@reddit
Yall 9070xt owners dont miss aan opportunity to defend it. Its not better then an XTX in all. It just isnt nor was it made to be, its close and in ray tracing it is better but thats it. Besides XTX is from 2023 9070xt is pretty damn new. Fsr 4 is great but in my opinion is best used in lower end cards. I damn near never used fsr unless i was tinkering or dlss. I dont know y ppl just cant leave them be the cards are brothers.
Buzz_Killington_III@reddit
Every post in this sub: "Why are you buying that? This one is better (even though I have no idea of your use case or why your chose it)"
Can you just let people make a decision on their own every once in a while without it turning into a circle jerk of what you would do?
AndyIsHereBoi@reddit
im not seeing any price increases in the last few days yet on the 7900xtx, then again that is like 800-1000$, i bought one a week ago and one 2 days ago as well and prices there are relatively the same.
im only using them for ai stuff so the 24GB of vram matters to me more
AdditionalMap5576@reddit
are they actually selling for that much? it might be time to sell mine
rewardingsnark@reddit
Gave up on PC gaming back in 2023 since have a 399 limit, just use chip GPU now
CommanderShepardFTW@reddit
I’ll sell you my 7900GRE for $999. I know what I have 🤣
Noobphobia@reddit
You are probably the only one.
joeh4384@reddit
Just because they are going to support FSR4, doesn't mean it well run as well as on a 9700xt.
yarrowy@reddit
Did you ever think it's sold out because they stopped manufacturing a 2 gen old card?
Foreign-Ad28@reddit
It’s a 2 gen old card, of course you’re not gonna find any.
Anyways why would you even consider one when the 9070xt is much cheaper and faster in gaming.
AnnieBruce@reddit
AI. I'd been running out of VRAM on a few image generation tasks I was trying before my 6800XT died, and the numbers had me just a couple gigabytes short.
For gaming, though, yeah. I do game, but when I had to replace my dead 6800XT, it was a matter of things I wanted to do not being possible at all on 16GB vram. The FPS cost compared to the 9070XT just wasn't nearly severe enough, especially considering I was otherwise happy with the 6800XT and would have just kept it for a few more years if it still worked. They were within 50USD of each other at the time, so the price premium really wasn't much of a problem either.
I do realize NVidia is better for AI and other compute workloads, but AMD is more reliable on Linux and the tooling does exist to run models on AMD hardware. Might generate a little more slowly than a comparable NVidia card, but I'll have fewer random crashes.
AnnieBruce@reddit
I would, of course, recommend the 9070XT if you don't have actual numbers demonstrating you need more VRAM, and that 4GB will actually be enough extra. I did have the actual numbers to justify the purchase.
Foreign-Ad28@reddit
It’s a 2 gen old card, of course you’re not gonna find any.
Anyways why would you even consider one when the 9070xt is much cheaper and faster in gaming.
oliverkiss@reddit
They died?
AnnieBruce@reddit
I think I paid like 700-800USD for mine after my 6800XT died.
Still annoyed about that. The 7900XT is a great card, but that 6800XT would have been viable for a few more years if it hadn't died on me, and my initial diagnosis pointed to the monitor so I replaced that first for a couple hundred, the original turned out to be fine(it's used when I need a bigger display on my laptop and it's my Pi 400s display- 1440p 144hz is massive overkill for those use cases, but it's what I have on hand that I don't need for something more demanding)
SchmeckleHoarder@reddit
Don’t know why people are shocked when products hit end of life….
Not going to magically find one for $10. That’s just not how it works.
Yes, prices are practically made up.
Merrick222@reddit
Just get a 9070 XT….its better.
Especially if you’re using FSR4 it’s more efficient.
So if you use FSR4 on same settings 9070 XT wins every time.
Anyone buying a 7900 XT is dumb right now for the same price or more than a 9070 XT.
ConsistencyWelder@reddit
RT is much better on the 9070XT too.
Gex2-EnterTheGecko@reddit
The 7900 xt is outclassed in nearly every way by the 9070 xt and I got one for $575 a couple months ago. Why not look for that?
goldenhokie4life@reddit
Newegg refreshed has them for anywhere from $550-700
AsManDoesManIs@reddit
So glad I bought the xtx recently, got it for LESS than a 9070xt and then a week later they announced 4.1 coming to rdna3
ToastyVoltage@reddit
There's literally 1 XFX model available on amazon rn
Rojodojo@reddit
K.