Never Fast Enough: GeForce RTX 2060 vs 6 Years of Ray Tracing
Posted by potato_panda-@reddit | hardware | View on Reddit | 310 comments
Posted by potato_panda-@reddit | hardware | View on Reddit | 310 comments
Firefox72@reddit
I do agree with what Tim is saying at the end.
Were 6 years into raytracing and its still not really usable across the stack. Or well it is once you relly on crutches like upscalling which have questionable quality at resolutions like 1080p and Frame Gen which again.
Nvidia made big promises about RT in 2018 but the hardware development for it has not been keeping up with the demans.
salcedoge@reddit
The biggest thing is how they literally change their GPU branding to RTX to advertise Ray tracing meanwhile DLSS is the feature that were actually used and considered by consumers
Radiant-Fly9738@reddit
DLSS is not marketable in video trailers, but rtx is. marketing means more sales.
MeelyMee@reddit
Surely it's easy, just a big frame rate counter.
chilan8@reddit
every single games who just use dlss is market has "rtx on" its so fucking stupid ....
Earthborn92@reddit
It’s Nvidia marketing genius, not stupid at all.
They apply RTX to features that have nothing to do with RT - such as RTX IO.
It’s the same kind of gimmick Tesla used - market self driving when what you really want to do is sell electric cars. Self driving and electric powertrains have nothing to do with each other.
ThankGodImBipolar@reddit
DLSS was hilariously bad when Nvidia rebranded their range to RTX, they wouldn’t have named those cards after that
dampflokfreund@reddit
One of the big reasons for that is insufficient VRAM. It's baffling how Nvidia pushes RT and frame gen, both of which need a ton of VRAM, while cheapskating on it so much. The same applies to AI too. It's almost though they want you to upgrade very soon...
teutorix_aleria@reddit
This is the one that baffles me, 7800XT trades blows in RT against similar priced nvidia cards because they are so memory limited.
Nvidias RT advantage literally doesn't exist in the segment where most people are buying. But the fact that a 4090 can run psycho RT on cyberpunk sells 4060s to gamers who will never be able to do anything close to that.
FinalBase7@reddit
AMD only trade blows in games that barely use RT, the moment a game uses multiple layers of RT extensively AMD falls apart, 4070 is not memory constraint most of the time at 1440p.
Pale-West-3176@reddit
if we're talking about AMD here. I am wondering how does my RX 7700 XT trade blows with the 4060 Ti 8GB RT, given they are priced the same on my end.
dedoha@reddit
No it doesn't lol, well unless you are looking at 4k tests...
teutorix_aleria@reddit
At 1080p it holds up against the 4060ti, the 4070 is 100 dollars more expensive.
dedoha@reddit
4070 is closer in price to 7800xt than 4060ti but even the latter demolishes that radeon in RT
conquer69@reddit
You mean the 4060 ti? Yeah no one should be buying that card. Nvidia basically has nothing worthwhile between the 4060 and 4070 super.
Morningst4r@reddit
That's not true at all for real RT titles, especially when you're forced to use XeSS or FSR 2.
AccomplishedLeek1329@reddit
Halo effect baby.
Amd is so bad at marketing it's depressing
balaci2@reddit
100% and I'm pretty sad about it because I like amd products
cumbrad@reddit
they’re not skimping on vram because they want you to upgrade quickly, that is just a bonus. They’re skimping on vram and locking GPUs for VMs and such because professional work, especially ai workloads, takes a lot of vram and they want those customers to buy the $10000 cards, not the $500-1600 cards
Morningst4r@reddit
We want pro users to buy pro cards as well. We've seen what happens when people can make money off gaming cards and it sucked. They need a better way to segment cards than VRAM, but I'm not sure what that would be.
JtheNinja@reddit
I really have no idea either. They used to do it by gimping the Geforce cards in functionality gamers never used, like 10bit OpenGL, stereo display out, functions only used by CAD viewports, etc. The problem is these days the workloads for consumer applications and professional use cases increasingly don't look any different to the GPU, this mostly started once unified shaders and GPGPU became a thing. (And this isn't even counting stuff like using Unreal Engine in filmmaking, where it's quite literally the same code that gamers are running)
Another way they've tried segmenting them is removing NVlink from the Geforce cards and only offering blower coolers on the pro cards, which makes the pro cards better for multi-GPU setups. But the price is so high it's often better to just live with the lack of VRAM if you can make it work. You can buy 3-5 4090 cards for the price of one RTX 6000 Ada, and if your workload fits in 24GB the 6000 Ada isn't even faster than a 4090.
wichwigga@reddit
One does not get to 3 trillion by being nice to consumers... I'll tell you that
wintrmt3@reddit
NV isn't worth 3 trillion because the consumer gaming cards, that's all the AI craze and their enterprise lineup.
Sapiogram@reddit
You realize Nvidia needs customers to buy enterprise and AI cards too, right? It's not like they can just cast high-level alchemy on their enterprise cards to turn them into $10000.
StickiStickman@reddit
Does it? Seems to work pretty flawlessly in every example I've seen, even for 1080p DLSS works really well.
Morningst4r@reddit
DLSS quality is heavily impacted by framerate. I think a lot of people who don't rate it highly are playing at 60 or sub 60 frame rates, where DLSS will look a lot worse than at 100+
iDontSeedMyTorrents@reddit
It sounds like you're talking about frame gen and not DLSS.
Morningst4r@reddit
No, it affects upscaling quality too due to the increase in temporal information it has available.
iDontSeedMyTorrents@reddit
That's the first I've ever heard of this. Can you point me to a source verifying this?
OutrageousDress@reddit
This has the unfortunate (but unavoidable) side effect that, because DLSS performs better if the base resolution is high and the base framerate is high, it can make high-end GPUs look even better, but is kind of mediocre on low-end hardware.
KittensInc@reddit
Raytracing is incredibly powerful, but it comes at a massive cost.
Besides the obvious graphical improvements, a raytracing pipeline is just far easier to write and scales way better than rasterization. If your hardware is powerful enough to allow for 100% raytracing and ditch the entire rasterization pipeline, you're able to revolutionize the entire gaming industry. An average Computer Science student can write a basic raytracing engine in a couple of weeks which generates visuals good enough to match AAA games. Imagine what the indie games would look like with that!
But we don't have that hardware. We're going to need a \~100x performance increase for that. So we're left using raytracing for small visual improvements. Moreover, the parts of the GPU dedicated to raytracing can't be used by the rasterization pipeline, and because most games still primarily use rasterization our GPUs primarily contain rasterization hardware - which means they are kinda bad at raytracing.
A fully raytracing-optimized GPU is going to suck for all non-raytracing games, and for a lot of people that's going to be a dealbreaker. Until we solve that chicken-and-egg problem, I fear raytracing in games is never going to be more than a gimmick.
jcm2606@reddit
Our GPUs don't primarily contain rasterisation hardware, they primarily contain general-purpose units capable of executing any embarassingly parallel workload. Rasterisation hardware makes up a fraction of a modern GPU, as the vast majority of hardware within a GPU is comprised of general-purpose scheduling, math, logic and memory units. Look at GPC/SM or WGP/CU breakdowns from NVIDIA or AMD respectively, and you'll see the vast majority of the hardware within the GPU is general-purpose.
As such, removing general-purpose hardware to fit in more specialised raytracing hardware won't lead to a linear performance increase, as the raytracing hardware only accelerates specific operations that are scheduled to it by the general-purpose hardware (ray-triangle/ray-box intersection tests, acceleration structure traversal, and/or micropoly geometry within the AS for opacity/displacement micromaps).
I imagine that you'd probably see a performance increase in the short term as you introduce more raytracing hardware since current GPUs probably haven't reached an equilibriam between general-purpose and specialised raytracing hardware, but eventually performance will plateau and eventually drop as there's less general-purpose hardware to pick up the slack and offload raytracing operations to the raytracing hardware.
AlphaFlySwatter@reddit
HDR is deffo the greater visual leap vs. rt.
And it comes at far less performance cost.
tukatu0@reddit
Thats if the game actually offers a true hdr. Rather than some sh,"""" turn up the brightness hdr. In which case if it's the ladder then its more a momiotr upgrade than hdr being it.
OutrageousDress@reddit
But not without a money cost - while HDR works on basically any GPU on the market it still does require splurging on a proper HDR display to make it look good, and midrange HDR monitor prices are comparable to midrange GPUs. At least the good news (or bad news depending on how you look at it) is that the best HDR monitors on the market right now are much cheaper than the best GPU on the market.
moofunk@reddit
It's nigh impossible to get to adequate real time raytracing without spending a few iterations on the hardware first. In 2010, it would have been argued that the hardware should be maybe 500x faster, and that might have been true, if there wasn't so significant advances happening on the software side as well to meet in the middle for an acceptable consumer solution.
Making these advances inside the user market are always tough, but there probably isn't many other ways to do it.
tukatu0@reddit
Jensen takes that and advertizes as if they gained 500x performance in 8 years
This is where amd needs to learn their marketing sucks. Except they only lie when they have a sh""" product for some odd reason
MiyaSugoi@reddit
And games with only RT based lighting, which makes development a good bit less cumbersome, will remain the few exceptions for the foreseeable future.
With a decrease in console sales and more handheld devices, Cross-Generation and handheld-compatible scaling will be a must for nearly every developer. So they'd want a non-RT based implementation in their game.
RT is the future but I personally see a legitimately widespread use being still comparatively far away.
On the plus side, DLSS2 upscaling is fantastic and hopefully the other techs can catch up somewhat in the near future.
OutrageousDress@reddit
Yeah, over the last few years I've come to the conclusion that fully RT games are now further away than they seemed in 2020. The terrible way Nvidia has been handling their GPU prices and AMD and Intel's inability to catch up, combined as you say with the PC handheld renaissance (and Microsoft announcing that they're working on their own Xbox handheld), and also just the general breakdown of Moore's Law and the explosion of AAA game budgets, imply that unless next-gen consoles are much more powerful and affordable than I now expect them to be we are probably going to be stuck with ray tracing just out of mainstream for another ten years.
Strazdas1@reddit
Games with mandatory RT are going to be ever more common. Anything using UE5 will be such for example.
Ashamed_Phase6389@reddit
Raytracing has been perfectly usable for years... in very small doses.
I know most people disagree, but the way I see it one of the best implementations of RT is what Capcom did for Resident Evil 2 Remake: Raytracing replaces SSR and AO, nothing else. This significantly improves the look of the game without completely destroying the framerate: it runs completely fine on relatively low-end cards, even AMD ones.
Not to mention Ubisoft uses Raytracing by default these days, it's not an option you can enable or disable: Avatar looks great and runs fine on mid-range computers.
But "this feature runs well on our cards from two generations ago AND on competing cards" doesn't sell 4080s and 4090s, does it? That's why Nvidia's marketing started focusing on RTGI and Path Tracing instead, which run like shit on everything. But that's exactly what people think of when they read "Raytracing."
DanaKaZ@reddit
That's also the main application I see for RT, to alleviate the shortcomings of SSR and SSAO.
Lighting can be approximated close enough, and to me simply isn't worth the cost.
Strazdas1@reddit
RT is absolutely usable across the stack for 4000 series. I have no problem using RT on my 4070 and i know people who use it on a 4060.
ryzenat0r@reddit
not natively
996forever@reddit
That’s more to do with their intentional product segmentation than tech development, tbh. The flagship at 1440p was 70% faster than the lowest end RTX Turing desktop card (2060) and the flagship Ada is 200% faster than the lowest end RTX Ada desktop card (4060).
https://www.techpowerup.com/review/nvidia-geforce-rtx-2060-founders-edition/33.html
https://www.techpowerup.com/review/asus-geforce-rtx-4060-dual-oc/32.html
Kalmer1@reddit
Damn. I thought segmentation got worse. But not **this much** worse
Zednot123@reddit
Which is perfectly in line with Pascal
Turing can't be used as a reference point for early node gen generations. TU102 could not be made larger and higher performance, it was almost at the reticle limit.
jenya_@reddit
It is almost like a Catch-22 situation for Nvidia. To get the game developers on board (primary users of the ray tracing hardware) you need to have RT hardware on most/all all the cards before the software will arrive.
Old-Benefit4441@reddit
They're doing good then!
kikimaru024@reddit
How about you base your opinion on RTX 4060 performance instead of 6-year old low-end hardware?
peakbuttystuff@reddit
???.
It's usable since the 308012 gb released. At least at 1080p native.
996forever@reddit
They specifically mentioned “across the stack”.
An x80 tier chip isn’t very relevant.
peakbuttystuff@reddit
That's 4070 speeds. Which is the old 60 tier. It's pretty much it won't run on AMD and 60 tier cards right now.
996forever@reddit
So it’s not the full stack which is their point
Jellyfish_McSaveloy@reddit
RT is essentially the new ultra, which was never for the full stack anyway. The real benefit of 2060 and it's RT and tensor cores was DLSS
Not_Yet_Italian_1990@reddit
True! However, it's worth noting that DLSS didn't really pay off for a while after the 2060's release. The first generation of the technology was awful.
It's sorta ironic that a lot of people bought 2060s for ray tracing features they were never really able to utilize well, but it was still a pretty decent card in the long run because DLSS improved so much and allowed the card to punch way above its weight.
Jellyfish_McSaveloy@reddit
Agreed. DLSS1 was just pitiful, looked incredibly ugly in FF15. From the looks of it though the people of /r/hardware doesn't agree that RT is essentially an ultra graphic setting. Much like supersampling, it's delusional to expect it to be a setting that will be easily usable on a 60 tier card.
PorchettaM@reddit
From the start, part of the promise with RT was that it would make developers' jobs easier by superseding the past 15 years of rasterization hacks they've been dealing with. The long-term expectation was (and still is) that it would become an always-on feature, replacing what was already there, not an extra cherry on top like supersampling. Expecting it to run competently on a 60 tier card is not delusional, it's the target that needs to be met in order to keep that promise.
Jellyfish_McSaveloy@reddit
It will become an always on feature in the future, but it's still going to be computationally expensive. It's never going to be as cheap as rasterised lighting or SSR reflections. You can see it right now. You can't turn it off in Metro Exodus enhanced or in Avatar and those games are hard to run on their respective generation of 60 cards.
In the future you're just going to see compromises elsewhere to fit in the performance budget of RT. For example if developers aren't trying to fit in new fancy tech in UE5 in games today and instead developed games with the fidelity of Control, you can absolutely play it with all bells and whistles enabled on a 4060.
PorchettaM@reddit
The trend with AAA games has historically been, mostly for marketing reasons, to sacrifice performance as much as tolerable before sacrificing graphics. As long as compromises need to be made, you won't see developers intentionally holding back on fidelity, but you will see lower framerates.
996forever@reddit
The 2060 is the real MVP in mobile. Desktop is meh, not too bad but not amazing.
TheNiebuhr@reddit
You're slightly exaggerating. 2060 on laptops wasnt that good, sometimes it could lose vs 1070. The majority of them came with 80 or 90 watts of board power. 1500 or low 1600 MHz on core is awful low for any post Maxwell gpu. Only the refreshed vbios with 115w was balanced wrt the hardware and how it should have been since the beginning.
On desktops, with proper power levels, it beat 1060 by 60 or 70%. That's almost as big as the gap between 960 and 1060. So 2060 is clearly the second best 60 class gpu in memory if not of all times, and you just label it "meh, not amazing"?
Not_Yet_Italian_1990@reddit
I sorta feel the same way about the 4060 on mobile, honestly.
On desktop, I don't think I'd recommend it due to the VRAM limitations and you tend to notice DLSS artifacts more easily on a bigger screen, which is bad if you need to dip below quality.
But on a smaller screen you can even get away with balanced DLSS settings sometimes. And it has frame gen. I don't know if I'd recommend it as a primary setup unless you're strapped for cash, but if you want to have a laptop that you can do some AAA gaming on when you're traveling, or whatever, it's a pretty good option, really.
996forever@reddit
The mobile 4060 seems decent because of the massive rip off that is the 4070 mobile. But with laptops, last years models get pretty good discounts so you can pick up a good 4070 mobile for around 1200.
996forever@reddit
I’m only responding to what the chain OP brought up.
Six years has passed and it’s not all that new anymore. That’s the whole point of the video and also the comment chain.
Jellyfish_McSaveloy@reddit
Did you honestly read my comment 'new ultra' and believe it refers to the length of time that RT has been out?
996forever@reddit
Yes?
Ever since RTRT came out in games, it’s been part of maximum settings. Hence “new ultra”.
It’s been six years. So it’s been six years of RTRT being the “new ultra”.
What am I missing?
Jellyfish_McSaveloy@reddit
I'll rephrase it for you:
The comment isn't about the length of time it's out. It's that it should be considered as part of the highest graphical preset and people shouldn't expect it to run on the full stack. Do people on here really think it should have a performance footprint of say ansiotropic filtering?
996forever@reddit
Because it’s the most popular and commonly sold tier, both in DIY and in gaming prebuild/laptops. Ten years ago you had GT740s, 730s, 640s 630s and they were not expected to perform well at the then-new features. But they were also not commonly sold in systems marketed for gaming.
Things changed in 2024. The 4060 is the best selling Ada tier, 3060 was the best selling Ampere. They were also the lowest tier desktop part during architecture cycle. So there’s only one way to put it.
The mostly commonly sold mainstream gaming GPU, despite touting of marketing material, is not ready for a feature introduced six years ago.
Jellyfish_McSaveloy@reddit
Ok? Do you think I said that the 2060 is ready for RT and that Nvidia has very responsible marketing? I agree with you, the 2060 isn't ready and never was ready for RT because it's a feature set that should be considered as part of a maximum preset. The 2060 isn't ready for RT the same way it isn't ready for 4k or for maximum raster settings in 2019.
That doesn't quite hold true for newer 60 tier cards however. You can absolutely use a 4060 and play an older game with RT.
peakbuttystuff@reddit
It runs well in 70 tier cards lol. It's mainstream
PotentialAstronaut39@reddit
Spot the problem:
4090 VS 2080 Ti = 156% faster, 118% more VRAM
4060 VS 2060 = 38% faster, 33% more VRAM
masterfultechgeek@reddit
your first comparison is a $1600-2000 card vs a $1000ish card
your second comparison is comparing two similarly priced cards.
996forever@reddit
That’s the issue of nvidia’s own price hiking.
Both are the respective flagship of their architectures.
spiderpig_spiderpig_@reddit
Charging more for a more capable product? How dare they. Much as I would love it to be cheaper it’s hardly surprising,
Banana_Joe85@reddit
Considering the 2060 was when they raised the prices one tier already, the 2060 was a serious downgrade compared to the previous 1070, which had 8 GB of VRAM already and not that much faster AFAIK.
They fixed that issue partially with the 2060 Super, that had again 8 GB. But we have been pretty stagnant in the mid range and I think we are approaching a 'Intel 4C/8T' scenario with GPUs.
Too bad that AMDs marketing team is torpedoing any chances for them to win market share in the GPU space.
Zednot123@reddit
I mean, sure it wasn't the greatest product ever.
But it was mostly critizized for being slightly slower than the 1080 (then, now it is faster). It was clearly above the 1070.
Launch performance
Banana_Joe85@reddit
Depending on scenarios, you could already back then run into VRAM issues.
For its time, the 1070 was a decent 1440p card and for 1080p it did hold up quite nicely for a good while (I only replaced mine in 2022).
Because of the 6 GB, the 2060 could end up running into issues that the 1070 did not have, depending on game and resolution. Yes, those might be edge cases, but they did happen, even if it overall was the faster card.
theholylancer@reddit
its also the AMD HW/SW team, the lack of viable DLSS competitor really don't work wonders. PSSR shows that it really needs to be addressed and maybe by UDNA or w/e PS6 it will but for now, its gimped horribly across the entire range because of it.
They are not willing to play on price, but their lack of features sink them when they are not willing to play on price.
tukatu0@reddit
Pssr the one that is filled with issues and has people in the ps5 subs why are devs taking away mode options when pssr isn't better than fsr? That pssr?
theholylancer@reddit
the one where Sony, a for profit org, spent an ass load of money to try and get things done their way because it seems AMD wasn't willing to add dedicated HW only upscalers.
And there are many people who are saying PSSR is better than FSR, but worse than DLSS. for my take, PSSR improves in some situations over FSR but certainly not great in all situations.
tukatu0@reddit
Well i give you that. Sony is going to improve it further from now on. Bit meaningless if it's not outright better than fsr.
At least when the ps6 comes out and pssr is better than dlss of right now. They'll probably lets devs just click a button to replace the old pssr with the latest one. Not that devs can't do that with fsr but considering they don't even often ship with the latest one.... Oh well. Guess we will enjoy pssr in 10 years.
FinalBase7@reddit
I mean if you go back 1 generation 2060 was 60% faster than 1060 which was 70% faster than 960, we've had a few exceptionally good generations of 60 class cards but Nvidia decided that's enough.
anival024@reddit
You should be comparing to the Titan RTX, not the 2080 Ti.
The xx90 cards have replaced the Titan cards, essentially.
tukatu0@reddit
Not true. Titans are the same as xx80ti. 98% of the full die.
The 4090 is neither
PotentialAstronaut39@reddit
Results aren't much better, 109% difference. ~3 times higher than the xx60.
996forever@reddit
It’s the top -102 die, regardless what they want to call it.
hackenclaw@reddit
Remember 560Ti vs 580? 760 vs 780Ti, 1060 vs 1080Ti.
Miss those days where mid range isnt low end speed.
ClassicRoc_@reddit
Even though I see the point you're trying to make vRAM needs don't necessarily increase with what games demand in a certain time frame.
PotentialAstronaut39@reddit
The point wasn't about VRAM.
It was about the massive gap between advances in the top-end VS advances in the midrange.
ClassicRoc_@reddit
I know what I'm trying to say is even if a graphics card is 100% more powerful than the previous generation. a next generation game might only need one or two more GB gen over gen.. generally, you wouldn't need double the graphics memory.
That being said, I do think vram needs have fallen behind slightly starting with the 20 series. At every level and every tier graphics cards need at least two more GB than they currently have.
ClassicRoc_@reddit
I know what I'm trying to say is even if a graphics card is 100% more powerful than the previous generation. a next generation game might only need one or two more GB gen over gen.. generally, you wouldn't need double the graphics memory.
That being said, I do think vram needs have fallen behind slightly starting with the 20 series. At every level and every tier graphics cards need at least two more GB than they currently have.
THXFLS@reddit
Somewhat better if you compare the actual x60 cards:
2060S VS 4060 Ti = 47% faster, 100% (or 0%) more VRAM
nailgardener@reddit
Jensen, munching on leather: What problem?
Chris00008@reddit
human leather.
MeelyMee@reddit
2060 was more of a... 1670.
FinalBase7@reddit
2060 was one of the largest generational leaps for a 60 class cards
MeelyMee@reddit
It has a whole lot extra but performance wise... marginally faster than a 1660Ti. Super/12GB bump it up a little bit more, maybe a 1680.
Njale@reddit
Their comment section is full of people saying that Ray tracing is a gimmick, they found their audience.
fogoticus@reddit
As per usual. The people enjoying RT benefits and looks are mostly silent while the people who dislike it or can't use it are the loud majority.
No surprise there.
auradragon1@reddit
The loudest people are AMD buyers who want to justify not having ray tracying/DLSS.
timorous1234567890@reddit
RT is the future but it is very hardware intensive and in a lot of cases the improvement in IQ is not always worth the performance hit. That won't always be the case and as more games start taking advantage of RT features more and more it will become a requirement.
hackenclaw@reddit
It is like Anti-Aliasing in early days. Those AA took a major performance hit back then. It is gonna take a few more generations b4 RT gets what AA is getting now.
timorous1234567890@reddit
Need a 9700 pro generation to come along and make it a default on feature rather than a nice but costs too much FPS feature.
LasersAndRobots@reddit
RT at native resolutions is also implausible for most hardware configs, so everything you gain image-quality wise from RT is nullified by upscaling artifacts.
DryMedicine1636@reddit
Movies and VFX are basically all "PT" now that the hardware supports it, but it used to be a rare tech demo too. It's easier to scale up the render farm that a single GPU, plus you would need to support console as well with an even longer release cycle.
The hardware just isn't there yet at the moment, so the adoption is rather meh. The game just designs around the limitation, or at least mask it with lots of techniques or just software RT.
ResponsibleJudge3172@reddit
The 1 time they have anything bad to say about AMD r/hardware comes out of the woodwork saying "see, the only guys who disagree with them must be cultists"
b_86@reddit
No, they're discussing what pretty much everybody with half a brain and not on the nvidia kool-aid predicted 6 years ago: that by the time RT actually became the transformative experience that Nvidia was overpromising in 2019, these cards would be too obsolete to even benefit for it. And that's exactly what happened: that moment is now, and the facts are clear. And if I had to guess, a similar analysis would probably not be very favorable to the 2070 either.
haloimplant@reddit
I run 1440p and prefer 60+ fps I don't think my 2070 traced a single ray lol
b_86@reddit
Like, let's be real for a moment. The 2060, 2070, 3050, 3060 and 4050 had absolutely no business bearing the "RTX" moniker in their name, and the cards right above them in the stack *barely* do. It was always a marketing stunt to justify price increases and stagnation in raster performance per dollar.
skycake10@reddit
I see what they mean. I still have a 2080 because I played with ray tracing when it was new, thought it was neat, but never had anything jump out at me and make me feel the need to upgrade to something that could run ray tracing better.
AcceptableFold5@reddit
To me, having reflections that aren't completely broken due to SSR occlusion is worth the upgrades. Games are so much more immersive thanks to this, at least if they support RT.
Decent-Reach-9831@reddit
For the vast majority of people, yes, it is. Normal gamers aren't buying a 5090 to do path tracing, and the number of games where RT matters is tiny.
In 10 years it will be everywhere and have near zero perf impact, but right now it absolutely is a gimmick.
I say this as someone who plans to spend $2k on a 5090 for path tracing.
Jellyfish_McSaveloy@reddit
Maybe in 20 years you'll have games built around it and you can't turn it off, but expecting it to have the performance penalty of 0 is unrealistic. Much like how 4k will always be more demanding than 1080p.
Strazdas1@reddit
You already have games built around it and you cant turn them off. Not many of them so far, but they exist.
Jellyfish_McSaveloy@reddit
Apologies, I should have said the majority of games. I do expect PT to be the primary method of lighting games in 20 years.
Strazdas1@reddit
Yeah, i can agree with such assessment, but until then there will be some interesting techniques for ray tracing i think.
account312@reddit
Well, we don't really talk about textures as having a performance penalty. They're just part of graphics.
Jellyfish_McSaveloy@reddit
The limiter for texture quality is VRAM, assuming we don't get AI tech in the future that can upscale on the fly low quality textures into 8k or something. Until that day, the more textures you have at a higher quality then you need more and more VRAM. RT will continue to get computationally expensive in the future because it scales with environmental quality and complexity. You can path trace Quake because it's a basic environment, it's an entirely different story trying to do so with GTA6 for instance.
account312@reddit
And path tracing will stop getting more expensive one we have all the complex scattering behavior, scene detail, and raycount we could possibly want at 8k.
Jellyfish_McSaveloy@reddit
Yeah it'll stop getting expensive once graphical fidelity has reached it's maximum and we keep getting more powerful GPUs. I'm not quite sure I'll see that in my lifetime though.
account312@reddit
I think it's not happening on silicon unless something major changes. We're running out of room at the bottom, and what's left is weird and ornery.
Strazdas1@reddit
Well you dont need to do path tracing to do ray tracing so you dont need to buy a fictional 5090 for it.
IronLordSamus@reddit
Because it is a gimmick like 3d tvs were.
Framed-Photo@reddit
Right now, outside a handful of good implementations, it's largely not worth the performance cost.
If that makes it a gimmick to some, then that's totally fair.
There's no point in judging things by what they could be in the future, if it's not that right now.
JensensJohnson@reddit
Gotta keep those patreon subscribers happy
SecreteMoistMucus@reddit
No it isn't. This reads a lot like a pathetic attempt to discredit something you don't like.
ToTTen_Tranz@reddit
Wait until you see Nintendo promoting raytracing capabilities on the Switch 2, with a fraction of the RTX2060's performance
Meekois@reddit
If any company can make beautiful graphics with shit hardware, it's Nintendo.
surf_greatriver_v4@reddit
Have you seen any switch games in the past 2 years? Developers for the switch are struggling hard to keep up
ToTTen_Tranz@reddit
They're not graphics that would traditionally gain a lot from raytracing, though.
Thingreenveil313@reddit
Yet we have Minecraft RTX, Quake 2 RTX, and Fortnite RTX. I don't really think it's the "graphics" that benefit from it, but the game.
BrkoenEngilsh@reddit
But aren't minecraft and quake path traced? That level of ray tracing is basically equivalent to AAA games.
Thingreenveil313@reddit
That's exactly my point. No specific style of graphics benefits from any real-time lighting, it's dependent on what the game is. For example, a platformer probably isn't going to benefit from real-time lighting, but it's generally seen as an improvement in shooters/first person games because of how you experience the game (perspective, etc)
BrkoenEngilsh@reddit
But my point is that they only look good with real time lighting because it's using the most advanced ray tracing. The budget ray tracing options Nintendo will likely use aren't going to be similar to minecraft or quake.
tukatu0@reddit
Whos to say they aren't if they run at 360p ¯\(ツ)/¯
They can just sell DS games remastered with path tracing. The art design shouldn't have issues since it was made with 150p in mind or whatever. Same goes for 3ds games
Earthborn92@reddit
They will use RT in gameplay.
I can imagine a new Zelda with dungeons using light/shadow puzzles or mirror reflection puzzles.
fogoticus@reddit
Major difference is switch is gonna be a console. Companies optimize for consoles greatly. And switch 2 will also have games starting development with the latest version of DLSS and framegen. So upscaling on switch won't look like total ass cheeks.
tukatu0@reddit
And even if they dont optimize it. Games will probably be upscaled just fine from 480p or 540p. Since the series s will run at sub 800p anyways
Darksider123@reddit
Upscaled from 240p /s
Jokes aside, I can actually see them doing something similar, since some of their games run at 720p/30hz
WhoTheHeckKnowsWhy@reddit
Nintendo sourced their past handheld SoCs from the market flop fire sale bin; I just don't see them being bothered with raytracing on Switch 2, and if it can it will be next to useless like the Steam Deck.
Infact thats the performance I expect for the Switch 2, as an older gen SoC like that would still give them good margins on each console sold and a good bump over the original Switch performance for under $400.
As a related aside; god those "impossible ports" for the Switch DF loves to wank over are just such trash when you actually play them. Even the Skyrim port of Switch, which mostly runs good; has bad frame holes in certain outdoor areas that made me put it down and return the cart to our countries version of Walmart.
tukatu0@reddit
I do see them. They don't really care if their games run at 540p. I mean sh"" the series s often runs things at sub 720p. So ¯\(ツ)/¯
Mario bros switch special running at 540p with dense textures seems possible. Like red dead redemption 2 density.
OutrageousDress@reddit
Switch 2 would be completely terrible at ray tracing if developers approached it the way they do on PC (arguably current consoles are also terrible at ray tracing because developers are approaching it the way they do on PC, and that's a mistake). There are ways to make good use of ray tracing that run very well on low-power GPUs with good resolutions and framerates if you're not greedy and don't try to make RT do more than you have power to do. There are actually multiple games in this Hardware Unboxed video that demonstrate this exact thing, running smoothly at (DLSS-Q) 1080p60 on a 2060 with great visual results. It can be done.
I trust that if Nintendo wanted to add ray tracing to their games in some capacity they would have no trouble doing it. It wouldn't be as impressive as, say, path traced Cyberpunk, but it would run smooth and look pleasant and sharp. It's when developers' eyes get bigger than their stomachs that games get bogged down with overengineered RT solutions that promise heaven and earth and run like crap.
zerinho6@reddit
I don't think it will be impossible or weaker for that given its low res and dlss already being so much better now than the starting version.
ResponsibleJudge3172@reddit
And a fraction of the pixel count on a small screen most players use
HyruleanKnight37@reddit
To this day, 60 class buyers are still paying for something they cannot fully utilize. The 4060 is more than capable of RT in modern games but runs out of memory at anything higher than native 1080p.
But but DLSS...
FinalBase7@reddit
Yes, but DLSS. It's a bigger selling point than RT.
I don't understand why people scrutinized the 4060 so much, it's not a great product, but AMD released the 7600 with the same 8GB 128bit bus bullshit for $270, it was slightly slower than a 4060 so raster performance wasn't even an argument for it, but it mostly flew under the radar, then AMD released a 16GB model with the same 128bit bus with a massive price increase copying Nvidia's 4060Ti 16GB one-to-one but again it flew under the radar.
HyruleanKnight37@reddit
Agreed, DLSS is indeed a bigger selling point than RT. Though I don't agree with some people's opinion about the 4060 being incapable of RT in games, because it absolutely is. It's only when trying to use RT with high-quality textures that you start running out of VRAM, and God forbid if you want to use FG.
Some people mitigate this by using DLSS, which logically reduces memory consumption since the input resolution is lower than 1080p. This is where my first complaint arises: DLSS upscaled from sub-1080p doesn't look great. There may be a few examples that do, but most don't. As a buyer of a current-gen $300 card I should atleast expect to be able to run my games at 1080p at the minimum, with the understanding that I could use DLSS to output a higher resolution if needed. If I need to use DLSS as a crutch just to get a 1080p output - well, that's not very different from using FG to achieve 60 fps, is it?
Now, let's talk about AMD.
Whether 4060 is faster than the 7600, I don't remember, but I do know they're close enough where the Raster difference doesn't matter. The $30 price difference isn't anything to write home about either, and both having the same 8GB frame buffer means the 7600 has no merit over the 4060. in a world where they cost $270 and $300, nobody in their right mind should buy the 7600. This much we both agree, I assume.
Except the 7600 goes for $250-260, often $240 even. As for the 4060, I don't think I've ever seen it go below $285, and usually hovers around $300, sometimes as high as $320. In an extreme scenario ($80 difference), there is some merit to buying the 7600, but that's as far as it goes.
Keeping this price difference in mind, people understandably do not expect great performance on the 7600. That is why nobody talks about it. And it's not just that - AMD is consistently worse than Nvidia in RT applications, which is why the lower price is justified, is it not? Why should we then shit on the 7600?
Next.
7600XT launched at $60 more ($330) than the base 7600's launch price ($270), though I'd argue the actual difference at the newer card's launch was greater than $60 because 7600s were already going for $250. Fair argument.
But prices have fallen sharply since, and now you can actually find a 7600XT for as low as $280, and there are several models at $300. How much is the 4060Ti 16GB again? $500 at launch, and today it's $450 at the minimum, though most are still being sold at $500. Yes, it is 15-17% faster, with the same amount of memory, for 60% more money. But hey, let's shit on the 7600XT too.
Admittedly, 4060Ti 16GB solves one glaring issue with the 4060Ti's insufficient VRAM by making RT and FG actually usable. 7600XT can't do shit even with all the memory in the world, so points for Nvidia. But for $170-200 more? I don't know about that, dude. I'd rather get a 7800XT at that price - yes, it's still worse in RT and FG, but it's almost 40% faster otherwise.
Final word.
Ultra/max settings is dumb, I already said it once. But you know what isn't? High-quality textures. They have no bearing on the performance on modern GPUs, so as long as you have the memory for it, you're good. I've always been an advocate for lowering graphics settings and raising texture quality because it improves your overall experience 100% of the time. Cards like the 4060 and 4060Ti fail in this regard because even with relatively modest settings, you can not use high texture quality and RT/FG at the same time in many games. You have to smear DLSS to get a stutter-free experience, at 1080p output no less. That is a bad deal at $300.
7600 doesn't suffer from the same fate despite having similar specifications because it is cheaper and has unusable RT performance, which is within expectations.
No_Guarantee7841@reddit
I have a 4070 and at 1080p, path tracing is barely on the playable side WITH dlss quality. Not sure where those delusions about even 1080p native come from regarding xx60 series gpus, when 4090 is the only card that does path tracing 60 fps on native 1080p with 4080 being close enough.
HyruleanKnight37@reddit
I said RT, not PT. I know PT is significantly more demanding that RT.
4060 and 4060Ti are more than capable of RT at native 1080p in most RT games if you lower texture quality to remove that VRAM capacity bottleneck. Heck even the 3060 12GB is capable enough at modest settings. My RX 6800, which is on par with the 3060 12GB in RT applications, can easily get \~60 fps at native 1080p in CP77 at max settings + max RT, so I know for a fact I'm not delusional.
I feel like you're making your judgement on RT capability on a select few games that are known to be absolute resource hogs, like Black Myth: Wukong. For every such game, there are many, many more that don't need a 4070 to run at native 1080p, let alone a 4090.
No_Guarantee7841@reddit
In cyberpunk with RT ultra preset (not PT) the 4060 barely does 60fps with dlss quality on 1080p. And not, its not vram restricted at those settings. https://www.youtube.com/watch?v=XFB3ea-7T5c&t=405s
GARGEAN@reddit
And going below Ultra settings on lowest card in the stack is forbidden by religion?
BighatNucase@reddit
The entitlement of modern PC gamers is actually disgusting. It used to be that Ultra was seen as this setting that should crush modern hardware and that any game which could be run on ultra was just an example of consoles dumbing down games; now if you can't run a game at Ultra 4k and get a consistent 120fps then that's a sign of hardware being shit/the game being unoptimised.
Morningst4r@reddit
I think it comes from the PS4/XB1 era where graphics settings didn't really scale up that far in most cases so ultra was more like medium and everything ran well on midrange hardware.
Strazdas1@reddit
Ultra settings should exist for future hardware to run them when you replay the game years later. A midrange card like 4070 should be expected to play games on medium settings.
BighatNucase@reddit
My issue isn't even really about whether Ultra is always "future proofing" but that people only judge based off Ultra settings.
HyruleanKnight37@reddit
Who said otherwise?
Strazdas1@reddit
Yes and? That is absolutely what one should expect when buying a 4060.
No_Guarantee7841@reddit
Dont understand what you mean by what one should expect.
Strazdas1@reddit
You buy the lowest tier card, put on ultra settings tthat are supposed to be meant for future GPUs and get playable framerates. That i would call better than expected performance.
No_Guarantee7841@reddit
That i can agree.
Raikaru@reddit
Isn't that what Frame Generation is for?
No_Guarantee7841@reddit
Frame generation input latency is good only if you are getting about 100fps or more.
StickiStickman@reddit
Frame Generation with Reflex is literally lower latency than without.
No_Guarantee7841@reddit
You are running reflex anyways even without fg so not sure what you are talking about.
Raikaru@reddit
Sure but is playing without reflex a bad experience? If not, then why would a lower input latency than that be a worse experience?
No_Guarantee7841@reddit
Its certainly a worse experience compared to having it enabled, so really no reason not to.
Raikaru@reddit
You kinda completely dodged the question there
No_Guarantee7841@reddit
Dunno, imo you are dodging the question implied by this argument about why you want to have it disabled instead. Thats like having a monitor with 144Hz refresh rate and arguing about running it at 60Hz only because its not a bad experience. Makes zero sense.
Raikaru@reddit
I asked the question first and YOU dodged first. Also that comparison makes literally no sense. 60hz has 0 benefits. Frame Generation has benefits over just Reflex enabled (motion smoothness). AMD GPUs can't enable Reflex at all. By this logic aren't AMD GPUs straight up inferior at similar FPS because they'll have worse input latency and therefore there's really not a reason to get them?
No_Guarantee7841@reddit
60Hz has no benefits... alright hotshot, please do tell what benefits does not enabling refex has.. Btw, amd has anti-lag for latency reduction so not sure what yourl are talking about... Seem to be clueless.
Raikaru@reddit
Anti Lag is not the equivalent of Reflex lol
https://www.youtube.com/watch?v=K_k1mjDeVEo
I never said not enabling reflex has benefits. I said if playing before reflex was a thing was fine why would something even better be bad?
No_Guarantee7841@reddit
People were running their pcs on hdds before too. So by your logic, everyone should stick on hdds forever because there weren't better choices in the past...
Raikaru@reddit
Is there a reason for you to keep dodging my question? Why not just answer it? Will it hurt you?
No_Guarantee7841@reddit
I already answered it. Its bad compared to having input latency technologies enabled.
Raikaru@reddit
What makes Frame Generation bad compared to just Reflex?
No_Guarantee7841@reddit
Its not bad as long as you enable it with at least a 60 fps base.
Raikaru@reddit
I've used frame generation and i completely disagree. Used it with 60 FPS before and the added motion fluidity in singleplayer titles is 100% worth it.
No_Guarantee7841@reddit
I used it too in Cyberpunk with the 4070 and when framerate drops below 90fps, input delay becomes very noticeable. At high settings with RT and dlss enabled, if you already have 60fps without FG, you should be getting around 100fps after with FG so we are not saying anything different in that regard.
HyruleanKnight37@reddit
That makes sense, he's getting 36 fps average at native 1080p Ultra + max RT. The 3060 12GB is 16.7% slower on paper, and also gets about 30 fps at the same settings, which is exactly 16.7% lower. At any rate, both are playable, unlike my AMD card which shits bricks when I raise the native resolution to 1440p.
As for VRAM usage, you're comparing 1080p DLSS Quality vs native 1080p. Go back a few seconds and you'll see at native 1080p Ultra + max RT he's basically pinned at 8GB.
StickiStickman@reddit
Obviously path tracing is more demanding, but it's perfectly playable with Frame Gen
No_Guarantee7841@reddit
Please do tell, what input latency are you getting with your perfectly playable frame gen.
Strazdas1@reddit
20 ms imput latency is fine in a game that isnt a twitch shooter.
No_Guarantee7841@reddit
I agree that 20ms is fine input latency. At cyberpunk you need around 90-100 fps without fg to achieve that though.
Strazdas1@reddit
How so? At 100 fps assuming no other bottleneck youd get a 10 ms input lag at worst case scenario.
Its useless to consider other sources of input lag here a s they will vary greatly. For example people using wireless controller to play the game will have >50ms input lag just from the controller sending signal alone, so they would never achieve bellow 20 ms input lag.
The only portion worth talking about here is one caused by lowered framerates, so yes, frametimes are whats important.
No_Guarantee7841@reddit
Nvidia overlay measures it so its accurate. And even if it is not accurate for whatever reason, dropping below 90fps with fg enabled is really noticeable input latency wise.
Strazdas1@reddit
A 4060 is capable of ray tracing at proper settings and resolutions.
potato_panda-@reddit (OP)
Another example of why you should never buy hardware based on future promises.
DLSS was a win, RTX was meh
Azzcrakbandit@reddit
I bought a 3060 because the 12gb made me feel comfortable trusting it's "future proofing." Dlss is just something I use to get at least 120fps in competitive games.
MeelyMee@reddit
It's weird that Nvidia recognised this with 2060 and 3060 with their 12GB models and then dropped it for 4060. Some people don't want the best of everything but a big ticket spec like 12GB VRAM is appealing.
Have found the 3060 to be an excellent, versatile card because of that spec.
Bored_Amalgamation@reddit
TBF, there have been comparisons between the 16GB and 8GB version of the 4060 Ti, and there's barely a noticable change in performance.
Affectionate_Rub_589@reddit
8gb cards have texture issues in games like alan wake 2
MeelyMee@reddit
Yeah there won't be unless you're using more than 8GB, just like 3060 though I'd guess its a good card for things that aren't gaming.
Bored_Amalgamation@reddit
What games are using more than 8 that makes a significant difference in performance?
iDontSeedMyTorrents@reddit
You're in luck, because HUB also did a video on this.
https://www.youtube.com/watch?v=ecvuRvR8Uls
Azzcrakbandit@reddit
Did you miss the whole comment where they weren't talking about gaming?
MeelyMee@reddit
Dunno
Bored_Amalgamation@reddit
So you dont really know if it makes a difference or not.
MeelyMee@reddit
Of course I do, I'm not talking about gaming.
Azzcrakbandit@reddit
Yeah, I'm in a weird spot where I have a ryzen 7900x and need Cuda support. I can't afford the go from 2x 16gb ddr5 to 2x 32gb, so the extra 12gb while running rebar prevents my pc from stuttering when my ram gets maxed out.
Hawke64@reddit
2x 24gb is the perfect sweet spot for 12 cores
Azzcrakbandit@reddit
Depends on what I'm doing. I make 3d scans of objects, and when using 0.1mm accuracy my ram maxes out very, very quickly. My two choices are getting another 2x 16gb modules for $100 and taking the performance hit of running 4 sticks of ddr5, or spending $200 and getting 2x 32gb modules.
Tuxhorn@reddit
Nvidia can get away with it.
CUDA is the name of the game, and being stingy on VRAM is an easy way to force producivity setups to spend more.
hackenclaw@reddit
vram is always gonna be future proofing if you intend to keep the card for as long as gets.
Look at those Radeon 1GB 7790,7850, 2GB 680, 3GB 1060, 4GB RX470/480. Those owners probably wish they pay a little more extra for double the vram.
imaginary_num6er@reddit
I hope AMD learns this lesson from the whole RDNA3 Navi 31 "aging like wine" and Intel Arrow Lake and Alchemist
teutorix_aleria@reddit
DLSS was a win for nvidia not for 20 series buyer since they got locked out of newer versions of it somewhat arbitrarily. Great way to sell 30 40 and 50 series cards tho.
Electrical_Zebra8347@reddit
That's because the hardware in turing cards is literally worse than the newer cards, it's not arbitrary same as how cards before ada and RDNA3 can't do AV1 encoding since those cards lack the hardware encoders for it. DLSS FG as it exists today is not possible on turing cards due to the slower optical flow accelerators, they're slower and they also don't support 2x2 grid sizes. Sure it's possible nvidia could have done it the way AMD did it or made a worse frame gen version for turing but as we've seen doing frame gen with async compute comes with its own challenges, i.e. if a game requires a lot of async compute or a card doesn't have enough async compute to run the game and FG then performance will suffer.
Morningst4r@reddit
The only DLSS the 20 series can't do is frame gen because it doesn't have the hardware, the 30 is in the same boat. Turing can use ray reconstruction.
Strazdas1@reddit
I dont think you can call "literally not having hardware to run it" arbitrary locking.
ResponsibleJudge3172@reddit
2060 has outlive RX5600XT though
Strazdas1@reddit
I aenjoy both DLSS and RTX. The promises were fulfilled.
fogoticus@reddit
Raytracing was meh*
RTX is technically DLSS & Framegen & other nvidia exclusive abilities.
Storm_treize@reddit
DLSS was a win, because RTX was meh
only_r3ad_the_titl3@reddit
AMD unboxed is back. Why are they taking the 2nd worst RT card and judging a whole technology based on it?
Robot1me@reddit
Was there a point where they were unreasonably biased, or why that name?
SecreteMoistMucus@reddit
Some Nvidia fans don't like it when people objectively analyse Nvidia's selling points. HUB does that better than anyone else.
HyruleanKnight37@reddit
It's pretty clear you didn't watch the video at all. Anything you say in criticism is therefore unintelligible background noise.
SignalButterscotch73@reddit
Because it's the biggest selling first gen RT card. The vast majority of people buy the 60 class cards and the 2060 was heavily advertised as RTX. You do know what the RT in RTX represents, yeah?
only_r3ad_the_titl3@reddit
you could buy a 1660ti aswell.
996forever@reddit
That’s not officially labelled an RTX card. Its support of ray tracing is equivalent to 1060-1080ti.
only_r3ad_the_titl3@reddit
sure but there were other options if you did not yet want RT
996forever@reddit
This video is specifically about RT. What are you talking about?
SignalButterscotch73@reddit
Was it advertised as a raytracing card? No.
Does it have tensor cores? Y'now the big new thing about 20 series that supposedly made RT possible. No.
It was sold as a GTX not an RTX.
If you're going to be an idiot, please at least be an informed idiot.
Yes the the 1660 can do raytracing, so can anything with the right software. It doesn't make them RT cards.
potato_panda-@reddit (OP)
They're not judging the whole technology on it. They have 2 other videos judging the merits of the tech where they were pretty positive about it overall. This video is just to judge the merits of buying a 2060 solely for the promise of RTX
kaden-99@reddit
I had it for a while. It was good enough to play Control and Metro Exodus with pretty good looking RT but anything other than those sucked dick. Even Quake RTX didn't run well.
TalkWithYourWallet@reddit
Nice to see them test with ultra and low raster settings before enabling RT
Ultra raster is one of my biggest issues with RT benchmarking. There are some raster settings (e.g. shadows) which have larger hits than RT
IMO, the optimal way to test RT is with optimized raster + RT settings (Like what DF use). But that would be too much work for one video
Morningst4r@reddit
Ultra shadows -30% fps for barely any effect: I sleep
RT reflections -30% fps to transform the scene: nvidia stole my frames!
tukatu0@reddit
Yeah and they charge you more money for the ultra option?
conquer69@reddit
Tim did a lot on his first video of this series. RT ambient occlusion costs 3%? Unacceptable!
Meanwhile he was running ultra settings which cost way more than 3% and look basically the same.
SecreteMoistMucus@reddit
Please quote or link the part of the video where he said 3% cost was unacceptable.
conquer69@reddit
First video where he thinks Dead Space's ray traced AO isn't worth it. https://youtu.be/DBNH0NyN8K8?t=243
The performance cost in the follow up video https://youtu.be/qTeKzJsoL3k?t=254
It's a minimal performance cost in line with all the other superfluous rasterized settings that gamers love to max out for no reason.
The worst offender though is his take on RE4 Remake https://youtu.be/DBNH0NyN8K8?t=281
That game is notorious for dog shit screen space reflections and the RT fixes it. He also complains about the RT removing incorrect screen space reflections for some reason.
Performance cost? 2%. But he still it isn't worth it. https://youtu.be/qTeKzJsoL3k?t=296
SecreteMoistMucus@reddit
This is a lie. He doesn't say it isn't worth it, he says it makes no noticeable difference.
Completely different kind of lie this time; that reflection is not incorrect. Boat surfaces are somewhat reflective, and you would certainly see an object's reflection when it's that close. Source: I've seen a boat before.
conquer69@reddit
If the ray tracer doesn't cast a reflection, then there shouldn't be a reflection there. Screen space reflections are never accurate.
dparks1234@reddit
RT is ultimately just a graphics setting like any other. Each game needs to be tuned differently to find the right balance. Sometimes low with RT on looks better than medium with no RT.
dampflokfreund@reddit
Yeah I was pleasantly surprised by the video. He gave it a fair chance.
Icynrvna@reddit
Its usually the same story when a new gfx tech is announced. Only the top tier card can run the new gfx setting at a reasonable fps. The low end cards are more of a cash grab by having these features.
The 5200 cant run DX9 well, even the tech demos sucked.
vanBraunscher@reddit
I don't know, one of the new hotnesses before raytracing was tesselation. And while weaker cards really did struggle with that at first and adoption took a while, only a couple of years later it had become an established standard and almost every new card handled it just fine. Today it isn't even talked about anymore and just silently sits right beside all the other rendering techniques. As it should be.
Can't say the same for raytracing though. Almost seven years and three gpu generations later it's still a massive performance concern and the market is sharply divided by gpu tier, brand, and even resolution. Also more often than not, real pathtracing falls under the deluxe++ option and most consumers have to be content with somewhat prettier shadows and reflections.
So no, I wouldn't call the current situation business as usual.
StickiStickman@reddit
That's because tesselation itself is barely used anymore. You either just do some fancy shader tricks with parallax occlusion or have something like Nanite.
OutrageousDress@reddit
This is incorrect. Just about every snow and mud track effect you've seen in the last 5 years has used tessellation - prior to that they used fancy shader tricks (mostly POM, yeah). Nanite on the whole is actually still uncommon in games, even games using UE5 don't always use Nanite. Actually mesh shading will eventually take over all the tasks that tessellation is currently used for, and it's much more powerful and flexible than tessellation, but we're not there yet. Kind of like with ray tracing really.
doneandtired2014@reddit
It's actually used pretty widely in a lot of games to round off curved or "organic" meshes or to save a bit of time while giving a damn close approximation for what an artist intended without having to both massively compromise the asset geometry for performance's sake and to spend more time cooking up a convincing normal map.
Developers are just a bit more conservative in where they apply it and to what degree.
vanBraunscher@reddit
My point still stands though. Former resource intensive techniques had their footprint made manageable far quicker than what's happening with ray tracing today. Five years after the first heavily tesselated games came out no one would really have recommended someone with a new gaming rig to reduce tesselelation quality in options first when having fps drops, because even midrange cards stopped breaking a sweat because of it.
Today ray tracing will heavily impact performance even when throwing it at the above average range segment of GPUs, and good luck getting path tracing to deliver acceptable frame rates at higher resolutions on anything but the top end cards.
And no, dropping to 1080p and praying that some upscaler will save your hide isn't much of a gotcha either. Cause in old money that would have been the equivalent of going down to 720p and deactivating AA. Hardly a satisfying outcome.
The reasons why it is that way are numerous, and not every single one will be cuz NVIDIA be greedy lol. But to say it's like it has always been, nothing to see here, kids these days etc. is certainly not warranted.
account312@reddit
We're still probably more than one order of magnitude short of consumer GPU performance really being up to the task of real time path tracing a AAA at consistently good frame rates without having to faff about with accumulating rays between frames and such. And it really needs to be something consoles can do before it's a serious target for most of the industry.
vanBraunscher@reddit
Agreed. I would go even further and brazenly claim, that I don't see this happening until the triple A industry drops baked-in lighting altogether.
But looking at the state of the industry rn, this would be an eternity and a half off.
And would probably necessitate the vestige of a competitor that is AMD to get their shit together and/or NVIDIA somehow be willing to ease up on their proprietary semi-monopoly. And fat chance for either of these.
account312@reddit
I think they'll probably drop baked lighting as soon as full tracing is a reasonable option for the lowest graphics setting they aim to support. Path tracing is just better and simpler. It's still too expensive though.
Strazdas1@reddit
It took a while to get tesselation going. Remmember when we had issues with Witcher 3 because despite tesselation being in use for over 5 years by then, only the top tier cards could run it at desired settings.
vanBraunscher@reddit
Can't say I've noticed this back then. Had a GTX 970 at the time, so at the lower end of the enthusiast bracket, and had absolutely no problems. Even with its infamously stunted VRAM.
But granted, I picked the game up around late 2016, chances are they had already sufficiently patched it up by then.
Are you sure that this was because of tesselation? I think I recall that that nvidia hair FX, ambient occlusion and ultra shadows (as always) were the culprits most talked about.
nlight@reddit
Tesselation is practically dead - it never really took off, the performance hit is not worth it and using it triggers some slow paths in the drivers which never got optimized. As far as I'm aware none of the current gen engines support or use it.
Strazdas1@reddit
Its been replaced by shader deformation for most part, but it was sound in theory.
QuinQuix@reddit
The 5900 couldn't run it all that well I regretted getting it over the 9800 pro.
bloodem@reddit
Rich bastard... I had a GeForce 2 MX at the time, and you are complaing about your "puny" FX 5900? Damn you, damn you! :-)
m3n00bz@reddit
I had a BFG 5700 Ultra and I made sure everyone knew it lol
QuinQuix@reddit
Iirc I was 16-17 and saved up during my summer job.
I think the high end at that time cost like 300-ish euro back then which was crazy crazy expensive.
The FX 5900 was the second best nvidia offering but the 5950 wasn't that much better.
It did hurt more that it wasn't the best choice but to be fair it only hurt me when I was benchmarking (which back then was all the time). Halo in practice played fine but ran notoriously worse on the fx series than on ati.
Not putting you down but I wasn't rich then, at least not conventionally speaking (relatively speaking most people on reddit have plenty to be thankful for).
bloodem@reddit
Hehe, I was just joking, don't worry. :-)
I was 18 in 2003, so my resources were quite limited (especially since it was just me and my mom). I remember drooling over Anandtech articles that were comparing the FX 5900/5950 Ultra and the Radeon 9800 PRO/XT, and realizing that those GPUs basically cost more than my whole computer.
Funnily enough, now I could afford to buy multiple RTX 4090s each month, and yet I barely have time to play (maybe) 2 - 3 games per year. Such is life...
QuinQuix@reddit
I'm in that same boat.
Time is far more valuable than any other resource right now.
MrCleanRed@reddit
Lmao what? DLSS was dogshit at first.
Strazdas1@reddit
DLSS 1 was okay. Something akin to FSR 2. DLSS2 was great.
MrCleanRed@reddit
It was not akin to fsr 2. FSR 1 was much much better than DLSS1. DLSS1 was a ghosting filled mess. FSR1 was usable in some cases. DLSS 2 was really good, and I think after 2.1 or 2.3 it became almost indistinguishable to native
Leo9991@reddit
Where does he say it wasn't?
MrCleanRed@reddit
He doesn't, but they did not misled anyone. It was bad.
jnf005@reddit
Can we really call real time ray tracing new anymore? It was introduced to us in 2018 fall with the 2000 series card. If we look at what we have 6 years after the 5200, even the low end Tesla card should run those dx9 feature well.
Tech_Itch@reddit
What's going on with people's basic reading comprehension? You're the second commenter mistaking them as saying RT is new now, when they're clearly talking about the time when RTX 2060 was released. And both of you are getting upvoted by multiple people.
PM_ME_UR_PET_POTATO@reddit
The point is that compared to other techs, adoption is very weak to the point that it looks basically new in comparison.
Strazdas1@reddit
Anything hating RT gets upvoted, no matter if logical or not. This is just reddit thing.
996forever@reddit
How many more years until y’all stop calling it “new”? Four more and it’ll make a decade.
Tech_Itch@reddit
It was new when the RTX 2060 was released. Which is what's being talked about.
NeroClaudius199907@reddit
Nvidia's marketing department is soo good. To this day 60 class buyers are buying for ray tracing lmao
2FastHaste@reddit
We're people really buying the 2060 for RT?
Or were they just simply buying the latest budget class gpu from nvidia?
I reckon it was probably the later for 99% of buyers.
SecreteMoistMucus@reddit
Yes they were. There were a lot of people saying "ray tracing is the future, you will need RTX to enjoy future games, you would be an idiot for buying an AMD card without ray tracing in current year."
MeelyMee@reddit
Yeah never encountered anyone who thinks/thought they could slam up settings on the 60 class cards.
They're just good, solid performers for reasonable settings.
NeroClaudius199907@reddit
People are even buying 60 class today thinking theyll get useable rt
JonWood007@reddit
I mean this should be a "no crap" moment.
It was obvious when that stuff came out that RTX was mostly for rich enthusiasts and an excuse to drive up the cost of GPUs. The goal was to make shareholders happy, not consumers. THey just decided "screw you youre gonna pay more for this feature you dont need and cant even reasonably use" and I saw this from a mile away.
This is why I bought AMD this time. As someone with a 1060 who wanted a similarly priced upgrade, was I gonna spend extra on like a 3060? Heck no. I dont care about running super spicy shadows at 30 FPS instead of 20 FPS....I wanna run stuff at 60+ and I'd never be able to.
So I just went with the side that had better raster for the money. I could get a 6600 for $200ish, a 6650 XT for $240ish, or a 3050 for $280 or a 3060 for $340 at the time. It was insane. No way was I gonna buy nvidia in a situation like that.
Even now a 66500 XT is like $230 while a 4060 is $300. Does extra RT really matter here? Not really.
Nvidia has changed. They used to offer good value mid range cards. Now what used to be mid range is budget and they're appealing to whales. They lost me as a customer until they learn how to make decently competitive GPUs for $200-300.
Sudden_Mix9724@reddit
RTX was NEVER READY in 2018...
it always felt like a " niche technology" ...
imagine in the world of cars, the next revolution in car technology was " shiny new paint".... that's what RTX was
balaci2@reddit
it's not really ready now either
it's more like "at least I don't see PowerPoint fps anymore"
alc4pwned@reddit
If you have a top end card then it's ready. It's very nice in Cyberpunk.
balaci2@reddit
xx90tracing
account312@reddit
No, it was a fancy new engine design that Bugatti came up with and licensed scale models of for Honda to use as hood ornaments.
Snobby_Grifter@reddit
The 2060 is a 1080 with some rt cores and dlss. It was perfectly fine for control and metro exodus. No, it's not amazing today.
But neither is any other 1080 analogous card.
cheekynakedoompaloom@reddit
even on my 4070 raytracing is really only worth it in minecraft. i have it on in shadow of the tomb raider solely because its such a light implementation and game that im well over 150fps even with it on, also i can barely tell the difference.
the one game that it does make a big difference visually is cyberpunks reflections but the hit is so high that im undecided if i'll keep it on or not. need to get further into game(im about an hour in) with busier scenes to know.
balaci2@reddit
not the same games but the same sentiment
TheAgentOfTheNine@reddit
gimmick feature is still a gimmick feature 6 years later. Half the blame goes to how underpowered most cards are and the other half goes to implementations that are either negligible visually or too taxing for 99% of the installed gpu base
2FastHaste@reddit
If ray tracing graphics is a gimmick feature, how did it manage to become the standard in offline rendering?
What incredible revolutionary secret in computer graphics are you hiding?
balaci2@reddit
as a development tool? fucking amazing
for most people, pretty screenshots yay
Brawndo_or_Water@reddit
Why did I know it would be hardware unboxed? The drama Steves of the internet.
balaci2@reddit
so true, I'll buy more Nvidia shares
constantlymat@reddit
Wiuld have been a good opportunity to use this video to admit they themselves misled gamers about the quality of the other key feature of the RTX 2000 series: DLSS
Hardware Unboxed continued its crusade against it long after the release if the 2.0 Version which was the breakthrough that made it a valuable feature.
aminorityofone@reddit
They address that. Even talk about using the lastest version of dlss. Or did you just hate boner immediately?
SignalButterscotch73@reddit
First gen DLSS was dogshit. They didn't mislead in the slightest.
They very clearly state that evey version of DLSS since 2 is a feature worth making you look closely at buying Nvidia. Hell, it's the feature from Nvidia that HUB praises the most.
Strazdas1@reddit
It wasnt. It was about the same quality as FSR 2. Second gen DLSS was just so much better its dogshit in comparison.
Jellyfish_McSaveloy@reddit
One of HUBs biggest misses was actually comparing FSR1 to DLSS2. On launch of FSR they had a video comparing upscaling in Godfall and had the gall to claim that in quality mode 4K they were relatively comparable.
kikimaru024@reddit
WTF is Godfall? ^/s
SignalButterscotch73@reddit
FSR1 was better than DLSS1 and far more comparable to DLSS2 in image quality. Not as good for sure but comparable was a fair assessment.
Jellyfish_McSaveloy@reddit
Eh that is very debatable. You can make a very solid argument that FSR3.1 hasn't quite matched up to DLSS2.5 yet (though they are getting very close), let alone bring FSR1 into the mix.
SignalButterscotch73@reddit
Seriously go back and look at some DLSS1 reviews, it was awful.
Jellyfish_McSaveloy@reddit
Yeah it was awful, I'm not disputing that. I'm saying FSR1 is awful mate.
SignalButterscotch73@reddit
And I'm saying that FSR1 was enough better than DLSS1 that comparing it to DLSS2 was reasonable.
In DLSS terms you could call FSR1 DLSS1.85, clearly better than DLSS1, not quite as good as DLSS2.
Jellyfish_McSaveloy@reddit
I guess we can agree to disagree. I find it similar so someone saying DLSS1 is far superior to integer scaling so you can compare it to DLSS2.
SignalButterscotch73@reddit
Fair enough. Sounds like we got into a disagreement about the use of 'comparable' the world rather than the image quality of the technologies but English is messy enough for both uses to be correct 😅
GARGEAN@reddit
FSR 1 comparable to DLSS 2? What are you smoking?..
SignalButterscotch73@reddit
Comparable, not equal.
You would have FSR1 vs DLSS2 because putting it up against DLSS1 would be bullying.
HyruleanKnight37@reddit
Man, DLSS 1.0 was bad. Like, BAD bad. Even 2.0 had glaring issues that still plague FSR 3.1 to this day.
If I remember correctly DLSS didn't start getting good until 2.2 or 2.3, long after HUB's review.
timorous1234567890@reddit
DLSS was rubbish until Control.
After that it was not in enough games to make it a killer feature.
By the 3070 launch though HUB were recommending the 3070 over the very slightly cheaper 6700XT because of DLSS.
Firefox72@reddit
DLLS sucked when it was revealed though.
Limited_Distractions@reddit
I think the honest legacy of the 20 series was mostly buried beneath the GPU mining boom and the resulting scarcity. At normal pricing it represented worse value than both pascal and ampere (easy to forget the reaction to the 3070 announcement where nvidia claimed it was better than the 2080ti, which was already considered insanely expensive) and the first couple years were about watching GPUs crawl trying to play Quake II RTX.
Without DLSS and extreme market conditions, I think it might genuinely have been a GPU generation with nothing to offer most people. To me personally it also represents GPUs crossing a threshold where upselling unobtainable compute goals became the obsession of the entire GPU stack. GPU value has never really recovered.
tuvok86@reddit
would have been nice to add the 1000 series and AMD contemporary cards (through actual alternatives at the time), especially regarding performance in "rt-only games"
bobbie434343@reddit
HUB milking ray tracing rage.
ryzenat0r@reddit
😴 News flash almost all cards are not powerful enough and none are without shenanigan's
Strazdas1@reddit
This is actually correct take. The card that would be powerful enough would be able to render indistinguishable from reality graphics in real time with zero frametime issues. Such card does not exist and wont exist for many, many years. So all cards are "not powerful enough".
Meekois@reddit
The kind of have a point. I just don't use ray tracing very much due to performance issues.
Granted, I buy Nvidia for other reasons, but what are these rt cores even good for?
2FastHaste@reddit
I'm getting really good performance on metro exodus EE and Dying light 2 with all RT features turned on on my 4070s.
I mention those 2 games because they have a ridiculously transformative RT implementation due to how they handle GI. It feels like you straight up skipped 2 generations of consoles. The way the bounced lighting solves the ugly video-gamey look of older rasterized solutions is really something.
I still haven't started playing Cyberpunk 2077 but it's another example of a game were the transformation is phenomenal.
Sooner or later most games will be like that. So I'd say starting from 4000 series mid-end, RT is becoming a big deal (at least for single player AAA games)
Emotional-Way3132@reddit
AMD's Ray Tracing performance is more misleading LMAO