I have a 2080ti bought in 2018 best piece of hardware I ever bought. I avoided all the shortages and price spikes I got mine for 899. And I play 1440p ultra games and don't see a reason to upgrade yet I get 120 fps on most stuff constant but see a drop if I turn on rts on max I have to turn some stuff down to keep about 80 fps. I could buy a 4090 but see no need maybe the 5 series.
the very interesting part is, that there were several issues, where the 11 GB vram was just not enough period.
11 GB of the 2080 ti already NOT being enough to run games at settings, that otherwise would run just fine, like 1080p or 1440p ratchet & clank rt enabled, while the titan rtx with enough vram gets very playable frame rates.
12 GB is just one GB more....
let's hope, that next generation amd cards will have a 16 GB vram minimum.
why amd? because i expect the worst from nvidia.
but yeah don't forget how crucial vram is for performance and visuals. games just break without having enough, or you have to lower the most crucial visual setting: textures to try to get it playable.
and certainly worth to take away, that if you want to enable raytracing, you NEED vram.
worth asking also:
can raytracing become the big main render path for games without vram increasing noticably for graphics cards?
Some games are going past 16gb too. We will see more of it before the generation is over. Especially with VRAM gobbling features like upscaling and frame gen.
Framegen increases vram usage in Wukong by 4gb at 4K and 2.5gb at 1440p. https://tpucdn.com/review/black-myth-wukong-fps-performance-benchmark/images/vram.png
Especially with VRAM gobbling features like upscaling and frame gen.
did you mean *raytracing and fake frame gen there?
because upscaling while using vram itself is generally reducing vram requirements as the much lower render resolution even at the quality setting makes more than up for the vram the upscaling itself uses?
All 3 use more vram than not using them. And while the vram usage of upscaling vs native (at lower resolutions) isn't that big, I still take it into account.
This is incorrect. Enabling upscaling all esle being equal reduces VRAM usage because you reduce usage from lower resolution more than you increase to run upscaler.
11 vs 12 GB is not artificial decision, they would put it 12 if they could. But the GPU (i.e. the chip) is cutdown by 1/12, including the part which handles VRAM, so one memory chip (of the 12) cannot be connected.
That's why there were 6 and 12 GB x60 cards, while higher tiers only had 8 GB.
RT did have a decent impact on videogames, but I would actually argue that the most important feature from that generation was actually DLSS, not RT.
Cards from that era aren't really suitable for anything other than very limited RT effects, especially SKUs below the 2080 Ti. It's basically an experimental feature. Even the 3000 series can't do RT particularly well, outside of the 3080-level SKUs and above, and RDNA 2 & 3 are worse still. The 4000 series is when cards started getting good at RT, and RDNA 4 is supposed to be a massive improvement as well. So it's going to be another 2-3 years, still, until a majority of gamers are going to have cards that can target high-level RT effects at acceptable framerates on PC, and hopefully the consoles follow.
Raster is going to be around for another 10 years, probably. Maybe more. The next console generation will determine that around 2028-- 10 years after the 2000 series launched. If they're good, raster will start to disappear after the cross-gen period comes to a close. If they're still pretty bad at RT, then we'll have another decade or so where raster will be relevant.
But if the PS6 is capable of global illumination/path tracing at high resolutions, then it's possible we'll see RT-only games by 2030ish.
it didn't have a big one yet, because games are designed around raster generally still, because both graphics vendors refuse to provide enough performance to the average person to allow games to have decent raytracing implementations as the main lighting system with raster only being an after thought.
and there are also console issues. you want your game to also look nice on the steamdeck and the switch (if you can get it running on that dumpster fire hardware), or WORSE the xbox series s....
we are just seeing the first games, that by default have raytracing on in all settings and a weak version of course.
so there is just a slow change. likely with the ps6 launch things will change a lot.
when the ps6 launches there will already be a switch 2 out (that thing has been ready to ship for long over a year now...) and the steamdeck 2, both of which able to do basic raytracing half decently at least.
and assuming rdna4 will bring massive raytracing performance increases and have enough vram, that could become the first backbone, that in a few years devs can target as the minimum for a good experience.
long story short, ray tracing didn't have a big impact on videogames YET, but a lot of ground work was laid out to have a massive impact in a few years.
in comparison to a modern card, an RTX 3070 is faster than 2080 Ti on Ray Tracing | Path Tracing, and there is no doubt the same can be said with 4070 vs 3080. So, that should tell us the advancement that Nvidia had already gotten throughout generations.
an RTX 3070 is faster than 2080 Ti on Ray Tracing | Path Tracing
looking at hardware unboxed raytracing testing the 2080 ti is exactly on par with the 3070.
HOWEVER that was 3 years ago.
what's the crucial difference now?
the 2080 ti has 11 GB vram, the 3070 has only 8 GB vram.
so the 2080 ti performs signifianctly better now with raytracing than the 3070 purely because 11 GB will run still fine, when 8 GB becomes completely broken performance or visual wise or both... or it straight up won't start or crash with 8 GB.
so the 2080 ti is the better raytracing card, just like how the rx 6800 is the better raytracing card compared to a 3070 today, because lots of games need more vram, especially with raytracing on, so the therotical advantage of a 3070 vs an rx 6800 gets MORE than made up by the missing vram issue of the 3070.
stuff like this is crucial to keep in mind.
so nvidia didn't progress, but regress in that example. they sold the same performance, but removed 3 GB vram, which is a major issue.
Yep, I went from 2070 Super --> 3080 Ti --> 4080 Super. The RT performance difference is always massive on newer gen GPUs. The RT cores are updated every GPU generation, and the RT performance is about double than before.
2x RT performance vs 40xx would be expected jump. Also, the Tensor (AI) performance jump will be massive. The only negative side effect from generational RT performance boost is the VRAM. It's insane how much more VRAM it takes to run modern maxed out RT/PT features, even at low rendering resolutions.
just slightly so. The big jumps are always from 20X0 to the 30X0 and from that to the 40X0. Going 2080ti to 3070 is a very small jump and not worth the money, imo.
The 70 series of the next gen is usually about the same performance as the 80 series of the current one.
Because he isn't a big youtube to stash a collection of cards for instant comparison. Just what you have vs what time does to it. a.k.a showing average person's ownership of cards.
I can't remember, been to bed since then. I remember something about Shadow of the Tomb Raider was slightly noticeable if you stopped and looked. I think HUB did a video about DeathLoop and in Hogwarts it looked like a wet floor, lol.
I'm a casual gamer and I can't tell the difference to be honest.
I think the first rt game that really wow me was Cyberpunk, with RTGI it does look fairly realistic and cool. After that all I played was RE8, Re4 and Elden Ring, all their rt are completely pointless from what I can see.
Elden Ring is the only game yet where I've found it improved anything, personally (granted, I don't play too many higher budget games which would have ray tracing). The screen space AO (and/or GI perhaps?) was awfully distracting sometimes when running through grassier areas for me.
But I still dealt with it most of the time to stress my GPU less, although in some instances turning ray tracing on actually reduced total system power draw as the framerate was bottlenecked enough. This just sometimes meant sub-60.
Perhaps, but "good RT" doesn't necessarily mean that more is better. It fixes much of the AO ghosting that plagues it normally, so I consider it quite a visual improvement. Moreso than the games with raytraced shadows which primarily, in my eyes, serve to add distracting ghosting instead.
You can't tell the difference if the game barely uses rt effects.
In rt heavy games like Alan wake 2, cyberpunk overdrive, black myth wukong, portal rtx, it makes a huge difference. This video details the drastic improvement:
From a developer perspective it's the complete opposite. It's an enormous time savings over having to manually set light sources in every scene to hopefully not have the illusion break when you move the camera around.
It seems like none of you who peddle this have ever worked with those things.
Setting up lighting that breaks with camera movement? Like how would that even work... You might be perhaps confusing it with lights being added in cutscenes to fit the scene. This is about cinematography, RT changes very little here.
What is often cited and problematic is resource baking, leak fixing and lack of robust emissive surfaces - which are often replaced using manually adjusted point lights.
Baked lights are almost always RT, just done sensibly offline, because of their static nature. Realtime or near real time techniques in editor are good enough. Not baking at all is faster of course, but you are delegating the problem as none of the hardware is up to task, so what was your workflow problem is now the programming problem.
I am sure that nvidia is working hard for acceptable "one size to fit all" solution, such as with their RTXGI, but I wouldn't consider those things to be ready at the moment. You're still dealing with plethora of limitations that will require manual adjustments, and in addition to that you now have a lot to worry about compatibility and performance.
That said, when in future the RT "just works", then yeah developers will have easier job.
No, this just means management can put them on the next project faster. Or maybe the team can be rightsized to be more lean and increase value for the shareholders.
Light placement by level mappers is presumably little changed with ray tracing. They have a little less work to do handling the pre-baking of lighting and things like that but the main issues of placing lights and checking what ends up being illuminated and what doesn't remain.
As a byproduct of ray tracing dedicated hardware in consumer gpus, developers do benefit from higher performing real time light simulations at lower prices.
Minecraft RTX was the single biggest disappointment. Instead of playing Minecraft with RT you could just toggle on, it was released as a bunch of custom maps, with regular Minecraft untouched. Sure you could run RT vanilla, but you then have to download the right texture pack, of which there are hundreds, and then it's basically just modding the game but with worse performance than loading just a whole bunch of ENBs.
It’s a shame that the titan rtx was not released as the rtx2090 for $1,500!!!! It’s such a great card but with no amd competition in the high end nvidia can charge whatever they want.
It still doesn't feel like it's been 6 years. I was in a bit of disbelief when I bought my 4080 Ti Super and realized I had been using my 2070 for 5 years.
in comparison the modern card, an RTX 3070 is faster than 2080 Ti on Ray Tracing | Path Tracing, so that alone should tell us the advancement that Nvidia had already gotten throughout generations.
This video doesnt really make much sense, without comparisons to modern cards it doesnt tell us much. Also the second half of the video is a joke, ofc lowering internal res increases performance.
BERRY_1_@reddit
I have a 2080ti bought in 2018 best piece of hardware I ever bought. I avoided all the shortages and price spikes I got mine for 899. And I play 1440p ultra games and don't see a reason to upgrade yet I get 120 fps on most stuff constant but see a drop if I turn on rts on max I have to turn some stuff down to keep about 80 fps. I could buy a 4090 but see no need maybe the 5 series.
reddit_equals_censor@reddit
the very interesting part is, that there were several issues, where the 11 GB vram was just not enough period.
11 GB of the 2080 ti already NOT being enough to run games at settings, that otherwise would run just fine, like 1080p or 1440p ratchet & clank rt enabled, while the titan rtx with enough vram gets very playable frame rates.
12 GB is just one GB more....
let's hope, that next generation amd cards will have a 16 GB vram minimum.
why amd? because i expect the worst from nvidia.
but yeah don't forget how crucial vram is for performance and visuals. games just break without having enough, or you have to lower the most crucial visual setting: textures to try to get it playable.
and certainly worth to take away, that if you want to enable raytracing, you NEED vram.
worth asking also:
can raytracing become the big main render path for games without vram increasing noticably for graphics cards?
that may be impossible.
conquer69@reddit
Some games are going past 16gb too. We will see more of it before the generation is over. Especially with VRAM gobbling features like upscaling and frame gen.
Framegen increases vram usage in Wukong by 4gb at 4K and 2.5gb at 1440p. https://tpucdn.com/review/black-myth-wukong-fps-performance-benchmark/images/vram.png
reddit_equals_censor@reddit
did you mean *raytracing and fake frame gen there?
because upscaling while using vram itself is generally reducing vram requirements as the much lower render resolution even at the quality setting makes more than up for the vram the upscaling itself uses?
conquer69@reddit
All 3 use more vram than not using them. And while the vram usage of upscaling vs native (at lower resolutions) isn't that big, I still take it into account.
Strazdas1@reddit
This is incorrect. Enabling upscaling all esle being equal reduces VRAM usage because you reduce usage from lower resolution more than you increase to run upscaler.
conquer69@reddit
DLSS uses more vram than the regular bilinear upscaler.
Kyrond@reddit
11 vs 12 GB is not artificial decision, they would put it 12 if they could. But the GPU (i.e. the chip) is cutdown by 1/12, including the part which handles VRAM, so one memory chip (of the 12) cannot be connected.
That's why there were 6 and 12 GB x60 cards, while higher tiers only had 8 GB.
callmedaddyshark@reddit
Based on the title, I expected a retrospective of the impact ray tracing has had on videogames, but we didn't really get that.
So, I ask you: did ray tracing have an impact on videogames?
Derpa_derp_1817@reddit
RT did have a decent impact on videogames, but I would actually argue that the most important feature from that generation was actually DLSS, not RT.
Cards from that era aren't really suitable for anything other than very limited RT effects, especially SKUs below the 2080 Ti. It's basically an experimental feature. Even the 3000 series can't do RT particularly well, outside of the 3080-level SKUs and above, and RDNA 2 & 3 are worse still. The 4000 series is when cards started getting good at RT, and RDNA 4 is supposed to be a massive improvement as well. So it's going to be another 2-3 years, still, until a majority of gamers are going to have cards that can target high-level RT effects at acceptable framerates on PC, and hopefully the consoles follow.
Raster is going to be around for another 10 years, probably. Maybe more. The next console generation will determine that around 2028-- 10 years after the 2000 series launched. If they're good, raster will start to disappear after the cross-gen period comes to a close. If they're still pretty bad at RT, then we'll have another decade or so where raster will be relevant.
But if the PS6 is capable of global illumination/path tracing at high resolutions, then it's possible we'll see RT-only games by 2030ish.
reddit_equals_censor@reddit
it didn't have a big one yet, because games are designed around raster generally still, because both graphics vendors refuse to provide enough performance to the average person to allow games to have decent raytracing implementations as the main lighting system with raster only being an after thought.
and there are also console issues. you want your game to also look nice on the steamdeck and the switch (if you can get it running on that dumpster fire hardware), or WORSE the xbox series s....
we are just seeing the first games, that by default have raytracing on in all settings and a weak version of course.
so there is just a slow change. likely with the ps6 launch things will change a lot.
when the ps6 launches there will already be a switch 2 out (that thing has been ready to ship for long over a year now...) and the steamdeck 2, both of which able to do basic raytracing half decently at least.
and assuming rdna4 will bring massive raytracing performance increases and have enough vram, that could become the first backbone, that in a few years devs can target as the minimum for a good experience.
long story short, ray tracing didn't have a big impact on videogames YET, but a lot of ground work was laid out to have a massive impact in a few years.
twhite1195@reddit
IMO, not yet tbh... It's still some years left before it becomes necessary
VastTension6022@reddit
I don't get it, whats the point of this video without a comparison to RT impact on newer cards or a single card under $1k?
ShadowRomeo@reddit
in comparison to a modern card, an RTX 3070 is faster than 2080 Ti on Ray Tracing | Path Tracing, and there is no doubt the same can be said with 4070 vs 3080. So, that should tell us the advancement that Nvidia had already gotten throughout generations.
Noreng@reddit
Is it really that much faster? The reviews at launch didn't really put the difference as particularly significant.
https://www.techpowerup.com/review/msi-geforce-rtx-4060-gaming-x/34.html
Doesn't look like the difference between a 2080 Ti and 3070 was all that noteworthy in 2023 either
bctoy@reddit
The path-traced updates to the older games like Serious Sam is where I saw 3070 do much better than 2080Ti.
https://www.pcgameshardware.de/Serious-Sam-The-First-Encounter-Spiel-32399/Specials/SeSam-Ray-Traced-Benchmark-Test-1396778/2/#a1
BenchmarkLowwa@reddit
Yes, Ampere fares better in terms of raw RT throughput. But many games allocate much more VRAM with RT - this is when the old Turing-Ti shines.
reddit_equals_censor@reddit
looking at hardware unboxed raytracing testing the 2080 ti is exactly on par with the 3070.
HOWEVER that was 3 years ago.
what's the crucial difference now?
the 2080 ti has 11 GB vram, the 3070 has only 8 GB vram.
so the 2080 ti performs signifianctly better now with raytracing than the 3070 purely because 11 GB will run still fine, when 8 GB becomes completely broken performance or visual wise or both... or it straight up won't start or crash with 8 GB.
so the 2080 ti is the better raytracing card, just like how the rx 6800 is the better raytracing card compared to a 3070 today, because lots of games need more vram, especially with raytracing on, so the therotical advantage of a 3070 vs an rx 6800 gets MORE than made up by the missing vram issue of the 3070.
stuff like this is crucial to keep in mind.
so nvidia didn't progress, but regress in that example. they sold the same performance, but removed 3 GB vram, which is a major issue.
Hugejorma@reddit
Yep, I went from 2070 Super --> 3080 Ti --> 4080 Super. The RT performance difference is always massive on newer gen GPUs. The RT cores are updated every GPU generation, and the RT performance is about double than before.
Dangerman1337@reddit
Will be interesting to see RTX 50/Blackwell because the rumours it's going to be a massive jump in that regards.
Vb_33@reddit
It being on the same node is going to limit it's potential vs Ada.
Hugejorma@reddit
2x RT performance vs 40xx would be expected jump. Also, the Tensor (AI) performance jump will be massive. The only negative side effect from generational RT performance boost is the VRAM. It's insane how much more VRAM it takes to run modern maxed out RT/PT features, even at low rendering resolutions.
TheAgentOfTheNine@reddit
just slightly so. The big jumps are always from 20X0 to the 30X0 and from that to the 40X0. Going 2080ti to 3070 is a very small jump and not worth the money, imo.
The 70 series of the next gen is usually about the same performance as the 80 series of the current one.
DeathDexoys@reddit
The point of this video is pretty clear though? Showing RT performance of first generation top of the line RT cards.
Sure, he doesn't have the optimal setup to not be CPU limited or a various GPUs to compare it's performance, he is still a small youtuber....
XenonJFt@reddit (OP)
Because he isn't a big youtube to stash a collection of cards for instant comparison. Just what you have vs what time does to it. a.k.a showing average person's ownership of cards.
BenchmarkLowwa@reddit
I'd recommend this video instead. More GPUs compared: https://www.youtube.com/watch?v=leLdmttPSHA
djashjones@reddit
I thought ray tracing was a waste of time?
OliveBranchMLP@reddit
...says who???
djashjones@reddit
I can't remember, been to bed since then. I remember something about Shadow of the Tomb Raider was slightly noticeable if you stopped and looked. I think HUB did a video about DeathLoop and in Hogwarts it looked like a wet floor, lol.
I'm a casual gamer and I can't tell the difference to be honest.
jnf005@reddit
I think the first rt game that really wow me was Cyberpunk, with RTGI it does look fairly realistic and cool. After that all I played was RE8, Re4 and Elden Ring, all their rt are completely pointless from what I can see.
zopiac@reddit
Elden Ring is the only game yet where I've found it improved anything, personally (granted, I don't play too many higher budget games which would have ray tracing). The screen space AO (and/or GI perhaps?) was awfully distracting sometimes when running through grassier areas for me.
But I still dealt with it most of the time to stress my GPU less, although in some instances turning ray tracing on actually reduced total system power draw as the framerate was bottlenecked enough. This just sometimes meant sub-60.
zerinho6@reddit
Elden has one of the worst RT implementations ever, it's only AO and Shadows, however, the distance and quality of both are small and terrible.
zopiac@reddit
Perhaps, but "good RT" doesn't necessarily mean that more is better. It fixes much of the AO ghosting that plagues it normally, so I consider it quite a visual improvement. Moreso than the games with raytraced shadows which primarily, in my eyes, serve to add distracting ghosting instead.
PainterRude1394@reddit
You can't tell the difference if the game barely uses rt effects.
In rt heavy games like Alan wake 2, cyberpunk overdrive, black myth wukong, portal rtx, it makes a huge difference. This video details the drastic improvement:
https://m.youtube.com/watch?v=hhAtN_rRuQo
WJMazepas@reddit
Spiderman 2, Alan Wake 2, and Hellblade 2 were made with RT on mind, and it does make a huge difference.
Metro Exodus also made a huge difference having Ray Tracing, and that is an old game
Beautiful_Ninja@reddit
From a developer perspective it's the complete opposite. It's an enormous time savings over having to manually set light sources in every scene to hopefully not have the illusion break when you move the camera around.
leeroyschicken@reddit
It seems like none of you who peddle this have ever worked with those things.
Setting up lighting that breaks with camera movement? Like how would that even work... You might be perhaps confusing it with lights being added in cutscenes to fit the scene. This is about cinematography, RT changes very little here.
What is often cited and problematic is resource baking, leak fixing and lack of robust emissive surfaces - which are often replaced using manually adjusted point lights.
Baked lights are almost always RT, just done sensibly offline, because of their static nature. Realtime or near real time techniques in editor are good enough. Not baking at all is faster of course, but you are delegating the problem as none of the hardware is up to task, so what was your workflow problem is now the programming problem.
I am sure that nvidia is working hard for acceptable "one size to fit all" solution, such as with their RTXGI, but I wouldn't consider those things to be ready at the moment. You're still dealing with plethora of limitations that will require manual adjustments, and in addition to that you now have a lot to worry about compatibility and performance.
That said, when in future the RT "just works", then yeah developers will have easier job.
PhoBoChai@reddit
No. It's extra work because games are released to support raster & compute effects, and optional RT on top or replacing those raster effects.
The time saving argument only works when devs choose to release RT only modes, without fallback to raster.
How many game studios do you think does that?
conquer69@reddit
Avatar and SW Outlaws are on the snowdrop engine and both have RT lighting at all times. UE5 with Lumen is the most popular one right now.
Deepsilver, the creators of Metro Exodus, haven't released a new game yet but it will for sure have RT only too.
SomniumOv@reddit
4A Games. Deepsilver is the publisher.
djashjones@reddit
That's great. Does that mean I don't have to wait a few years to play the latest release as "developers" have more time now to fix bugs?
Chipay@reddit
No, this just means management can put them on the next project faster. Or maybe the team can be rightsized to be more lean and increase value for the shareholders.
djashjones@reddit
That sounds about right, always about the bottom line.
cp5184@reddit
Light placement by level mappers is presumably little changed with ray tracing. They have a little less work to do handling the pre-baking of lighting and things like that but the main issues of placing lights and checking what ends up being illuminated and what doesn't remain.
As a byproduct of ray tracing dedicated hardware in consumer gpus, developers do benefit from higher performing real time light simulations at lower prices.
conquer69@reddit
That's not a problem. It's adjusted in real time if you have ray tracing.
chronocapybara@reddit
Minecraft RTX was the single biggest disappointment. Instead of playing Minecraft with RT you could just toggle on, it was released as a bunch of custom maps, with regular Minecraft untouched. Sure you could run RT vanilla, but you then have to download the right texture pack, of which there are hundreds, and then it's basically just modding the game but with worse performance than loading just a whole bunch of ENBs.
ibeerianhamhock@reddit
Yeah I'm surprised it stayed in beta and we basically still don't have minecraft RTX so many years later.
It would be the perfect application for path tracing now with FG, DLSS, etc. Probably capable of 120+ fps easily on say a 4070 or up current gen.
Manordown@reddit
It’s a shame that the titan rtx was not released as the rtx2090 for $1,500!!!! It’s such a great card but with no amd competition in the high end nvidia can charge whatever they want.
xDragod@reddit
It still doesn't feel like it's been 6 years. I was in a bit of disbelief when I bought my 4080 Ti Super and realized I had been using my 2070 for 5 years.
illathon@reddit
This was the best looking GPU Nvidia made. It was all down hill since then.
ShadowRomeo@reddit
in comparison the modern card, an RTX 3070 is faster than 2080 Ti on Ray Tracing | Path Tracing, so that alone should tell us the advancement that Nvidia had already gotten throughout generations.
TemporalAntiAssening@reddit
This video doesnt really make much sense, without comparisons to modern cards it doesnt tell us much. Also the second half of the video is a joke, ofc lowering internal res increases performance.