Why are so many new AA/AAA games dropping hardware ray tracing lately?
Posted by XHellAngelX@reddit | hardware | View on Reddit | 395 comments
Is it just me, or have a lot of recent AA/AAA titles stopped supporting hardware-based ray tracing altogether?
Take Silent Hill f, Expedition33, Dying Light: The Beast, Split Fiction, BF6,..... for example — no RT reflections, no RT shadows, nothing. Some studios are switching entirely to software/global illumination systems like Lumen or other hybrid lighting methods, and calling it a day.
I get that hardware RT is expensive in terms of performance, but it’s been around since the RTX 20-series — we’re six years in now. You’d think by 2025 we’d see more games pushing full path-traced or at least hybrid hardware RT.
Instead, we’re seeing the opposite:
- Hardware RT being removed or “temporarily disabled” at launch.
- “Next-gen lighting” now often just means software GI or screen-space tricks.
So what’s going on here?
Is hardware RT just too niche for mass-market AAA titles? Or are we hitting a point where software-based lighting like Lumen is “good enough” for most players?
And seriously — are all those RT cores on our GPUs just going to waste now?
Would love to hear what others think — especially from a tech/dev perspective. Are we watching hardware ray tracing quietly die before it even became standard?
Daryl_ED@reddit
Nvidia better drop the RTX of their branding lol
Sirts@reddit
Basically all games are now multiplatform, and since consoles have weak ray tracing hardware, all games and engines need to have software-based lightning anyway. Since there are now software-based GI techniques like Lumen, hardware RT doesn't have that big advantage
Nvidia )and (AMD) mid-range GPUs have improved at snail pace, slower than ever. RT launched 7 years ago, yet RTX5060 and sometimes even 5070 don't perform well on 2.5K-4K + high settings + RT features even on older games.
I don't think RT or path tracing is dying though, there are now rumors that PS6 is going to have \~5080-5090 level RT performance, so it should allow developers to focus on RT
ProfessionalB0ss@reddit
Simply not possible, no AMD card has that, those are 1200$+ cards
Look at 9070XT at best is where the PS6 will be
dudemanguy301@reddit
Lumens hardware mode was heavily neglected until UE5.4 and onwards. EPIC themselves didn’t start recommending hardware RT Lumen until version 5.6.
On the latest branches hardware Lumen can hit 60fps on console, and on modern PC hardware the hardware version of Lumen not only looks better than the SW version, it runs faster too.
Morningst4r@reddit
I don’t know how much recent updates have improved it but the killer for hardware lumen is usually the CPU hit. A decent GPU handles its side without losing much performance but games like OW2 hammer the CPU with it on.
Sirts@reddit
That's good to hear. Seems hardware RT on many new UE5 games like Borderlands 4 and Outer Worlds 2 is pretty useless because it performs so badly https://www.tomshardware.com/pc-components/gpus/the-outer-worlds-2-with-ray-tracing-on-performs-badly-just-like-borderlands-4-runs-below-60fps-at-540p-resolution-on-an-rtx-5090-and-9800x3d
Plank_With_A_Nail_In@reddit
GPU improvement is happing at roughly the same rate as it always has. You can measure it easy enough just by looking at the increase in memory bandwidth of each generations 80's cards, it think only one generation 2080 to 3080 is an aberration with a terrible increase between them.
detectiveDollar@reddit
Yeah, one big issue is that on the Nvidia side, the penalty for enabling RT hasn't shrunken by enough to make it always worth turning on. AMD has made huge strides with reducing the penalty, but that only brings them near parity with Nvidia.
kikimaru024@reddit
RTX 5060 is only mid-range if you look at price.
xx60 series is entry level.
VenditatioDelendaEst@reddit
Entry level is 5050 or APU.
ca7593@reddit
The RTX 5060 is entry in price and series. It’s $299 dude, that is not mid-range pricing.
The 1060 founders was 299, and so the 2060 was 349! That 299 for the 1060 is ~$403 accounting for inflation now.
kikimaru024@reddit
Yeah, that's what I'm saying.
Mostly I've seen people complain that 5060 is "mid-range" because they only remember when the (shit) GTX 960 was $199.
Petting-Kitty-7483@reddit
Yeah until next gen consoles take off and leave this gen behind in oh probably a decade or so given how cross gen time works. We will be using software ray tracing like lumen as the main thing not hardware rt. Unless everyone learns to optimize hardware RT like ID can. Still rt so.the main benefits are there just not as good looking
IezekiLL@reddit
Yeah. Nvidia GPUs can reach any height, but due to console market importance the real indicator is AMD. We need to waith for PS6 on UDNA+zen5/6 to appear and then we will start to see actually raytraced games.
SireEvalish@reddit
Unreal Engine 5 makes it easy to use SW-based ray tracing, which most of the games you listed use in lieu of HW-based solutions. Of course, every one of those games would probably look/run better if they added a HW option, but I digress.
Here are a list of titles released in 2025 that contain HW RT features:
ARC Raiders
Assassin's Creed Shadows
Avowed
Cronos: The New Dawn
Doom: The Dark Ages (full path tracing available)
FBC: Firebreak
FC 26
Grand Theft Auto V Enhanced (Actually one of the better recent RT implementations, BTW)
Little Nightmares Enhanced Edition (Not 100% sure)
Little Nightmares III (Not 100% sure)
Ninja Gaiden 2 Black (though it's implemented poorly)
Oblivion Remastered
Outer Worlds 2 (though it's implemented poorly)
Rise of the Ronin (Not 100% sure. Likely since it's a PS5 port which has HW)
Spider Man 2 (technically an older title from PS5, but supported HW RT)
Dying Light: The Beast appears to be getting RT in an update in the next few weeks. We'll see what happens there.
MC_chrome@reddit
Is it not a little bleak that a 12 year old game is one of the better representations of an emerging technology?
fire2day@reddit
I find that older games get the biggest uplift from it. A lot of newer games look just as good with baked-in lighting as they do with raytracing.
Silent-Selection8161@reddit
If you ask most developers they'd agree RT is maybe a modest improvement on baked lighting at most, and what they really like about it is being able to just go and make levels quickly of whatever size without worrying about baking the lighting or how much disc all the baked lighting takes up.
Plank_With_A_Nail_In@reddit
But the game still needs baked lighting as it needs to run on non RT cards so no developers are getting this improvement.
You haven't asked any developers lol you are putting words in their mouths.
No_Fennel4315@reddit
Except, games really don't have to run on non rt cards anymore. Have you seen the statistics of how many people are on cards that support hardware ray tracing? The vast, vast majority. It will show soon enough, it already has.
Flippantlip@reddit
One of DOOM's devs, I think the director? Can't recall who, there's a technical-interview online, where he goes really in depth about their tech-choices and what they were excited about.
RT was a big one of em: https://youtu.be/DZfhbMc9w0Q?si=ivnnCKl8Gw4bLfq8&t=913
"Without this feature (ray tracing), we'd have to elongate the build-time by a magnitude of years".
He continues to explain that, level-design-wise, designers can now immediately see how lighting affects a scene -- and make minute changes, that they would otherwise either not notice without baking, and or would take a really long time to process properly.
I.E: Even if the end-user does not have access to RT, the development engine does, so game-designers can still exploit the feature to get a quick way to demo the level.
The feature does not prevent the devs to also bake the lighting afterwards, when they are happy with how everything looks.
(The latter bits about baking was not part of the interview, this is my assumption)
Strazdas1@reddit
If you have any dynmic light sources baked lighting either looks awful or you have to put a shitload of effort to hide it and bloat game size to 50+ gigabytes.
fire2day@reddit
Oh yeah, I’m not discounting how great of a tool it can be for developers.
radspot77@reddit
GTA5E released in 2025.
Busy-Scientist3851@reddit
Nah. It's still very much a legacy non-PBR game, just has better more realistic shadows and sunlight, not much else.
SireEvalish@reddit
It's a combination of a few different things, really.
Being an older game means the rasterized rendering isn't particularly heavy by modern standards. While the game has been updated over the years, especially when they ported it to PC/PS4/XBO, the core technology was designed to run on PS3/360. Assets and geometry aren't crazy by modern standards.
The engine is being used for GTA6, so work done to get RT working in it can likely be applied to that game as well. They can basically use GTA5 as a test bed for the features in GTA6.
They weren't under any sort of time crunch or release pressure. They could spend as much time as needed (within reason, of course) to get it running well.
Rockstar implemented what I consider two of the most important RT features: Ray-traced global illumination and reflections. I'm of the belief that when done well these can have impressive boosts to overall image fidelity and "correctness" of the lighting in the world. It's especially noticeable in a game like GTA5 which has a variety of different lights, shifting times of day, and lots of reflective surfaces.
Strazdas1@reddit
The PC version wouldnt run on PS3/360 at all. the gap is huge. They also had to invent a new way of streaming data just to make it run on the PS3/360 due to how slow its data transfer rates were. A one-off technology that got abandoned for PS4/Xbone/PC as it wasnt needed there.
I think thats exactly why we got the enhanced edition.
Strazdas1@reddit
Its a remaster of literally the most popular game in a decade.
Realistic_Village184@reddit
I don't see why it would be. There are lots of older games that are used as essentially tech demos, like Portal RTX and Quake 2 RTX. What about that seems bleak to you? It's not like no new games have hardware ray-tracing. Am I missing something?
VotesDontPayMyBills@reddit
Criticizing Unreal Engine 5 because current PCs can't handle it like other engines is similar to the backlash CryEngine faced when Crysis debuted: great graphics demand substantial hardware. Moving on!
geo2160@reddit
I don't know if you're just young or just forgot how Crysis looked at the time. Crysis 1 was pretty well optimized and the jump in image quality from other games in that generation was huge. Crysis 3 was criticized for bad optimization because it used to render out of bounds tessellated textures to make Nvidia look better in benchmarks.
UE5 is responsible for 90% of the badly optimized slop from the last couple of years.
Hytht@reddit
Crysis 1 was heavily single threaded
geo2160@reddit
As opposed to which games? Mainstream Quad core processors were 10 months old at that time. Even dual cores were barely 2 years old.
vanBraunscher@reddit
Yeah, the games industry was hilariously slow in adopting more than one core usage. Some franchises had been dragging their feet well into the late 2010s even.
AntLive9218@reddit
Hell, it took an embarrassingly long time to just move away from silly signed 32-bit pointers effectively limiting the address space to 31 bits (2 GiB), even common when 64 bit CPUs were already around for a long enough time to be considered the baseline.
I was more upset about that, because crashing when running out of memory was quite a bit more significant problem than "just" lower than desired performance. And while I fortunately didn't get to deal with that, the memory limitation was one of the root causes of why some endlessly growing save files couldn't be loaded anymore, making some people unable to continue long adventures.
Large companies in the gaming industry laugh at other developers getting nervous at tech debt being carried for more than a year, because they are masters of releasing just barely good enough pieces of garbage hiding often more than a decade old bad decisions.
Strazdas1@reddit
The best game ever made (victoria 2) was singlethreaded and came out in 2009.
Strazdas1@reddit
Crytek banked on single core continuing performance growth and went heavy on singlethreaded. Its why even modern PCs kneel when running Crysis, the single thread bottleneck for all its LUA scripts.
illicITparameters@reddit
This comment reads like someone who hasnt been gaming for any serious length of time….
Cute-Pomegranate-966@reddit
^ false. You should go ahead and review that crisis 3 claim and learn so you can stop repeating it. There were in pine objects that were tessellated heavily and their looks didn't change. But that out of bounds shit has never been true.
geo2160@reddit
https://youtu.be/IYL07c74Jr4?si=CXErCFOJ1NwSl1il
Cute-Pomegranate-966@reddit
And you can see that i said they had objects in frame that were over tessellated. But there was never any hidden anything.
Strazdas1@reddit
There was a bug in release version of Crysis 2 that tesselated water underground that was never visible to the player. They later Patched it, but remmeber that back then not everyone was connected to internet recieving automatic patches.
Cute-Pomegranate-966@reddit
??? No it wasn't. Crysis 2 didn't have tessellation at all until it received a dx11 upgrade patch...
Strazdas1@reddit
It was Crysis 2 that did underground tesselation. Crysis 3 looked spectaacular. all Crysis games have issue with AI running on single threaded LUA scripts which performs very poorly.
Strazdas1@reddit
Current PCs could handle RT even on budget cards for over 2 years.
ShadowBlaze80@reddit
Now I think UE gets some flack because you can make a good optimized game in it, companies just don’t. However, cryengine actually sucks and runs like crap anyways. I had crysis on a Core2 E8400 at 3GHz, 4gb of ram, and dual 8800s and the game still struggled on my rig running about mid-40s at best. Lo and behold I buy a new i5 6600, game still has issues with crunching the CPU. We have so much computing power now that’s being mismanaged, it’s crazy we have to AI upscale games from 360p to reach target frame rates. The PS5 theoretically is 10x more powerful than the PS4 yet we’re still hardly reaching a solid 1080p60 or 4k60 frame target and not netting much in return, it doesn’t make sense.
Jaznavav@reddit
Your argument would maybe sort of hold up if Epic wasn't optimizing core feature cpu cost by 10-20% ever major update. Nanite and other headlining features have been unoptimized slop for most of the engine's lifespan.
SpotlessBadger47@reddit
Uhuh, sure, except some of us actually remember how it was back in the day. UE5 is genuinely a garbage engine.
KanedaSyndrome@reddit
Didn't know that there was a difference between hardware and software raytracing. Thought it was just raytracing and nvidia gpus supported that with hardware.
-Purrfection-@reddit
You can see the difference here. For example UE5 Lumen has a software and hardware version.
https://youtu.be/O6GC8TZbJmI?si=5wMd4YtRy3x40rTu&t=1193
Basically the software version is tracing into a low quality version of the game world, so reflections for example aren't ideal for sofware RT, but Global Illumination doesn't have to be so accurate so it works better. Software RT can be run as a generic compute shader, so any GPU can run it, but it's lower quality and less performant.
wawasan2020BC@reddit
Software ray-tracing has the advantage of being hardware-agnostic, but as always comes with the cost of less FPS because there's simply no dedicated chip space to do your RT calculations.
pythonic_dude@reddit
It can always be hardware-agnostic if the driver "fakes" it by making generic compute cores do all the work (rather than just most of it), it's just really, really inefficient and slow and not worth pursuing most of the time. See: running titles requiring HW RT on vega and rdna1 gpus on linux thanks to mesa/radv doing exactly that.
the_nin_collector@reddit
5090 can't even get 60fps at 540p with RT on. Your statement is the textbook example of an understatement.
nWhm99@reddit
Is that the Bethesda game or the puzzler?
Filipi_7@reddit
Neither.
It's a sci-fi shooter/action RPG by Obsidian Entertainment, under the Xbox/Microsoft umbrella.
loozerr@reddit
However running it at high with RT off and FSR 4.0 quality I'm getting a pretty comfortable 100fps.
Vb_33@reddit
RT is always on. You're just using RT running on your GPU shaders instead of the hardware accelerated version on your RT cores. That's the difference between software lumen and hardware lumen.
Cheap-Plane2796@reddit
He said it runs like ass with rt on, you say it rubs better with rt off. Ok?
It looks hideous with rt on and worse with rt off btw. Gross looking game.
They re shitting out games at breakneck speed and it shows in the quality. Huge step down from avowed in polish and graphics
Vb_33@reddit
That's because RT Lumen in the outer worlds 2 is RT reflections, indirect lighting and shadows. On top of enabling RT Lumen people are then cranking up all the RT options to max (reflections, shadows, global illumination etc). DF has said for ages that going above high on UE5 settings is heavily punitive and not worth it for current gen hardware, you can check out their TOW2 review to see them say something similar. Even software lumen at very high is super punishing.
The real question is, is very high bringing value for the performance it demands and the answer for software lumen is generally absolutely not (couple exceptions in TOW2), as for HW Lumen its better but in TOW2 it's bugged (shadows don't work right) and very high isn't worth it currently.
Gearsper29@reddit
This is a cpu bottleneck. It has nothing to do with 5090.
the_nin_collector@reddit
no. Please read stuff first before posting false news.
"The Outer Worlds with ray tracing can't hit 60FPS at paltry 540p resolution with an RTX 5090 and 9800X3D - ray tracing 'performance' mirrors Borderlands 4 fiasco"
Just Google it. RT is straight fucked.
Digital Foundry confirms as well.
Modern CPUs barely matter these days in modern AAA games.
Even with my 14900k, CPU is running at 10-15% in this game.
cowoftheuniverse@reddit
RT has always had a cpu performance cost, but very often it stays "hidden" because it stresses gpu even more so many just assume it doesn't stress cpu at all. For some reason in outer worlds 2 it's not just a big hit on cpu but a huge one.
Cpu percentage usage doesn't matter if one or two cores are at 100% choking the whole thing and some of cores sit idle (not all work can be shared between cores).
Daxius@reddit
I am unsure how you cited the Digital Foundry video and go on to say CPU means nothing and even talking about how it doesn’t go above 10-15% when the entire first half of the video is about how the game is extremely CPU limited and Alex even says “That means this game scales more with linear processor speed than it does with more cores” The Hardware RT is broken but not because of performance but because of the way it enables Shadows and does it poorly with issues denoising and using the wrong pipeline.
AIgoonermaxxing@reddit
I get where he's coming from though, turning on hardware RT in that game does some really funky shit. At 1080p DLSS P with hardware RT off, the 5090 still gets over 90% GPU utilization, but at those exact same settings with hardware RT on, GPU utilization drops to 60%.
Lord knows what is wrong with this game, but with GPU utilization dropping that low comparatively to having hardware RT off I can understand why he thinks it might be CPU related.
Gearsper29@reddit
Thats because it puts more weight to the cpu. Like lets say the max potential frame rate with a 9800x3d without rt is 150.
Now you enable ray tracing and this doubles the demand from the cpu. Now the max frame rate you can achieve is 75 no matter how good your gpu is or how low resolution you use.
In cases like that it is better to increase the resolution to at least get a better image quality at the same frame rate.
Thats how a cpu bottleneck works.
So yes the game is an unoptimized mess but it needs a stronger cpu and not a gpu.
Gearsper29@reddit
Please learn to interpet what you are reading before you accuse someone for false news.
A significant part of ray tracing calculations are done by the cpu. So ray tracing greatly increases cpu load. The Outer Worlds has bad cpu utilization in general and with rt it probably hammers 1 or 2 cores.
Thats why with RT it has a frame cap even with 9800x3d of around 60-70 fps no matter the gpu or the resolution. zWORMz Gaming got around 73 fps with a 5090 at 4k dlss quality and the same fps at 720p but with less half the gpu utilization. This is obviously a cpu bottleneck.
AIgoonermaxxing@reddit
Ok, that's pretty strong evidence that it's CPU related. I watched Daniel Owen's video on it and he got similarly low GPU utilization at 540p.
02Tom@reddit
at Very High settings
Silent-Selection8161@reddit
Dying Light, The Beast is/was an expansion to Dying Light 2 they spun off into a standalone title. 2 also got its RT update later so same same.
Dazzling-Tadpole3239@reddit
gta v? tried that and didnt see much difference without rt
obiwansotti@reddit
Just a gotcha of UE5.
It was designed before HW Ray Tracing took off and it's playing catchup. It can be done in UE5, but it's harder than it needs to be, so if push comes to shove developers spend their time elsewhere.
Other engines like iD tech that have made ray tracing a priority you see multiple games (Indy and Doom DA) with pull path tracing.
Ray Tracing is the future, and software ray tracing is too slow compared to HW to get the full benefits.
jeramyfromthefuture@reddit
Because ray tracing itself is very old tech , we are looking at new techniques that give the same quality and perform at higher rates , also most the world doesn't have Nividia RTX GPUs , many of us just didn't bother to upgrade from the 2070.
When the price of a ps5 is cheaper than the latest Nvidia GPU , expect gamers to look elsewhere.
Bannedwith1milKarma@reddit
Because their customers turn everything on, then complain about performance.
Verite_Rendition@reddit
It's a shame this is being downvoted so much. Even if the OP is (probably) reading into things a bit too much, they do raise a good topic of discussion - albeit one that's probably more about game development practices than hardware.
BF6 strikes me as a particularly interesting case. BF5 was a marquee title for hardware RT (yes, NVIDIA, we get it: BF5 has reflections!). So having it absent from BF6 is a major shift.
Cheerful_Champion@reddit
RT in BF5 was already killing performance. With how much is going on in BF6 it would be impossible to run it on anything below 5090 with multi frame gen enabled.
Someone in EA, shockingly, decided to do thing that makes sense and drop RT and focus on performance. I believe that once their engine will be updated in the future to handle RT better and RT hardware will improve they will start implementing it in BFs again.
MC_chrome@reddit
I’ll be the one to say it: first person shooters don’t need raytracing. You aren’t going around admiring the environment like it’s Cyberpunk 2077.
moofunk@reddit
It's more than just appearances. RT rendering is far more robust and is the ultimate way to render. You don't need 100 tricks to make your game look good, if it relies on realism.
The rendering process is simpler, it scales easily and relates more with extremely mature offline pipelines. You can use similar artistic techniques to what offline CG artists use.
That is why I think future game engines, not the behemoths that exist now, will be a lot smaller and rely fully on RT, and dump the difficult labor of fidelity on the GPU and whatever RT acceleration techniques it uses.
MC_chrome@reddit
Sure, at a massive compute cost.
Have you seen how much of the die that NVIDIA’s current raytracing implementation takes up? We are almost 6 years into the RTX era and we still don’t have multiple tiers of cards that can ray trace competently at decent frame rates.
I think raytracing will reach its ultimate potential when it can be done entirely in software without needing specialized hardware.
moofunk@reddit
Getting there will take 15-20 years, and after that not much will change, except fidelity, resolution, power consumption and more AI assist. This is a long haul project that will take 10 GPU generations to solve.
Not going to happen. The hardware is what forces it to be fast. It always has been since the 1970s. Then you can be clever with AI denoising and things like that, but the core reason we have RT in games in the first place is the extreme parallelism implemented in hardware, secondarily very good AI denoising.
Even offline raytracers have become more GPU oriented over the past 10 years, because it simply is much, much faster and scales so very simply, and only the old dogs like RenderMan or Arnold still use CPU for final render.
We observed also for a while some 20 years ago with offline raytracers that CPU based ones try to use many clever tricks to speed up global illumination, by using irradiance caches and various sparse sampling tricks, allowing many different sampling settings and things like that. You can simply brute force your way through that with a GPU to get the same image using simpler methods, so those features are now gone again, i.e. less software, but more speed.
I know you think of software based "path tracers" like Lumen, but they cannot grow in accuracy and speed as fast as hardware based RT will be able to over time, because they use less robust tricks to function and are plainly less accurate. They are mostly a product of many consumer GPUs still being too slow for RT.
Strazdas1@reddit
Well fuck them then. Getting there shouldnt take even 5 years. this utter stagnation in tech adoption in games is ridiculous and developers need a good ass whooping for this.
moofunk@reddit
Physics won't allow that. Realtime raytracing is a hard number-of-samples per second problem. More samples = sharper and more stable image.
RT engines still need to be maybe 10x-100x faster to compete with offline raytracers for image quality. Then they also need to consume much less power.
We're able to touch the bottom end of it now, which is a tantalising prospect for changing game development (any kind of 3D visualization) into a very simple, ubiquitous raytracing by default method.
The good news is that all you need to do is throw transistors at the problem with no change in algorithms.
Strazdas1@reddit
Nothing to do with physics. Realtime ray traving is viable on even budget cards of this generation. They dont need to be faster, they need to be implemented.
moofunk@reddit
Everything to do with physics. This is a problem that can't be solved outside of throwing more transistors at it, as has been done over the past 50 years.
What you're getting in current games are only a smidgen of what you can do with raytracing.
The current top end RTX solution is a huge compromise to get responsive frame rates and must use AI upscaling and sophisticated AI denoising to work.
The criticism that modern games look as good as they do with RTX on or off lies in rasterization/raytracing hybridisation, where the rasterizer helps the raytracer with primary rays. You don't get any of the benefits of simulating real cameras with raytracing, i.e. real accurate DOF, real lens flares and real motion blur. You don't get caustics, you don't get light scattering or subsurface scattering without tricks or special arrangements. These are too compute intensive for real time.
As it is, even with low image quality demands, RTX fails the first goal of actual real time raytracing: Converge an image fully in a single frame. That's impossible at the moment, because the GPU plainly can't calculate enough samples fast enough per frame. In Quake II RTX, you can converge in about 500 frames on a 2080Ti, so maybe 125 frames on a 5090. You need to get down to 1-3 frames.
Forget indoor scenes generated with only bounced light. You can't do those in realtime at all without burying them in noise.
Play around with Omniverse for a bit and compare the RTX scene rendering with Iray scene rendering. There is a stark visual difference. RTX still takes several seconds to converge and is very noise riddled, where Iray takes about 100x longer to converge, but is also much more accurate and has much less noise. You really want Iray's several minutes long render to be the realtime goal, hence GPUs need to get 100x faster.
VastTension6022@reddit
That's like saying rasterization will reach its ultimate potential when it can be done entirely on the CPU without needing a GPU.
Wait_for_BM@reddit
We has fully software RT back in the old days, so that's a wrong conclusion. The physics and math for RT are well know for a long time. They (e.g. Pixar) had been rendering them frame by frame that takes tens of hours for ages. There is no wishful thinking that it can be make to runs orders of magnitudes faster. RT only become more main stream accessible when we can do some of the limited RT in real time.
We need the specialized hardware in GPU to accelerate RT. Specialized hardware always beats doing it on pure software on a general purpose processor. i.e. without specialized hardware.
Strazdas1@reddit
Well, now that you said it, maybe dont say it again. Its a stupid thing to say.
BighatNucase@reddit
I think that's too broad a statement really. I think it makes perfect sense for a game like Battlefield 6 to focus more on making things as smooth as possible for gamers because it needs to be a standout hit for the franchise to continue going on; it's better to have slightly dated graphics and no complaints about performance than to risk it. Super competitive shooters similarly don't really need it either for obvious reasons. Singleplayer shooters though? I think it would be silly not to try and push things a bit graphically even if it's just an option.
Western-Helicopter84@reddit
Ray tracing is not only for the visuals. Ray traced global illumination can decrease a lot of development time & cost.
For instance, id software said they had to decrease the resolution of lightmaps of doom eternal into 1/4 as it took too much time to bake. And also said that it would be almost impossible to develop doom tda in that way since it's at least 4 times larger than the eternal for each maps.
Cheerful_Champion@reddit
Dunno mate, I definitely did admire environment in some multiplayer games with RT (war thunder, bf v, darktide)
MC_chrome@reddit
So you’re paying more attention to the environment than the other players trying to take you out? Certainly not the choice I would have made but you do you I guess
Cheerful_Champion@reddit
Don't care. I'm playing games to have fun. Stopping to see amazing looking scene is part of the fun. Getting sweaty and dropping graphical settings so some graphical fireworks don't cover enemy for a split second is not fun
dudemanguy301@reddit
Call me crazy but maybe a game with dynamic destruction should have dynamic lighting to handle how the environment changes???
They already had to rework their lighting system in less than a month and there are still problems.
R6 Siege has reworked its lighting system atleast twice that I’m aware of but I haven’t paid attention to the game in years.
The Finals uses RTGI and its destruction is incredible.
Clean_Experience1394@reddit
Cyberpunk is a first person shooter
MC_chrome@reddit
I mean yes, but it is also a very story driven game that you can take the time to look at the environments around you….Cyberpunk 2077 is also not a multiplayer centric game that is solely focused around fast action
account312@reddit
Yeah, they should stick with sprites.
sturgeon02@reddit
What are you talking about? The RT in BFV is very easy to run on modern hardware, and in general raytracing is pretty achievable on hardware much less powerful than a 5090. And one of the advantages of raytracing is that the cost is relatively fixed, just because there's a lot going on in BF6 (and by that you mean there are a lot of effects happening over a relatively static map with canned destruction animations) doesn't mean it would be impossible to run on anything but the best card.
Is raytracing absolutely needed? Obviously not, but this game cost $400M+ to make, just give people the option.
Cheerful_Champion@reddit
RTX3080 is absolute minimum if you want to run game at ultra or even high with RT enabled in 1080p. If you want to play on 1440p you are looking at RTX4080. How is it very easy? Enable 4k and 5090 is needed
sturgeon02@reddit
Don't know where you're getting those performance numbers, here's a 4060 getting 70-80fps at 1440p/RT with no upscaling.. I also used to play the game at 4K DLSS quality (so 1440p) on a 4070 and had no issues making the game out.
Cheerful_Champion@reddit
Tests that don't suck. Go play on this settings Under No Flag instead multiplayer scenario that produces results that can't be reproduced.
sturgeon02@reddit
Multiplayer seems like the revelant benchmark here, considering that's what the vast majority of people are playing. But what the heck, I still had the game installed so I did some benchmarks on my 5080. At 4K with RT the Under No Flag mission runs at around 70-90fps. It seems heavier than anything else in the game though, as the three other missions I tested all ran at 100-120fps, as did a fully populated match of breakthrough with lots of action on screen. The 5080 is ~40% less performant than the 5090, so no you absolutely do not need a 5090 for this game. And if you use upscaling like a reasonable person (not framegen), you could get away with a much less powerful card.
Apprehensive-Box-8@reddit
OP is basically answering the question themselves by acknowledging a move to engine-specific lighting systems like Lumen. It’s the same reason why developers adopted RT in the first place: cut development time. Studios were able to cut times on lightmaps by integrating RT, but had to do so „by hand“ and still needed a fallback for non-RT HW. With things like Lumen integrated into the graphics engine, they can cut even more dev-time.
Strazdas1@reddit
cutting lightmaps do more than cut developement time. We had titles released where 90% of the file size was lightmaps because they attempted to lightmap global illumination outside. Sure, it looked great (for that time) but damn downloading all that data for lightmaps when it should be generated real time.
Flippantlip@reddit
Isn't that a touch exaggerated? How many textures are we talking about here, and in what resolution, for it to take up so much capacity?
Vb_33@reddit
No he means software lumen vs hardware lumen and why so many games are shipping without hw lumen support even tho it's just a toggle on UE5 engine. The reason is because it costs more time and money to implement 2 versions of lumen vs 1 that works on all hardware (except Switch 2).
Petting-Kitty-7483@reddit
Heck lumen is still ray tracing. So it's not like that's dying. Just these games and engines are in flux
Darth_Ender_Ro@reddit
Hiw the hell do you guys see the downvotes
Verite_Rendition@reddit
At least on Old Reddit, it shows the percent of votes that were up-votes ("% upvoted"). So you you can infer the downvotes from that information.
I don't use New Reddit, so I don't know if and where that information is located there.
letsgoiowa@reddit
Wait where do you see that? I've been on Old Reddit since my account has been around and that feature vanished for me like 7 years ago.
Verite_Rendition@reddit
Top right-hand corner. Case in point, this post:
letsgoiowa@reddit
Ohhh just the posts for me not comments
Sevastous-of-Caria@reddit
Bf6 did the sensible thing. Bf5 RT was slow, outdated and unoptimised. Excluding it made bf5 the most optimised AND visually pleasing FPS out there.
not_a_gay_stereotype@reddit
Bfv had the easiest to run RT I've ever seen in a game and it looked great on the mud and water. 2042 only had RT global illumination and you basically couldn't tell the difference and it tanked your FPS.
Vb_33@reddit
You wish BF2042 had RT Global Illumination, in reality it only had RT Ambient Occlusion which is the cheapest (to run) and most unnoticeable RT effect of the whole batch.
At least BF5 had RT reflections which are more noticeable and in BF5 had great fidelity. also BF5 was the first RT AAA game, it paved the way for modern RT games, I run BF5 with RT reflections on all the time and it's great.
Thistlemanizzle@reddit
This alone kind of makes a case against RT. There’s good RT and super duper amazing RT. It’s too much to keep track of. Super Duper RT requires expensive hardware or heavy optimization (which costs money)
Vb_33@reddit
It's like any graphics setting, what the hell is ambient occlusion anyway? Anisotropic filtering? My favorite is the anti aliasing setting on UE5, it's so confusing. "Improves anti aliasing quality" ok so it makes DLSS better? Does it make it more demanding? If I put DLSS on performance and antiliasing on epic (max) will performance fall apart?
Turns out it doesn't do jack shit and no game ever says it. It only affects FXAA (lol), MSAA (double lol) and TSR but in what way exactly? Who knows, games never tell you.
Thistlemanizzle@reddit
How exactly do you get the top 1% commenter flair or badge?
Is this overall reddit or just this sub? I assume you have a great upvote ratio and a lot of upvotes in general?
Dangerman1337@reddit
DICE didn't even update BFV to support DLSS 2.0 :/.
not_a_gay_stereotype@reddit
That game was so easy to run you don't even need DLSS lmao
YNWA_1213@reddit
You say that like there wasn't an obscene amount of hitching in DX12 mode. Most in the community stuck to DX11 through its lifetime cause it offered a smoother experience.
not_a_gay_stereotype@reddit
I never had an issue running it, ever.
letsgoiowa@reddit
Clear your shader cache and run a campaign mission in DX12 and record the whole process. Freezes and stutters out the wazoo.
Strazdas1@reddit
Ive yet to find a game that runs better in DX11 mode than DX12. In fact in some games, where you could do it due to unreal engine being versatile, modding the game to run in dx12 mode improves performance.
disagreementsarenorm@reddit
Thats jusy dx12 most of the time isnt it?
Strazdas1@reddit
DLAA is best AA there is though. Its always worth enabling even if there is no issue running the game.
not_a_gay_stereotype@reddit
I agree DLAA and FSR AA are great
Lars_Galaxy@reddit
Why
ArateshaNungastori@reddit
Ehh, BFV was rough performance wise. It got fixed later gradually.
But also BFV being 7 years old and 3 generations of gpus released also helps.
Sevastous-of-Caria@reddit
Im still impressed cards like 5050 can extract 250fps easily. Same as CS2 while having 64 player lobbies with terrain deformation. Physics, vehicles and explosions.
letsgoiowa@reddit
5050 at 250 fps???? Show me because my 3070 is certainly not doing that
ComputerUpgrader@reddit
I literally enabled ray tracing during the launch of this game on a literal gtx 1080 and at rt medium it was able to get between 30-40 fps iirc
Vb_33@reddit
Nah BF5 looks best with RT reflections on. That's how I play it to this day. BF6 is just a PS4 game technology wise running on current gen consoles. They should be able to get BF6 running even on Switch 2 at reduced fidelity and fps. Think about it this way, the unlocked fps mode of BF6 on PS5 reaches 90fps in gameplay, BF6 was not built to maximize the capabilities of current consoles (disappointedly the CPU) unlike BF3 on PS3.
It's an underspecced game thats running on overspecced hardware, I imagine the only reason it isn't on PS4 is because the Jaguar CPU would make gameplay fps wobbly.
einmaldrin_alleshin@reddit
BF5's ray tracing looked really tacked on though. Shiny cars fit in a racing game, but not in the apocalyptic hellscape of a WW2 battlefield.
It reminded me of that period in the mid 2000s where everyone put excessive bloom and god rays into their games.
digital_n01se_@reddit
2nm or 1.8nm sounds more feasible.
I Think the transistor budget of 3nm will be used primary for AI, RT will get a benefit but the big money is in the AI inference and AMD needs to catch NVIDIA.
However, the density of 1.8 nm is massive and RT can get a healthy allocation of transistors.
I think the next PS5 will be 1.8 nm, more SoC focused, and a lot of experimental and exciting hardware features.
dabocx@reddit
1.8nm will not be ready for the PS6 if its going to launch in 2026. Its probably going to be 3nm to keep costs down.
digital_n01se_@reddit
Honestly, I don't think we will have PS6 for 2026, very unlikely.
2028 - 2029 sounds solid for me, PS5 is a solid machine with a good CPU, decent GPU, RAM size and memory - I/O bandwidth, very balanced as a whole compared to prior consoles, I'm sure it can last longer than PS3 and PS4, and it will last longer if sony wants.
1.8 nm for 2028 at reasonable costs sounds reasonable, I'm not too pessimistic about it, but you are right about the possibility of 3 nm to cut costs.
dabocx@reddit
Sorry meant 2027, and there's wake too much smoke for the 2027 rumors. If it is on zen6 then it might be 2nm but I think its more likely to be Zen 5 with RDNA5/UDNA on 3nm.
Kryohi@reddit
> the next PS5 will be 1.8 nm
that's a good way to get a $1000 PS6.
Node is irrelevant here though, the next consoles will use RDNA5-derived architectures which will focus both on ML and RT. The problem is what happens with cross-gen titles, and for how long we are going to have those.
Strazdas1@reddit
They better fucking have a 1000 dollar PS6. Or are we going to get another e-waste dead on arrival console generation?
digital_n01se_@reddit
PS5 pro launch price was already 700$
PS6 will cost 800-990$ even if it's 1.8nm or not lol
Exciting-Ad-5705@reddit
Pro consoles are usually more expensive.
Good_luckapollo@reddit
Usually the costs go down for manufacturing and the pro console comes out at the old consoles MSRP. PS5 pro is an anomaly given that didn't based on modern economic and manufacturing problems.
digital_n01se_@reddit
that was because the cost per mm2 of silicon decreased, but that's not the case anymore, we are getting more transistors, but those transistors are expensive, and the price rarely goes down.
a PS6 at 990 dollars would be and exaggeration, but I seriously doubt a launch price of $500, I think PS6 would be launched at the same price of PS5 pro (700 USD)
even game prices are rising, I just prefer to not expect cheap hardware anymore.
Good_luckapollo@reddit
Likely yes, I'd imagine current prices are to maintain stock and familiarize people with the new prices so it's expected for next gen and not a shock. Clearly gaming is going back to where it was in the 90's as a luxury. It's a shame really, all of the emergent tech that's streamlined efficiency in making and selling games should still be driving costs down for games at the minimum.
digital_n01se_@reddit
consoles will not be 500$ anymore at launch, that's what I'm trying to say.
this is the first time that consoles increased their price instead of decreasing it due to inflation, they even removed physical media and increased prices of online services
RTX 5090 is two grand alone
RTX 5080 cost more than 1000 USD
they will find a reason to price PS6 at 800-990 dollars
Noreng@reddit
Nvidia doesn't need to add more SMs or RT cores with their next generation of GPUs, they need to increase the ability to feed the existing SMs. Most games today have huge issues with feeding the SMs work, with multiple UE5 titles choking the 5090 down to 400W because the front-end isn't capable of feeding the SMs.
As for AI, maybe? Improving SM utilization would already be a significant help
digital_n01se_@reddit
you mean, more cache?
out of order execution and branch prediction?
interesting
Noreng@reddit
Not necessarily more cache, but GPCs
digital_n01se_@reddit
I think that the modern workloads became "too general purpose" to fit into the classic GPU insane parallel scheme, perhaps GPUs would benefit from a more complex front-end capable of proper branch prediction and out of order execution, or smarter cache design.
I'm concerned about the power draw, 3090, 4090 and 5090 increased the power to 350W, 450W and 570W TDP
If someone can clarify my ignorance, I would be glad.
Noreng@reddit
The entire point of GPUs is SIMD, if you're trying to not use SIMD on a modern GPU it's a skill issue rather than a hardware issue.
I don't know exactly what's causing the poor utilization in Ada and Blackwell, but it's likely a case of front-end.
digital_n01se_@reddit
Imagine a branch inside SIMD code, some sequential thing that would lose performance if you execute it on the CPU and then travel across PCI express due to latency.
Some workloads waste performance due to this CPU -> PCIe -> GPU travel, GPUs don't work well with sequential code, some tasks can't be vectorized furthermore.
I think that's why some tasks fit better on a AVX capable CPU, the SIMD units are close to the CPU, minimizing latency.
digital_n01se_@reddit
u/Noreng
I realized that it's far more expensive to develop such complex GPUs than optimizing code
A GPU GPC capable of that would be much larger, that means less GPCs per die, therefore less performance, and the performance upflit is questionable.
thanks for your clarification
KennKennyKenKen@reddit
Never made sense a multiplayer shooter was pushing the boundaries of graphics.
They did the right thing this year by focusing on optimisation. And they did an amazing job.
It doesn't get enough credit. My 5 year old 3080 can run the game at nearly 200fps. Insanity, so pleasantly surprised considering how many unoptimized games have been coming out
Techhead7890@reddit
EA being EA, the cynical part of me thinks there was a business deal or some weird business strategy to include it and the mainline devs couldn't convince the producers and managers to drop it lol.
And then again I think you also discount the amount of campaign only folk. It's not as bad as in RTS where competitive PVP is brutal and tiny, but sometimes people are okay with less performance on a singleplayer experience that looks good.
But yeah glad to hear the performance is better when you do go into pvp!
Strazdas1@reddit
multiplayer shooters used to always push the boundaries of graphics until we got into the whole competetive scene with counter strike andn then you had a bunch of people at internet caffees running worst possible hardware playing them.
iBoMbY@reddit
The Frostbite Engine is a complete mess for a long time, and especially since Johan Andersson left.
Strazdas1@reddit
What a way to ruin such a good engine :(
Bmotley@reddit
As anyone who plays Madden (especially on PC) will tell you.
cookieblair@reddit
To be fair, the reflections kinda messed up BF5's multiplayer because only some players could see them.
studyinformore@reddit
Well, we have to remember all the really important stuff isnt 3nm. All of the actual gates will still be in the 17 to 24nm range. Its only some features like the interconnects and maybe cache that'll be 3nm.
SERIVUBSEV@reddit
This is simply the case of AMD being terrible at marketing, and having to catch up at any nonsense jacket man makes up to hype his products.
Ray Tracing looks good, but it performs 100x better if doing in game development phase instead of real-time when someone is playing the game.
Remember, real time RT is only needed on shiny surfaces and water. It is not the groundbreaking tech that gamers have been made to think.
Glebun@reddit
That's not true, though. It makes a difference on all kinds of surfaces, and it can only be baked in if the light sources and the surfaces that the light falls on are stationary (or move along predetermined paths).
It makes a difference with regular non-glossy surfaces. Watch RTX on-off comparisons for any game, including Cyberpunk - you will see that the light and its color affects surfaces like walls (just like it does in real life).
account312@reddit
Which, to be clear, includes an NPCs and the player character.
SomniumOv@reddit
dynamic lighting, day and night cycles, modular prefab games like the Elder Scrolls ?
Taeyangsin@reddit
Or in games with very variable props/destruction, hard to bake lighting into something like Teardown I'd imagine.
macholusitano@reddit
I agree. Here’s an upvote.
Doikor@reddit
This doesn't really help as the price per transistor hasn't really gone down (and nobody expects it to).
Yes there is space for it on the die but nobody will buy a $1500 PS6
Keep-Darwin-Going@reddit
It is more like the initial launch with more like Nvidia push for it. When leave to market force it make sense not to use it. It only works on the best gpu now and it cost too much for majority of the gamers, why would you spend so much effort to let 1% of your target audience enjoy a little bit more compared to better game play.
theoutsider95@reddit
Because OP cherry picked few titles that didnt come with HE RT. If you check the games released this year most of them have HW RT.
From-UoM@reddit
Software Lumen is the reason UE5 performance sucks ass
jonydevidson@reddit
I see you never used Unreal Engine.
More than likely the culprit is bad programming and low usage of instanced static meshes because if you don't do ISM, every copy of a mesh that you might have in the loaded level will have its own draw call every frame.
Lumen is kind of set it and forget it, bounces are bounces, they don't care about geometry so in theory it scales very well. That's why your empty UE5 project with Lumen on runs at 200 FPS - what tanks it over the course of development isn't Lumen.
temo987@reddit
Doesn't Nanite consolidate them?
jonydevidson@reddit
Why would it? Nanite is just a rendering mechanism for automatic LOD. Creating an ISM takes away control since it groups the meshes into one. If you want to control them separately, you have to split them up.
temo987@reddit
IIRC Nanite consolidates all (Nanite) meshes into a single (big) draw call. It's the reason behind all the "big base cost but scales well" talk. After you pay the performance cost, you can do whatever you want.
OoFTheMeMEs@reddit
Are we smoking crack nowadays? Since when is
Calneon@reddit
Software Lumen doesn't use HWRT so not sure what relevance your point has.
Strazdas1@reddit
thats the point, it does not use the fast way to do RT, instead choosing the slow way (software).
From-UoM@reddit
It does. Software RT was a terrible idea and was done for comparability reasons.
And the worst part? Even the Hardware RT Lumen implementation still runs parts software RT.Lumen
So a double whamy
UE6 has to ditch software based RT completely and go full i hardware acceleration.
binosin@reddit
Iirc the only parts of software RT remaining are unfortunately core to how Lumen itself works. Lumen retains low quality world space lighting in the surface cache for stable lighting. It traces into the cache using real geometry (hardware Lumen) or SDFs (software Lumen) to gather and mixes screen traces where possible to clean up reflections, hide uncached lighting and cover missing or misaligned geometry. If Im getting this right the only leftover software parts running in HW mode are screen space gathering and trace mixing which is kind of core to the technique in general because of how poor the surface cache is. Even Cyberpunk PT needs screen traces for missing stuff like smoke.
Imo Lumen itself needs a bit of a rethink to change how it's caching lighting. The cached lighting basically needs screen traces because it's so low quality. I don't know of any solution except almost path tracing (which could work with the cache to reduce variance) but that's still extremely expensive. Lumen was a bit too designed to make software tracing bearable
Different_Lab_813@reddit
Explain how Software RT is using HWRT just because they using compute shaders doesn't mean it's using RT cores.
From-UoM@reddit
Stuff like Surface Cache and screen tracing are still done on the shader.
Calneon@reddit
Nobody is denying that HWRT uses some components of SWRT.
I said that SWRT doesn't use HWRT (which is true). You said it does. /u/Different_Lab_813 asked you to clarify how SWRT uses HWRT, and you respond to say how HWRT uses SWRT.
HWRT using SWRT is not the same as SWRT using HWRT. There's some miscommunication going on here.
From-UoM@reddit
Huh?
I never said anything about Swrt using hwrt
Go look at my comment. You misread it
Calneon@reddit
I said:
You said:
My original comment was questioning why bringing up SWRT is relevant when this thread is about HWRT.
From-UoM@reddit
Ah shit. My bad.
I MISREAD. Thought your original comment was HWRT doesnt use SWRT
My mistake.
Calneon@reddit
Aah OK that makes sense, no problem :) glad to work it out.
From-UoM@reddit
Yeah. Sorry about. All good now.
From-UoM@reddit
Oh shit I just realized what happened. Sorry, it wasnt you that misread it.
It was u/Different_Lab_813 how did
Here is my comment he replied to
>Even the Hardware RT Lumen implementation still runs parts software RT.Lumen
I never said anything about swrt using hwrt
https://www.reddit.com/r/hardware/comments/1on70rj/comment/nmuq92g/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
alelo@reddit
i guess SW RT is like SW PhysX?
From-UoM@reddit
Not quite. The quality of Software RT is lower.
PhysX on sw and hw runs the same simulations afaik. One just slower than the other. I could be wrong here though
alelo@reddit
yeah but i guess you could run SW RT on the same fidelity that HW RT runs on, it would just be as slow as SW PhysX is compared to HW PhysX
Calneon@reddit
They are ditching it. Basically support is being dropped and they are recommending developers to focus on the HWRT path.
battler624@reddit
Won't happen.
They should make it default to full hardware lumen and more optimize its performance.
_OccamsChainsaw@reddit
That is the point. Poorly implemented software lumen is the reason my 5090 can still struggle with some of the more egregious UE5 titles. Everyone gets bad stutter.
Calneon@reddit
With all due respect you don't know what you're talking about. Software Lumen doesn't cause stuttering. There are many things that can cause UE5 games to run poorly, you have no idea if it's Lumen or any number of other reasons without doing in-depth profiling that would require a development builds of the game and tools.
Software Lumen isn't perfect, maybe it's not even great. But it's trying to do something that hasn't been done before with wide hardware compatibility.
Petting-Kitty-7483@reddit
Yeah software lumennwas one of the only good things about ue5 until 5.6 launched. The stutter has been there since ue3. It just wasn't until this gen with larger maps it was that apparent to everyone
Dangerman1337@reddit
And only happened until DX12 was adopted.
Petting-Kitty-7483@reddit
The stutter predates software lumen by at least two gens. Ue3 had it if you pushed the maps large enough even. It just didn't reach the breaking point until devs started pushing the engine haed like ue5 maps do.
Sevastous-of-Caria@reddit
There was a great video showing gpu execution pipeline and its execute time problems with UE5.
https://youtu.be/c3zZtVBspzU?si=pIi_G2Anl68-ZbyN
Aggrokid@reddit
Threat Interactive seems like a sketchy channel to me. He claims to be a game developer but all he seems to do is UE ragebait
Seanspeed@reddit
He is no way a game developer. He's an armchair tinkerer at best.
The guy has been constantly dunked on by actual game developers, trying to call out his nonsense whenever idiots go parroting his videos/claims on social media.
LockingSlide@reddit
If he was just confidently incorrect he'd be easy to ignore, but the guy behaves like a cult leader., constantly referring to himself as "we" to trick people into believing it's not just him working on videos, saying there's a "movement" which he is apparently a self appointed leader of.
In the linked video he flashes names of Lumen developers and says they should be "held accountable", borderline psychopathic behavior and even a little dangerous, someone could harass or do something worse to these people.
Seanspeed@reddit
The guy has also tried to push his followers on harassment campaigns against others, like Digital Foundry. Guy is a scumbag on top of being a hack.
ChangeRemote7569@reddit
Proof? Because in his Alan Wake 2 video one of the devs commented and said it was well made and they would send it to the tech team
Seanspeed@reddit
Well that's probably cuz he was praising Alan Wake 2 for something or the other.
The problem is when he's criticizing stuff, which is like 95% of his videos.
And no, I've not saved any of the receipts, but it's happened lots of times. The way the guy just throws out tons of "Oh if they just used 'x' technique that I found some paper on", as if it was at all that simple. The guy has no clue about the vast majority of things he talks about, and has no actual practical experience DOING any of the things he proclaims will just magically fix AAA games.
What's more likely - that the entire industry of professional game developers have all become incompetent and couldn't come up with these 'easy' solutions that Threat Interactive talks about, or maybe things aren't at all as Threat Interactive claims them to be and doesn't really know what he's talking about?
ChangeRemote7569@reddit
No it was a critical video saying it's graphics were overhyped. The rest of your response is a nothingburger, still haven't provided any actual details. The "what's more likely" hypothetical as an argument seems to me you don't actually know what you're talking about
veryrandomo@reddit
He had/has a whole goal of trying to raise $900k so he could "fix" UE5, but when I looked at some of his messages on the Graphics Programming discord he outright said he only had entry level knowledge at best, and that it didn't matter if he understands what he's talking about because his viewers don't either
His whole shtick is tossing around a bunch of jargon to make himself sound smart then only responding/arguing against random comments while copyright striking anyone else so he can play the "my criticizers don't know what they're talking about" card
kuddlesworth9419@reddit
He comes across as a douche but he makes some valid points. Especially on image noise which is so prevalent these days.
Seanspeed@reddit
Because he's an absolute hack who rarely ever *actually* knows what he's talking about. But people who know less than him are unaware of that, so think he does.
From-UoM@reddit
He is egotistical, but UE has severe problems
Sevastous-of-Caria@reddit
It is sketchy channel yea. But that render pipeline isnt defendable
hishnash@reddit
Lumen sucks ass either it is using RT HW or not.
From-UoM@reddit
Because even if you turn on RT HW, parts of the RT SW are still running.
Unless they completely ditch the every RT SW aspect, performance will continue to suffer.
hishnash@reddit
not just that, also just how it works under the hood means it spends a LOT of extra compute (HW or SW) in places were it does not need to.
reveil@reddit
The quality gain is not worth the performance cost on the vast majority of hardware. Until we get a mainstream xx60 class card that can do RT well there is no point in optimizing for the 2% of customers that have a 5090. It makes much more sense to lower min hardware requirements to sell to a wider audience.
Strazdas1@reddit
a 4060 can do ray tracing without issue, so we had those mainstream class cards capable of RT for two years now.
reveil@reddit
I have one and it can but very poorly with bad performance even on 1080p. Cyberpunk with path tracing recommends at least a 4070 for a reason.
Strazdas1@reddit
First, do you know the difference between path tracing and ray tracing?
reveil@reddit
Yes and the path traced cyberpunk looks incredible while the normal rt just has higher requirements and I'm not even sure if it even looks any better than the rasterized version.
Morningst4r@reddit
Rasterised cyberpunk looks terrible. It’s like a different game.
reveil@reddit
Non path traced normal rt looks almost the same as rasterized just a tad darker. If you don't have water reflections most people would fail a blind test. Path traced looks so much better tho.
Strazdas1@reddit
Good, then go back and read again what i said. I was talking abour ray tracing, not path tracing.
exscape@reddit
The 5060 can run RT in quite a few titles depending on your requirements. (Since it's the second-lowest-end NVIDIA GPU you obviously shouldn't expect 100+ fps.)
For example (all in 1080p, all native/no DLSS): Cyberpunk RT Ultra 45 fps, Veilguard Ultra High 52 fps, F1 24 72 fps, Resident Evil 4 98 fps, SW: Jedi Survivor 64 fps
Games in the same test suite that weren't playable with those settings: SW Outlaws 22 fps, Alan Wake II (RT High) 19 fps, Black Myth: Wukong (full RT, path tracing I think?) 14 fps.
5/8 ain't bad for such high settings on a fairly low-end GPU. Could probably hit 60 fps in those games with DLSS Quality and some non-ultra settings.
Vb_33@reddit
All RT titles run on a 5060 well. Hell most RT titles are built to run on a 2060 Super and PS5 (see Assassin's Creed Shadows).
fuzzynyanko@reddit
This is a good point. The most popular card on Steam is still probably the RTX 3060.
Heck, my laptop has the laptop RTX 3060 because I just wanted a decent gaming machine that plays 75% or more of my games (desktop has a beefier GPU and the actual number is actually closer to 99%)
reveil@reddit
The top 5 cards are: 1. 3060 2. 4060 laptop 3. 4060 4. 3050 5. 1650 None of these are capable of good RT performance.
dannybates@reddit
I have a 5090 and I still don't use RT. It's just not worth it imo.
DabuXian@reddit
Gamers have become incredibly hostile towards graphics technology. Anything that pushes graphics forward, like Crysis did, tends to get negative reviews and poor sales. Meanwhile games that proudly announce they only use old rendering methods like BF6 are universally praised and get free social media engagement from happy users. I'm not surprised devs increasingly avoid using ray tracing.
The_Countess@reddit
Games became hostile to that because the cost of the fasted GPU available has increased 10x, while midrange performance has increased MUCH slower, while still costing a lot more then it use too.
Strazdas1@reddit
No, it hasnt. A 5060 cost less than a 1060 did at launch.
The_Countess@reddit
Just because they are both called xx60 doesn't mean they are the same. Twice now nvidia's downgraded what the xx60 class of GPU was, moving it low and lower on the stack
To illustrate this, there use to be a 1030, there isn't a 5030. What would have been a 1030 is now a 5050. (and i'm being generous and ignoring the 1010)
So while xx60 use to be the middle of the stack, now they are just the second to last.
Strazdas1@reddit
This is nonsense. What used to be 1030 is called integrated graphics now. Its simply not worth making GPUs that are as good as integrated graphics.
It was never middle of the stack.
It always was.
skycake10@reddit
That's just a reasonable priority for a game like BF6. It's an online competitive game where good and consistent performance is king for most users. There's no reason to use RT if half your users are going to turn it off for performance reasons.
DabuXian@reddit
Competitive gamers dont care about graphics at all, if you click through BF6 twitch streams almost everyone plays on the lowest settings. You could use that argument to advocate for all mp games to look like battlebit or minecraft, why even bother with graphics at all.
Strazdas1@reddit
yet another reason to not like the sweat boys.
Darrelc@reddit
Are you sure? There's a lot of throaters out there
Key-Pace2960@reddit
Multiplatform releases mean that you have to develop for consoles that use RDNA2 which doesn't really have any RT acceleration hardware with mentioning and are pretty much just brute forcing it.
Unfortunately that means that software RT solutions like Lumen are often prioritized in spite of it not running meaningfully better if not outright worse than hardware RT solutions on NVDIA, Intel and current gen AMD cards while also looking a lot worse.
That being said hardware RT isn't going anywhere, if anything we're starting to see games with mandatory hardware RT.
As much as I like it I also don't think it always make sense to implement RT outside of an enthusiast POV. Battlefield 6 for example is a fast paced shooter that needs to run well for a broad audience. I definitely would have appreciated RT, but I get why it wasn't a priority. Most people probably wouldn't have used RT anyway in order to maximize their framerates in a competitive shooter.
While RT is considerably easier to implement than a traditional rasyer approach you still have so do both to some extend until we're getting to a point here the lowest common denominator can run a comprehensive suite of RT effects with decent performance. So you have to do many things twice and as such the resources aren't always allocated to RT.
Dangerman1337@reddit
Thing is BF6 doesn't even use Mesh Shaders despite it would give current hardware a performance boost! The GTX 10 user based is crazy vocal and tear devs to use newer tech properly.
Strazdas1@reddit
consoles cant run mesh shaders. consoles holding gaming back as usual.
Dangerman1337@reddit
Xbox Series can and PS5 has Primitive Shaders.
Strazdas1@reddit
Yes, they do. Those are prototypes of mesh shaders. But its not the same thing and not the same capability.
Kemaro@reddit
Because HW runs worse and often looks worse or at best very similar to SW since most devs don't implement a good denoiser.
aronmayo@reddit
FWIW Lumen IS raytracing. Hardware Lumen is hardware RT. Software lumen is software RT.
The reason the hardware stuff is less prominent now is that it’s tough to get running on consoles and they are still the primary development target for most AAA games.
Dangerman1337@reddit
These days when a game developer does push hardware like AWII (Not RT only but Mesh Shaders) or Doom The Dark Ages they get monstered by GTX 10 owners who scream at them being unoptimised and lazy. And even when developers like ID explain it gets ignored.
I mean how many games actually use Mesh Shaders? Don't think its more than a count of more fingers on someone's hand.
F0czek@reddit
But why you get nothing out of mesh shaders (that game run like ass and look ass too) and doom the dark age also run unimpressively but hey you get semi realistic lights BUT WHY? it is not like it looked amazing, pretty much like any other modern game.
Morningst4r@reddit
AW2 is incredibly well optimised on the CPU, which will at least partly be due to mesh shaders
Dangerman1337@reddit
AWII would've ran worse without mesh shaders. And Doom The Dark Ages upped the scale, more interactive environments etc where there would've been a performance hit regardless and also has smooth frame times which is a bigger tell of optimisation than pure frame rate.
F0czek@reddit
No it wouldnt donkey, games dont need mesh shaders to fucking run good it just tech that restricts older cards for the sake of it. If it was geniuely better than traditional methods you would see it being used, but it is not.
9897969594938281@reddit
Update your card, bro
Petting-Kitty-7483@reddit
Imagine living in 2025 and having a GPU worse than the steam deck like that poor sod does.
F0czek@reddit
Imagine being worthless pig that is clueless about gpus and their perf
hardware-ModTeam@reddit
Thank you for your comment! Unfortunately, your comment has been removed for the following reason:
Please read the the subreddit rules before continuing to post. If you have any questions, please feel free to message the mods.
F0czek@reddit
5070 ti not enough i suppose
hardware-ModTeam@reddit
Thank you for your comment! Unfortunately, your comment has been removed for the following reason:
Please read the the subreddit rules before continuing to post. If you have any questions, please feel free to message the mods.
Dangerman1337@reddit
By that logic Monolith was being lazy mandating DX 9.0 cards for FEAR in 2005?
Sure the GPU market is bad because Cyrpto Boom and then AI boom and AMD not releasing competitive architectures until now but its not unreasonable for Devs to drop near 10 year old hardware.
Different_Lab_813@reddit
Tell me you know nothing about mesh shaders without telling. Mesh shaders have nothing to do with image being blurry or games running worse. Quite contrary usage of them improves performance, they offer way better geometry culling.
KennKennyKenKen@reddit
I mean the last few dooms were a benchmark of optimisation, running at like hundreds and hundred of fps even on modest hardware.
Super jarring this new one runs like shit.
Morningst4r@reddit
The new one runs amazingly considering how good it looks. If you max out the settings on a midrange GPU it’s not going to be great but it’s good to have some scaling.
Strazdas1@reddit
to be fair AW2 got fucked over by being Epic exclusive, which already means 99% of audience lost.
Dangerman1337@reddit
Yeah true but Epic was the reason why the game exists in the first place, wish Remedy and Epic would reach a compromise to get it on Steam :/.
scytheavatar@reddit
Blame Nvidia, gamers are justified at screaming at these devs when you look at GPU prices.
Strazdas1@reddit
Same prices as always (even less when accounting for inflation) for budget GPUs that run new games better than they did back in 1000 generation.
Nicholas-Steel@reddit
Well, afaik the PS5 and maybe the PS5 Pro also don't support Mesh Shaders so if a game is releasing on those consoles and gets a PC port that performs like ass on high end 1000 series cards... of course it's gonna get bashed when Mesh Shaders discussion is brought up.
Strazdas1@reddit
Then maybe its time to stop supporting those obsolete hardware consoles, then.
Doikor@reddit
PS5 has the predecessor of mesh shaders called primitive shaders.
The problem with using the same gen AMD card on Windows is that there is no way to access the primitive shaders through DirectX. NVidia truly had nothing like it in their hardware until Turing (1060/20xx)
smarlitos_@reddit
I prefer software GI and screen-space tricks
There’s been too much of a focus on needing a $2K PC to run the game at the best settings when the focus should be on the game being fun.
fkrkz@reddit
Handhelds are on the rise, the next billion dollar addressable market. None of them are capable of RT.
Also, RT is just an eye candy that does not improve gameplay
InbetweenTheLayers@reddit
I mainly miss reflections, I feel like I never see rtx reflections offered now its all screenspace garbage which I've grown very allergic to its quirks
BighatNucase@reddit
Even games with full on hardware RT seem to use screen space for some reflections which is frustrating.
DanaKaZ@reddit
Avoiding Screen Space reflections and SSAO is the primary use case for RT for me.
The rest isn't really worth the performance hit to me.
OoFTheMeMEs@reddit
Simple answer, Laziness/Cost-cutting. Hardware accelerated lumen (as atrocious as lumen is) exists and it has significantly better results for similar/better performance. It is just a result of UE5's feature set (or lack thereof), the ecosystem around it and its pricing structure. Promoting development cost minimization while giving you the ability to make games look "AAA" is its selling point.
NGGKroze@reddit
My best guess is development cost/time.
RT is rarely priority and you either use it in (like Doom, which has RT all the time) or its a toggle and you have to implement it separately from the Raster. So Lumen being already part of UE5, might be the easier choice for developers.
From-UoM@reddit
RT implementation is much faster than raster.
Its doing both that takes time.
The most sensible is to drop raster all together. Especially with the 10 series now officially no longer driver support.
Seanspeed@reddit
What does driver support have anything to do with this? :/
RomBinDaHouse@reddit
Obsolete → dropped from dev/publisher targets → low settings creep up and leave old GPUs behind
Seanspeed@reddit
Again, I dont understand what driver support has to do with this. This will all happen just naturally from older/weaker specs and lack of modern features.
Sevastous-of-Caria@reddit
Path tracing demos fps counter disagrees. As far as I understand PT ignores raster all together for lightning? And it doesnt push over 20fps on a 5090
leeroyschicken@reddit
Not for lighting.
Rasterization has nothing to do with lighting - it's how geometry is translated to image.
With path tracing rasterization would be redundant ( it's still used for UI elements, but that's besides the point ), because you already get the shapes by launching all those traces.
If you begin by rasterizing your image, you get only the shapes, and you still have to fill them in. At this point you can use RT as well.
Technically PT doesn't have to be realistic or complete, but ending at first hit would pretty much defeat the entire point of doing it.
From-UoM@reddit
You can have RT without doing every pixel like PT does
Games like Doom The Dark Ages and Indiana Jones does only Hardware RT and offers PT as a bonus
larso0@reddit
Path tracing is just a specific way to do ray tracing. I don't understand why we are talking about this like its distinctly different things.
From-UoM@reddit
I know. Its semantics really.
Dangerman1337@reddit
Should be but people have crazy expectations on wanting very high framrates for every game. I mean I hope those RDNA 5 rumours are true and the AT3 SKU offers like 300 US Dollars WW with Rx 9070 performance with potentially 4090 RT and PT performance. Then thr Pascal die hards can just shut up and have no excuses any more.
Doubleyoupee@reddit
What about "It just works"?
relxp@reddit
Pretty simple when the 5090 is struggling to run many titles nowadays.
Enigm4@reddit
Probably because the added fidelity just isn't worth the performance hit. Unless you have a $1000+ modern graphics card, RT/PT just isn't that great. Small improvement in fidelity at the cost of the game running like crap. Games can be made to look more than good enough just with traditional rendering.
Valuable_Impress_192@reddit
The amount of games with a sensible implementation in those 6 years can be counted on 1 hand. In every other game it adds less than it takes from the experience.
Not having RT exponentially increases the amount of hardware capable of running the game.
More pc’s capable = likely more players = likely more money.
Not rocket science
Strazdas1@reddit
thats some mutated hand with dozens of fingers you have.
Valuable_Impress_192@reddit
I said sensible brother
epraider@reddit
I personally disable it nearly every title, despite having a rig that can handle it reasonably well. It’s just not worth the performance hit, and in many games I actually just completely dislike how it makes the game look.
From-UoM@reddit
OP is specifically taking about Hardware RT.
RT is pretty much in evey UE5 with Lumen Software RT
Petting-Kitty-7483@reddit
And several other games like kingdom come 2 with their software RT implementation. Plus it's not like most the games that have hardware RT required it.
loozerr@reddit
Maybe the hype is finally dying and developers are realising that slightly more accurate lighting isn't worth halving your fps for.
Strazdas1@reddit
Maybe developers dont want to deal with nonsensical lies like your comment.
loozerr@reddit
Point out the lie. Ray tracing isn't worth enabling even in games which do it relatively well like cyberpunk.
Strazdas1@reddit
here are two lies in your comment:
Darrelc@reddit
It's fine because not only will they sell you the problem, they'll sell you the solution too (FG)
Seanspeed@reddit
Except plenty of developers are proving that when they custom tailor their own tech for RTGI, they can achieve fantastic visuals without any massive performance costs.
Also, we're getting lots of UE5 games now, with RT as standard.
Dangerman1337@reddit
Problem is that games are taking longer and sometimes you have to make things look uneven with raster to meet deadlines.
loozerr@reddit
That's a management problem.
MarcCDB@reddit
RT is not ready for prime time yet. Hardware is too weak for proper implementation.
Strazdas1@reddit
HW not too weak unless you are on console.
scytheavatar@reddit
EA said it themselves, a significant number of players use potatoes to play their games. Ignoring them would be a fucking stupid idea. The failure of Doom The Dark Ages should have made it clear hardware ray tracing has lost the war.
Full path-tracing is not practical in 2025, it's probably not practical until the PS7 era.
Strazdas1@reddit
Not catering to the worst players would not be a stupid idea at all.
leeroyschicken@reddit
Addition of optional features wouldn't mean ignoring those players. There must be another explanation, even a simple one that they didn't even bother.
Hard to pinpoint that on the technology, when the game itself was just underwhelming. As far as I am aware, it runs just fine on consoles, which would make bulk of the player base anyway.
That sounds about right, with high end PC enjoying it few years prior.
Personally I am of the opinion that the tech will be used as a baseline more in the future and that will allow software to catch up and provide us with more performant solutions. For example, reflections are not slow because of the ray tracing calls ( though those do have overhead ), but by virtue of simply adding a lot of extra shading - the game engine could somewhat reliably guess the clarity of such reflection and then simply downgrade the quality of shading as much as possible or even exclude things from it completely.
EiffelPower76@reddit
"hardware ray tracing has lost the war"
No, ray tracing will win in a few years
meshreplacer@reddit
Waste of resources to work on something that requires a hard to get video card that costs as mid as a midrange workstation.
Strazdas1@reddit
Its easier to develop for RT, and the capable GPUs cost less than they did back in 1000 generation.
utimagus@reddit
I’m waiting on a sound engine to use ray tracing…
Roph@reddit
It's been tried, nobody cared
utimagus@reddit
People and their lack of proper sound. Least wwise can do it and seems modern consoles/engines can too.
Strazdas1@reddit
Kinda this. You can race trace sound, but if the player is going to be listening to it via a mono speaker in their TV across the room then all of that is pointless.
Strazdas1@reddit
actually there were many attempts, including in some games used right now (tarkov for example) and the community usually loves it, it just does not get adopted into the big titles though.
NotYourSonnyJim@reddit
I'm sure the Digital Foundry review mentioned that the Ubisoft Avatar game has ray traced audio ? Haven't played it, so can't vouch for it personally.
BighatNucase@reddit
Didn't the Dead Space Remake use something like that?
ReasonableAnything@reddit
Just a guess, but Nvidia stopped paying game devs to implement it, and putting resources into it never made financial sense for the studios?
Strazdas1@reddit
Its more that its just simpler to slap lumen software RT in and do no actual development. Why hire a programmer when you can hire 5 artists for same price and sell cut content as microtransactions.
garbo2330@reddit
The new Resident Evil game will have path tracing and is partnered with NVIDIA.
Vb_33@reddit
Yea and Doom TDA has Path Tracing and was partnered with Nvidia. I believe the new F1 also has path tracing and is partnered with Nvidia.
inyue@reddit
Wow we will finally have native dlss implementation? It was just ABSURD to only have the dogshit fsr in the past games. Glad we could mod it thanks to modders.
fuzzynyanko@reddit
Agreed. Nvidia's money maker right now is AI, and since DLSS is based on AI, probably has been getting more focus lately
SporksInjected@reddit
I think this has something to do with it. Nvidia is doing pretty well today and it has nothing to do with games lol
PrairieNihilist@reddit
Playability and optimization.
Chramir@reddit
We're waiting for consoles to catch up.
RomBinDaHouse@reddit
Yeah, but PCs need to catch up with consoles first — less than half of those GPUs on the right are even close in performance:
https://www.reddit.com/r/pcmasterrace/s/JirVgZuhqT
Thistlemanizzle@reddit
They can likely add it later. The market is too small. I think we’ve had enough data to determine if it moves units, unfortunately enthusiasts are a tiny market and they only buy a $70* game once.
*This isn’t about the rising cost of games and live service monetization. I think legitimately RT doesn’t have a good ROI right now. It sucks because it’s awesome.
bubblesort33@reddit
Consoles run bad with hardware based, until UE5.6 or newer. I think CDPR mentioned they are working to get hardware RT to the level of software RT on consoles with the Witcher 4, and all the changes they made to Unreal Engine. I'm not sure it's totally there yet. Maybe when then porting to PC on current console releases, no one bothers to get hardware to RT to work on PC. The Outerworlds 2 even has hardware RT that is broken, and looks horribly according to Digital Foundry on PC.
CSFFlame@reddit
I've been getting shit for this for the last decade... but it's a gimmick.
Tuned_Out@reddit
Because having both a ray tracing mode and a baked in mode is a lot of effort. Especially since many games are multiplatform and have to work on a range of hardware to reach the maximum userbase. Ray tracing has been pushed/marketed for almost a decade now but the hardware hasn't really been truly ready for the mass majority of gamers despite all of Nvidias marketing since the 2000 series.
All that FOMO that is marketed at consumers since Nvidias 2000 series hasn't really translated into a mass amount of games that can properly use it. It's been a really cool feature and will eventually be the future of graphics but over 5 years later and we still have people who threw down a ton of cash quote games that have been out for half a decade when justifying their purchasing decisions (but but...cyberpunk.) nevermind 90% of games still don't give a shit when it comes to implementation anywhere close to it. It's STILL a showcase tech for hardware most people don't have.
Despite what the reddit hive mind might drool out of its mouth. Most people are on hardware that is console level ray tracing compatible (barely) or less. It's just like anything else marketing over promises, remember that 3090 Jensen pulled out of the oven and said was 8k ready? They've done this numerous times before with different tech, sometimes only to drop it years later. Physx or hair works for example.
We're almost there and it will happen but we're not there yet. People need to catch up and that takes time. Especially when graphics cards are more expensive than ever and most game development is moving at the pace at what a console can push out...not what a 5090 can push out.
Ray tracing is amazing but raster still has a ton of time where it will be dominant. With the next gen of consoles likely being the most expensive we've ever seen and the performance uplift per dollar likely being pitiful compared to previous gens...raster isn't going anywhere.
FryToastFrill@reddit
Many games are switching to some form of dynamic GI, a notable one Ik of is DL beast, which uses a voxel GI solution when rt is disabled (which it is rn since rt isn’t released yet :( )
However as you’ve said yes, some form of raster is still gonna kick around for a while. I would however imagine that maybe mid ps6 cycle we could see many games switch to using an RTGI system of some sort to speed up dev time but keep around raster for most other parts of the image, but that would likely depend on how adoption for the pc market plays out. I would hope that in 5-6 years we still aren’t trying to use 1080 TIs to play the newest games.
Nachyobelgrande@reddit
Because they don’t wanna work at optimizing there games
Vb_33@reddit
The real reason is because in UE5 there's 2 paths: Software Lumen (simplified RT running on your GPUs shaders) and Hardware Lumen (enhanced lumen with hardware accelerated RT). Consoles struggle with Hardware Lumen currently so devs default to using Software lumen. Enabling HW Lumen on top of that requires another set of QA because now you have to play test the whole game again to make sure nothing has broken with HW Lumen. TOW2 has broken RT shadows in its HW Lumen mode currently for example.
In other words enabling HW Lumen makes for a better looking game that scales better than software lumen (software lumen hits a hard wall at higher settings where further scaling becomes unsustainable performance wise). Hardware Lumen at times even runs faster in certain scenes (due to hardware acceleration) but it also means more work ($$$) to be done because you'll need to support 2 lighting systems if you're gonna ship on console or pre RTX PCs. This is the reason why many devs forgoe usage of HW Lumen. This is changing with UE 5.6 and 7 where Epic has labelled software lumen as deprecated since HW Lumen can finally achieve 60fps on console hw but we won't see UE5.6 & 7 games for awhile.
vexargames@reddit
RT development was getting funding from Nvidia to support the hardware for years.
If you look at the data of people actually using RT for PC gaming is much less then 1%. Might be like .00001%.
I know this program because I am Game Dev as was asked to take my personal indie project and convert to RT to maybe get supported either through direct funding or marketing assistance.
It took me a few hours in UE5 to tune the lighting and my project was setup to take advantage of RT if it ever became a thing so my art content worked well. I did have to fix a few transparency issues.
Large teams like can support it for the few customers that want it, but it does add a lot of cost if your project isn't setup for it. If I was patching a game every few months and had to pay for extra time to tune the lighting for RT years after the launch of the title I might start skipping and removing it as a supported feature. Depends on how well the Lighting Team is setup to do this type of tweaking.
Personally as a gamer and a dev it is a cool toy I have been waiting for it since 1990 and first read the term Ray Tracing in the Graphics Gems books, and then again when we were using the Pixar Renderman system to created pre-rendered frames for space ships. Again when working at Dreamworks in 2007 Intel was promising this for Project Larabee, so I was excited about it personally. We don't even have video cards that can support true 4k yet at high frame rates so give it another 10 years. What will really trigger a major change is once the consoles like PS6 or 7 support it. Then the work will be served for more customers.
ekortelainen@reddit
Well made lighting looks as good as ray tracing, but runs 10 times better. There is no reasln to ever use RT unless the game otherwise looks bad, which it shouldn't. RT shouldn't be a thing in gaming at all.
sdk5P4RK4@reddit
they keep getting absolutely slammed that their games arent performant at launch. Embark seems to be the only dev capable of utilizing UE5 competently. makes sense they just move back from the shiny object.
NPPraxis@reddit
Not an expert, but my gut feeling here is that because: (A) alot of the early GeForce RTX ray tracing hardware can only do a little bit without it becoming the bottleneck, and (B) most consoles don’t support ray tracing, and (C) AMD got decent ray tracing hardware VERY late
Game makers cannot assume decent ray tracing hardware. This means they need to optimize their game and art style for a LOT of different hardware and need to make sure it looks good with ray tracing, without ray tracing, and with only a little ray tracing for weak hardware (Switch 2, GeForce 20-series, etc).
That’s a LOT of testing and art choices. It’s probably easier to just forgo it.
When all of the consoles support ray tracing it might get more ubiquitous.
Good_luckapollo@reddit
Console spec leaks are showing next gen should have hardware ray tracing on par with the 5090, doubtful we're seeing hardware rt going away. Really it seems to be about art direction moreso than abandoning tech.
c0ldhardcash@reddit
Ray tracing was just a selling gimmick from nvidia and it has done well for sure but I really wish they didn't push raytracing so early due to awful performance.
AssCabbage22@reddit
The cards that provide a decent RT experience are used by a very small percentage of PC gamers. We're probably about 5-10 years off from it becoming ubiquitous.
FauxReal@reddit
I wonder what percentage of users have hardware RT turned on? Maybe it's not worth the effort vs software RT?
Mule2121@reddit
Raytracing is pointless that's why
fuzzynyanko@reddit
Since analytics are integrated into so many games, companies are realizing that many people are choosing performance mode over quality mode. People want the frames, not more blades of grass
Hardware ray tracing is still probably being worked on, but if things are zooming past your character at over 100 FPS at a high speed, or shit's just blowing up around you, you are less likely to tell if something is rasterized vs ray traced. AMD Radeon that's in the consoles are based on the 6000 series, and AMD's ray tracing was weak in that generation.
AMD has improved RT in later generations, but consoles are based on 6000 series, so the AAA are probably targeting that. We also are getting Steam Deck and Steam Deck-like devices, which are also powered by AMD. Many of those are based on the Radeon 6000 series. The Asus Xbox ROG Ally X is based on the 7000 series, a good improvement, but still limited
The major factor is probably Nvidia. RT isn't based on AI, and Nvidia's biggest profit center right now is AI. DLSS is based on AI, while Ray Tracing isn't. One of Nvidia's biggest selling points for a while was Ray Tracing
Brickfrog90@reddit
I wouldn't be surprised if industry telemetrics were indicating that relatively few users bother to turn on ray-tracing.
Even with frame generation, the FPS hit that comes with ray-tracing is noticeable enough on my 4070 TI Super that I almost always turn it off im. I'm sure plenty of others are doing the same.
I would prefer they spend more die space on rasterizers or frame gen accelerators (if that exists. I don't know how frame gen works but I assume there's a special core for it?)
__some__guy@reddit
It's a gimmick until the average GPU is fast enough to do full hardware raytracing, without any lightmapping tricks, at a constant 60+ FPS.
Right now even high-end GPUs are just too slow to realistically replace traditional lighting techniques.
You basically have to implement both, which is twice the work, for a bit of eye candy.
fixminer@reddit
Lumen can use either software, or RT hardware if the developers enable that.
Lighthouse_seek@reddit
Imo it's the reality that a lot of PC players don't have high spec machines and these studios need enough of an addressable market to make money
ammar_sadaoui@reddit
ray tracing is still early access technology for me at least
it will stable by rtx 8060 or 8070 for games like cp2077 to get ultra on rt with 180fps until than i perfer high frame rate than ray tracing visual
MumrikDK@reddit
No, that's basically impossible. RT isn't something that's just a matter of personal tastes. The current RT trade-offs are.
RT isn't some gimmick Nvidia made up and pushed. They just jumped in early. RT has basically since the beginning of 3D acceleration been viewed as the obvious long-term goal for realtime 3D rendering. We'll get there somehow no matter what.
devinprocess@reddit
Companies don’t want to spend money on a feature that a lot of gamers cannot use, and thus not get a return on their investment.
kwirky88@reddit
It’s going to go the way of vr: the hardware is simply too expensive and the experience has got a low ceiling in quality. I’d rather play a game without the latency and artifacts of frame interpolation and at very high refresh rates than have slightly better reflections.
I’ve written Ray tracers in a few programming languages, I know what it is, but the hardware hit its just too hard. I always turn it off.
Makaveli789@reddit
The industry is DEI-ing from within'.
pgriffith@reddit
If hardware RT looked AMAZING compared to software rendered lighting then maybe it might be worth it. But it just doesn't, it only looks a little bit better, and then really only noticeable in comparison screenshots with RT on/off. Once you're actually playing a game and it the middle of a firefight, no ones noticing RT being on.
As for making it easier for software devs to implement real lighting in their games and not having to 'fake' it. The thought process of "how much is this going to tank performance" needs to be removed from the equation, the hardware needs to be powerful enough that it's not even a consideration anymore.
constantlymat@reddit
Spiderman 2's excessive development costs were in large parts blamed on spending so much money on increasing the graphical fidelity of the game. It went from a $75m budget to roughly a quarter billion.
I think in a time when the consensus is that games are too expensive and take too long to develop, scaling down on the highend of graphics is an easy way to cut cost.
haloimplant@reddit
This right here when you drop a quarter billion and then watch a game like Megabonk top the sales charts maybe rethink how much money gets blown on fancy graphics for every game
Aettyr@reddit
I personally never cared for it. The only game I’d say I genuinely loved it was cyberpunk, that game was absolutely improved by the lighting. However, most games don’t have that level of care
veckans@reddit
I don't know why some studios opt out from using ray tracing but here is my guess:
Ray tracing is an extremely heavy effect to drive which means only players with absolutely top specs (i.e. 5090) can play the games. At least if we are talking about tranformitive RT like full path tracing.
They could do smaller amounts of RT but as test from Hardware Unboxed have shown, those are barely even noticeable or even makes the game look worse.
Furthermore RT is something that most gamers leave or turn off. Not at all like the success of DLSS4/FSR4.
If the graphic impact is neglible, and performance impact is huge, then why bother?
zaza991988@reddit
The Lumen lighting system supports both software-based and hardware-accelerated ray tracing (RT). On consoles, developers frequently opt for the software variant because it exerts a smaller performance penalty; hardware RT modes tend to be reserved for higher-quality (e.g., “quality” 30 FPS) modes on consoles—or as optional settings on PC.
From a hardware architecture standpoint, AMD’s real-time ray tracing performance has historically lagged Nvidia. With the introduction of the Radeon RX 9000 Series (based on the RDNA 4 architecture), AMD had a significanty improvement in RT performance.
However, despite these advancements, AMD still trails in some of the most demanding RT scenarios (for example games with aggressive ray-tracing workloads) or path tracing.
On the console front, upcoming information suggests that the next generation of consoles probably called PlayStation 6 and the next-gen Microsoft surface Xbox Copilot AI Gaming+365 X, are expected to include dedicated RT hardware blocks (sometimes described as “Radiance Cores” in joint AMD/PlayStation announcements) rather than relying on general-purpose shader cores for RT tasks.
If these dedicated cores deliver as projected, one can reasonably expect a substantial uplift in console RT capability, which in turn could shift game development toward RT-first design (i.e., games built from day one with ray tracing as a core rendering path rather than an optional extra).
SEI_JAKU@reddit
It's a good thing too. RT will remain a meme for some years.
If you're the kind of person who insists on maxing out RT in every game like your life depends on it, instead of doing the exact opposite and turning that garbage off, you are wasting/destroying your hardware.
Zaptruder@reddit
rt is in a bit of a death valley right now thanks to AI. it's an economic issue... but basically not enough devices capable of rt at good speeds around to afford to put it front and center. This will change with next gen as and xbox consoles, helping to push rt hardware as standard for developers... but even then the transition will take a few years as games take that long to develop and as adoption ramps up.
interestingly cloud gaming might help with that adoption as well... as 20 a month gets you access to a 2 to 3k spec machine that performs 90% as good as native. is that the tipping point? maybe... especially if next gen games are rt and pt focused and gamers on older gpus cant afford the jump to newer rt optimized gpus.
the other issue is that they need to market it better... consumers still don't understand the differences well... which isn't helped by the fact that rt has been a very piecemeal kinda thing for years (I.e. it's not raster vs path tracing, it's raster vs mostly raster with a sprinkle of ray tracing).
TechaNima@reddit
It was always a gimmick anyway. All it does is drop your FPS by a significant amount of, while not adding anything a well lit environment could not offer.
On paper it's cool tech that should make development easier, but it's not really helping anything as much as it's hindering.
I say good riddance until it can be used without a huge performance hit with minimal effect vs the old tried and tested way of doing things
Qsand0@reddit
Ray tracing is largely placebo. You don't appreciate it unless when compared side by side with traditional lighting and then the interest is driven by the need to justify the $$$ spent on stronger hardware.
It's pointless if you ask me. And bottom barely of nice-to-haves
Seansong82@reddit
Because all the people with shitty hardware will flock to the internet saying the game is unplayable lol.
Petting-Kitty-7483@reddit
Because these particular ones weren't designed with it in mind and they aren't spending the money to go back and do it.
Fwiw software lumen is still RtGi just not hardware rt
Thermatix@reddit
Thing is, I hope Hardware RT doesn't die, Not because of "REAListIC GRAaPHics! But because it actually has some other uses.
I know of a voxel game that uses the RT hardware for calculations that allow for millions of voxels to be displayed at a decent FPS that just wouldn't be possible without it.
EiffelPower76@reddit
RT will never die, it's here to stay forever
reveil@reddit
The 5/8 on brand new card is not bad it is totally horrible. A brand new newest generation card should handle 1080p ultra settings with 60 fps without any issues on 100% of games. Plus assigning 45 fps a passing score is just not right. The 60 fps is the minimum that should qualify for a pass. So this is exactly the problem I'm describing. Entry level cards need to reach a level of performance to run the path traced Cyberpunk on 1080p ultra without dlss with stable 60fps. Until then RT is not worth it for the vast majority of people. Look at the steam hardware survey to see what people are still running.
ibeerianhamhock@reddit
I think what you’re observing is actually the opposite of what you think. Games are dropping support all together for pre calculated light maps and either are going 100% forced ray tracing or a software based solution like lumen (which ofc also has a hardware version).
You’ll see more and more games either have software 100% dynamic lighting or hardware 100% dynamic lighting from here on out imo. I think most games with software dynamic lighting just don’t bother with a hardware option.
EiffelPower76@reddit
"are all those RT cores on our GPUs just going to waste now?"
"Are we watching hardware ray tracing quietly die before it even became standard?"
Not at all. RT/PT is here to stay
Simply, not every video game use the same technology, and that's perfectly normal
Developers are free to use whatever technique they want to achieve a graphics engine that runs great on the most configs and looks good.
You will always have games like Cyberpunk 2077 that makes usage of RT
teutorix_aleria@reddit
Consoles. Lumen runs on everything out of the box so anything on UE5 can rely on it without making seperate HW accelerated RT options.
larso0@reddit
Pretty graphics is overrated. It looks good at screenshots, but the physics and gameplay is the most important. I'd take early 2000s graphics with good physics any day, over ray traced beautiful graphics where I can't interact with the environment.
There probably exists games that does it all, but they probably won't run well on my GPU anyways. I'm not going to buy a very expensive electric space heater of a GPU and upgrade the electrical grid in my apartent, just to play a handful of games.
ClerkProfessional803@reddit
The problem is we never really needed RT in the first place. Probe based GI, compute based lighting, and PBR did all the heavy lifting after x360 gen ended, and we had amazing performance vs visual balance up until nvidia decided to force the agenda. The industry followed suite, but AAA is now basically UE5 vs everything else, and software Lumen is simply easier to implement.
ChangeRemote7569@reddit
RT reflections are useful to get rid of the abomination that is SSR but otherwise I agree
III-V@reddit
When it came out, it was a marketing thing that companies more or less had to jump on the bandwagon with everyone else, or they'd be left behind. A lot of games became more of a "look at what we can do" to draw people to them. When you have a mtx-driven revenue scheme, you especially need to try to draw the people with cutting-edge hardware, because those are likely going to be buying more mtx. The novelty of it has worn off, so things are going back to a more balanced approach, where games are less focused on being an art demo, and more on gameplay and other facets.
exodusTay@reddit
I don't know about other titles but BF6 definetely dropped RT because they want it to be widely accesible. It comes with a F2P battle royale after all.
And as a gamer, more often than not I value (real)frames matching my monitors refresh rate over pretty lightning. On team red, until FSR4 things looked real bad with resolution scaling and frame gen. Besides frame gen sucks if you don't already have decent FPS to begin with.
ParanoidalRaindrop@reddit
Maybe MR. Leather Jacket's "It just works" didn't actually just work.
ihatetool@reddit
Most studios don't want to bother implementing ray tracing, because they're primarily developing for consoles, and if they put it in their pc port, they fear their game's gonna be flagged as unoptimized.
Some studios even try to sell their laziness as a feature..
noiserr@reddit
RT is not ready for prime time. Most gamers can't even run poorly optimized games in raster, let alone RT.
SERIVUBSEV@reddit
Well Nvidia used to pay game developers to implement RT, so that their cards sold more than competitors who had weak RT performance.
Much the same way they paid for proprietary upscaling, physx and hairworks.
\~$20 million spent on 4-5 AAA games a year to essentially have a RT Graphics Mod for showcase, was enough to corner 90%+ of GPU market share.
Now I guess they stopped sending payments because what are they going to do, cross 100% market share?
garbo2330@reddit
New Resident Evil will be using NVIDIA path tracing.
Exzerios@reddit
Not sure about the rest, but Expedition uses Lumen, which has hw-accelerated mode. In Dying Light RT is expected in one of the updates - they promised it, but didn't manage to deliver on release. For games like Silent Hill you don't really need RT as you can already bake in very high quality lighting given their static nature.
garbo2330@reddit
Expedition 33 only uses software lumen by default (like the vast majority of UE5 titles). You can force the hardware RT on with a mod.
CrispyDave@reddit
It failed because unless you want to pay $1k for a GPU, most people are better off turning it off.
I don't think the effect is worth the hardware overhead.
allthebaseareeee@reddit
As someone on their 3rd RTX card i can count on half a hand the times i have turned it on and this is similar for all my friends.
Its great for SP games but no FPS game is worth the hit right now.
team56th@reddit
My take is as follows:
Ray tracing isn’t necessarily better; it’s just a different approach to lighting vs. traditional raster. It has advantages over raster in terms of more complete glitch-free look, but it also has drawbacks; it’s fundamentally more expensive, people are less experienced with it, getting over these two requires substantial effort to rewrite the whole stack.
I think Death Stranding 2 was the best example of this. Some people asked why DS2 didn’t opt for RT. The answer is, RT is not needed. DS2 looks great, it’s made by competent artists that are master of raster-based graphics, RT merely as a sauce on the top is just taxing for questionable gains. Same with the likes of Battlefield 6. Performance is king, RT as a mere addition was a fancy flash during the last few years, but not anymore.
The good way to approach RT is what id Software and its partners are doing; replace majority of raster with RT, make it look comparable to traditional raster, and as a useful tool that accelerates graphics development. Indiana Jones and Doom TDA are made in a relatively short production cycle compared to other AAA games, and it runs great on a modern hardware. It’s not flashy sauce that RT initially promised, but this is the way forward.
So, RT is closer to us than ever. But it’s not in a way that it was advertised.
Seanspeed@reddit
Death Stranding 2 gets away without RT, because most of it takes place on big wide open, barren outdoor areas, where RT's impact would be very minimized. As soon as you get into indoor scenes, the outdated raster lighting is immediately noticeable.
Seanspeed@reddit
Dying Light The Beast is the only game I'm aware of where RT was 'removed' or 'temporarily disabled'.
Also, Lumen is ray tracing and we're getting loads of games with it. I dont understand why it matters so much whether it's 'hardware RT' or not. Obviously consoles aren't very strong at hardware RT, which is a slightly watered down software RT solution is being commonly implemented. It's not a complicated situation.
FitCress7497@reddit
You're just cherry picking. The only big release from the list you gave is BF6 and it's a competitive title.
The same way, I can pick AC Shadows, Doom TDA, a bunch of RTX enhanced titles this year and say they're heading toward full baked in RT future. Listing just several games is very misleading.
MrPrevedmedved@reddit
Lumen is RT. Hardware or sofware is matter of techincal implementation, but it's still RT. Lumen handles almost all RT effects, expect shadows. What you see now are games started development 3-5 years ago and for the most developers it's their first expirience with real time RT. When they started their projects, most PC RT capabilities were on par with modern consoles and could handles 1-2 RT effects at the time. Even now, path tracing is mostly reserved for high end PCs. So when you combine lack of expirience implementing RT with weak hardware that can't hadle a lot of RT effects no matter how hard you optimize, you end up with sofware lumen. Easy to use multiplatform system to add RTGI and reflections that was alaviable since UE5 launch. And yeah, BF6 is competetive multiplayer shooter, it should run for as many people as possible, half empty lobby is 10 time worse than lack of fancy lighting.
insolentrus@reddit
I don't know why are you asking. Answer is obvious. Its useless shit that divide fps by 2.
reddanit@reddit
Whenever you look at hardware RT support/performance on GPUs and consoles that people have at home today, surprisingly large chunk of customer base will not be able to run heavy RT.
As a game developer, this means that unless you want to severely limit your customer base, your game is required to have a decently looking rendering pipeline that doesn't need strong hardware RT acceleration to work.
With that being the case, you can now follow one of two major paths:
Large part of RT promise, from game development perspective, is ability to greatly simplify lighting workflows (when developing the game). This would lead to reduction in development costs, but those can be realized only if you do not need to make that on top of standard baked-in lighting.
AutisticMisandrist@reddit
When raytracing first came out it was considered a fad until Nvidia came and marketed heavilly their fad - in gaming at least it's a fad because rasterized raytracing is at such high level that it's a waste of resources to gobble up so much performance. The upscalling on the other hand is way more useful.
sircod@reddit
I am guessing after that first generation of RT supported games the devs looked at the stats and saw very few people actually had RT enabled and decided it wasn't worth the effort. Even if you have an RT supported GPU it still isn't worth it most of the time unless you have a top-tier GPU which is a very small fraction of gamers.
F0czek@reddit
Cuz nvidia didnt sponsor them and rt doesnt make game better.
Touma_Kazusa@reddit
Not true, even console games have gone all in on RT (i.e. ghost of yotei recently with 60 fps rt mode on ps5 pro), the things is most of these games are AA/Indie games that don't have the biggest budget or esports games which can't take the performance hit, IMO big AAA games will still carry on with hardware RT and more AA games will move to hardware RT during the next console generation where it makes sense to drop pre baked lighting support
ReasonableNetwork255@reddit
ligjting 'gimmicks' have been a thing since early 2000's .. it got you to buy it didnt it? lol
From-UoM@reddit
He is taking about hardware RT
Almost all UE5 games, which are a lot use Lumen Software RT
quizical_llama@reddit
Not sure exactly but I would love some data on how many people actually enable it. It's the first thing I turn off in any game.
Maybe devs are just not seeing the benefits of adding it unless they are being directly sponsored by daddy Nvidia