We tested supercharging the RTX 5090 in PhysX games using an RTX 5060 as a secondary GPU — SLI may be dead, but how much can dual GPUs boost performance in classic PhysX titles?
Posted by RTcore@reddit | hardware | View on Reddit | 67 comments
Solaihs@reddit
More interesting is using multi card frame gen, where one gpu is only doing frame generation and another is doing traditional rendering of the game
poorlycooked@reddit
PhysX is something that runs independently. There's no particular reason to run it on the GPU together with the graphics; it's just because the GPU is good at it. Thus using another GPU as a math coprocessor makes sense.
DLSS framegen is integrated in the render pipeline; if you use a second GPU for framegen, it will be akin to SLI, in other words jank and not supported anyway.
Post-processing effects like Lossless Scaling do make good use of dual GPU, but of course it has none of the latency benefits.
Sciencebitchs@reddit
Care to elaborate?
Solaihs@reddit
You can use software like lossless scaling with a multi gpu setup to have one card do rendering and the other do pure framegen.
Here is the video showcasing it
Sciencebitchs@reddit
Thank you for the reply
Lars_Fletcher@reddit
Wouldn’t using 4060 be a better idea? Like it has native support of physx?
Sutanreyu@reddit
If you read the article, at least with the range of games that were tested, there was a Dec 2025 driver update that re-added 32-bit PhysX support to the 50-series cards.
excaliburxvii@reddit
Emulated, not native hardware support, slower.
Atopos2025@reddit
SLI never died. Nvidia just stopped including SLI profiles in their drivers and game developers could then add them to their games/software to enable SLI instead.
Multi card set ups are still possible today with a dx 11 tech called Explicit Multi-Adapter. It allows the cards to work like SLI/Crossfire did, however the cards don't have to have a physical bridge to use them together. This tech also allows 2 other really cool things; it allows the VRAM from both cards to be pooled AND it permits the mixing of AMD and Nvidia cards - but note that this is something developers have to add to their games or software to work.
coltonbyu@reddit
"but note that this is something developers have to add to their games/software to work."
So in other words it's dead
Atopos2025@reddit
Nope, it never died.
Remember those dlss 4 renderings? Made with 2 RTX 5090s using this tech. Also there's games like ashes of singularity that use it, to give some examples.
Traditional SLI still works with the latest Nvidia drivers too. Again, it never died. People just did not understand what Nvidia posted when they announced they were no longer including SLI profiles in their drivers and YouTube tech enthusiasts perpetuated this misunderstanding. Leading folks like you to still pretend it's dead
Even despite what I typed above.
PaulTheMerc@reddit
I never understood why. Such an ugly game. The tech is awesome, but feels so wasted.
reallynotnick@reddit
It’s not dead so much as forever so infinitely niche and unsupported that it may as well be dead.
ElectricalFeature328@reddit
claude, you are an super experienced games developer. write da driver for me and make sure you're definitely not going to brick my cards. ensure your output is less than 500 tokens, daddy's about to hit his limit
RandoCommentGuy@reddit
Certainly, first flash this bios file to your RTX 5090's... 'pulled_pork_recipe.rom'
zghr@reddit
hmm... something's off...
shouldn't that be pulled_pork_recipe.exe ?
nevermind i'll just rename it myself and we're good to go :)
zghr@reddit
Do you know of any boutique/arcade systems that use it? Could it use for example three or even four 5090s for some wild experiences?
xb9j@reddit
Ashes of the Benchmark is over a decade old
Benji998@reddit
Yeah that's true dude, the homing pigeon isn't dead as a communication method. There is a family in Estonia that still uses it.
pixelpoet_nz@reddit
You joke but it's still better than South African internet: https://www.jeffgeerling.com/blog/2023/pigeon-still-faster-internet/
Send_heartfelt_PMs@reddit
Hey it even supports Quality of Service
JJ3qnkpK@reddit
The SNES isn't dead - I was just playing Super Mario World the other day!
RumbleTheCassette@reddit
Maybe I'm misunderstand, but what games released in the last two or three years have measurably improved FPS when dual GPUs are used?
Aw3som3Guy@reddit
I don’t think any in the last two or three, but I want to say one of the Tomb Raider trilogy games supported it? And potentially one of the Total War Warhammer games. Idk how well they supported it.
To give you some idea, the last time I saw any review properly looked like into this they were using dual RDNA2 GPUs.
pythonic_dude@reddit
The last of the trilogy, Shadow, came out in 2018, 8 years ago. Rdna2 came with Nvidia 3000 series 5.5 years ago and sure, 3090 even included physical bridge for sli still, that's the point everyone understands as the one after which sli was killed.
kwirky88@reddit
> also there’s games like ashes of the singularity that use it
Is that game even fun or is it a game for people to play with their gpus?
stdvector@reddit
Ashes of singularity is a game-benchmark. I was used back in a day to promote AMD’s graphics API (Mantle)
Hamza9575@reddit
Ashes of singularity is fun. For some players.
roehnin@reddit
What games support it?
airfryerfuntime@reddit
Lol it definitely died.
coltonbyu@reddit
If none of my games support it, it might as well be dead, is our point
Zixinus@reddit
It would be better said that it's use went from turbo-charging gaming to more non-consumer uses. It is no longer a feature that new games bother to add. So maybe for enterprise or prosumers it might not be dead-dead, but it might as well be to most users.
randomkidlol@reddit
multi gpu setups are extremely common for non gaming workloads. the drivers definitely support it and a lot of enterprise software definitely works and benefits greatly from multi gpu systems.
hackenclaw@reddit
Actually it is a DirectX12 only feature not 11.
JapariParkRanger@reddit
And importantly, it is distinct from SLI.
trisanachandler@reddit
Any example games?
hellomistershifty@reddit
RDR2 was the only game I could remember playing with my two 3090s. It still ran and looked better with one card and DLSS. Frame pacing is kind of weird with SLI and DLSS doesn't work with multiple cards (or at least it doesn't in RDR2)
hackenclaw@reddit
Age of singularity, civilization 6. etc. you can google which old games support Explicit Multi-Adapter.
chesherkat@reddit
So dual 5090s is the new meta?
Z3r0sama2017@reddit
5090/4090 more like. The best rendering card paired with the fastest 32/64bit enabled physx card
Ninja_Weedle@reddit
Would still rather use a 40 series card for this for the native support tbh, probably even bigger gains when using a 4060
Z3r0sama2017@reddit
Yep. Cards that support 32/64bit physx natively will go bbrrrrrrrrrr compared to the kneecapped 5k series.
zghr@reddit
It doesn't make financial sense for Nvidia or AMD to push for multi-GPU adoption. It's inherently cannibalizing.
turtleship_2006@reddit
It doesn't make sense for them to encourage people to buy multiple GPUs ?
joeygreco1985@reddit
Jesus this takes me back. I had SLI GTX 670s in the early 2010s and sometimes set the 2nd card to be a dedicated physx card. It was amazing
RepresentativeRun71@reddit
The problem with the article is that the author doesn’t realize PhysX is more ubiquitous than he realizes. It’s actually common place middleware in Unreal 4 and 5 engines as well as anything based on Unity. The article would’ve been more interesting and informative if he tested more games than just the Batman games nobody plays.
RTcore@reddit (OP)
None of those games use GPU accelerated PhysX, which makes testing them with multiple GPUs pointless.
RepresentativeRun71@reddit
Wow, you’re so confidently wrong it’s not funny. Any game built with tools where the PhysX SDK is default physics engine runs it from my GPU and thus benefits from it being offloaded to a second GPU provided the GPU isn’t so old that it actually slows down the newer one. PUBG and Fortnite are both popular games that benefit from it and should’ve been tested.
Plank_With_A_Nail_In@reddit
Lol at the irony of this post, well done I guess.
Xera1@reddit
Have you ever even touched UE?
UE4 did not use GPU acceleration by default. It had to be enabled, and accelerated particles and cloth, not the full sim. Rigid body still happens on the CPU for compatibility and speed. Read back is slow as fuck.
UE5 uses Chaos, also not GPU accelerated except for particle effects. Same reasons.
shing3232@reddit
I would be most interested if Rt can operate on secondary card.
1vim@reddit
Nvidia really said SLI is dead then watched everyone bring it back for PhysX anyway.
shadowmage666@reddit
Wow the 25 32bit physx games that aren’t compatible who cares. The 5090 could still get high raw frames in them anyway from brute force and it still does 64bit physx in hardware
Nicholas-Steel@reddit
Yep, just 3+ grand for a graphics card that can struggle to get frame rates that ancient graphics cards can reach in those games.
Die4Ever@reddit
Well this just sounds like unoptimized code to me? 27% of a 5060 should not be a 76% boost on top of a 5090
battler624@reddit
Because PhysX isn't supported in hardware in the 50 series.
Nvidia should just give up and open-source it instead, maybe we would see better emulation for it.
ResponsibleJudge3172@reddit
It's only specifically 32 bit physx that is affected
BarKnight@reddit
Physx is supported by the 50 series which is why they are using a 5060
Physx is also open source https://github.com/NVIDIA-Omniverse/PhysX/discussions/384
battler624@reddit
I stand corrected on the open source matter but on the hardware matter no. 32-Bit isn't supported in Hardware. Its being emulated to run on 50 series hardware (so you do not have to use CPU PhysX on those 32-bit games) but it isn't a native support unlike 64-bit PhysX.
Noreng@reddit
Well, if you consider WOW64 and WINE to be emulators, then I guess PhysX is also emulated on the 50-series...
randomkidlol@reddit
i dont think theyve open sourced the old versions that some of these old games are using.
zshift@reddit
It’s not just the utilization of the 5060 that affects performance. In high performance software, waiting for anything to happen is a huge bottleneck, and this includes the physics calculations or having to reach out to VRAM to get physics data for a large number of objects. By moving that onto the 5060, the 5090 is able to keep more rendering data in cache, which avoids the relatively massive latency you get from going to VRAM. This alone can account for the speed up, regardless of the GPU used for PhysX. This of course depends entirely on the code the GPUs are running. Unoptimized software can easily bottleneck any hardware.
Time-Maintenance2165@reddit
Sometimes it's not that. Sometimes its that the percentage used isn't accurate for all compute paths.
DuhPai@reddit
Welcome back GTX 470 PhysX Edition
Ploddit@reddit
This is still an issue? I thought they added 50 series support for all the Arkham games.
Die4Ever@reddit
it's not an issue, both GPUs in the test are 50 series, and they're comparing the performance
AutoModerator@reddit
Hello RTcore! Please double check that this submission is original reporting and is not an unverified rumor or repost that does not rise to the standards of /r/hardware. If this link is reporting on the work of another site/source or is an unverified rumor, please delete this submission. If this warning is in error, please report this comment and we will remove it.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.