Amd and Nvidia cards on same rig
Posted by deathcom65@reddit | LocalLLaMA | View on Reddit | 5 comments
Hey guys
I have an AMD and Nvidia GPU lying around I'm wondering if it's possible to use them at the same time and to split a model across them.
I know they have different back ends but can a unifying backend like vulkan take advantage of both ? It's just hardware I have on and and I'd like to make the most use of it. I have a 7900xtx and a few 3060s
Let me know if any you have experimented with this sort of setup and what your results were.
Sharp_Classroom9686@reddit
Dont go with CUDA use Vulkan. the TK/s will be limited. but yes you can go , how many 3060s do you has? maybe is better just go with the 3060s
taking_bullet@reddit
I'm rocking with RTX 5070 TI & RX 9070 under the mask (32GB VRAM combined in total). Just remember to switch to Vulkan.
Training_Visual6159@reddit
any gotchas, how's the performance for models that have to split the VRAM between the two?
what about vllm?
taking_bullet@reddit
~27 t/s while using Qwen 27B Q8_0 in LM Studio.
Icy_Bid6597@reddit
Never tried that by myself but it seems that lamacpp with vulkan backend should be able to mix different cards alltogether without any big issues.
Mixing rocm and cuda also seems possible.
There will be some performance penalty, but from perspective of vram it should work