Is it possible for different brand GPUs to work together?
Posted by WizardlyBump17@reddit | LocalLLaMA | View on Reddit | 5 comments
I have an Arc B580 and a GTX 1650. I plan to get a new motherboard with 2 pcie slots and use both cards. Is it possible to get both gpus to work together?
Right now I use qwen2.5-coder:14b and nomic-embed-text:v1.5 through ollama and I use tabby as code completion tool. \
I added 4 repositories as context providers and 1 whole javadoc on tabby and my 12Gb VRAM gets filled up pretty quick. I make minecraft plugins, so i have to keep the game open to see what i am doing, but i have to keep it at 800x600 to not to pass the 12Gb VRAM, but sometimes i need a second minecraft instance, but i cant open it because my VRAM is already being 100% used and i open it the screen freezes and i have to kill some stuff. \
If it is possible to make different brand gpus to work together, i would make minecraft to use the 1650 and use AI on the B580.
I am on Ubuntu 25.04 and I am using ollama right now >!i have seen some people saying stuff in the lines of "you use ollama? lol", but i dont get it. Is ollama bad? i like it because i can use its cli to easily manage the models, and some days ago i tried to run a llama.cpp container made for intel gpus, but the performance there was worse than ollama!<
prusswan@reddit
You will have way less headache keeping to just one of them.
mustafar0111@reddit
Yes, but it does make things more complicated and difficult. Especially on Windows based systems.
I generally keep the GPU hardware all the same in each system. Right now I have one Nvidia system and one AMD rig. In the future I'm probably going to replace the older Nvidia P100's with Intel or AMD cards but I'm waiting to see what releases by the end of this year.
WizardlyBump17@reddit (OP)
what if i deploy a server for each gpu? will i get limited too or will they work as if there was only that gpu on the system?
mustafar0111@reddit
It'll be easier in terms of inference engine selection and drivers. But you'll lose group access to all the VRAM on a single system.
I believe there are ways to do this with network clusters or hordes but I've next experimented with that.
WizardlyBump17@reddit (OP)
This is what my VRAM usage looks like, normally (i will upload more on the comments):
qwen2.5-coder:14b and minecraft 1.21.1 fullscreen on background. When minecraft is the main window, the VRAM usage is about 11.5Gb and about 11.3Gb when on 800x600