Is it possible for different brand GPUs to work together?

Posted by WizardlyBump17@reddit | LocalLLaMA | View on Reddit | 5 comments

I have an Arc B580 and a GTX 1650. I plan to get a new motherboard with 2 pcie slots and use both cards. Is it possible to get both gpus to work together?

Right now I use qwen2.5-coder:14b and nomic-embed-text:v1.5 through ollama and I use tabby as code completion tool. \
I added 4 repositories as context providers and 1 whole javadoc on tabby and my 12Gb VRAM gets filled up pretty quick. I make minecraft plugins, so i have to keep the game open to see what i am doing, but i have to keep it at 800x600 to not to pass the 12Gb VRAM, but sometimes i need a second minecraft instance, but i cant open it because my VRAM is already being 100% used and i open it the screen freezes and i have to kill some stuff. \

If it is possible to make different brand gpus to work together, i would make minecraft to use the 1650 and use AI on the B580.

I am on Ubuntu 25.04 and I am using ollama right now >!i have seen some people saying stuff in the lines of "you use ollama? lol", but i dont get it. Is ollama bad? i like it because i can use its cli to easily manage the models, and some days ago i tried to run a llama.cpp container made for intel gpus, but the performance there was worse than ollama!<