A mid range PC build for Dual GPU Local LLMs and SLMs.
Posted by iammhk@reddit | LocalLLaMA | View on Reddit | 5 comments
I want to build a mid range Desktop to fine-tuning and host Small Language Models and LLMs. I am thinking about using 2 AMD Radeon 9060XT 16GB to reach 32 GB VRAM on budget. Will it help? Since 32GB Cards like Nvidia RTX5090 are absurdly expensive. What are your suggestions about the Motherboard and CPU for my build? Should I go for a Mac Mini M4 cluster, or other Single Board Chip cluster to achieve high VRAM? I am in India, btw.
StupidityCanFly@reddit
Remember that Apple’s M processors aren’t that great at prompt processing- takes them a longer while. So if you’re impatient I’d recommend taking the Radeon route.
I went for dual RX7900XTX. I chose the Gigabyte Z890 Aero G motherboard, as it has two CPU PCIe slots that give x8 when both are used. This opens up the possibility to add two more GPUs later on.
AppearanceHeavy6724@reddit
Any i5 newer than 9th gen will do.
iammhk@reddit (OP)
I do want to play games on it too, you know. 😅 But I will go for a mid range CPU from the current generation. But what about the dual GPU setup? Do you think it will work?
AppearanceHeavy6724@reddit
of course. I have two gpus, works like charm.
Aggressive_Dream_294@reddit
With mac mini you will be stuck with mac os . If you just want to run the model then it's great. If you are training the model then you want a more linux compatible system. Although the ecosystem on apple has improved for ml a lot but it's not on par with linux. So amd will be better in my view, and if you can afford than nvidia is still best for ml stuff. You can get 4090 instead of 5090.