A mid range PC build for Dual GPU Local LLMs and SLMs.

Posted by iammhk@reddit | LocalLLaMA | View on Reddit | 5 comments

I want to build a mid range Desktop to fine-tuning and host Small Language Models and LLMs. I am thinking about using 2 AMD Radeon 9060XT 16GB to reach 32 GB VRAM on budget. Will it help? Since 32GB Cards like Nvidia RTX5090 are absurdly expensive. What are your suggestions about the Motherboard and CPU for my build? Should I go for a Mac Mini M4 cluster, or other Single Board Chip cluster to achieve high VRAM? I am in India, btw.