Planning to build AI PC does my Build make sense?

Posted by germaniiifelisarta@reddit | LocalLLaMA | View on Reddit | 13 comments

Hi so I've been looking all around and there seems to be a shortage of GPU guides when building a PC for AI Inference, the only viable reference I could consult are GPU benchmarks and build posts from here.

So I'm planning to build an AI "Box". Based on my research the best consumer-level GPUs that are bang for the buck would be the RTX xx90 24GB series. So I browsed my local marketplace and those things are so dang expensive. So I looked for an alternative and found the RTX xx60 16GB line. Which has lesser vRAM but more in my price range.

I also found that I could cluster (not sure if this is the correct word but something something SLI) GPUs.

I was aiming to use the AI box purely for inferencing so I would be loading up LLMs, VLMs and try Stable Diffusion not at the same time though.

Sooo, based on those above, I have a few questions:

  1. Is the RTX xx60 non/Ti 16GB models have acceptable performance on my use case?

  2. If not, is it possible to do the clustering if I would buy 2 RTX xx60 non/Ti 16GB?

  3. Am I making sense?

All help is appreciated. Thanks if you think there is a better sub, please let me know and I would ask there too