is 2 nvlink x p6000 24gb (48 gb total) a bad choice for LLM's?

Posted by Infamous_Charge2666@reddit | LocalLLaMA | View on Reddit | 11 comments

I have the opportunity to buy 2 x p6000 quadro for $500. Do you think is a good idea to nvlink them (48 gb nvram) and use them for training LLMs? I know they'll fare just fine for inference but i'm more concerned with training performance...