is 2 nvlink x p6000 24gb (48 gb total) a bad choice for LLM's?
Posted by Infamous_Charge2666@reddit | LocalLLaMA | View on Reddit | 11 comments
I have the opportunity to buy 2 x p6000 quadro for $500. Do you think is a good idea to nvlink them (48 gb nvram) and use them for training LLMs? I know they'll fare just fine for inference but i'm more concerned with training performance...
Revolutionary_Ebb769@reddit
Hi, have you purchased it? how is the for training? can you give some feed back. I am planning to buy used ones. if you can still get the new one. please let me know.
No-Plastic-4640@reddit
Better to get a used 3090 24vram. Size is moot if it’s impractically slow. And why training? For academic go smaller. For professional, … it’s not that.
ForsookComparison@reddit
It's an awkward price point.
On the one hand, $500 for a p6000 (assuming each) is a solid deal per their usual going rate. 48GB is a lot of VRAM and the blower style cards mean that you can chuck them into any old case and probably not give a damn about thermals.
On the other, 48GB at ~450GB/s is awkward. Anything that takes advantage of that size will be slooowww.
Infamous_Charge2666@reddit (OP)
i'm getting them for $250 per brand new ( not sure if it matters used/new) ..but they were old stock , never installed and the price is $500 for the set
hainesk@reddit
Worth it. Nothing else with that amount of VRAM will cost anything close to that. Even 3090s are getting quite expensive these days. With these you’ll at least be able to play around with what you want to use AI/LLMs for and then plan an upgrade later when you know what you need.
Infamous_Charge2666@reddit (OP)
thank you
ForsookComparison@reddit
oh wtf ignore everyone and just get them then that's a great deal
DeltaSqueezer@reddit
They are useless for training.
Infamous_Charge2666@reddit (OP)
thank you
segmond@reddit
training will suck, it's basically an OC P40 with a blower fan
Infamous_Charge2666@reddit (OP)
thank you