Should I buy Tesla K80 for 70€ or Tesla M10 for 110€?
Posted by Similar-Republic149@reddit | LocalLLaMA | View on Reddit | 21 comments
I've heard they are somewhat okay for llms and for like a little less than half the price of a 3060 they seem pretty enticing but I just need some advice on wether I should buy one of these two or pass on them.
EnvironmentalRow996@reddit
P40 is a newer option. Or M40.
Similar-Republic149@reddit (OP)
I just checked and it seems the p40s are going for around 300€ is that a good deal or not?
OutlandishnessIll466@reddit
They used to be 150-200 euros. They work, I have 3, but not that fast and you need to spend another 30 euros for a fan.
A RTX 3090 costs 2x the price of a p40 for 4x the performance.
AppearanceHeavy6724@reddit
Tesla M10 is e-waste, a retro device, and not worth $10, let alone 110 euro. If you want the cheapest possible 16 GiB buy two p104-100, $25-$40 each, depending on the local market. They still work fine, and way less hassle to use.
Forward_Somewhere249@reddit
i am following https://www.reddit.com/r/LocalLLaMA/comments/1hpg2e6/budget_aka_poor_man_local_llm/
waiting for my shipment
AppearanceHeavy6724@reddit
do not forget to cap them at 150W
Mother_Context_2446@reddit
Why not use a cloud solution? It would be much better. $110 on RunPod would get you 48GB VRAM L40S approx 130 hours of gpu usage.
Similar-Republic149@reddit (OP)
I was considering the cloud but I realized that learning how to manage the hardware is part of the fun so I really just want to self host on a budget
ComprehensiveBird317@reddit
I respect this very much
a_beautiful_rhind@reddit
Better off getting an Mi50 than those ancient cards. Any cuda advantage for something that obsolete is long gone.
No-Refrigerator-1672@reddit
I wouldn't say that Maxwell's CUDA advantage is completely gone; i.e. something like M40 is still considerably faster than CPU; and if it were selling for 50 eur per piece, it would be a good deal. But yeah, as of today, Mi50 is definetly better bank/buck.
a_beautiful_rhind@reddit
His M10 has less b/w than my xeon.
They're trying to sell M40s for more than old P40 and Mi50 prices on ebay :(
henk717@reddit
I'd get neither, I almost never use the M40 I have because its quite slow.
Keep in mind CUDA13 is not that far off which means the end of driver support for maxwell.
K80's are even worse off, KoboldCpp supports them in the oldpc build but not much software out there does.
mmowg@reddit
speaking honestly, avoid at all the k80, it's too old, almost same age of a k2000 quadro, i have a lot of k2000s and they are collecting dust for now. About the M10, it's better, like the M2000 quadro serie, but my advice is save the money for a Quadro P2000/P2200 or P4000, they are the entry level for all AI world. With a good hardware knowledge, you can do some interesting things with a single P2000, because it's still supported by latest nvidia driver for Linux and Windows. Quadro serie K are totally outdated now, serie M are slightly better, but P serie are the entry level now.
Similar-Republic149@reddit (OP)
Ok thanks for the advice I'll be looking into the p series
fizzy1242@reddit
honestly man, neither. both are very old
Tenzu9@reddit
You are better off saving a little more money and getting an RX 6800xt. 16Gb of Vram, Supported by RocM and you can run it on LM Studio in Windows too. Its Vulkan runtime is nothing to scoff at either.
offlinesir@reddit
Those cards are fanless, you need to set up a solution for that, and they are also very old. I would recomend against buying them. Also, that money could be spent on a better real GPU (3060 12gb) or api credits at openrouter.
Similar-Republic149@reddit (OP)
Would you say that the M10 is half as good as a 3060 12 gig?
offlinesir@reddit
No, and to put that into perspective, it was released in 2016, based on the Maxwell archetecture, which was released in 2014. That means the tech behind the card is... 11 years old. It's just not supported anymore by some AI tools. VRAM is not everything, just because it's got two thirds the vram of the 3060 12gb doesn't mean it's even half as good.
Similar-Republic149@reddit (OP)
Ok thanks for the advice I think I'll pass on this then