"Cheap" 24GB GPU options for fine-tuning?

Posted by deus119@reddit | LocalLLaMA | View on Reddit | 18 comments

I'm currently weighing up options for a GPU to fine-tune larger LLMs (Deepseek 70b), as well as give me reasonable performance in inference. I'm willing to compromise speed for card capacity.

Was initially considering a 3090 but after some digging there seems to be a lot more NVIDIA cards that have potential (p40, ect) but I'm a little overwhelmed.