Best home hardware for an AI rig

Posted by maofan@reddit | LocalLLaMA | View on Reddit | 22 comments

I'm currently spending £90 a month with anthropic and potentially thinking of going to the next tier which is £200, that's the same if I stick with Anthropic or go for codex or similar. I can buy a 3090RTX 24GB card and I already have a 4070RTX 12GB card. I'm currently running on a desktop with 64GB ram and AMD ryzen 7 9700x.

Model 36GB VRAM Experience Speed
Qwen 3.5 Coder (35B) Fits 100% on GPU with huge 32k context.
Llama 4 (70B) Fits \~80% on GPU; small spill to 64GB RAM

I'm thinking I could stay on the 5x tier, and spend 7-8 months worth of subscription on a 3090RTX. If that goes well I could sell my 4070 and get another 3090RTX and a new power supply!

My workflow usually is "opus" for planning and "sonnet" for execution. For anyone who has done this jump, could I get close to sonnet reasoning with 36GB? Would I need to go the whole way and go up to 48GB?

Is it even worth it? With models improving all the time, I'm wondering if more and more memory will be required.