Top hardware stacks for local compute over the coming few months? (3-10K USD range)

Posted by IamFondOfHugeBoobies@reddit | LocalLLaMA | View on Reddit | 37 comments

I'm one of the 200 dollar a month plan Claude users currently tearing his hair out over how a company can offer a service this unstable and annoying (we are...many at the moment). And I'm thinking it might be time to just drop 3-10k USD on local AI.

I'm running GPT-OSS-20GB on my gaming desktop atm and it is....way better than expected (also giving me a better experience than Gemma 4 which was wtf but whatever).

Thing is. I'm not a hardware guy. I can program my own local AI tools easy enough. But hardware? Help please.

Currently I'm planning to wait for the new apple releases likely announced in June. Then look towards the Mac Studio line-up. But I'm sure there are people in here who know a LOT More about this than me.

What are the current top of the line solutions for Local AI in my price range? What are the trade-offs in terms of power consumption and things like RocM on Linux (never, never, NEVER again oh god I value my sanity too much to try that again PURGE WITH FIRE).

I prefer the freedom of Linux but I'm fine with Apple. Windows is a no-go for me. Too much bloat, me and windows are permanently divorced.

Do note. Context is very important for me. It's not enough to just be able to get a model to load. I need it to be able to use it's full context well too.

I've labelled this thread a discussion since I suspect there will be a few different opinions on this and I'd love to get a good, productive discussion on this going.