Realistic local LLM rig under $6500? Dev with heavy RAM needs

Posted by TeachTall3390@reddit | LocalLLaMA | View on Reddit | 27 comments

Hey everyone,

I'm a developer looking for practical hardware recommendations under $6500 for local LLM work. My usage breaks down like this:

Anything heavy I just rent GPU clusters or use work resources.

I usually run 40-50 services at once, so I need a ton of RAM. Video editing would be a nice bonus but not required. Linux or macOS is fine.

What builds are actually worth it right now? Thanks!