Is local AI the actual endgame? (M5 Mac Studio vs. Dual 3090s)

Posted by Party-Log-1084@reddit | LocalLLaMA | View on Reddit | 100 comments

Hey everyone,

I currently use Gemini and NotebookLM a lot, but I really want to transition to local AI for things like privacy and uncensored models. Before dropping serious cash though, I have to ask: is local AI the actual future for power users, or will the big cloud models just permanently outpace us? Or is there something else i didnt even know about coming soon?

If you were to invest long-term right now, what is the smartest move? Should I wait for an M5 Mac Studio Ultra, even if it costs 4 to 7k, just for the massive unified memory? Or is it better to build a classic setup with two used RTX 3090s? I've got an old Dell Precision T5810 with Intel Xeon E5-2680 v4 and 128GB Ram.

Or is there a third option: just wait? Software and quantization seem to be improving so fast. Are we reaching a point where we can run amazing models on much cheaper hardware soon anyway?

Is it worth the heavy hardware investment right now? Would love to hear your realistic thoughts.