Question about llama.cpp and OpenCode

Posted by Able_Limit_7634@reddit | LocalLLaMA | View on Reddit | 16 comments

I see a lot of people using llama.cpp with OpenCode, but I don’t really understand why they don’t just use LM Studio or Ollama. What are the advantages?

Also, what would you recommend for a MacBook M4 Pro with 48GB of RAM if my main use case is coding in Dart?