Cursor-like tools that work with llama.cpp

Posted by ga239577@reddit | LocalLLaMA | View on Reddit | 4 comments

Recently started using llama.cpp instead of LM Studio and wanting to try vibe coding with Local LLMs.

I've found several threads and videos about setting up various tools to use Ollama, but can't seem to find any good information on setting them up to use llama.cpp. Also saw a guide on how to set up Cursor to use LocalLLMs but it requires sending data back to Cursor's servers which kind of defeats the purpose and is a pain.

Does anyone know how to do this or of any resources explaining how to set this up?