Cursor-like tools that work with llama.cpp
Posted by ga239577@reddit | LocalLLaMA | View on Reddit | 4 comments
Recently started using llama.cpp instead of LM Studio and wanting to try vibe coding with Local LLMs.
I've found several threads and videos about setting up various tools to use Ollama, but can't seem to find any good information on setting them up to use llama.cpp. Also saw a guide on how to set up Cursor to use LocalLLMs but it requires sending data back to Cursor's servers which kind of defeats the purpose and is a pain.
Does anyone know how to do this or of any resources explaining how to set this up?
ForsookComparison@reddit
Roo Code comes pretty close in a lot of ways. It's not a drop in replacement (Cursor compressing your repo on their servers to make it a makeshift RAG DB is genuinely unique), but it's solid for agentic coding or just being an editor tool.
ga239577@reddit (OP)
Thank you for the response. I will be trying this next. A while after making this post I found a different solution - Cline + VSCode.
Producing very nice looking results so far with GLM 4.5 Air. Speed is very decent considering this is all happening locally.
chisleu@reddit
GLM 4.5 air is a fantastic model for it's size. Cline is the perfect agent for local LLM right now.
yazoniak@reddit
Roo code + flexllama to manage and switch multiple models automatically.