Best local LLM Setup for IntelliJ / coding assistance?
Posted by Lost_Fox__@reddit | LocalLLaMA | View on Reddit | 6 comments
I'm looking for a setup that can integrate with IntelliJ, where I can feed my entire code base into a Local LLM, ask it questions about the code base, and ask it to perform actions.
I saw something called cline for Visual Studio, and I thought it was a really cool idea. I was curious if anyone had something similar that didn't require visual studio, and ideally integrated with IntelliJ / Android Studio.
DinoAmino@reddit
I tried continue with intellij and it was not good. I'm sure it's great with vscode, at least from what I keep hearing. But it was a while back and maybe they have better support for intellij now. The CodeGPT plugin is what I use in the IDE. It supports many providers including Ollama.
Lost_Fox__@reddit (OP)
It's not run locally, so even if you could use your local code as a context, you'd run through credits like crazy I assume.
DinoAmino@reddit
?? not sure I understand. You asked about using it with local LLM. CodeGPT is a local plugin for your local IDE and you can use a local models via Ollama, Llama.cpp, or "custom Open AI" which could be any local inference engine (like vLLM) that supports open-ai compatible endpoints.
Lost_Fox__@reddit (OP)
It can use your entire context as a context?
DinoAmino@reddit
If you have the VRAM for it I suppose. I don't know how it handles it under the hood, but you can select files, folders, or git branches.
stddealer@reddit
Check out Continue (continue.dev). (It also works with vsCode)