GitHub Copilot CLI goes BYOK with local models
Posted by KvickaN@reddit | LocalLLaMA | View on Reddit | 5 comments

GitHub now lets Copilot CLI use local and BYOK models through Ollama, vLLM, Azure OpenAI, Anthropic, or OpenAI, with offline mode and optional GitHub auth. Read in-depth article: https://ainewssilo.com/articles/github-copilot-cli-byok-local-models
deejeycris@reddit
That's actually a good move from them, profit from anthropic's short-sighted blunders.
Rim_smokey@reddit
I wouldn't be so confident they're being honest with their use of the term "offline". This company only has their own interest in mind, and their product is big data.
jon23d@reddit
I’ve been using opencode for some time now. How does copilot cli compare?
Potential-Net-9375@reddit
This is fantastic!! I've been waiting for this, finally an alternative to trying to run Roo
draconisx4@reddit
Solid move with Copilot's local support it's a smart way to lock down model access and keep your data off the cloud, making oversight way easier for anyone running production AI.