What's the current best code autocomplete LLM for local deployment (as of April 2026)?

Posted by danielecappuccio@reddit | LocalLLaMA | View on Reddit | 14 comments

I know this question has already been asked a thousand times, probably, but... what's the best or close-to-best model I can use with Continue for local IDE-like code autocomplete? Assume reasonable amount of VRAM to work with (\~16GB, so no GLM or similar trillion parameters models)

Answers to similar questions still point to Qwen2.5-Coder, hence a two(almost three)-generations old model.

Also, do I need Base models only or I'm also fine with Instruct ones?