Help on SLMs

Posted by Mission_Big_7402@reddit | LocalLLaMA | View on Reddit | 3 comments

I am building a context aware terminal wrapper, which suggests the completion of the commands(as vscode code suggestions but for commands), I've completed building for the local bash history, it auto completes the last matching command, shows in gray first.

So, I'm trying to use a SLM for the use case predicting or completing the user command by also understanding the context which is stored in the CONTEXT.md file, sent for every keystroke, But most of the SLMs are slow or just generate random things, making no sense,

I've tried the Qwen 2.5 coder(1.5 & 0.5), and llama 3.2(1b), which are lightweight

Are there any other good models out there, and Is it possible for what I'm trying to build?

share your thoughts and suggestions, I'm just trying to build something and learn