Suggestions kind people for a simple local chatbot for mobiles.
Posted by zenith-czr@reddit | LocalLLaMA | View on Reddit | 4 comments
I am currently using Llama-3.2-1B-Instruct-q4f16_1-MLC via WebLLM v0.2.82. This is a completely local feature for making a personalised meal plan for the user as per their diet goal, even without the internet so they don't need to look at emails and other notifications first thing in the morning when they want a breakfast for say vegan meal for heart health. Llama works fine for this but anything a little deep in the conversation and its starts to become strange. I was thinking about qwen 3.5 0.8b, but would love to hear from you all, given you would have more experience.
Blindax@reddit
Gemma 4 e2b is quite good if it can run.
zenith-czr@reddit (OP)
Unfortunately it's not available currently for webllm
qubridInc@reddit
For mobile keep it simple try Qwen 3.5 0.8B/1.8B (Q4) for better coherence than Llama 1B, and keep prompts tight + scoped to avoid drift.
srigi@reddit
On mobile you can use Gemma4 model locally with Edge Gallery app https://developers.googleblog.com/bring-state-of-the-art-agentic-skills-to-the-edge-with-gemma-4/