Built a terminal chatbot in Python that uses Ollama + Qwen3.5:4b — fully offline, beginner project but works well

Posted by Beneficial-Job-3082@reddit | LocalLLaMA | View on Reddit | 5 comments

Hey everyone, I am interested in exploring Python and wanted to build something with local LLMs instead of using OpenAI.

Built a simple terminal chat app that:

It's nothing fancy but it was a great way to learn how Ollama's API works under the hood.

GitHub: https://github.com/Aditya-rc4/localai_chat

Happy to hear any feedback or suggestions for improvements!