Llm on android
Posted by gokuchiku@reddit | LocalLLaMA | View on Reddit | 12 comments
is it possible to run llms locally on your android? if so please do tell me how? Thanks.
Posted by gokuchiku@reddit | LocalLLaMA | View on Reddit | 12 comments
is it possible to run llms locally on your android? if so please do tell me how? Thanks.
logosvil@reddit
Best so far is google edge gallery it runs the new model Gemma 4 2B
RareAd5942@reddit
Termux & ollama.cpp
gokuchiku@reddit (OP)
These are android apps? Do I need both to run?
ML-Future@reddit
I use it that way too. Install termux Then run: pkg install llama-cpp
Then you can use llama cpp to run models
gokuchiku@reddit (OP)
Thank you very much. I will try.
RareAd5942@reddit
ask LLM bro
Comfortable_Ebb7015@reddit
Edge Gallery runs Gemma 4 very well!
qwen_next_gguf_when@reddit
PocketPal
gokuchiku@reddit (OP)
App on Play Store?
qwen_next_gguf_when@reddit
Yes
gokuchiku@reddit (OP)
Thanks, I will try.
cyborgolympia@reddit
Download Layla ai from the Google playstore.