Is there any way to voice chat with a AI locally (on iPhone)
Posted by Tobias783@reddit | LocalLLaMA | View on Reddit | 4 comments
Is there any way to voice chat with a model locally? Is it possible on iPhone?
ThatHavenGuy@reddit
By locally, I assume you mean running a model on the iPhone itself but if you just mean that you have something already serving AI models locally that you want to interact with on your iPhone then there's a few options out there where you just need a web browser to have voice calls with the model. I'm familiar with Open WebUI as an example.
Not too familiar with running models ON iPhones though. That might be a tougher nut to crack but I know there's folks who have gotten AI models running on Android so I wouldn't be surprised if there's something out there for iPhones too.
FrostyMisa@reddit
Check Layla app from AppStore. It have lot of things, voice chat with AI included. Best with latest 8GB RAM phones.
GilAbides@reddit
I think your best bet is to run it your model and tts together on a local machine and host them as a secured server you can access through your phone.
SocksAreShoesToo@reddit
I don’t believe we have any local TTS on iPhone