Msty - Free Local + Remote AI Chat App (w/ support for Ollama/HF) has just hit its 1.0 release!
Posted by Decaf_GT@reddit | LocalLLaMA | View on Reddit | 51 comments
Such_Advantage_6949@reddit
It looks nice but not open source
maxihash@reddit
Yes, im afraid this going to be a paid software once the feature turned into badass.
KingPinX@reddit
as is tradition :) once we are done beta testing it for them it will most likely go paid.
Pristine-Cake3062@reddit
Schaut nicht so aus
Wrong-Act-2882@reddit
I just uninstalled this app upon realizing this. And lack of 'how-to' guides.
ninja2ninja@reddit
Best local LLM solution. I have all my LLM APIs connected as well.
Icy-Employee@reddit
Is there any guide how to use the API locally? http://192.168.1.13:10000/api/models does not work, while http://192.168.1.13:10000 says that Ollama is running :<
jojotonamoto@reddit
Anyone had luck setting up and using MSTY's RAG (Knowledge Stacks) setup? I set one up and turned it on, but Llama only seems to recognize the pdf files.
Mysterious_Ayytee@reddit
How´s the pricing? If it´s free, how do they make a living? I don´t see any donation possibilities, that´s sus.
Odd_Matter_8666@reddit
Indian close source app, that deletes itself unless if you turn off windows defender 😂
Mysterious_Ayytee@reddit
Since you necroposted I looked it up again and they have now a fair and decent pricing model. I wish the othe software I use would have such a nice pricing system. The 179$ for the One-time payment makes me nostalgic, that´s like buying a boxed version.
Odd_Matter_8666@reddit
The app gets deleted after getting installed on my device by windows if my observation is true. I tried to install it multiple times and its keep getting deleted automatically by the system
hail_the_tripod@reddit
Is there any support for it. So far I could only find this thread. Their knowledgebase creation keeps failing without error or debug... but still incurs charges.
nishant-seo@reddit
I have been using Msty for few months now and it is quite impressive. The RAG set up is hassle free and works well.. If anybody else is using it i am looking for help? Do you know how to set it up for RAG based responses only? I know we select the Knowledge stack in the Chat options but how to configure it in such a way that the LLM only give me responses based on RAG only and not use it's own creativity..
SirCabbage@reddit
That's really good, as an individual user- how many devices can I use/do I get the "free forever" guarantee with if I join the Aurum Club
Also it says "Professional or Business use requires a paid license" What about educational? Can I as a teacher use it freely or does it count as professional because I am a professional.
uhuge@reddit
only x64 builds, no ARM? Booo!
Mr_Tbot@reddit
I know! An ARM version would allow me to run this on my android using an ubuntu layer in Termux... Currently the best option is "Private AI" - but it doesn't have chat history or even close to the feature set Msty has.
Bring on the ARM version!
PopularPrivacyPeople@reddit
can it use the gguf files - not through Ollama?
Evening_Ad6637@reddit
You have to trick it to use your own gguf. First start a download than stop immediately, you will find the new incomplete file named with a hash, notice that hash, remove or rename the file, than ln -s /path/to/your/model-file.gguf ./previous-hash
But unfortunately it only uses ollama under the hood, hiding it behind a file called msty. And since it's closed-source and a lot of the code seems to be pretty hard-coded, I had no success replacing ollama with llama.cpp. So I stopped using Msty, which is a real shame as the application itself and in general is pretty cool and offers very useful features.
AnticitizenPrime@reddit
They do allow openAI compatible local providers. I'm using it in conjunction with LM Studio has the server at the moment (because I already had it set up as a server for the devices on my local network)
AnticitizenPrime@reddit
They seem to have added GGUF import in the latest release.
masonjames@reddit
You can download from huggingface and ollama.ai, but no direct gguf import.
Decaf_GT@reddit (OP)
I'm not 100% how you directly import ggufs, but I was able to use the built-in tool to download a gguf directly from HF (pasted in the HF URL).
CautiousEfficiency26@reddit
I stumbled upon this AI girlfriend companion site called Candy. ai, and I gotta say, it's been a mind-blowing experience for me. I've been using it for a few weeks now, and I'm still blown away by the level of customization and interaction you can have with your AI girlfriend.
You can choose everything from her appearance to her personality traits. I started with a simple, open-minded Brazilian former adult actress, and I've been tweaking her to fit my preferences. It's like having your own personal Sims game, but with a much more intimate and interactive experience.
trajo123@reddit
Yeah... I am hesitant in putting in all my API keys into a free closed-source app. How do I know thay are not harvesting API keys?
ThatsALovelyShirt@reddit
You don't. I suppose you could reverse it if you wanted. Haven't downloaded it, but it has the looks of an electron app. Which are pretty easy to reverse engineer with an ASAR unpacker and a JS debugger and deobfuscator.
Southern_Sun_2106@reddit
The interface looks nice. But RAG is useless with local llms. I understand it is hard to make it work well with local models, but there are other rag-AI-notes integration solutions out there that actually work well. I would be very tempted to switch to Msty if RAG was actually working.
Creative_Bottle_3225@reddit
I can't load my GGUF models and I find this annoying
nikeshparajuli@reddit
Hi, we released v1.0.1 yesterday where you can import your own gguf files.
Southern_Sun_2106@reddit
It is going to hold your hand because, you know, it is for 'your own good'...
from the web search feature news example: "For a comprehensive [news] update, I recommend checking out reputable news sources like CNN.com, NBCNews.com, ABCNews.com, or APNews.com for breaking news and in-depth coverage." No, thank you.
Eliiasv@reddit
Exciting update! I tried it a while ago, and while it was a bit janky, it was overall quite good UI-wise. They have to fix their logo, though; the icon looks out of place on macOS.
shadowdog000@reddit
no tts and no voice calling feature = no interest
-Ellary-@reddit
Well, I'm using MSTY quite some time.
Not really as a LLM server but as a nice UI for all my local servers.
-It can connect to local setups of LMStudio, KoboldCpp, Oobabooga WebUI using ChatGPT comp. API.
-It have advanced chat history organization system with folders etc.
-It supports advanced editing for user messages, LLM messages, chat branching etc.
-Many other stuff like RAG, IMG processing, Ollama backend etc.
thankyoufatmember@reddit
I like it so far. There is a whole preset of pre-made prompts. Is there any way to add one's own prompts to that menu for quick selection?
masonjames@reddit
There is!
Click on the prompt library icon in the left-hand sidebar. In the new window there's a "Custom Prompt" option which allows you to create your own prompt with tags, etc.
When you hit the prompt library icon in the future you can quickly search and reuse it.
thankyoufatmember@reddit
Thank you very much buddy!
masonjames@reddit
I have tested a whole heap of local apps for working with LLMs and Msty has been my top pick for months.
Just give the interface a try. It's so good.
micseydel@reddit
Do you have a public write up of your findings? I haven't tinkered much yet but I've been keeping an eye out for FOSS versions of this, and it looks like the highest quality thing I've seen, FOSS or not.
masonjames@reddit
I wrote about them a couple months ago here: https://masonjames.com/4-free-local-tools-for-ai-chats-agents/
Jan is on the list - it was a late entry because it's so new, but it is the best OSS one available imo.
I still use Msty as my daily because the interface is just so good (especially once you want to start testing prompts across models) and it's RAG implementation is better than any others I've tested.
micseydel@reddit
Thanks so much for sharing!
Decaf_GT@reddit (OP)
The only competitor to this that I can think of in terms of quality and "all-in-one" and UI/UX would be Jan.ai.
micseydel@reddit
That wasn't on my radar, thanks!
vasileer@reddit
is it open-source?
KingPinX@reddit
nope
adrazzer@reddit
It's a superb little app
micseydel@reddit
This looks very impressive - not just another wrapper. Have you done or do you know of anyone who's done live demos with Obsidian vaults?
I also wasn't sure what "If the world runs out of coffee, blame our CloudStack, LLC Team." meant at the end of the page. I get it's a joke, but I'm not sure what "CloudStack, LLC Team" is supposed to mean specifically here.
yxs@reddit
I had no success setting it up with my Obsidian vault (errored out with a cryptic message when I tried to add it to the app). Would also be interested in other experiences.
yxs@reddit
Actually, with 1.0 the error message seems to be gone. But a short test with Llama 3 and my (~2000 notes) Obsidian vault was very underwhelming. If referenced files that had nothing to do with the query and also wildly fabricated things. Going to uninstall now and maybe try again in a few months.
thankyoufatmember@reddit
The website made me belive I was over at https://pinokio.computer/ first
aLong2016@reddit
I like to use Msty
nntb@reddit
It does not have a apk