Why most models on Hugging Face cannot be ran on Ollama ?
Posted by KaKi_87@reddit | LocalLLaMA | View on Reddit | 13 comments
And how to get a list of only those that can ?
https://huggingface.co/models?library=ollama doesn't work because I already stumbled on models that din't have the tag yet then can run on it.
Thanks
jacek2023@reddit
Start from switching from ollama to llama.cpp Then search for ggufs
KaKi_87@reddit (OP)
Too complicated.
erazortt@reddit
how is that to complicated..? you just need 10 minutes of read, max! Everybody who has ever used a PC can do that.
Ambitious-Profit855@reddit
I got bad news for you...
InvertedVantage@reddit
Try LM Studio then.
random-tomato@reddit
+1; LM Studio is better than ollama in almost every way
random-tomato@reddit
KaKi_87@reddit (OP)
most.
chibop1@reddit
Ollama can run pretty much all the tex2text models on Huggingface (not vision language) in Gguf format that llama.cpp can support.
GloomyPop5387@reddit
Ollama can use ggufs
Ok_Stranger_8626@reddit
What are you talking about?? I've downloaded hundreds of models from hugging face, and they've all run on Ollama without any issue.
Amon_star@reddit
https://huggingface.co/models?apps=ollama&sort=trending You're just using the search incorrectly.
AccordingRespect3599@reddit
any gguf