What happened to MLX-LM? What are the alternatives?
Posted by Solus23451@reddit | LocalLLaMA | View on Reddit | 4 comments
Support seems non-existent and the last proper release was over a month ago. Comparing with llama.cpp, they are just miles different in activity and support. Is there an alternative or should just use llama.cpp for my macbook?
chibop1@reddit
It's still going. Look at the history.
https://github.com/ml-explore/mlx-lm/commits
FoxiPanda@reddit
Llama.cpp is not bad on macOS via the homebrew release. It’s reasonably performant and is very supported.
Koalababies@reddit
I've been happy with omlx
sammcj@reddit
Yeah oMLX is shaping up well.