What happened to MLX-LM? What are the alternatives?

Posted by Solus23451@reddit | LocalLLaMA | View on Reddit | 4 comments

Support seems non-existent and the last proper release was over a month ago. Comparing with llama.cpp, they are just miles different in activity and support. Is there an alternative or should just use llama.cpp for my macbook?