SmolLM3 has day-0 support in MistralRS!

Posted by EricBuehler@reddit | LocalLLaMA | View on Reddit | 5 comments

It's a SoTA 3B model with hybrid reasoning and 128k context.

Hits ⚡105 T/s withAFQ4 @ M3 Max.

Link: https://github.com/EricLBuehler/mistral.rs

Using MistralRS means that you get

Super easy to run:

./mistralrs_server -i run -m HuggingFaceTB/SmolLM3-3B

What's next for MistralRS? Full Gemma 3n support, multi-device backend, and more. Stay tuned!

https://reddit.com/link/1luy32e/video/kkojaflgdpbf1/player