When are we gonna get more 1-Bit models(Medium & Large size)?

Posted by pmttyji@reddit | LocalLLaMA | View on Reddit | 9 comments

Obviously this thought came after recent Prism ML's Bonsai 8B model.

This thread seems honest feedback on Bonsai-8B model. Few mentioned that halluciation happened few times. Hope future 1-bit models come with more improvements.

There's recent thread on simulation for Qwen3.5 models. That looks awesome for tiny GPUs. I also mentioned the size ratio for medium-big-large models(on some other thread) which seems nice. Pasting the size ratio below.

(Parameters : Size in GB)

Wouldn't be nice to have more 1-bit models in above sizes? Like I could run 50B models just with 8GB VRAM, 100B models just with 24GB VRAM, ..... which seems a miracle.

Our dude is cooking something for us. Hope we get some in future soon.

Qwen 3 8B. I’m cooking the 397B right now, since you guys have such an appetite for bitnets. - u/Party-Special-5177

Anyone else cooking something like this? Please share.