FINAL-Bench/Darwin-36B-Opus · Hugging Face

Posted by jacek2023@reddit | LocalLLaMA | View on Reddit | 20 comments

https://huggingface.co/bartowski/FINAL-Bench_Darwin-36B-Opus-GGUF

Darwin-36B-Opus is a 36-billion-parameter mixture-of-experts (MoE) language model produced by the Darwin V7 evolutionary breeding engine from two publicly available parents:

Darwin V7 recombines these two parents into a single descendant that preserves the Mother's distilled chain-of-thought behavior while retaining the structural fidelity of the Father's expert topology. The breeding process is fully automated and produces a deployable bfloat16 checkpoint in under an hour on a single GPU.

On the GPQA Diamond benchmark — 198 graduate-level questions in physics, chemistry, and biology — Darwin-36B-Opus achieves 88.4%, establishing it as the highest-performing model in the Darwin family and extending the series' record of producing state-of-the-art open models through evolution rather than retraining.