Is this the largest "No synthetic data" open weight LLM? (142B)

Posted by AaronFeng47@reddit | LocalLLaMA | View on Reddit | 40 comments

Is this the largest "No synthetic data" open weight LLM? (142B)

From the GitHub page of https://huggingface.co/rednote-hilab/dots.llm1.base