back

Z.ai Releases GLM-5.1 Open-Source: 754B MoE Tops SWE-Bench Pro at 58.4, Surpassing GPT-5.4 and Claude Opus 4.6; MIT License

today 11:05

Z.ai (formerly Zhipu AI) released GLM-5.1 on April 7, 2026, a 754-billion-parameter mixture-of-experts model published under the MIT license on Hugging Face. The model achieved 58.4 on SWE-Bench Pro, surpassing GPT-5.4 (57.7) and Claude Opus 4.6 (57.3), and ranked #1 among open-source models on LLM Arena based on over 11,000 community votes. GLM-5.1 is a follow-on to GLM-5, which Z.ai trained entirely on Huawei Ascend 910B chips.

Citations