Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
nebulaResearch 
posted an update Sep 26, 2025
Post
248
We’re thrilled to announce the release of Zagros-1.0-Quick on Hugging Face – a 30.5B-parameter multilingual MoE language model with a Persian heart! Built on the innovative Zagros architecture, it delivers efficient, high-performance NLP for text generation, translation, and more across English, Persian, Arabic, and beyond.
Open-source and ready for your projects – download now and help us test it on real-world tasks!
Check it out: darsadilab/zagros-1.0-quick
#ZagrosLLM #PersianAI #MultilingualNLP #MoE