--- license: apache-2.0 language: - en tags: - zen - zenlm - hanzo-ai - multilingual - translation - CJK pipeline_tag: text-generation library_name: transformers base_model: zenlm/zen-next-80b-instruct --- # Zen Multilingual > **Parameters**: 32B | **Architecture**: Zen 3 Architecture | **Context**: 128K | **License**: Apache 2.0 | **Released**: 2024-12-01 Multilingual generation across 30+ languages: English, Chinese, Japanese, Korean, Arabic, Spanish, French, German, Portuguese, Russian, and more. Strong at cross-lingual reasoning, code-switching, and multilingual instruction following. Base weights: [zenlm/zen-next-80b-instruct](https://huggingface.co/zenlm/zen-next-80b-instruct) ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("zenlm/zen-next-80b-instruct", torch_dtype="auto") tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-next-80b-instruct") messages = [{"role": "user", "content": "Your domain-specific prompt here"}] text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) inputs = tokenizer(text, return_tensors="pt").to(model.device) output = model.generate(**inputs, max_new_tokens=1024) print(tokenizer.decode(output[0][inputs.input_ids.shape[-1]:], skip_special_tokens=True)) ``` --- ## The Zen LM Family Joint research between **Hanzo AI** (Techstars '17), **Zoo Labs Foundation** (501c3), and **Lux Partners Limited**. All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially. [HuggingFace](https://huggingface.co/zenlm) · [Chat](https://hanzo.chat) · [API](https://api.hanzo.ai) · [Docs](https://zenlm.org)