You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Yunmo_v0 (可行性研究)

📊 模型資訊

  • 🏗️ 架構:Transformer
  • 🧱 層數:16*Transformer Block
  • 🧠 隱藏維度:768
  • 🧩 注意力頭數:8
  • 🔡 詞表大小:6400
  • 📏 最大序列長度:512 (RoPE not used)
  • 🔢 參數量:~108M
  • 💾 權重格式:Transformers
  • 📚 訓練語料:高品質中文語料,涵蓋對話、知識、任務型文本

🚀 使用方法 - Hugging Face Transformers

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("yuhuanstudio/Yunmo_v0", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("yuhuanstudio/Yunmo_v0", trust_remote_code=True)
prompt = "你好,請自我介紹。"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

⚠️ 注意事項

  • 僅供學術研究、技術驗證與內部測試
  • 請勿用於商業或生產環境
  • 請勿用於生成違法、敏感或不當內容

🤝 聯絡與致謝

  • 📧 聯絡作者:Yuhuan
  • 🙏 感謝開源社群與數據集貢獻者

Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support