You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Model Card for Marble-3B-Instruct

Marble-3B-InstructMarble-3B(Granite 3.1 3B-A800M MoE 之繁中 CPT 版本)的指令微調版本,以繁中對話資料完成 SFT,提供 MoE 架構下的台灣語境繁中對話能力。

⚠️ 規格重點: 本模型為 3B Mixture-of-Experts(MoE)模型、純文本單模態。

Model Details

MoE 架構在推論時僅活化部分專家權重,可在保留模型容量的同時降低推論成本。本模型在 Marble-3B 之繁中 CPT 底座上做指令微調,目標是讓 MoE 架構在繁中與台灣語境下提供穩定可用的對話能力。

核心特點 (Key Features)

  1. MoE 架構對話模型:推論成本相對於同容量稠密模型更低。
  2. 台灣語境對齊:訓練資料以繁中與台灣常見任務為主。
  3. 可端側部署:3B 規模友善於消費級硬體。

Model Description

Model Sources

Citation

@misc{marble_3b_instruct,
  title        = {Marble-3B-Instruct: A Traditional Chinese Instruction-Tuned Granite 3B-A800M MoE Model for Taiwan},
  author       = {Huang, Liang Hsun},
  year         = {2025},
  howpublished = {\url{https://huggingface.co/lianghsun/Marble-3B-Instruct}}
}

Acknowledge

  • 特此感謝 APMIC 的算力支援。

Model Card Authors

Huang Liang Hsun

Model Card Contact

Huang Liang Hsun

Downloads last month
-
Safetensors
Model size
3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lianghsun/Marble-3B-Instruct

Finetuned
(1)
this model

Collection including lianghsun/Marble-3B-Instruct