You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Model Card for Marble-3B

Marble-3B 是以 ibm-granite/granite-3.1-3b-a800m-base(IBM Granite 3.1 3B-A800M MoE)為基底,針對繁體中文與中華民國台灣語境完成持續預訓練(CPT)之 MoE 基底模型,作為 Marble-3B-Instruct 等下游模型的繁中底座。

⚠️ 規格重點: 本模型為 3B Mixture-of-Experts(MoE)基底模型、純文本單模態、僅做 CPT、未做指令微調,需自行 SFT 後才有對話能力。

Model Details

IBM Granite 3.1 系列引入 MoE 架構(3B 總參數、800M 活化參數),在推論成本與能力之間提供有趣的折衷點。Marble-3B 把繁中與台灣語境語料注入 Granite 3.1 3B-A800M,使下游任務能在 MoE 架構下取得繁中底層能力,兼顧推論效率與多領域覆蓋。

核心特點 (Key Features)

  1. MoE 架構繁中底座:少量活化參數(800M)卻有 3B 總參數的容量,部署效率較同等級稠密模型佳。
  2. 多領域適配:MoE 結構天然適合多領域知識分流,配合繁中 CPT 可作為法律、教育、生活等多領域微調的共同基底。
  3. 可下游微調:作為 Instruct、領域應用之 SFT/DPO 起點。

Model Description

Model Sources

Citation

@misc{marble_3b,
  title        = {Marble-3B: A Traditional Chinese Continued-Pretrained Granite 3B-A800M MoE Model for Taiwan},
  author       = {Huang, Liang Hsun},
  year         = {2025},
  howpublished = {\url{https://huggingface.co/lianghsun/Marble-3B}}
}

Acknowledge

  • 特此感謝 APMIC 的算力支援。

Model Card Authors

Huang Liang Hsun

Model Card Contact

Huang Liang Hsun

Downloads last month
-
Safetensors
Model size
3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lianghsun/Marble-3B

Finetuned
(3)
this model
Finetunes
1 model

Spaces using lianghsun/Marble-3B 15

Collection including lianghsun/Marble-3B