OLMo3-190M-zh-full

为零基础 AI 大模型研发训练营(llm001)L04 Full 模型(190M 参数,20 步测试训练)。

模型配置

  • hidden_size: 768, num_layers: 12, num_heads: 12, intermediate_size: 3072
  • vocab_size: 48000, sliding_window: 4096

训练配置

  • 数据:cmz1024/llm101-olmo3-zh-demo-data (500M tokens)
  • 训练:H100, max_steps=20, bs=16×8=128, lr=5e-4, bf16

用法

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("cmz1024/olmo3-190m-zh-full")
tok = AutoTokenizer.from_pretrained("cmz1024/olmo3-190m-zh-full")
Downloads last month
16
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for bashanyeyu/olmo3-190m-zh-full

Unable to build the model tree, the base model loops to the model itself. Learn more.