Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
koreankiwi99
/
M2_dpo_model_base_Math-Step-DPO-10K
like
0
TensorBoard
Safetensors
qwen3
Model card
Files
Files and versions
xet
Metrics
Training metrics
Community
main
M2_dpo_model_base_Math-Step-DPO-10K
Commit History
Upload folder using huggingface_hub
1de6541
verified
koreankiwi99
commited on
Jun 5, 2025
initial commit
524c300
verified
koreankiwi99
commited on
Jun 5, 2025