Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
koreankiwi99
/
M2_dpo_model_base_Math-Step-DPO-10K
like
0
TensorBoard
Safetensors
qwen3
Model card
Files
Files and versions
xet
Metrics
Training metrics
Community
main
M2_dpo_model_base_Math-Step-DPO-10K
/
runs
99.2 kB
Ctrl+K
Ctrl+K
1 contributor
History:
1 commit
koreankiwi99
Upload folder using huggingface_hub
1de6541
verified
11 months ago
Jun05_14-17-48_47dafa9566a9
Upload folder using huggingface_hub
11 months ago