Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Boko99
/
qwen2_c_dpo
like
0
Safetensors
Model card
Files
Files and versions
xet
Community
main
qwen2_c_dpo
301 MB
2 contributors
History:
2 commits
ByungOh-Ko
update
5534b4a
2 months ago
DPO_Beta_0.1_LR_2.0e-6
update
2 months ago
.gitattributes
1.52 kB
initial commit
2 months ago