Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Boko99
/
qwen2_c_dpo
like
0
Safetensors
Model card
Files
Files and versions
xet
Community
main
qwen2_c_dpo
301 MB
Ctrl+K
Ctrl+K
2 contributors
History:
2 commits
ByungOh-Ko
update
5534b4a
5 months ago
DPO_Beta_0.1_LR_2.0e-6
update
5 months ago
.gitattributes
Safe
1.52 kB
initial commit
5 months ago