Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Boko99
/
qwen2_dpo
like
0
Safetensors
Model card
Files
Files and versions
xet
Community
main
qwen2_dpo
1 contributor
History:
2 commits
Boko99
Upload folder using huggingface_hub
5347601
verified
4 months ago
vanilla_DPO_Beta_0.1_LR_2.0e-6
Upload folder using huggingface_hub
4 months ago
.gitattributes
Safe
1.52 kB
initial commit
4 months ago