Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
nomadrp
/
dpo-v1
like
0
Transformers
Safetensors
Generated from Trainer
trl
dpo
arxiv:
2305.18290
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
dpo-v1
Commit History
dpo-v1
a8f41bc
verified
nomadrp
commited on
Apr 19, 2025
Training in progress, step 50
4eff429
verified
nomadrp
commited on
Apr 19, 2025
initial commit
db8375e
verified
nomadrp
commited on
Apr 19, 2025