Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
GingerBled
/
DPOV2
like
0
Follow
GingerBled
4
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
DPOV2
/
generation_config.json
Commit History
Add full-parameter DPO weights (2025-06-04)
a495970
verified
bouchonnn
commited on
Jun 4, 2025