Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
GingerBled
/
DPOV3
like
0
Follow
GingerBled
4
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
DPOV3
/
config.json
Commit History
Add full-parameter DPO weights (2025-06-05)
bb7e8e5
verified
bouchonnn
commited on
Jun 5, 2025