Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
GingerBled
/
DPOV7
like
0
Follow
GingerBled
4
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
DPOV7
Commit History
Add full-parameter DPO weights (2025-06-06)
00552e5
verified
bouchonnn
commited on
Jun 6, 2025
initial commit
0734c79
verified
bouchonnn
commited on
Jun 6, 2025