Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
acbueff
/
dpo-base
like
0
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
dpo-base
/
config.json
Commit History
Upload dpo_base model
1f273f7
verified
acbueff
commited on
11 days ago