Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Jennny
/
dpo_acc
like
0
Model card
Files
Files and versions
xet
Community
main
dpo_acc
/
optimizer.pt
Commit History
upload checkpoint
7b59958
verified
Jennny
commited on
Sep 3, 2025