Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Bgoood
/
ass4_DPO_ft_model
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
ass4_DPO_ft_model
Commit History
Upload 7 files
0c05526
verified
Bgoood
commited on
May 11, 2025
Upload 7 files
bcd1095
verified
Bgoood
commited on
May 11, 2025
initial commit
c1e07a3
verified
Bgoood
commited on
May 11, 2025