Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
COURSEMO
/
dpo_adapter_400
like
0
Safetensors
Model card
Files
Files and versions
xet
Community
main
dpo_adapter_400
Commit History
Upload folder using huggingface_hub
4b70764
verified
Ziang Huang
commited on
Jul 23, 2025
initial commit
d841c37
verified
Ziang Huang
commited on
Jul 23, 2025