Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
RedMist137
/
DPO-Zephyr-7B
like
0
Safetensors
mistral
trl
dpo
Generated from Trainer
Model card
Files
Files and versions
xet
Community
main
DPO-Zephyr-7B
/
generation_config.json
Commit History
Model save
069c484
verified
RedMist137
commited on
Mar 21, 2025
Model save
e11b85e
verified
RedMist137
commited on
Oct 17, 2024