Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sssssungjae
's Collections
SFT
DPO
SFT Dataset
DPO Dataset
DPO
updated
Sep 26, 2025
DPO model
Upvote
-
sssssungjae/qwen2.5-dpo-shi3
Text Generation
•
8B
•
Updated
Sep 24, 2025
sssssungjae/qwen2.5-dpo-shi2
Text Generation
•
8B
•
Updated
Sep 22, 2025
•
1
Upvote
-
Share collection
View history
Collection guide
Browse collections