Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sssssungjae
's Collections
SFT
DPO
SFT Dataset
DPO Dataset
DPO Dataset
updated
Sep 26, 2025
DPO Dataset
Upvote
-
sssssungjae/dpo_shiba_10K
Viewer
•
Updated
Sep 24, 2025
•
10k
•
4
sssssungjae/dpo_shiba_safety1
Viewer
•
Updated
Sep 21, 2025
•
10k
•
5
Upvote
-
Share collection
View history
Collection guide
Browse collections