Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
sssssungjae 's Collections
SFT
DPO
SFT Dataset
DPO Dataset

DPO

updated Sep 26, 2025

DPO model

Upvote
-

  • sssssungjae/qwen2.5-dpo-shi3

    Text Generation • 8B • Updated Sep 24, 2025

  • sssssungjae/qwen2.5-dpo-shi2

    Text Generation • 8B • Updated Sep 22, 2025 • 1
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs