Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Kukedlc
/
NeutriX7000-7b-DPO

Merge
mergekit
#dpo
MaximeLabonne
#mergeofmerge
Model card Files Files and versions
xet
Community
NeutriX7000-7b-DPO
3.67 kB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 4 commits
Kukedlc's picture
Kukedlc
Update README.md
950bcbc verified about 2 years ago
  • .gitattributes
    1.52 kB
    initial commit about 2 years ago
  • README.md
    2.15 kB
    Update README.md about 2 years ago