Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
naiweizi
/
dpo-harmless_helpful-mixed
like
0
Model card
Files
Files and versions
xet
Community
main
dpo-harmless_helpful-mixed
29.6 MB
Ctrl+K
Ctrl+K
1 contributor
History:
2 commits
naiweizi
Initial upload
a336ee6
verified
about 1 year ago
final_checkpoint
Initial upload
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago