Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
thorirhrafn
/
gpt7B_DPO_model
like
0
Model card
Files
Files and versions
xet
Community
main
gpt7B_DPO_model
1.52 kB
1 contributor
History:
1 commit
thorirhrafn
initial commit
4a0ac9e
verified
almost 2 years ago
.gitattributes
1.52 kB
initial commit
almost 2 years ago