Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sumo43
/
SOLAR-10.7B-Instruct-DPO-v2.0
like
0
Model card
Files
Files and versions
xet
Community
main
SOLAR-10.7B-Instruct-DPO-v2.0
Commit History
initial commit
0f40421
sumo43
commited on
Dec 22, 2023