Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
minsik-oh
/
dpo-model-sample
like
0
PEFT
Safetensors
trl
dpo
Generated from Trainer
License:
llama3.1
Model card
Files
Files and versions
xet
Community
Use this model
main
dpo-model-sample
/
.config
84.8 kB
1 contributor
History:
1 commit
minsik-oh
minsik-oh/dpo-model-sample
2f4c3be
verified
about 1 year ago
configurations
minsik-oh/dpo-model-sample
about 1 year ago
logs
minsik-oh/dpo-model-sample
about 1 year ago
.last_opt_in_prompt.yaml
3 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
.last_survey_prompt.yaml
37 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
.last_update_check.json
135 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
active_config
7 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
config_sentinel
0 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
default_configs.db
12.3 kB
minsik-oh/dpo-model-sample
about 1 year ago
gce
5 Bytes
minsik-oh/dpo-model-sample
about 1 year ago
hidden_gcloud_config_universe_descriptor_data_cache_configs.db
12.3 kB
minsik-oh/dpo-model-sample
about 1 year ago