Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
GingerBled
/
qwen-DPO
like
0
Follow
GingerBled
4
Safetensors
qwen3
dpo
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
main
qwen-DPO
2.4 GB
1 contributor
History:
3 commits
bouchonnn
Upload README.md with huggingface_hub
54bf461
verified
10 months ago
.gitattributes
1.57 kB
Add final DPO fine-tuned checkpoint (merged)
10 months ago
README.md
684 Bytes
Upload README.md with huggingface_hub
10 months ago
added_tokens.json
707 Bytes
Add final DPO fine-tuned checkpoint (merged)
10 months ago
config.json
760 Bytes
Add final DPO fine-tuned checkpoint (merged)
10 months ago
generation_config.json
117 Bytes
Add final DPO fine-tuned checkpoint (merged)
10 months ago
merges.txt
1.67 MB
Add final DPO fine-tuned checkpoint (merged)
10 months ago
model-00001-of-00002.safetensors
1.99 GB
xet
Add final DPO fine-tuned checkpoint (merged)
10 months ago
model-00002-of-00002.safetensors
390 MB
xet
Add final DPO fine-tuned checkpoint (merged)
10 months ago
model.safetensors.index.json
25.5 kB
Add final DPO fine-tuned checkpoint (merged)
10 months ago
special_tokens_map.json
496 Bytes
Add final DPO fine-tuned checkpoint (merged)
10 months ago
tokenizer.json
11.4 MB
xet
Add final DPO fine-tuned checkpoint (merged)
10 months ago
tokenizer_config.json
9.7 kB
Add final DPO fine-tuned checkpoint (merged)
10 months ago
vocab.json
2.78 MB
Add final DPO fine-tuned checkpoint (merged)
10 months ago