Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
AhmadHatam
/
Qwen-Python-DPO-Adapter
like
0
Transformers
Safetensors
English
text-generation-inference
unsloth
qwen2
trl
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Qwen-Python-DPO-Adapter
/
README.md
Commit History
Upload README.md with huggingface_hub
f3ee545
verified
AhmadHatam
commited on
14 days ago