Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
qayemmehdi
/
MNLP_M2_dpo_model
like
0
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
MNLP_M2_dpo_model
Commit History
Update README.md
70780cf
verified
qayemmehdi
commited on
Jun 8, 2025
Create README.md
741fd49
verified
qayemmehdi
commited on
Jun 8, 2025
Upload folder using huggingface_hub
0171bb3
verified
qayemmehdi
commited on
May 23, 2025
initial commit
f4fbf7f
verified
qayemmehdi
commited on
May 23, 2025