Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
derko83
/
dpo_longstep_3_4_4
like
0
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
dpo_longstep_3_4_4
Commit History
Upload folder using huggingface_hub
a78a799
verified
derko83
commited on
Jun 2, 2025
initial commit
f5bdb9d
verified
derko83
commited on
Jun 2, 2025