Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
samitizerxu
/
Deepseek-R1-Distil-7B-Qwen-DPO-keep-v2
like
0
Safetensors
qwen2
Model card
Files
Files and versions
xet
Community
main
Deepseek-R1-Distil-7B-Qwen-DPO-keep-v2
Commit History
Upload folder using huggingface_hub
2ef25b8
verified
samitizerxu
commited on
Mar 30, 2025
Upload folder using huggingface_hub
c9d144a
verified
samitizerxu
commited on
Mar 30, 2025
initial commit
ae11b86
verified
samitizerxu
commited on
Mar 30, 2025