Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
BIMU233
/
SPPO_7B
like
0
Safetensors
qwen2
Model card
Files
Files and versions
xet
Community
main
SPPO_7B
Commit History
Upload folder using huggingface_hub
e963a89
verified
BIMU233
commited on
Jan 19
initial commit
53dfdda
verified
BIMU233
commited on
Jan 19