Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
RWKV-GLM-4.7-Flash-exp
like
2
Text Generation
Transformers
Safetensors
English
Chinese
Japanese
rwkv07i_moe
rwkv
rwkv7
hybrid
linear-attention
distillation
tica
Mixture of Experts
conversational
custom_code
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
RWKV-GLM-4.7-Flash-exp
Commit History
Upload folder using huggingface_hub
aff7c6c
verified
OpenMOSE
commited on
4 days ago
Upload folder using huggingface_hub
9af1ba0
OpenMOSE
commited on
5 days ago
Upload folder using huggingface_hub
411272a
OpenMOSE
commited on
6 days ago
Update README.md
8f164da
OpenMOSE
commited on
6 days ago
Update README.md
9c83829
OpenMOSE
commited on
6 days ago
Update README.md
470942f
OpenMOSE
commited on
6 days ago
initial commit
d54676f
OpenMOSE
commited on
6 days ago