Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
RWKV-GLM-4.7-Flash-exp
like
1
Text Generation
Transformers
Safetensors
English
Chinese
Japanese
rwkv07i_moe
rwkv
rwkv7
hybrid
linear-attention
distillation
tica
Mixture of Experts
conversational
custom_code
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
RWKV-GLM-4.7-Flash-exp
/
test_client_api.py
Commit History
Upload folder using huggingface_hub
411272a
OpenMOSE
commited on
5 days ago