Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
RWKV-GLM-4.7-Flash-exp
like
1
Text Generation
Transformers
Safetensors
English
Chinese
Japanese
rwkv07i_moe
rwkv
rwkv7
hybrid
linear-attention
distillation
tica
Mixture of Experts
conversational
custom_code
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
RWKV-GLM-4.7-Flash-exp
/
__pycache__
8.7 kB
1 contributor
History:
1 commit
OpenMOSE
Upload folder using huggingface_hub
411272a
4 days ago
test_openai_api.cpython-312.pyc
Safe
8.7 kB
Upload folder using huggingface_hub
4 days ago