Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
RWKV-GLM-4.7-Flash-exp
like
2
Text Generation
Transformers
Safetensors
English
Chinese
Japanese
rwkv07i_moe
rwkv
rwkv7
hybrid
linear-attention
distillation
tica
Mixture of Experts
conversational
custom_code
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
RWKV-GLM-4.7-Flash-exp
/
__pycache__
/
test_openai_api.cpython-312.pyc
OpenMOSE
Upload folder using huggingface_hub
411272a
5 days ago
download
Copy download link
history
contribute
delete
Safe
8.7 kB
This file contains binary data. It cannot be displayed, but you can still
download
it.