Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3-Preview
like
1
Text Generation
Transformers
causal-lm
linear-attention
rwkv
reka
knowledge-distillation
multilingual
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
2
Deploy
Use this model
dcec4e6
HRWKV7-Reka-Flash3-Preview
42.7 GB
2 contributors
History:
3 commits
OpenMOSE
Update README.md
dcec4e6
verified
7 months ago
.gitattributes
1.52 kB
initial commit
7 months ago
README.md
63 Bytes
Update README.md
7 months ago
hxa079-reka-flash3-stage2-hybrid.pth
42.7 GB
xet
Upload hxa079-reka-flash3-stage2-hybrid.pth with huggingface_hub
7 months ago