Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
OpenMOSE
/
HRWKV7-Reka-Flash3.1-Preview
like
1
Text Generation
Transformers
rwkv
linear-attention
reka
distillation
knowledge-distillation
hybrid-architecture
language-model
arxiv:
2505.03005
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
3878d8d
HRWKV7-Reka-Flash3.1-Preview
1.55 kB
Ctrl+K
Ctrl+K
2 contributors
History:
1 commit
OpenMOSE
initial commit
3878d8d
verified
9 months ago
.gitattributes
Safe
1.52 kB
initial commit
9 months ago
README.md
Safe
31 Bytes
initial commit
9 months ago