Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
latestissue
/
rwkv-4-code-7b-world-32k-ggml-quantized
like
0
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
main
rwkv-4-code-7b-world-32k-ggml-quantized
/
README.md
latestissue
Update README.md
3f06ee1
over 2 years ago
preview
code
|
raw
Copy download link
history
blame
contribute
delete
Safe
87 Bytes
metadata
license:
apache-2.0
Source:
https://huggingface.co/xiaol/RWKV-Code-7B-world-32k