Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Vortex5
/
Mystic-Rune-v2-12B-Q4_K_M-GGUF
like
0
Transformers
GGUF
mergekit
Merge
llama-cpp
gguf-my-repo
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Mystic-Rune-v2-12B-Q4_K_M-GGUF
7.48 GB
1 contributor
History:
3 commits
Vortex5
Upload README.md with huggingface_hub
0fa9ab5
verified
5 months ago
.gitattributes
1.59 kB
Upload mystic-rune-v2-12b-q4_k_m.gguf with huggingface_hub
5 months ago
README.md
1.79 kB
Upload README.md with huggingface_hub
5 months ago
mystic-rune-v2-12b-q4_k_m.gguf
7.48 GB
xet
Upload mystic-rune-v2-12b-q4_k_m.gguf with huggingface_hub
5 months ago