Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
jeiku
/
spare
like
0
GGUF
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
spare
61.2 GB
1 contributor
History:
13 commits
jeiku
Upload Aura_v2-Q4_K.gguf
ab76ddb
verified
almost 2 years ago
.gitattributes
Safe
2.19 kB
Upload Aura_v2-Q4_K.gguf
almost 2 years ago
2x9B-Q3_K_S.gguf
6.94 GB
xet
Rename ggml-model-Q3_K_S.gguf to 2x9B-Q3_K_S.gguf
almost 2 years ago
Ashera-Q4_K.gguf
4.37 GB
xet
Upload Ashera-Q4_K.gguf
almost 2 years ago
Aura-Q4_K.gguf
4.37 GB
xet
Upload Aura-Q4_K.gguf
almost 2 years ago
Aura_v2-Q4_K.gguf
4.37 GB
xet
Upload Aura_v2-Q4_K.gguf
almost 2 years ago
Garbage-Q4_K_S.gguf
5.13 GB
xet
Upload Garbage-Q4_K_S.gguf
almost 2 years ago
LunaIntern-Q3_K_S.gguf
8.76 GB
xet
Upload LunaIntern-Q3_K_S.gguf
almost 2 years ago
Melinoe-Q4_K.gguf
4.37 GB
xet
Upload Melinoe-Q4_K.gguf
almost 2 years ago
Minerva-Q4_K_S.gguf
5.13 GB
xet
Upload Minerva-Q4_K_S.gguf
almost 2 years ago
RPMix-4x7B-Q3_K.gguf
11.6 GB
xet
Upload RPMix-4x7B-Q3_K.gguf
almost 2 years ago
Spiced_2x7B-Q3_K.gguf
6.21 GB
xet
Rename ggml-model-Q3_K.gguf to Spiced_2x7B-Q3_K.gguf
almost 2 years ago