Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
codegemma-7b-GGUF-smashed
like
1
Follow
Pruna AI
331
Pruna AI
GGUF
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
3a86b37
codegemma-7b-GGUF-smashed
Ctrl+K
Ctrl+K
3 contributors
History:
14 commits
johnrachwanpruna
Upload codegemma-7b.Q4_K_S.gguf with huggingface_hub
3a86b37
verified
almost 2 years ago
.gitattributes
2.18 kB
Upload codegemma-7b.Q4_K_S.gguf with huggingface_hub
almost 2 years ago
README.md
12 kB
Upload README.md with huggingface_hub
almost 2 years ago
codegemma-7b.IQ3_M.gguf
4.11 GB
xet
Upload codegemma-7b.IQ3_M.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.IQ3_S.gguf
Safe
3.98 GB
xet
Upload codegemma-7b.IQ3_S.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.IQ3_XS.gguf
3.8 GB
xet
Upload codegemma-7b.IQ3_XS.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q2_K.gguf
Safe
3.48 GB
xet
Upload codegemma-7b.Q2_K.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q3_K_M.gguf
4.37 GB
xet
Upload codegemma-7b.Q3_K_M.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q3_K_S.gguf
3.98 GB
xet
Upload codegemma-7b.Q3_K_S.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q4_K_M.gguf
5.33 GB
xet
Upload codegemma-7b.Q4_K_M.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q4_K_S.gguf
Safe
5.05 GB
xet
Upload codegemma-7b.Q4_K_S.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q5_0.gguf
5.98 GB
xet
Upload codegemma-7b.Q5_0.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q5_K_S.gguf
Safe
5.98 GB
xet
Upload codegemma-7b.Q5_K_S.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.Q8_0.gguf
Safe
9.08 GB
xet
Upload codegemma-7b.Q8_0.gguf with huggingface_hub
almost 2 years ago
codegemma-7b.fp16.bin
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
17.1 GB
xet
Upload codegemma-7b.fp16.bin with huggingface_hub
almost 2 years ago