Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
AaryanK
/
GLM-4.6V-Flash-GGUF
like
1
Image-Text-to-Text
GGUF
Chinese
English
zai
glm-4
vlm
multimodal
Mixture of Experts
conversational
License:
mit
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
GLM-4.6V-Flash-GGUF
87.8 GB
Ctrl+K
Ctrl+K
1 contributor
History:
5 commits
AaryanK
Delete GLM-4.6V-Flash.fp16.gguf
e646806
verified
2 days ago
.gitattributes
Safe
2.45 kB
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q2_k.gguf
Safe
4.01 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q3_k_l.gguf
Safe
5.2 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q3_k_m.gguf
Safe
4.97 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q3_k_s.gguf
Safe
4.59 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q4_0.gguf
Safe
5.46 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q4_1.gguf
Safe
6.01 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q4_k_m.gguf
Safe
6.17 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q4_k_s.gguf
Safe
5.76 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q5_0.gguf
Safe
6.56 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q5_1.gguf
Safe
7.11 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q5_k_m.gguf
Safe
7.05 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q5_k_s.gguf
Safe
6.7 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q6_k.gguf
Safe
8.27 GB
xet
Upload folder using huggingface_hub
5 months ago
GLM-4.6V-Flash.q8_0.gguf
Safe
10 GB
xet
Upload folder using huggingface_hub
5 months ago
README.md
Safe
1.35 kB
Update README.md
5 months ago