Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
ggml-org
/
GLM-4.6V-Flash-GGUF
like
21
Follow
ggml-org
1.81k
GGUF
conversational
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
GLM-4.6V-Flash-GGUF
17.1 GB
Ctrl+K
Ctrl+K
2 contributors
History:
3 commits
ggerganov
HF Staff
Upload GLM-4.6V-Flash-Q8_0.gguf with huggingface_hub
8166025
verified
3 months ago
.gitattributes
Safe
1.71 kB
Upload GLM-4.6V-Flash-Q8_0.gguf with huggingface_hub
3 months ago
GLM-4.6V-Flash-Q4_K_M.gguf
6.17 GB
xet
Upload folder using huggingface_hub
4 months ago
GLM-4.6V-Flash-Q8_0.gguf
10 GB
xet
Upload GLM-4.6V-Flash-Q8_0.gguf with huggingface_hub
3 months ago
README.md
Safe
274 Bytes
Upload folder using huggingface_hub
4 months ago
mmproj-GLM-4.6V-Flash-Q8_0.gguf
980 MB
xet
Upload folder using huggingface_hub
4 months ago