Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
GLM-4.7-Flash-GGUF
like
2
Text Generation
Transformers
GGUF
English
Chinese
conversational
arxiv:
2508.06471
License:
mit
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
1c99fbd
GLM-4.7-Flash-GGUF
343 GB
Ctrl+K
Ctrl+K
1 contributor
History:
15 commits
Mungert
Upload GLM-4.7-Flash-q8_0.gguf with huggingface_hub
1c99fbd
verified
3 months ago
f16
Upload f16/GLM-4.7-Flash-f16-00002-of-00002.gguf with huggingface_hub
3 months ago
.gitattributes
2.42 kB
Upload GLM-4.7-Flash-q8_0.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-bf16_q8_0.gguf
38.1 GB
xet
Upload GLM-4.7-Flash-bf16_q8_0.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-f16_q8_0.gguf
38.1 GB
xet
Upload GLM-4.7-Flash-f16_q8_0.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q2_k_m.gguf
11.9 GB
xet
Upload GLM-4.7-Flash-q2_k_m.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q3_k_m.gguf
15.5 GB
xet
Upload GLM-4.7-Flash-q3_k_m.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q4_k_l.gguf
18.6 GB
xet
Upload GLM-4.7-Flash-q4_k_l.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q4_k_m.gguf
18.5 GB
xet
Upload GLM-4.7-Flash-q4_k_m.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q4_k_s.gguf
17.8 GB
xet
Upload GLM-4.7-Flash-q4_k_s.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q5_k_l.gguf
22 GB
xet
Upload GLM-4.7-Flash-q5_k_l.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q5_k_m.gguf
21.8 GB
xet
Upload GLM-4.7-Flash-q5_k_m.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q6_k_l.gguf
24.8 GB
xet
Upload GLM-4.7-Flash-q6_k_l.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q6_k_m.gguf
24.6 GB
xet
Upload GLM-4.7-Flash-q6_k_m.gguf with huggingface_hub
3 months ago
GLM-4.7-Flash-q8_0.gguf
31.8 GB
xet
Upload GLM-4.7-Flash-q8_0.gguf with huggingface_hub
3 months ago