GLM-4.7-Flash

#1709
by xesole3024 - opened

https://huggingface.co/zai-org/GLM-4.7-Flash

New 30B A3B MoE, please Quant this :3

ERROR:hf-to-gguf:Model Glm4MoeLiteForCausalLM is not supported
Hi, seems unsupported yet, but it might be added later

Support was merged into llama.cpp last night: https://github.com/ggml-org/llama.cpp/pull/18936

Could you have another look?

yeah, we are not that fast at updating =)

@nicoboss can you please requeue when the llama cpp is updated ?

Sign up or log in to comment