README.md exists but content is empty.
Downloads last month
23
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mlx-community/Qwen3-32B-4bit-DWQ

Base model

Qwen/Qwen3-32B
Quantized
(1)
this model