Fully working quantization of the alpha version of the Minthy/RouWei-0.8-16ch-v0.1alpha model WARNING! To run (ComfyUI), you will need a custom node. The PR may have already been accepted, and you can simply use the standard library from the manager. Otherwise, you will need to separately install the repository in custom_nodes: https://github.com/YakovSava/ComfyUI-gguf-SDXL-16ch-loader

Downloads last month
80
GGUF
Model size
3B params
Architecture
sdxl
Hardware compatibility
Log In to add your hardware

4-bit

5-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for YakovSava/quantized-RouWei-0.8-16ch-v0.1alpha