Fully working quantization of the alpha version of the Minthy/RouWei-0.8-16ch-v0.1alpha model
WARNING! To run (ComfyUI), you will need a custom node. The PR may have already been accepted, and you can simply use the standard library from the manager. Otherwise, you will need to separately install the repository in custom_nodes: https://github.com/YakovSava/ComfyUI-gguf-SDXL-16ch-loader
- Downloads last month
- 80
Hardware compatibility
Log In to add your hardware
4-bit
5-bit
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for YakovSava/quantized-RouWei-0.8-16ch-v0.1alpha
Base model
KBlueLeaf/kohaku-xl-beta5
Finetuned
Minthy/RouWei-0.6
Finetuned
Minthy/RouWei-0.7
Finetuned
Minthy/RouWei-0.8
Finetuned
Minthy/RouWei-0.8-16ch-v0.1alpha