drbaph commited on
Commit
e7539ad
·
verified ·
1 Parent(s): 624041d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -14,7 +14,7 @@ tags:
14
 
15
  # 🔢 FP8 Quantized Version - ComfyUI Compatible
16
 
17
- This is the **fp8_e4m3fn** and **fp8_e5m2** quantized version of the Z-Image model, optimized for ComfyUI workflows. These quantized formats significantly reduce VRAM requirements while maintaining high image quality, making the model more accessible for consumer-grade GPUs.
18
 
19
  **Quantization Formats:**
20
  - `fp8-e4m3fn-scaled`
@@ -22,6 +22,8 @@ This is the **fp8_e4m3fn** and **fp8_e5m2** quantized version of the Z-Image mod
22
  - `fp8_e5m2-scaled`
23
  - `fp8_e5m2`
24
 
 
 
25
  ## 📸 Example Outputs
26
 
27
  <div align="center">
 
14
 
15
  # 🔢 FP8 Quantized Version - ComfyUI Compatible
16
 
17
+ This is the **fp8-e4m3fn-scaled** / **fp8-e4m3fn** and **fp8-e5m2-scaled** / **fp8_e5m2** quantized versions of the Z-Image model, optimized for ComfyUI workflows. These quantized formats significantly reduce VRAM requirements while maintaining high image quality, making the model more accessible for consumer-grade GPUs.
18
 
19
  **Quantization Formats:**
20
  - `fp8-e4m3fn-scaled`
 
22
  - `fp8_e5m2-scaled`
23
  - `fp8_e5m2`
24
 
25
+ [Comfyui Workflow](https://huggingface.co/drbaph/Z-Image-fp8/resolve/main/z-img_fp8-workflow.json?download=true)
26
+
27
  ## 📸 Example Outputs
28
 
29
  <div align="center">