Requirements For Use:

Pytorch 2.10+cu130 is recommended and required for mxfp8

ComfyUI-QuantOps

ComfyUI-QuantOps

Custom fork of comfy-kitchen

Either clone my fork and checkout branch "fix/int8-tensor-backend" Then with CUDA toolkit 13.0 installed build and install it.

Alteratively install prebuilt wheel from my repo Choose wheel matching your python version and OS.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for silveroxides/LTX-2.3-Quants

Base model

Lightricks/LTX-2.3
Quantized
(3)
this model