GGUF versions?
#3
by
fawogin598
- opened
Happy new year and thank you very much for your work.
Without you, I couldn't run it on my 3060.
But still, with the fp8 version, I need to swap blocks on RAM and it's slow.
Would it be possible to have it into GGUF?
AI tells me about convert-hf-to-gguf.py but I don't know anything.
It should be possible in theory but I don't know the details. Usually it's not enough to just convert the model, they also get their own tags (model type) and support in the main ComfyUI GGUF repo. So if you really need it you can ask there!