Ming-flash-omni-2.0-i1-GGUF ?
#1860
by
Rebis - opened
Hi,
Is it possible to make a i1-GGUF version of Ming-flash-omni-2.0 ?
https://huggingface.co/inclusionAI/Ming-flash-omni-2.0
Thank you in advance.
If supported then it will work, it's queued =)
You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#Ming-flash-omni-2.0-GGUF for quants to appear.