This is amazing.

#1
by maithink - opened

I hope you guys will release the 14b version. nice work.

Thanks for your attention. Training a 14B-parameter model is relatively expensive for us. We may explore low-cost fine-tuning approaches such as LoRA to develop a 14B version, but this is not guaranteed.

this model is great, it does what it promises.
now that wan 2.2 5b is out, maybe it could be trained for that ?
if this could be done and quantized at int4 (with deepcompressor) it would be awesome.

Sign up or log in to comment