Possibility of switching to Flux2 VAE
I am aware that switching VAE/whole latent space will cost a large amount of compute, but I believe that using Flux2's VAE will be very beneficial for the model overall and it's longevity.
For reference, there is a small team that managed to adapt SDXL (NoobAI v-pred) to Rectified Flow and Flux2 VAE through donations in 6 epochs using the original NoobAI dataset (Danbooru only) https://huggingface.co/CabalResearch/NoobAI-Flux2VAE-RectifiedFlow-0.3
While the details are still lacking, it is working and has come quite far in just 6 epochs.
I'd like to know if maybe there's a chance for CircleStone to consider training this model to use the Flux2 VAE in a future iteration.
Isn't the qwen image vae basically the same as flux2 vae in terms of quality?
Isn't the qwen image vae basically the same as flux2 vae in terms of quality?
No. (Anzhc the VAE YOLO guy or whatever he's called did some benchmarking, qwen vae quality is about the same level as flux1vae, particularly in semantic clustering aka learning ability.....its really bad, a trap.)
Well it's still better than sdxl vae tbf, but switching to flux2vae would be a great boon for both quality and training if it's possible.
I see, didn't know that, thanks.
https://bfl.ai/research/representation-comparison
BFL's own chart which also illustrates that (no qwen VAE but still).
Qwen VAE is pretty much the same as FLUX 1 VAE in this regard unfortunately