Flux 2 ggud merged!
#1
by
Rafaelldestilo
- opened
LoRa is fantastic, it performs better than in 20 steps, but adding it makes generation very slow. The correct approach would be models with LoRa merged. Could you create a Flux 2 GGUF merged Q3 or Q4? This would result in a fast model without the burden of LoRa, which greatly increases generation time. The 8-step model with LoRa takes a long time. I'm sure this would be very beneficial for the community: Flux 2 Turbo merged models.