|
|
--- |
|
|
license: apache-2.0 |
|
|
pipeline_tag: text-to-image |
|
|
library_name: gguf |
|
|
tags: |
|
|
- flex |
|
|
- flux |
|
|
- gguf |
|
|
- safetensors |
|
|
base_model: |
|
|
- ostris/Flex.2-preview |
|
|
--- |
|
|
|
|
|
# Info |
|
|
|
|
|
Various quantizations for [hf:ostris/Flex.2-preview](https://huggingface.co/ostris/Flex.2-preview/). |
|
|
|
|
|
# Safetensors |
|
|
|
|
|
| Filename | Quant Type | File Size | Description | Example Image | |
|
|
| -------- | ---------- | --------- | ----------- | ------------- | |
|
|
| [Flex.2-preview-fp8_e4m3fn_scaled.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e4m3fn_scaled.safetensors) | F8_E4M3FN | 8.17GB | Scale per weight tensor | - | |
|
|
| [Flex.2-preview-fp8_e5m2_scaled.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e5m2_scaled.safetensors) | F8_E5M2 | 8.17GB | Scale per weight tensor | - | |
|
|
| [Flex.2-preview-fp8_e4m3fn.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e4m3fn.safetensors) | F8_E4M3FN | 8.16GB | - | - | |
|
|
| [Flex.2-preview-fp8_e5m2.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e5m2.safetensors) | F8_E5M2 | 8.16GB | - | - | |
|
|
|
|
|
|
|
|
# Pure GGUF |
|
|
|
|
|
- pure, conversion from safetensors BF16 via F32 gguf |
|
|
- architecture: flex.2 (as not all tensor shapes match to flux) |
|
|
- no imatrix was used to quantize |
|
|
- biases and norms: F32 |
|
|
- img_in.weight: BF16 (due to tensor shape and block sizes) |
|
|
- everything else according to file type |
|
|
|
|
|
| Filename | Quant Type | File Size | Description / L2 Loss Step 25 | Example Image | |
|
|
| -------- | ---------- | --------- | ----------------------------- | ------------- | |
|
|
| [Flex.2-preview-BF16.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-BF16.gguf) | BF16 | 16.3GB | - | - | |
|
|
| [Flex.2-preview-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q8_0.gguf) | Q8_0 | 8.68GB | TBC | - | |
|
|
| [Flex.2-preview-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q6_K.gguf) | Q6_K | 6.70GB | TBC | - | |
|
|
| [Flex.2-preview-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q5_1.gguf) | Q5_1 | 6.13GB | TBC | - | |
|
|
| [Flex.2-preview-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q5_0.gguf) | Q5_0 | 5.62GB | TBC | - | |
|
|
| [Flex.2-preview-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q4_1.gguf) | Q4_1 | 5.11GB | TBC | - | |
|
|
| [Flex.2-preview-IQ4_NL.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-IQ4_NL.gguf) | IQ4_NL | 4.60GB | TBC | - | |
|
|
| [Flex.2-preview-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q4_0.gguf) | Q4_0 | 4.60GB | TBC | - | |
|
|
| [Flex.2-preview-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - | |
|
|
|
|
|
|
|
|
# Fluxified GGUF |
|
|
|
|
|
- conversion from safetensors BF16 via F32 gguf |
|
|
- truncated img_in.weight tensor to first 16 latent channels |
|
|
- lost ability to do inpainting and process control image |
|
|
- should be a drop-in replacement for FLUX |
|
|
- architecture: flux |
|
|
- dynamic quantization? |
|
|
|
|
|
| Filename | Quant type | File Size | Description / L2 Loss Step 25 | Example Image | |
|
|
| -------- | ---------- | --------- | ----------------------------- | ------------- | |
|
|
| [Flex.2-preview-fluxified-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q8_0.gguf) | Q8_0 | 8.39GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q6_K.gguf) | Q6_K | 6.74GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_1.gguf) | Q5_1 | 6.19GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_0.gguf) | Q5_0 | 5.70GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_K_S.gguf) | Q5_K_S | 5.67GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_1.gguf) | Q4_1 | 5.22GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_0.gguf) | Q4_0 | 4.72GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q4_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_K_S.gguf) | Q4_K_S | 4.58GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - | |
|
|
|
|
|
# Fluxified GGUF Imatrix |
|
|
|
|
|
- Fluxified GGUF + Importance Matrix |
|
|
|
|
|
| Filename | Quant type | File Size | Description / L2 Loss Step 25 | Example Image | |
|
|
| -------- | ---------- | --------- | ----------------------------- | ------------- | |
|
|
| [Flex.2-preview-fluxified-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q8_0_gguf) | Q8_0 | 8.39GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q6_K_gguf) | Q6_K | 6.74GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_1_gguf) | Q5_1 | 6.19GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_0_gguf) | Q5_0 | 5.70GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q5_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_K_S.gguf) | Q5_K_S | 5.67GB | TBC |
|
|
| [Flex.2-preview-fluxified-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_1_gguf) | Q4_1 | 5.22GB | TBC | - || - | |
|
|
| [Flex.2-preview-fluxified-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_0_gguf) | Q4_0 | 4.72GB | TBC | - || - | |
|
|
| [Flex.2-preview-fluxified-IQ4_NL.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-IQ4_NL.gguf) | IQ4_NL | 4.58GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q4_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_K_S.gguf) | Q4_K_S | 4.58GB | TBC | - |
|
|
| [Flex.2-preview-fluxified-IQ4_XS.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-IQ4_XS.gguf) | IQ4_XS | 4.37GB | TBC | - || |
|
|
| [Flex.2-preview-fluxified-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - | |
|
|
| [Flex.2-preview-fluxified-Q2_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q2_K.gguf) | Q2_K | 2.82GB | TBC | - | |
|
|
|