File size: 7,292 Bytes
cd964a7 085561b cd964a7 96d26ec 9b8a00e 085561b 9b4b075 96d26ec 1299047 9b4b075 091642e 98b56e4 96d26ec f70680d 9b8a00e c1c5c0d 547ccfc 5817bbf 9b8a00e c1c5c0d 9b8a00e c1c5c0d 547ccfc a0818d2 f1d9108 c77aabc 183b69b a00aa7c c1c5c0d 5817bbf f70680d 5817bbf c1c5c0d 5817bbf f70680d 5817bbf f70680d 55503a6 c5bd59c 077fc9d 5bd1f60 9c02de0 26fbd39 18ba1ab f70680d 10d18be c1b79d0 a2fe898 ef4d43a f447141 ef4d43a f447141 10d18be f447141 10d18be 22b2c27 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 |
---
license: apache-2.0
pipeline_tag: text-to-image
library_name: gguf
tags:
- flex
- flux
- gguf
- safetensors
base_model:
- ostris/Flex.2-preview
---
# Info
Various quantizations for [hf:ostris/Flex.2-preview](https://huggingface.co/ostris/Flex.2-preview/).
# Safetensors
| Filename | Quant Type | File Size | Description | Example Image |
| -------- | ---------- | --------- | ----------- | ------------- |
| [Flex.2-preview-fp8_e4m3fn_scaled.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e4m3fn_scaled.safetensors) | F8_E4M3FN | 8.17GB | Scale per weight tensor | - |
| [Flex.2-preview-fp8_e5m2_scaled.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e5m2_scaled.safetensors) | F8_E5M2 | 8.17GB | Scale per weight tensor | - |
| [Flex.2-preview-fp8_e4m3fn.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e4m3fn.safetensors) | F8_E4M3FN | 8.16GB | - | - |
| [Flex.2-preview-fp8_e5m2.safetensors](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-fp8_e5m2.safetensors) | F8_E5M2 | 8.16GB | - | - |
# Pure GGUF
- pure, conversion from safetensors BF16 via F32 gguf
- architecture: flex.2 (as not all tensor shapes match to flux)
- no imatrix was used to quantize
- biases and norms: F32
- img_in.weight: BF16 (due to tensor shape and block sizes)
- everything else according to file type
| Filename | Quant Type | File Size | Description / L2 Loss Step 25 | Example Image |
| -------- | ---------- | --------- | ----------------------------- | ------------- |
| [Flex.2-preview-BF16.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/Flex.2-preview-BF16.gguf) | BF16 | 16.3GB | - | - |
| [Flex.2-preview-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q8_0.gguf) | Q8_0 | 8.68GB | TBC | - |
| [Flex.2-preview-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q6_K.gguf) | Q6_K | 6.70GB | TBC | - |
| [Flex.2-preview-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q5_1.gguf) | Q5_1 | 6.13GB | TBC | - |
| [Flex.2-preview-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q5_0.gguf) | Q5_0 | 5.62GB | TBC | - |
| [Flex.2-preview-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q4_1.gguf) | Q4_1 | 5.11GB | TBC | - |
| [Flex.2-preview-IQ4_NL.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-IQ4_NL.gguf) | IQ4_NL | 4.60GB | TBC | - |
| [Flex.2-preview-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q4_0.gguf) | Q4_0 | 4.60GB | TBC | - |
| [Flex.2-preview-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/pure/Flex.2-preview-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - |
# Fluxified GGUF
- conversion from safetensors BF16 via F32 gguf
- truncated img_in.weight tensor to first 16 latent channels
- lost ability to do inpainting and process control image
- should be a drop-in replacement for FLUX
- architecture: flux
- dynamic quantization?
| Filename | Quant type | File Size | Description / L2 Loss Step 25 | Example Image |
| -------- | ---------- | --------- | ----------------------------- | ------------- |
| [Flex.2-preview-fluxified-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q8_0.gguf) | Q8_0 | 8.39GB | TBC | - |
| [Flex.2-preview-fluxified-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q6_K.gguf) | Q6_K | 6.74GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_1.gguf) | Q5_1 | 6.19GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_0.gguf) | Q5_0 | 5.70GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q5_K_S.gguf) | Q5_K_S | 5.67GB | TBC | - |
| [Flex.2-preview-fluxified-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_1.gguf) | Q4_1 | 5.22GB | TBC | - |
| [Flex.2-preview-fluxified-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_0.gguf) | Q4_0 | 4.72GB | TBC | - |
| [Flex.2-preview-fluxified-Q4_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q4_K_S.gguf) | Q4_K_S | 4.58GB | TBC | - |
| [Flex.2-preview-fluxified-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified/Flex.2-preview-fluxified-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - |
# Fluxified GGUF Imatrix
- Fluxified GGUF + Importance Matrix
| Filename | Quant type | File Size | Description / L2 Loss Step 25 | Example Image |
| -------- | ---------- | --------- | ----------------------------- | ------------- |
| [Flex.2-preview-fluxified-Q8_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q8_0_gguf) | Q8_0 | 8.39GB | TBC | - |
| [Flex.2-preview-fluxified-Q6_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q6_K_gguf) | Q6_K | 6.74GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_1_gguf) | Q5_1 | 6.19GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_0_gguf) | Q5_0 | 5.70GB | TBC | - |
| [Flex.2-preview-fluxified-Q5_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q5_K_S.gguf) | Q5_K_S | 5.67GB | TBC
| [Flex.2-preview-fluxified-Q4_1.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_1_gguf) | Q4_1 | 5.22GB | TBC | - || - |
| [Flex.2-preview-fluxified-Q4_0.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_0_gguf) | Q4_0 | 4.72GB | TBC | - || - |
| [Flex.2-preview-fluxified-IQ4_NL.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-IQ4_NL.gguf) | IQ4_NL | 4.58GB | TBC | - |
| [Flex.2-preview-fluxified-Q4_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q4_K_S.gguf) | Q4_K_S | 4.58GB | TBC | -
| [Flex.2-preview-fluxified-IQ4_XS.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-IQ4_XS.gguf) | IQ4_XS | 4.37GB | TBC | - ||
| [Flex.2-preview-fluxified-Q3_K_S.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q3_K_S.gguf) | Q3_K_S | 3.52GB | TBC | - |
| [Flex.2-preview-fluxified-Q2_K.gguf](https://huggingface.co/Eviation/Flex.2-preview/blob/main/fluxified-imat/Flex.2-preview-fluxified-Q2_K.gguf) | Q2_K | 2.82GB | TBC | - |
|