Update README.md
Browse files
README.md
CHANGED
|
@@ -15,6 +15,7 @@ For more information (including how to compress models yourself), check out http
|
|
| 15 |
|
| 16 |
Feel free to request for other models for compression as well, although compressing models that do not use the Flux architecture might be tricky for me.
|
| 17 |
|
|
|
|
| 18 |
|
| 19 |
### How to Use
|
| 20 |
|
|
|
|
| 15 |
|
| 16 |
Feel free to request for other models for compression as well, although compressing models that do not use the Flux architecture might be tricky for me.
|
| 17 |
|
| 18 |
+
This compressed model was made from [rockerBOO/flux.1-dev-SRPO](https://huggingface.co/rockerBOO/flux.1-dev-SRPO)'s BF16 quantization. Thanks to rockerBOO, without which I would not have been able to directly work with. (My PC only has 48 GB of VRAM, too little to work with a 12B model in FP32 precision)
|
| 19 |
|
| 20 |
### How to Use
|
| 21 |
|