Update README.md
Browse files
README.md
CHANGED
|
@@ -50,7 +50,8 @@ widget:
|
|
| 50 |
### **review**
|
| 51 |
- use tag/word(s) as input for more accurate results for those legacy models; not very convenient (compare to the recent models) at the very beginning
|
| 52 |
- credits should be given to those contributors from civitai platform
|
| 53 |
-
- fp8 scaled
|
|
|
|
| 54 |
- good to run on old machines, i.e., 9xx series or before (legacy mode [--disable-cuda-malloc --lowvram] supported); compatible with the new gguf-node
|
| 55 |
|
| 56 |
### **reference**
|
|
|
|
| 50 |
### **review**
|
| 51 |
- use tag/word(s) as input for more accurate results for those legacy models; not very convenient (compare to the recent models) at the very beginning
|
| 52 |
- credits should be given to those contributors from civitai platform
|
| 53 |
+
- fast-illustrious gguf was quantized from fp8 scaled safetensors while illustrious gguf was quantized from the original bf16
|
| 54 |
+
- fp8 scaled file works fine in this model; including vae and clips
|
| 55 |
- good to run on old machines, i.e., 9xx series or before (legacy mode [--disable-cuda-malloc --lowvram] supported); compatible with the new gguf-node
|
| 56 |
|
| 57 |
### **reference**
|