cmh commited on
Commit
bdcf548
·
verified ·
1 Parent(s): c495e3e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -25,16 +25,16 @@ llama-quantize --allow-requantize --output-tensor-type bf16 --token-embedding-ty
25
  llama-quantize --allow-requantize --pure phi-4.bf16.gguf phi-4.bf16.q8p.gguf q8_0'
26
  ```
27
 
28
- | | Quant type | File Size | ~Vram*| |
29
- | -------- | ---------- | --------- | -------- |-------- |
30
- | [phi-4.q8.q4](https://huggingface.co/cmh/test/blob/main/phi-4.q8.q4.gguf) | 4 bits per weight | 9.43 GB | **12.9 GB** | |
31
- | [phi-4.bf16.q5](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.gguf) | 5 bits per weight | 11.9 GB | **14.2 GB** | recommended |
32
- | [phi-4.bf16.q5.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.im.gguf) | 5 bits per weight | 11.9 GB | **14.2 GB** | |
33
- | [phi-4.bf16.q6](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.gguf) | 6 bits per weight | 13.2 GB | **15.5 GB** | |
34
- | [phi-4.bf16.q6.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.im.gguf) | 6 bits per weight | 13.2 GB | **15.5 GB** | |
35
- | [phi-4.bf16.q8](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8.gguf) | 8 bits per weight | 16.5 GB | **18.5 GB** | |
36
- | [phi-4.bf16.q8p](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8p.gguf) | 8 bits per weight | 15.6 GB | **18.6 GB** | |
37
- | [phi-4.bf16](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.gguf) | 16 bits per weight | 29.3 GB | |
38
 
39
  <sub>*approximate value at **16k context, FP16 cache**.<sup>
40
 
 
25
  llama-quantize --allow-requantize --pure phi-4.bf16.gguf phi-4.bf16.q8p.gguf q8_0'
26
  ```
27
 
28
+ | | Quant type | File Size | ~Vram*|
29
+ | -------- | ---------- | --------- | -------- |
30
+ | [phi-4.q8.q4](https://huggingface.co/cmh/test/blob/main/phi-4.q8.q4.gguf) | 4 bits per weight | 9.43 GB | **12.9 GB** |
31
+ | [phi-4.bf16.q5](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.gguf) | 5 bits per weight | 11.9 GB | **14.2 GB** |
32
+ | [phi-4.bf16.q5.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.im.gguf) | 5 bits per weight | 11.9 GB | **14.2 GB** |
33
+ | [phi-4.bf16.q6](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.gguf) | 6 bits per weight | 13.2 GB | **15.5 GB** |
34
+ | [phi-4.bf16.q6.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.im.gguf) | 6 bits per weight | 13.2 GB | **15.5 GB** |
35
+ | [phi-4.bf16.q8](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8.gguf) | 8 bits per weight | 16.5 GB | **18.5 GB** |
36
+ | [phi-4.bf16.q8p](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8p.gguf) | 8 bits per weight | 15.6 GB | **18.6 GB** |
37
+ | [phi-4.bf16](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.gguf) | 16 bits per weight | 29.3 GB |
38
 
39
  <sub>*approximate value at **16k context, FP16 cache**.<sup>
40