cmh commited on
Commit
65246fb
·
verified ·
1 Parent(s): 25d0bec

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -27,7 +27,9 @@ llama-quantize --allow-requantize --pure phi-4.bf16.gguf phi-4.bf16.q8_p.gguf q8
27
  | -------- | ---------- | --------- | -------- |
28
  | [phi-4.q8.q4](https://huggingface.co/cmh/test/blob/main/phi-4.q8.q4.gguf) | 4.00 bits per weight | 9.43 GB | **12.9 GB** |
29
  | [phi-4.bf16.q5](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.gguf) | 5.00 bits per weight | 11.9 GB | **14.2 GB** |
 
30
  | [phi-4.bf16.q6](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.gguf) | 6.00 bits per weight | 13.2 GB | **15.5 GB** |
 
31
  | [phi-4.bf16.q8](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8.gguf) | 8.00 bits per weight | 16.5 GB | **18.5 GB** |
32
  | [phi-4.bf16.q8_p](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8_p.gguf) | 8.00 bits per weight | 15.6 GB | **18.6 GB** |
33
 
 
27
  | -------- | ---------- | --------- | -------- |
28
  | [phi-4.q8.q4](https://huggingface.co/cmh/test/blob/main/phi-4.q8.q4.gguf) | 4.00 bits per weight | 9.43 GB | **12.9 GB** |
29
  | [phi-4.bf16.q5](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.gguf) | 5.00 bits per weight | 11.9 GB | **14.2 GB** |
30
+ | [phi-4.bf16.q5.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q5.im.gguf) | 5.00 bits per weight | 11.9 GB | **14.2 GB** |
31
  | [phi-4.bf16.q6](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.gguf) | 6.00 bits per weight | 13.2 GB | **15.5 GB** |
32
+ | [phi-4.bf16.q6.im](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q6.im.gguf) | 6.00 bits per weight | 13.2 GB | **15.5 GB** |
33
  | [phi-4.bf16.q8](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8.gguf) | 8.00 bits per weight | 16.5 GB | **18.5 GB** |
34
  | [phi-4.bf16.q8_p](https://huggingface.co/cmh/test/blob/main/phi-4.bf16.q8_p.gguf) | 8.00 bits per weight | 15.6 GB | **18.6 GB** |
35