Update README.md
Browse files
README.md
CHANGED
|
@@ -17,6 +17,9 @@ language:
|
|
| 17 |
|
| 18 |
EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mistral-Large-Instruct-2407)
|
| 19 |
|
|
|
|
|
|
|
|
|
|
| 20 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.0bpw)
|
| 21 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.5bpw)
|
| 22 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|
|
@@ -24,5 +27,6 @@ EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mi
|
|
| 24 |
[4.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.5bpw)
|
| 25 |
[4.75 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.75bpw)
|
| 26 |
[5.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/5.0bpw)
|
|
|
|
| 27 |
|
| 28 |
[measurement.json](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/blob/main/measurement.json)
|
|
|
|
| 17 |
|
| 18 |
EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mistral-Large-Instruct-2407)
|
| 19 |
|
| 20 |
+
[2.30 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/2.3bpw)
|
| 21 |
+
[2.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/2.5bpw)
|
| 22 |
+
[2.75 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/2.75bpw)
|
| 23 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.0bpw)
|
| 24 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.5bpw)
|
| 25 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|
|
|
|
| 27 |
[4.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.5bpw)
|
| 28 |
[4.75 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.75bpw)
|
| 29 |
[5.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/5.0bpw)
|
| 30 |
+
[6.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/6.0bpw)
|
| 31 |
|
| 32 |
[measurement.json](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/blob/main/measurement.json)
|