Update README.md
Browse files
README.md
CHANGED
|
@@ -17,8 +17,6 @@ language:
|
|
| 17 |
|
| 18 |
EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mistral-Large-Instruct-2407/edit/main/README.md)
|
| 19 |
|
| 20 |
-
_**This model requires the dev branch of ExLlamaV2 for now. New release coming soon with the necessary changes.**_
|
| 21 |
-
|
| 22 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.0bpw)
|
| 23 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.5bpw)
|
| 24 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|
|
|
|
| 17 |
|
| 18 |
EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mistral-Large-Instruct-2407/edit/main/README.md)
|
| 19 |
|
|
|
|
|
|
|
| 20 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.0bpw)
|
| 21 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/3.5bpw)
|
| 22 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|