Update README.md
Browse files
README.md
CHANGED
|
@@ -1,8 +1,13 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
| 4 |
-
This is a hacked together version of the new Mistral
|
|
|
|
|
|
|
| 5 |
|
| 6 |
Credit to Mistral AI and the amazing team over there and Cognitive Computations especially Eric Hartford for tutelage and helping navigate the LLM landscape.
|
| 7 |
|
| 8 |
-
As this is a mix of Mistral 7b v0.1 and Mistral 7b v0.2 files it is to be considered a pre-alpha
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
| 4 |
+
This is a hacked together version of the new Mistral-7b-v0.2 FP16 weights directly downloaded from their CDN.
|
| 5 |
+
|
| 6 |
+
The conversion was done by directly converting the monolithic pickle file to safetensors and building the index which is suboptimal.
|
| 7 |
|
| 8 |
Credit to Mistral AI and the amazing team over there and Cognitive Computations especially Eric Hartford for tutelage and helping navigate the LLM landscape.
|
| 9 |
|
| 10 |
+
~~As this is a mix of Mistral 7b v0.1 and Mistral 7b v0.2 files it is to be considered a pre-alpha.~~
|
| 11 |
+
|
| 12 |
+
This conversion is suboptimal and I would use https://huggingface.co/alpindale/Mistral-7B-v0.2-hf for the FP-16 weights until MistralAI does their offical release.
|
| 13 |
+
|