Leonard Püttmann commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,14 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is my glorious attempt to understand the Mistral 7B model. Because the people from Mistral AI have open-sourced their model code, I tried to replicate a small version of the model. Like... really small. A whopping a million parameters. Needless to say, the model is useless for anything.
|
| 2 |
+
|
| 3 |
+
The model was trained on a handful examples from the Cosmopedia dataset, which is an open-source version of the high quality textbook dataset in a similar style to the Phi dataset.
|
| 4 |
+
Check out my GitHub to see the code used: https://github.com/LeonardPuettmann/understanding-mistral
|
| 5 |
+
|
| 6 |
+
### How to use
|
| 7 |
+
Please don't. You should probably use Mistral 7B instead: [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3)
|
| 8 |
+
Or if you are (very) GPU rich, you can try to train their model yourself: https://github.com/mistralai/mistral-inference
|
| 9 |
+
|
| 10 |
+
In the folder `inference` you actually find a small script, which allows you to chat with the 7B param model. All you need is a free HuggingFace API token.
|
| 11 |
+
|
| 12 |
+
---
|
| 13 |
+
license: apache-2.0
|
| 14 |
+
---
|