Update README.md
Browse files
README.md
CHANGED
|
@@ -28,6 +28,42 @@ EduMixtral-4x7B is a Mixture of Experts (MoE) made with the following models usi
|
|
| 28 |
* [mistralai/Mathstral-7B-v0.1](https://huggingface.co/mistralai/Mathstral-7B-v0.1)
|
| 29 |
* [FPHam/Writing_Partner_Mistral_7B](https://huggingface.co/FPHam/Writing_Partner_Mistral_7B)
|
| 30 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 31 |
## 💻 Usage
|
| 32 |
|
| 33 |
It is reccomended to load in 8bit or 4bit quantization
|
|
@@ -69,40 +105,4 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
|
| 69 |
>To find the total number of pages she read in two days, we add the pages she read on the first day and the second day:
|
| 70 |
>\[ 30 \text{ pages} + 48 \text{ pages} = 78 \text{ pages} \]
|
| 71 |
>Therefore, Xiaoli read a total of 78 pages in two days.
|
| 72 |
-
>Final answer: Xiaoli read 78 pages in total
|
| 73 |
-
|
| 74 |
-
## 🧩 Configuration
|
| 75 |
-
|
| 76 |
-
```yaml
|
| 77 |
-
base_model: mlabonne/NeuralDaredevil-7B
|
| 78 |
-
gate_mode: hidden
|
| 79 |
-
experts:
|
| 80 |
-
- source_model: mlabonne/NeuralDaredevil-7B
|
| 81 |
-
positive_prompts:
|
| 82 |
-
- "hello"
|
| 83 |
-
- "help"
|
| 84 |
-
- "question"
|
| 85 |
-
- "explain"
|
| 86 |
-
- "information"
|
| 87 |
-
- source_model: BioMistral/BioMistral-7B
|
| 88 |
-
positive_prompts:
|
| 89 |
-
- "medical"
|
| 90 |
-
- "health"
|
| 91 |
-
- "biomedical"
|
| 92 |
-
- "clinical"
|
| 93 |
-
- "anatomy"
|
| 94 |
-
- source_model: mistralai/Mathstral-7B-v0.1
|
| 95 |
-
positive_prompts:
|
| 96 |
-
- "math"
|
| 97 |
-
- "calculation"
|
| 98 |
-
- "equation"
|
| 99 |
-
- "geometry"
|
| 100 |
-
- "algebra"
|
| 101 |
-
- source_model: FPHam/Writing_Partner_Mistral_7B
|
| 102 |
-
positive_prompts:
|
| 103 |
-
- "writing"
|
| 104 |
-
- "creative process"
|
| 105 |
-
- "story structure"
|
| 106 |
-
- "character development"
|
| 107 |
-
- "plot"
|
| 108 |
-
```
|
|
|
|
| 28 |
* [mistralai/Mathstral-7B-v0.1](https://huggingface.co/mistralai/Mathstral-7B-v0.1)
|
| 29 |
* [FPHam/Writing_Partner_Mistral_7B](https://huggingface.co/FPHam/Writing_Partner_Mistral_7B)
|
| 30 |
|
| 31 |
+
## 🧩 Configuration
|
| 32 |
+
|
| 33 |
+
```yaml
|
| 34 |
+
base_model: mlabonne/NeuralDaredevil-7B
|
| 35 |
+
gate_mode: hidden
|
| 36 |
+
experts:
|
| 37 |
+
- source_model: mlabonne/NeuralDaredevil-7B
|
| 38 |
+
positive_prompts:
|
| 39 |
+
- "hello"
|
| 40 |
+
- "help"
|
| 41 |
+
- "question"
|
| 42 |
+
- "explain"
|
| 43 |
+
- "information"
|
| 44 |
+
- source_model: BioMistral/BioMistral-7B
|
| 45 |
+
positive_prompts:
|
| 46 |
+
- "medical"
|
| 47 |
+
- "health"
|
| 48 |
+
- "biomedical"
|
| 49 |
+
- "clinical"
|
| 50 |
+
- "anatomy"
|
| 51 |
+
- source_model: mistralai/Mathstral-7B-v0.1
|
| 52 |
+
positive_prompts:
|
| 53 |
+
- "math"
|
| 54 |
+
- "calculation"
|
| 55 |
+
- "equation"
|
| 56 |
+
- "geometry"
|
| 57 |
+
- "algebra"
|
| 58 |
+
- source_model: FPHam/Writing_Partner_Mistral_7B
|
| 59 |
+
positive_prompts:
|
| 60 |
+
- "writing"
|
| 61 |
+
- "creative process"
|
| 62 |
+
- "story structure"
|
| 63 |
+
- "character development"
|
| 64 |
+
- "plot"
|
| 65 |
+
```
|
| 66 |
+
|
| 67 |
## 💻 Usage
|
| 68 |
|
| 69 |
It is reccomended to load in 8bit or 4bit quantization
|
|
|
|
| 105 |
>To find the total number of pages she read in two days, we add the pages she read on the first day and the second day:
|
| 106 |
>\[ 30 \text{ pages} + 48 \text{ pages} = 78 \text{ pages} \]
|
| 107 |
>Therefore, Xiaoli read a total of 78 pages in two days.
|
| 108 |
+
>Final answer: Xiaoli read 78 pages in total
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|