Update README.md
Browse files
README.md
CHANGED
|
@@ -98,7 +98,7 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
| 98 |
|
| 99 |
## Results
|
| 100 |
|
| 101 |
-
|
| 102 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a>
|
| 103 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
|
| 104 |
|
|
@@ -109,7 +109,7 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
| 109 |
|
| 110 |
</details>
|
| 111 |
|
| 112 |
-
|
| 113 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
|
| 114 |
|
| 115 |
<details>
|
|
@@ -123,12 +123,12 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
| 123 |
|
| 124 |
|
| 125 |
## Usage Format
|
| 126 |
-
|
| 127 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
| 128 |
- 2B, 9B: User:{query}\nAssistant:{response}\<eos\>
|
| 129 |
- 3.8B: <|user|>\n{query}<|end|><|assisitant|>\n{response}<|end|>
|
| 130 |
|
| 131 |
-
|
| 132 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
| 133 |
|
| 134 |
## Dataset & Evaluation
|
|
|
|
| 98 |
|
| 99 |
## Results
|
| 100 |
|
| 101 |
+
#### Dense
|
| 102 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a>
|
| 103 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
|
| 104 |
|
|
|
|
| 109 |
|
| 110 |
</details>
|
| 111 |
|
| 112 |
+
#### Post-MoE
|
| 113 |
🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
|
| 114 |
|
| 115 |
<details>
|
|
|
|
| 123 |
|
| 124 |
|
| 125 |
## Usage Format
|
| 126 |
+
##### Apollo2
|
| 127 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
| 128 |
- 2B, 9B: User:{query}\nAssistant:{response}\<eos\>
|
| 129 |
- 3.8B: <|user|>\n{query}<|end|><|assisitant|>\n{response}<|end|>
|
| 130 |
|
| 131 |
+
##### Apollo-MoE
|
| 132 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
| 133 |
|
| 134 |
## Dataset & Evaluation
|