Update README.md
Browse files
README.md
CHANGED
|
@@ -19,6 +19,7 @@ The config looks like this...(detailed version is in the files and versions):
|
|
| 19 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
|
| 20 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
|
| 21 |
|
|
|
|
| 22 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
| 23 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
| 24 |
|
|
|
|
| 19 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
|
| 20 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
|
| 21 |
|
| 22 |
+
[Join our Discord!](https://discord.gg/CAfWPV82)
|
| 23 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
| 24 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
| 25 |
|