Kquant03 commited on
Commit
971b309
·
1 Parent(s): 79d4822

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ language:
8
  # I am become death, destroyer of worlds.
9
 
10
 
11
- ...32 experts in one model...at glorious 7B. Uses AIDC-ai-business/Marcoroni-7B-v3, Toten5/Marcoroni-neural-chat-7B-v2, HuggingFaceH4/zephyr-7b-beta, NurtureAI/neural-chat-7b-v3-16k, mlabonne/NeuralPipe-7B-ties, mlabonne/NeuralHermes-2.5-Mistral-7B, cognitivecomputations/dolphin-2.6-mistral-7b-dpo, SanjiWatsuki/Silicon-Maid-7B and xDAN-AI/xDAN-L1-Chat-RL-v1.
12
 
13
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
14
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
 
8
  # I am become death, destroyer of worlds.
9
 
10
 
11
+ ...32 experts in one frankenMoE...at glorious 7B. Uses AIDC-ai-business/Marcoroni-7B-v3, Toten5/Marcoroni-neural-chat-7B-v2, HuggingFaceH4/zephyr-7b-beta, NurtureAI/neural-chat-7b-v3-16k, mlabonne/NeuralPipe-7B-ties, mlabonne/NeuralHermes-2.5-Mistral-7B, cognitivecomputations/dolphin-2.6-mistral-7b-dpo, SanjiWatsuki/Silicon-Maid-7B and xDAN-AI/xDAN-L1-Chat-RL-v1.
12
 
13
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
14
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)