vikhyatk commited on
Commit
6397c64
·
verified ·
1 Parent(s): 1ac5e95

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -6,12 +6,11 @@ library_name: transformers
6
 
7
  [✨ Demo](https://moondream.ai/c/playground)   ·   [☁️ Cloud API](https://moondream.ai/c/docs/quickstart)   ·   _📝 Release notes_ (coming soon)
8
 
9
- ### Examples
10
 
11
  ![](https://huggingface.co/moondream/moondream3-preview/resolve/main/structured_outputs.png)
12
  ![](https://huggingface.co/moondream/moondream3-preview/resolve/main/open_vocab_detect.png)
13
 
14
- ### Architecture
15
 
16
  1. 24 layers; the first four are dense, the rest have MoE FFNs with 64 experts, 8 activated per token
17
  2. MoE FFNs have GeGLU architecture, with inner/gate dim of 1024. The model's hidden dim is 2048.
 
6
 
7
  [✨ Demo](https://moondream.ai/c/playground)   ·   [☁️ Cloud API](https://moondream.ai/c/docs/quickstart)   ·   _📝 Release notes_ (coming soon)
8
 
 
9
 
10
  ![](https://huggingface.co/moondream/moondream3-preview/resolve/main/structured_outputs.png)
11
  ![](https://huggingface.co/moondream/moondream3-preview/resolve/main/open_vocab_detect.png)
12
 
13
+ ## Architecture
14
 
15
  1. 24 layers; the first four are dense, the rest have MoE FFNs with 64 experts, 8 activated per token
16
  2. MoE FFNs have GeGLU architecture, with inner/gate dim of 1024. The model's hidden dim is 2048.