HeavensHackDev commited on
Commit
facd25a
·
verified ·
1 Parent(s): f09efc3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -13,7 +13,8 @@ tags:
13
  **HCAE-21M** is a mid-scale (21 Million parameters) text embedding model combining Depthwise Separable Convolutions and Self-Attention layers. It achieves high performance on Semantic Textual Similarity and Retrieval tasks while remaining extremely memory-efficient.
14
 
15
 
16
- ![Group 2 (1)](https://cdn-uploads.huggingface.co/production/uploads/680c9127408ea47e6c1dd6e8/TZz2drDoVsxfPQu3ExwDh.png)
 
17
 
18
  ## Architecture Description
19
  - **Size:** ~21M parameters (d_model=384)
 
13
  **HCAE-21M** is a mid-scale (21 Million parameters) text embedding model combining Depthwise Separable Convolutions and Self-Attention layers. It achieves high performance on Semantic Textual Similarity and Retrieval tasks while remaining extremely memory-efficient.
14
 
15
 
16
+
17
+ ![Group 2 (2)](https://cdn-uploads.huggingface.co/production/uploads/680c9127408ea47e6c1dd6e8/0KKsVpqg2Id01nxh8zRjO.png)
18
 
19
  ## Architecture Description
20
  - **Size:** ~21M parameters (d_model=384)