lvkaokao
commited on
Commit
·
9dead9b
1
Parent(s):
ef0a5f3
update doc.
Browse files
README.md
CHANGED
|
@@ -2866,4 +2866,42 @@ model-index:
|
|
| 2866 |
value: 87.15717597277224
|
| 2867 |
- type: max_f1
|
| 2868 |
value: 79.71815316150567
|
| 2869 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2866 |
value: 87.15717597277224
|
| 2867 |
- type: max_f1
|
| 2868 |
value: 79.71815316150567
|
| 2869 |
+
---
|
| 2870 |
+
|
| 2871 |
+
## Model Details:
|
| 2872 |
+
|
| 2873 |
+
This embedding model is a fine-tuned 10.7B parameter LLM on the Intel Gaudi 2 processor using the [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0).
|
| 2874 |
+
|
| 2875 |
+
## Date
|
| 2876 |
+
|
| 2877 |
+
July, 2024
|
| 2878 |
+
|
| 2879 |
+
## Training Details
|
| 2880 |
+
|
| 2881 |
+
Two stage training:
|
| 2882 |
+
- General Text Embedding Training
|
| 2883 |
+
- Specific domains Emebedding Training
|
| 2884 |
+
|
| 2885 |
+
|
| 2886 |
+
More technical details will be updated later.
|
| 2887 |
+
|
| 2888 |
+
|
| 2889 |
+
## Evaluation
|
| 2890 |
+
|
| 2891 |
+
The results of (MTEB)[https://huggingface.co/spaces/mteb/leaderboard] (English):
|
| 2892 |
+
|
| 2893 |
+
| Model Name | MTEB(56) ||
|
| 2894 |
+
|:----:|:---------:|:----------:|
|
| 2895 |
+
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 |
|
| 2896 |
+
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 |
|
| 2897 |
+
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 |
|
| 2898 |
+
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 |
|
| 2899 |
+
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 |
|
| 2900 |
+
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 |
|
| 2901 |
+
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 |
|
| 2902 |
+
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 |
|
| 2903 |
+
| [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 |
|
| 2904 |
+
| [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 |
|
| 2905 |
+
| [**neural-embedding-v1**](https://huggingface.co/Intel/neural-embedding-v1) | **69.94** |
|
| 2906 |
+
|
| 2907 |
+
|