alime-embedding-large-zh
The alime embedding model.
Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
pip install -U sentence-transformers
Then you can use the model like this:
from sentence_transformers import SentenceTransformer
sentences = ["西湖在哪?", "西湖风景名胜区位于浙江省杭州市"]
model = SentenceTransformer('Pristinenlp/alime-embedding-large-zh')
embeddings = model.encode(sentences, normalize_embeddings=True)
print(embeddings)
- Downloads last month
- 13
Spaces using Pristinenlp/alime-embedding-large-zh 10
Evaluation results
- cos_sim_pearson on MTEB AFQMCvalidation set self-reported49.648
- cos_sim_spearman on MTEB AFQMCvalidation set self-reported54.733
- euclidean_pearson on MTEB AFQMCvalidation set self-reported53.063
- euclidean_spearman on MTEB AFQMCvalidation set self-reported54.733
- manhattan_pearson on MTEB AFQMCvalidation set self-reported53.048
- manhattan_spearman on MTEB AFQMCvalidation set self-reported54.729
- cos_sim_pearson on MTEB ATECtest set self-reported48.659
- cos_sim_spearman on MTEB ATECtest set self-reported55.125
- euclidean_pearson on MTEB ATECtest set self-reported55.734
- euclidean_spearman on MTEB ATECtest set self-reported55.125
- manhattan_pearson on MTEB ATECtest set self-reported55.712
- manhattan_spearman on MTEB ATECtest set self-reported55.122
- accuracy on MTEB AmazonReviewsClassification (zh)test set self-reported46.950
- f1 on MTEB AmazonReviewsClassification (zh)test set self-reported45.344
- cos_sim_pearson on MTEB BQtest set self-reported62.927
- cos_sim_spearman on MTEB BQtest set self-reported64.888
- euclidean_pearson on MTEB BQtest set self-reported63.314
- euclidean_spearman on MTEB BQtest set self-reported64.888
- manhattan_pearson on MTEB BQtest set self-reported63.222
- manhattan_spearman on MTEB BQtest set self-reported64.798