Sentence Similarity
sentence-transformers
ONNX
Safetensors
Transformers.js
bert
feature-extraction
mteb
arctic
snowflake-arctic-embed
Eval Results (legacy)
text-embeddings-inference
Instructions to use Snowflake/snowflake-arctic-embed-m with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use Snowflake/snowflake-arctic-embed-m with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("Snowflake/snowflake-arctic-embed-m") sentences = [ "That is a happy person", "That is a happy dog", "That is a very happy person", "Today is a sunny day" ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [4, 4] - Transformers.js
How to use Snowflake/snowflake-arctic-embed-m with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('sentence-similarity', 'Snowflake/snowflake-arctic-embed-m'); - Inference
- Notebooks
- Google Colab
- Kaggle
Fix incorrect hyperlink in Model Card (#3)
Browse files- Fix incorrect hyperlink in Model Card (c6fe7775e88189d823c330f10853c79ac8d3f60c)
Co-authored-by: Tom Aarsen <tomaarsen@users.noreply.huggingface.co>
README.md
CHANGED
|
@@ -2844,7 +2844,7 @@ The models are trained by leveraging existing open-source text representation mo
|
|
| 2844 |
| [snowflake-arctic-embed-s](https://huggingface.co/Snowflake/snowflake-arctic-embed-s/) | 51.98 | 33 | 384 |
|
| 2845 |
| [snowflake-arctic-embed-m](https://huggingface.co/Snowflake/snowflake-arctic-embed-m/) | 54.90 | 110 | 768 |
|
| 2846 |
| [snowflake-arctic-embed-m-long](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-long/) | 54.83 | 137 | 768 |
|
| 2847 |
-
| [snowflake-arctic-embed-
|
| 2848 |
|
| 2849 |
|
| 2850 |
Aside from being great open-source models, the largest model, [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/), can serve as a natural replacement for closed-source embedding, as shown below.
|
|
|
|
| 2844 |
| [snowflake-arctic-embed-s](https://huggingface.co/Snowflake/snowflake-arctic-embed-s/) | 51.98 | 33 | 384 |
|
| 2845 |
| [snowflake-arctic-embed-m](https://huggingface.co/Snowflake/snowflake-arctic-embed-m/) | 54.90 | 110 | 768 |
|
| 2846 |
| [snowflake-arctic-embed-m-long](https://huggingface.co/Snowflake/snowflake-arctic-embed-m-long/) | 54.83 | 137 | 768 |
|
| 2847 |
+
| [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/) | 55.98 | 335 | 1024 |
|
| 2848 |
|
| 2849 |
|
| 2850 |
Aside from being great open-source models, the largest model, [snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l/), can serve as a natural replacement for closed-source embedding, as shown below.
|