Instructions to use optimum/all-MiniLM-L6-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use optimum/all-MiniLM-L6-v2 with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("optimum/all-MiniLM-L6-v2") sentences = [ "That is a happy person", "That is a happy dog", "That is a very happy person", "Today is a sunny day" ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [4, 4] - Notebooks
- Google Colab
- Kaggle
Is this ONNX?
I don't see anywhere on the model card onnx being used. Am I missing something?
there is model.onnx file but that can only be read with optimum I guess , and I want to read that with sentence transformer and i am not able to
did you find any solution
Hey folks, yes this is an ONNX model converted from sentence-transformers/all-MiniLM-L6-v2 by optimum. You would be able to leverage it for inference with APIs in optimum, more details here.
I had tried that code already and it failed: https://github.com/philschmid/optimum-transformers-optimizations/issues/2
Hi everyone, here's a minimal example to use sentence-transformers with optimum:
https://colab.research.google.com/drive/1ieB91hNJPooS6_VN_dgq0y87bm0q52_k?usp=sharing
The key component is ORTModelForFeatureExtraction which is equivalent to AutoModel in the model card example