Instructions to use BAAI/bge-m3 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use BAAI/bge-m3 with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("BAAI/bge-m3") sentences = [ "That is a happy person", "That is a happy dog", "That is a very happy person", "Today is a sunny day" ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [4, 4] - Inference
- Notebooks
- Google Colab
- Kaggle
Adding ONNX file of this model
Beep boop I am the ONNX export bot ๐ค๐๏ธ. On behalf of yashvardhan7, I would like to add to this repository the model converted to ONNX.
What is ONNX? It stands for "Open Neural Network Exchange", and is the most commonly used open standard for machine learning interoperability. You can find out more at onnx.ai!
The exported ONNX model can be then be consumed by various backends as TensorRT or TVM, or simply be used in a few lines with ๐ค Optimum through ONNX Runtime, check out how here!
@yashvardhan7 , can you provide an example to use this onnx model? It seems that this onnx version cannot be used with ORTModelForFeatureExtraction class: https://huggingface.co/BAAI/bge-m3/discussions/50