Instructions to use cross-encoder/ms-marco-MiniLM-L12-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use cross-encoder/ms-marco-MiniLM-L12-v2 with sentence-transformers:
from sentence_transformers import CrossEncoder model = CrossEncoder("cross-encoder/ms-marco-MiniLM-L12-v2") query = "Which planet is known as the Red Planet?" passages = [ "Venus is often called Earth's twin because of its similar size and proximity.", "Mars, known for its reddish appearance, is often referred to as the Red Planet.", "Jupiter, the largest planet in our solar system, has a prominent red spot.", "Saturn, famous for its rings, is sometimes mistaken for the Red Planet." ] scores = model.predict([(query, passage) for passage in passages]) print(scores) - Transformers
How to use cross-encoder/ms-marco-MiniLM-L12-v2 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("cross-encoder/ms-marco-MiniLM-L12-v2") model = AutoModelForSequenceClassification.from_pretrained("cross-encoder/ms-marco-MiniLM-L12-v2") - Notebooks
- Google Colab
- Kaggle
Add TF weights
Validated by the pt_to_tf CLI. Max crossload hidden state difference=1.669e-06; Max converted hidden state difference=1.669e-06.
Hi there π
I'm a TF maintainer at Hugging Face, and this is your most downloaded model whose weights can be automatically converted into TensorFlow, using our tools. We believe that having TF weights would be of interest to the community, and will further boost the visibility of the model.
I also don't want to be a source of spam! Let me know if you are interested in merging these TF weights, and if you would like me to open PRs with TF weights for other models that you own π€
Hi again π
My apologies -- our automatic conversion tool was missing the conversion of some model heads, and this was one of the incomplete conversions. We also added much stricter equivalence tests (https://github.com/huggingface/transformers/pull/17588), to ensure TF users enjoy the exact same model experience as PT users.