Instructions to use rockmiin/QMSum-dpr-passage-encoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use rockmiin/QMSum-dpr-passage-encoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="rockmiin/QMSum-dpr-passage-encoder")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("rockmiin/QMSum-dpr-passage-encoder") model = AutoModel.from_pretrained("rockmiin/QMSum-dpr-passage-encoder") - Notebooks
- Google Colab
- Kaggle
Adding `safetensors` variant of this model
#1 opened about 1 year ago
by
SFconvertbot