Instructions to use ielabgroup/Rank-R1-3B-v0.1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ielabgroup/Rank-R1-3B-v0.1 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("ielabgroup/Rank-R1-3B-v0.1", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update model metadata to set pipeline tag to the new `text-ranking`
#1
by tomaarsen HF Staff - opened
README.md
CHANGED
|
@@ -3,6 +3,7 @@ library_name: transformers
|
|
| 3 |
license: apache-2.0
|
| 4 |
datasets:
|
| 5 |
- Tevatron/msmarco-passage
|
|
|
|
| 6 |
---
|
| 7 |
|
| 8 |
# Rank-R1
|
|
|
|
| 3 |
license: apache-2.0
|
| 4 |
datasets:
|
| 5 |
- Tevatron/msmarco-passage
|
| 6 |
+
pipeline_tag: text-ranking
|
| 7 |
---
|
| 8 |
|
| 9 |
# Rank-R1
|