Instructions to use webis/monoelectra-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Lightning IR
How to use webis/monoelectra-large with Lightning IR:
#install from https://github.com/webis-de/lightning-ir from lightning_ir import CrossEncoderModule model = CrossEncoderModule("webis/monoelectra-large") model.score("query", ["doc1", "doc2", "doc3"]) - Notebooks
- Google Colab
- Kaggle
Add pipeline tag, library name, and link to code and paper
#1
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,3 +1,9 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
+
pipeline_tag: question-answering
|
| 4 |
+
library_name: transformers
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
This repository contains the model described in the paper [A Systematic Investigation of Distilling Large Language Models into Cross-Encoders for Passage Re-ranking](https://arxiv.org/abs/2405.07920).
|
| 8 |
+
|
| 9 |
+
The code for training and evaluation can be found at https://github.com/webis-de/msmarco-llm-distillation.
|