monoelectra-base / README.md
nielsr's picture
nielsr HF Staff
Add library name and pipeline tag
d1abf56 verified
|
raw
history blame
367 Bytes
---
license: apache-2.0
library_name: transformers
pipeline_tag: question-answering
---
This model was introduced in the paper [A Systematic Investigation of Distilling Large Language Models into Cross-Encoders for Passage Re-ranking](https://arxiv.org/abs/2405.07920).
For code, examples and more, please visit https://github.com/webis-de/msmarco-llm-distillation.