monoelectra-base / README.md
nielsr's picture
nielsr HF Staff
Add library name and pipeline tag
d1abf56 verified
|
raw
history blame
367 Bytes
metadata
license: apache-2.0
library_name: transformers
pipeline_tag: question-answering

This model was introduced in the paper A Systematic Investigation of Distilling Large Language Models into Cross-Encoders for Passage Re-ranking.

For code, examples and more, please visit https://github.com/webis-de/msmarco-llm-distillation.