RankMistral / README.md
FilippoBetello's picture
Update README.md
fc93d04 verified
---
library_name: transformers
tags: []
---
# Model Card for RankMistral
RankMistral, finetuned from [Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3) using [rank_llm dataset](https://huggingface.co/datasets/castorini/rank_llm_data).
## Results
From [QPP-RA: Aggregating Large Language Model Rankings](https://dl.acm.org/doi/pdf/10.1145/3731120.3744575)
Using the [Rank LLM Library](https://github.com/castorini/rank_llm).
![image/png](https://cdn-uploads.huggingface.co/production/uploads/658daf028965a503497d87c0/dXIl10uMW7rvBimCiIuQS.png)
## Citation
If you use this model please cite:
```
@inproceedings{10.1145/3731120.3744575,
author = {Betello, Filippo and Russo, Matteo and D\"{u}tting, Paul and Leonardi, Stefano and Silvestri, Fabrizio},
title = {QPP-RA: Aggregating Large Language Model Rankings},
year = {2025},
isbn = {9798400718618},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3731120.3744575},
doi = {10.1145/3731120.3744575},
booktitle = {Proceedings of the 2025 International ACM SIGIR Conference on Innovative Concepts and Theories in Information Retrieval (ICTIR)},
pages = {103–114},
numpages = {12},
keywords = {llm, query performance prediction, rank aggregation},
location = {Padua, Italy},
series = {ICTIR '25}
}
```