DSI-large-NQ10k

This repository contains one of the models analyzed in our paper Reverse-Engineering the Retrieval Process in GenIR Models (See our website for a quick overview.).

Training

The model is based on T5-large and was trained on a randomly selected subset of 10k documents from Natural Questions as a atomic GenIR model reproducing DSI. The dataset can be found here.

Usage

Here is a complete example of using the models for retrieval.

Quick example usage:

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

model_path = 'AnReu/DSI-large-NQ10k'
model = AutoModelForSeq2SeqLM.from_pretrained(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)

query = 'this is a test query'
input_ids = tokenizer(query, return_tensors='pt').input_ids
decoder_input_ids = torch.zeros([1,1], dtype=torch.int64)
output = model(input_ids, decoder_input_ids=decoder_input_ids)

Model Overview

Model Huggingface URL
NQ10k DSI-large-NQ10k
NQ100k DSI-large-NQ100k
NQ320k DSI-large-NQ320k
Trivia-QA DSI-large-TriviaQA
Trivia-QA QG DSI-large-TriviaQA QG

Citation

@inproceedings{Reusch2025Reverse,
  author = {Reusch, Anja and Belinkov, Yonatan},
  title = {Reverse-Engineering the Retrieval Process in GenIR Models},
  year = {2025},
  isbn = {9798400715921},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3726302.3730076},
  doi = {10.1145/3726302.3730076},
  booktitle = {Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval},
  pages = {668–677},
  numpages = {10},
  location = {Padua, Italy},
  series = {SIGIR '25}
}
Downloads last month
1
Safetensors
Model size
0.7B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AnReu/DSI-large-NQ10k

Base model

google-t5/t5-large
Finetuned
(173)
this model

Dataset used to train AnReu/DSI-large-NQ10k

Paper for AnReu/DSI-large-NQ10k