id stringlengths 1 4 | question_id stringlengths 1 4 | question stringclasses 86 values | answer stringclasses 2 values | image_source stringclasses 500 values | image imagewidth (px) 333 640 | category stringclasses 3 values | pope_old_answer stringclasses 2 values |
|---|---|---|---|---|---|---|---|
0 | 1 | Is there a snowboard in the image? | yes | COCO_val2014_000000310196 | adversarial | yes | |
1 | 2 | Is there a backpack in the image? | no | COCO_val2014_000000310196 | adversarial | no | |
2 | 3 | Is there a person in the image? | yes | COCO_val2014_000000310196 | adversarial | yes | |
3 | 4 | Is there a car in the image? | no | COCO_val2014_000000310196 | adversarial | no | |
4 | 5 | Is there a skis in the image? | no | COCO_val2014_000000310196 | adversarial | yes | |
5 | 6 | Is there a dog in the image? | no | COCO_val2014_000000310196 | adversarial | no | |
6 | 7 | Is there a truck in the image? | yes | COCO_val2014_000000210789 | adversarial | yes | |
8 | 9 | Is there a person in the image? | yes | COCO_val2014_000000210789 | adversarial | yes | |
9 | 10 | Is there a dining table in the image? | no | COCO_val2014_000000210789 | adversarial | no | |
10 | 11 | Is there an umbrella in the imange? | yes | COCO_val2014_000000210789 | adversarial | yes | |
11 | 12 | Is there a handbag in the image? | no | COCO_val2014_000000210789 | adversarial | no | |
12 | 13 | Is there a person in the image? | yes | COCO_val2014_000000429109 | adversarial | yes | |
13 | 14 | Is there a dining table in the image? | no | COCO_val2014_000000429109 | adversarial | no | |
14 | 15 | Is there a bicycle in the image? | yes | COCO_val2014_000000429109 | adversarial | yes | |
15 | 16 | Is there a motorcycle in the image? | no | COCO_val2014_000000429109 | adversarial | no | |
16 | 17 | Is there a car in the image? | yes | COCO_val2014_000000429109 | adversarial | yes | |
18 | 19 | Is there a person in the image? | yes | COCO_val2014_000000211674 | adversarial | yes | |
19 | 20 | Is there a dining table in the image? | no | COCO_val2014_000000211674 | adversarial | no | |
20 | 21 | Is there a potted plant in the image? | yes | COCO_val2014_000000211674 | adversarial | yes | |
21 | 22 | Is there a vase in the image? | no | COCO_val2014_000000211674 | adversarial | no | |
23 | 24 | Is there a truck in the image? | no | COCO_val2014_000000211674 | adversarial | no | |
24 | 25 | Is there a traffic light in the image? | yes | COCO_val2014_000000458338 | adversarial | yes | |
25 | 26 | Is there a bus in the image? | no | COCO_val2014_000000458338 | adversarial | no | |
26 | 27 | Is there a person in the image? | yes | COCO_val2014_000000458338 | adversarial | yes | |
27 | 28 | Is there a dining table in the image? | no | COCO_val2014_000000458338 | adversarial | no | |
28 | 29 | Is there a car in the image? | yes | COCO_val2014_000000458338 | adversarial | yes | |
30 | 31 | Is there a dog in the image? | yes | COCO_val2014_000000283412 | adversarial | yes | |
31 | 32 | Is there a person in the image? | no | COCO_val2014_000000283412 | adversarial | no | |
33 | 34 | Is there a chair in the image? | no | COCO_val2014_000000283412 | adversarial | no | |
34 | 35 | Is there a bed in the image? | no | COCO_val2014_000000283412 | adversarial | yes | |
35 | 36 | Is there a book in the image? | no | COCO_val2014_000000283412 | adversarial | no | |
36 | 37 | Is there a person in the image? | yes | COCO_val2014_000000265719 | adversarial | yes | |
37 | 38 | Is there a car in the image? | no | COCO_val2014_000000265719 | adversarial | no | |
38 | 39 | Is there a spoon in the image? | yes | COCO_val2014_000000265719 | adversarial | yes | |
39 | 40 | Is there a cup in the image? | no | COCO_val2014_000000265719 | adversarial | no | |
40 | 41 | Is there a fork in the image? | yes | COCO_val2014_000000265719 | adversarial | yes | |
41 | 42 | Is there a dining table in the image? | no | COCO_val2014_000000265719 | adversarial | no | |
42 | 43 | Is there a tv in the image? | no | COCO_val2014_000000461331 | adversarial | yes | |
43 | 44 | Is there a person in the image? | no | COCO_val2014_000000461331 | adversarial | no | |
44 | 45 | Is there a toaster in the image? | yes | COCO_val2014_000000461331 | adversarial | yes | |
45 | 46 | Is there a book in the image? | no | COCO_val2014_000000461331 | adversarial | no | |
46 | 47 | Is there a microwave in the image? | yes | COCO_val2014_000000461331 | adversarial | yes | |
47 | 48 | Is there a bottle in the image? | no | COCO_val2014_000000461331 | adversarial | no | |
48 | 49 | Is there a backpack in the image? | yes | COCO_val2014_000000544456 | adversarial | yes | |
49 | 50 | Is there a handbag in the image? | no | COCO_val2014_000000544456 | adversarial | no | |
50 | 51 | Is there a person in the image? | yes | COCO_val2014_000000544456 | adversarial | yes | |
51 | 52 | Is there a car in the image? | no | COCO_val2014_000000544456 | adversarial | no | |
52 | 53 | Is there a skis in the image? | yes | COCO_val2014_000000544456 | adversarial | yes | |
53 | 54 | Is there a snowboard in the image? | no | COCO_val2014_000000544456 | adversarial | no | |
54 | 55 | Is there a bird in the image? | yes | COCO_val2014_000000017708 | adversarial | yes | |
55 | 56 | Is there a handbag in the image? | no | COCO_val2014_000000017708 | adversarial | no | |
56 | 57 | Is there a person in the image? | yes | COCO_val2014_000000017708 | adversarial | yes | |
57 | 58 | Is there a car in the image? | no | COCO_val2014_000000017708 | adversarial | no | |
58 | 59 | Is there a boat in the image? | yes | COCO_val2014_000000017708 | adversarial | yes | |
59 | 60 | Is there a chair in the image? | no | COCO_val2014_000000017708 | adversarial | no | |
60 | 61 | Is there a person in the image? | yes | COCO_val2014_000000574692 | adversarial | yes | |
61 | 62 | Is there a car in the image? | no | COCO_val2014_000000574692 | adversarial | no | |
62 | 63 | Is there an orange in the imange? | yes | COCO_val2014_000000574692 | adversarial | yes | |
63 | 64 | Is there a dining table in the image? | no | COCO_val2014_000000574692 | adversarial | no | |
64 | 65 | Is there a bottle in the image? | yes | COCO_val2014_000000574692 | adversarial | yes | |
65 | 66 | Is there a cup in the image? | no | COCO_val2014_000000574692 | adversarial | no | |
67 | 68 | Is there a handbag in the image? | no | COCO_val2014_000000353180 | adversarial | no | |
68 | 69 | Is there a person in the image? | yes | COCO_val2014_000000353180 | adversarial | yes | |
70 | 71 | Is there a bus in the image? | yes | COCO_val2014_000000353180 | adversarial | yes | |
71 | 72 | Is there a traffic light in the image? | no | COCO_val2014_000000353180 | adversarial | no | |
72 | 73 | Is there a person in the image? | yes | COCO_val2014_000000239444 | adversarial | yes | |
73 | 74 | Is there a car in the image? | no | COCO_val2014_000000239444 | adversarial | no | |
74 | 75 | Is there a dining table in the image? | yes | COCO_val2014_000000239444 | adversarial | yes | |
75 | 76 | Is there a cup in the image? | no | COCO_val2014_000000239444 | adversarial | no | |
76 | 77 | Is there a chair in the image? | yes | COCO_val2014_000000239444 | adversarial | yes | |
77 | 78 | Is there a couch in the image? | no | COCO_val2014_000000239444 | adversarial | no | |
78 | 79 | Is there a person in the image? | yes | COCO_val2014_000000569839 | adversarial | yes | |
79 | 80 | Is there a car in the image? | no | COCO_val2014_000000569839 | adversarial | no | |
81 | 82 | Is there a bowl in the image? | no | COCO_val2014_000000569839 | adversarial | no | |
83 | 84 | Is there a handbag in the image? | no | COCO_val2014_000000569839 | adversarial | no | |
84 | 85 | Is there a person in the image? | yes | COCO_val2014_000000219622 | adversarial | yes | |
85 | 86 | Is there a dining table in the image? | no | COCO_val2014_000000219622 | adversarial | no | |
86 | 87 | Is there a car in the image? | yes | COCO_val2014_000000219622 | adversarial | yes | |
87 | 88 | Is there a truck in the image? | no | COCO_val2014_000000219622 | adversarial | no | |
88 | 89 | Is there a frisbee in the image? | yes | COCO_val2014_000000219622 | adversarial | yes | |
89 | 90 | Is there a dog in the image? | no | COCO_val2014_000000219622 | adversarial | no | |
90 | 91 | Is there a knife in the image? | yes | COCO_val2014_000000300368 | adversarial | yes | |
91 | 92 | Is there a cup in the image? | no | COCO_val2014_000000300368 | adversarial | no | |
92 | 93 | Is there a person in the image? | yes | COCO_val2014_000000300368 | adversarial | yes | |
93 | 94 | Is there a car in the image? | no | COCO_val2014_000000300368 | adversarial | no | |
94 | 95 | Is there a cake in the image? | yes | COCO_val2014_000000300368 | adversarial | yes | |
95 | 96 | Is there a bottle in the image? | no | COCO_val2014_000000300368 | adversarial | no | |
96 | 97 | Is there a person in the image? | yes | COCO_val2014_000000482476 | adversarial | yes | |
97 | 98 | Is there a car in the image? | no | COCO_val2014_000000482476 | adversarial | no | |
98 | 99 | Is there a remote in the image? | no | COCO_val2014_000000482476 | adversarial | yes | |
99 | 100 | Is there a tv in the image? | no | COCO_val2014_000000482476 | adversarial | no | |
101 | 102 | Is there a backpack in the image? | no | COCO_val2014_000000482476 | adversarial | no | |
102 | 103 | Is there a baseball glove in the image? | yes | COCO_val2014_000000131115 | adversarial | yes | |
104 | 105 | Is there a person in the image? | yes | COCO_val2014_000000131115 | adversarial | yes | |
105 | 106 | Is there a car in the image? | no | COCO_val2014_000000131115 | adversarial | no | |
106 | 107 | Is there a baseball bat in the image? | yes | COCO_val2014_000000131115 | adversarial | yes | |
108 | 109 | Is there a sink in the image? | yes | COCO_val2014_000000157084 | adversarial | yes | |
109 | 110 | Is there a bottle in the image? | yes | COCO_val2014_000000157084 | adversarial | no | |
110 | 111 | Is there a bench in the image? | yes | COCO_val2014_000000157084 | adversarial | yes | |
111 | 112 | Is there a person in the image? | no | COCO_val2014_000000157084 | adversarial | no |
RePOPE (HuggingFace Version)
This dataset is a Hugging Face formatted version of the RePOPE benchmark, uploaded to make it easier to use in evaluation pipelines and multimodal research.
RePOPE is a corrected relabeling of the POPE benchmark, which is commonly used to evaluate object hallucination in Vision-Language Models (VLMs). The dataset fixes annotation errors in the original POPE benchmark and removes ambiguous examples.
Original paper: RePOPE: Impact of Annotation Errors on the POPE Benchmark Yannic Neuhaus, Matthias Hein (2025)
Dataset Overview
RePOPE evaluates whether a model incorrectly claims that an object exists in an image.
Example question format:
Question: Is there a car in the image?
Answer: Yes / No
The model must answer correctly based on the image.
Incorrect answers may indicate object hallucination, where the model claims objects exist that are not present.
Dataset Statistics
Total examples: 8185 🔄 Changed 494 answers. ❌ Removed 815 ambigious questions in POPE.
Image source: COCO 2014 validation set
Categories:
- random
- popular
- adversarial
These splits follow the structure of the original POPE benchmark.
Dataset Features
| Feature | Type | Description |
|---|---|---|
id |
string | Unique identifier for the sample |
question_id |
string | Identifier linking to the question instance |
question |
string | yes/no question about objects in the image |
answer |
string | Correct label after RePOPE relabeling |
pope_old_answer |
string | Original answer from the POPE benchmark |
image_source |
string | COCO image identifier |
image |
image | The corresponding COCO image |
category |
string | Split category (random, popular, adversarial) |
Example
from datasets import load_dataset
dataset = load_dataset("SushantGautam/RePOPE")
print(dataset["test"][0])
Example output:
{
'id': '...',
'question_id': '...',
'question': 'Is there a car in the image?',
'answer': 'no',
'pope_old_answer': 'yes',
'image_source': 'COCO_val2014_000000310196',
'category': 'adversarial',
'image': <PIL.Image>
}
Why RePOPE?
The original POPE dataset contains annotation errors and ambiguous samples. These issues can significantly impact evaluation metrics such as F1 score.
Key findings from the RePOPE paper:
- Incorrect "yes" labels: 9.3%
- Incorrect "no" labels: 1.7%
This imbalance can distort hallucination evaluation results.
RePOPE fixes these issues by:
- correcting incorrect labels
- removing ambiguous samples
- preserving compatibility with POPE evaluation pipelines
Intended Use
This dataset is intended for:
- Evaluating vision-language models
- Studying object hallucination
- Benchmarking multimodal systems
- Comparing hallucination mitigation techniques
Example models evaluated with POPE/RePOPE:
- LLaVA
- Qwen-VL
- BLIP
- GPT-4V
Dataset Source
Original annotations from the official RePOPE repository:
https://github.com/YanNeu/RePOPE
Images come from:
MS COCO 2014 validation set
Citation
If you use this dataset, please cite the original paper:
@article{neuhaus2025repope,
title={RePOPE: Impact of Annotation Errors on the POPE Benchmark},
author={Neuhaus, Yannic and Hein, Matthias},
journal={arXiv preprint arXiv:2504.15707},
year={2025}
}
Acknowledgements
Thanks to the authors of:
- POPE – Evaluating object hallucination in large vision-language models
- RePOPE – Correcting annotation errors in the POPE benchmark
- Downloads last month
- 22