eve
commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -340,9 +340,69 @@ Additionally, metadata about these images is provided in **six JSON files**, cor
|
|
| 340 |
|
| 341 |
## Results
|
| 342 |
In this section, we give the full experiment results, wherein the metrics of **Prec.**, **Rec.**, **F1.**, **R.L.**, **B.S.**, **Rel.**, **Eff.**, **Comp.**, **Pos.**, and **Avg.** represent image precision, image recall, image F1 score, rouge-l, BERTScore, image relevance, image effectiveness, comprehensive score, image position score, and average score, respectively. Specifically, the metric **Ord.** represents image ordering score.
|
| 343 |
-
### Comprehensive performance results on Wit(
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 344 |
|
| 345 |
-
### Comprehensive performance results on Wit(Web Dataset).
|
| 346 |
### Comprehensive performance results on Web-MQA+(Web Dataset).
|
| 347 |
| Framework | Model | Web-MQA+ | | | | | | | | | |
|
| 348 |
|------------|------------------------|-----------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
|
|
|
| 340 |
|
| 341 |
## Results
|
| 342 |
In this section, we give the full experiment results, wherein the metrics of **Prec.**, **Rec.**, **F1.**, **R.L.**, **B.S.**, **Rel.**, **Eff.**, **Comp.**, **Pos.**, and **Avg.** represent image precision, image recall, image F1 score, rouge-l, BERTScore, image relevance, image effectiveness, comprehensive score, image position score, and average score, respectively. Specifically, the metric **Ord.** represents image ordering score.
|
| 343 |
+
### Comprehensive performance results on Wit(Wit-MQA+).
|
| 344 |
+
| Framework | Model | Wit-MQA+ | | | | | | | | | |
|
| 345 |
+
|------------|------------------------|-----------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 346 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 347 |
+
| Rule-Based | GPT-4o | 49.50 | 49.67 | 49.56 | 56.23 | 92.27 | 43.67 | 39.50 | 77.00 | 50.08 | 56.39 |
|
| 348 |
+
| | GPT-4o-mini | 42.83 | 42.83 | 42.83 | 48.55 | 89.52 | 38.30 | 34.83 | 76.90 | 43.33 | 51.10 |
|
| 349 |
+
| | Claude-3.5-Sonnet | 50.08 | 50.50 | 50.22 | 53.37 | 92.53 | 44.03 | 39.93 | 79.20 | 50.58 | 56.72 |
|
| 350 |
+
| | Gemini-1.5-Pro | 28.83 | 29.00 | 28.89 | 39.47 | 84.96 | 25.20 | 22.83 | 75.50 | 29.08 | 40.42 |
|
| 351 |
+
| | DeepSeek-V3 | 57.67 | 58.00 | 57.78 | 58.71 | 93.65 | 51.00 | 46.13 | 79.37 | 58.17 | 62.28 |
|
| 352 |
+
| | Qwen2-VL-7B-Instruct | 51.67 | 51.83 | 51.72 | 53.23 | 91.14 | 45.97 | 41.53 | 74.97 | 52.25 | 57.15 |
|
| 353 |
+
| | Qwen2-VL-72B-Instruct | 40.83 | 41.00 | 40.89 | 46.80 | 88.20 | 36.17 | 32.73 | 73.73 | 41.58 | 49.10 |
|
| 354 |
+
| | InternVL-2.5-8B | 37.25 | 37.33 | 37.28 | 42.09 | 86.57 | 32.43 | 29.20 | 72.10 | 37.42 | 45.74 |
|
| 355 |
+
| | InternVL-2.5-78B | 43.25 | 43.50 | 43.33 | 47.52 | 88.58 | 37.53 | 34.20 | 76.20 | 43.42 | 50.84 |
|
| 356 |
+
| | Llama-3.1-8B-Instruct | 24.07 | 25.50 | 24.46 | 26.50 | 80.51 | 21.97 | 20.47 | 59.40 | 25.92 | 34.31 |
|
| 357 |
+
| | Llama-3.3-70B-Instruct | 53.58 | 53.83 | 53.67 | 56.50 | 92.42 | 46.97 | 42.43 | 78.47 | 54.25 | 59.12 |
|
| 358 |
+
| MLLM-Based | GPT-4o | 83.50 | 84.00 | 83.67 | 54.84 | 93.32 | 74.67 | 68.13 | 81.50 | 84.33 | 78.66 |
|
| 359 |
+
| | GPT-4o-mini | 64.61 | 86.83 | 71.27 | 47.62 | 92.48 | 74.60 | 69.60 | 74.27 | 67.75 | 72.11 |
|
| 360 |
+
| | Claude-3.5-Sonnet | 93.83 | 96.17 | 94.61 | 40.00 | 91.73 | 86.07 | 79.03 | 82.20 | 95.67 | 84.37 |
|
| 361 |
+
| | Gemini-1.5-Pro | 94.11 | 96.17 | 94.78 | 50.84 | 91.56 | 83.67 | 75.40 | 78.80 | 95.14 | 84.50 |
|
| 362 |
+
| | Qwen2-VL-7B-Instruct | 22.92 | 34.67 | 25.90 | 35.14 | 83.90 | 29.07 | 26.90 | 57.40 | 27.36 | 38.14 |
|
| 363 |
+
| | Qwen2-VL-72B-Instruct | 60.92 | 65.17 | 62.19 | 49.95 | 92.34 | 57.53 | 53.20 | 78.37 | 62.62 | 64.70 |
|
| 364 |
+
| | InternVL-2.5-8B | 44.71 | 68.17 | 51.33 | 41.24 | 89.07 | 59.07 | 55.53 | 67.10 | 56.34 | 59.17 |
|
| 365 |
+
| | InternVL-2.5-78B | 77.15 | 82.17 | 78.75 | 44.01 | 91.63 | 72.87 | 66.67 | 80.13 | 80.71 | 74.90 |
|
| 366 |
+
| LLM-Based | GPT-4o | 73.75 | 73.83 | 73.78 | 52.80 | 93.02 | 66.13 | 60.03 | 82.70 | 74.42 | 72.27 |
|
| 367 |
+
| | GPT-4o-mini | 61.39 | 91.33 | 70.54 | 42.85 | 91.80 | 78.90 | 72.63 | 76.80 | 63.03 | 72.14 |
|
| 368 |
+
| | Claude-3.5-Sonnet | 91.53 | 94.83 | 92.61 | 44.24 | 92.58 | 84.60 | 77.57 | 82.37 | 92.11 | 83.60 |
|
| 369 |
+
| | Gemini-1.5-Pro | 96.08 | 96.67 | 96.28 | 53.93 | 92.45 | 84.73 | 77.40 | 80.20 | 96.42 | 86.02 |
|
| 370 |
+
| | DeepSeek-V3 | 93.81 | 96.83 | 94.78 | 43.64 | 92.48 | 86.43 | 79.23 | 82.10 | 94.75 | 84.89 |
|
| 371 |
+
| | Llama-3.1-8B-Instruct | 32.75 | 40.50 | 34.87 | 32.51 | 82.06 | 37.87 | 35.70 | 54.77 | 36.14 | 43.02 |
|
| 372 |
+
| | Llama-3.3-70B-Instruct | 86.58 | 96.00 | 89.09 | 44.83 | 92.87 | 81.93 | 75.33 | 78.90 | 88.15 | 81.52 |
|
| 373 |
+
|
| 374 |
+

|
| 375 |
+
nce results on Wiki-MQA+(Web Dataset).
|
| 376 |
+
| Framework | Model | Wiki-MQA+ | | | | | | | | | |
|
| 377 |
+
|------------|------------------------|-----------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 378 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 379 |
+
| Rule-Based | GPT-4o | 53.00 | 53.00 | 53.00 | 54.62 | 95.15 | 46.60 | 42.56 | 82.24 | 53.00 | 59.24 |
|
| 380 |
+
| | GPT-4o-mini | 49.60 | 49.60 | 49.60 | 53.39 | 94.87 | 42.52 | 39.12 | 82.04 | 49.60 | 56.70 |
|
| 381 |
+
| | Claude-3.5-Sonnet | 37.80 | 37.80 | 37.80 | 49.32 | 94.06 | 32.60 | 30.00 | 82.88 | 37.80 | 48.90 |
|
| 382 |
+
| | Gemini-1.5-Pro | 41.20 | 41.20 | 41.20 | 47.18 | 92.46 | 35.76 | 32.64 | 80.44 | 41.20 | 50.36 |
|
| 383 |
+
| | DeepSeek-V3 | 56.20 | 56.40 | 56.27 | 53.33 | 95.28 | 49.36 | 44.80 | 83.00 | 56.20 | 61.20 |
|
| 384 |
+
| | Qwen2-VL-7B-Instruct | 53.50 | 53.60 | 53.53 | 48.15 | 93.12 | 46.04 | 41.60 | 76.08 | 53.50 | 57.68 |
|
| 385 |
+
| | Qwen2-VL-72B-Instruct | 51.50 | 51.60 | 51.53 | 48.08 | 92.81 | 44.76 | 40.76 | 77.72 | 51.50 | 56.70 |
|
| 386 |
+
| | InternVL-2.5-8B | 50.00 | 50.20 | 50.07 | 48.06 | 93.32 | 43.64 | 40.08 | 78.20 | 50.20 | 55.97 |
|
| 387 |
+
| | InternVL-2.5-78B | 54.00 | 54.20 | 54.07 | 51.42 | 94.61 | 46.40 | 42.60 | 81.44 | 54.10 | 59.20 |
|
| 388 |
+
| | Llama-3.1-8B-Instruct | 21.60 | 21.80 | 21.67 | 27.74 | 84.65 | 18.76 | 17.28 | 59.96 | 22.20 | 32.85 |
|
| 389 |
+
| | Llama-3.3-70B-Instruct | 53.70 | 53.80 | 53.73 | 53.02 | 94.91 | 46.76 | 43.00 | 80.80 | 53.70 | 59.27 |
|
| 390 |
+
| MLLM-Based | GPT-4o | 71.30 | 71.60 | 71.40 | 53.34 | 95.70 | 63.32 | 58.28 | 83.32 | 71.40 | 71.07 |
|
| 391 |
+
| | GPT-4o-mini | 49.83 | 81.40 | 58.56 | 49.99 | 95.51 | 70.36 | 64.60 | 74.00 | 51.32 | 66.17 |
|
| 392 |
+
| | Claude-3.5-Sonnet | 91.90 | 94.20 | 92.67 | 44.42 | 94.41 | 83.68 | 76.00 | 82.36 | 92.50 | 83.57 |
|
| 393 |
+
| | Gemini-1.5-Pro | 92.10 | 93.80 | 92.67 | 50.05 | 94.34 | 82.08 | 74.60 | 79.76 | 92.20 | 83.51 |
|
| 394 |
+
| | Qwen2-VL-7B-Instruct | 24.22 | 31.60 | 26.28 | 32.02 | 87.45 | 26.76 | 25.20 | 56.24 | 26.63 | 37.38 |
|
| 395 |
+
| | Qwen2-VL-72B-Instruct | 53.64 | 59.60 | 55.29 | 47.93 | 94.63 | 54.28 | 49.32 | 79.92 | 54.19 | 60.98 |
|
| 396 |
+
| | InternVL-2.5-8B | 46.92 | 72.40 | 53.97 | 44.69 | 93.12 | 61.76 | 57.64 | 71.08 | 53.79 | 61.71 |
|
| 397 |
+
| | InternVL-2.5-78B | 67.12 | 72.40 | 68.66 | 45.43 | 94.85 | 64.32 | 58.96 | 81.24 | 69.33 | 69.15 |
|
| 398 |
+
| LLM-Based | GPT-4o | 81.40 | 81.60 | 81.47 | 51.53 | 95.66 | 72.28 | 65.76 | 83.72 | 81.40 | 77.20 |
|
| 399 |
+
| | GPT-4o-mini | 44.47 | 86.80 | 56.05 | 47.98 | 95.20 | 72.40 | 67.04 | 73.68 | 45.02 | 65.40 |
|
| 400 |
+
| | Claude-3.5-Sonnet | 93.70 | 94.80 | 94.07 | 45.78 | 94.63 | 83.68 | 76.60 | 82.48 | 93.80 | 84.39 |
|
| 401 |
+
| | Gemini-1.5-Pro | 95.90 | 96.00 | 95.93 | 50.92 | 94.84 | 81.76 | 75.20 | 79.72 | 96.10 | 85.15 |
|
| 402 |
+
| | DeepSeek-V3 | 90.03 | 95.80 | 91.90 | 45.71 | 95.18 | 84.40 | 77.00 | 82.16 | 90.13 | 83.59 |
|
| 403 |
+
| | Llama-3.1-8B-Instruct | 23.50 | 28.00 | 24.79 | 35.66 | 85.16 | 23.04 | 21.68 | 51.16 | 23.90 | 35.21 |
|
| 404 |
+
| | Llama-3.3-70B-Instruct | 70.61 | 94.40 | 76.35 | 47.86 | 95.47 | 78.16 | 71.84 | 76.96 | 71.46 | 75.90 |
|
| 405 |
|
|
|
|
| 406 |
### Comprehensive performance results on Web-MQA+(Web Dataset).
|
| 407 |
| Framework | Model | Web-MQA+ | | | | | | | | | |
|
| 408 |
|------------|------------------------|-----------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|