eve
commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -340,6 +340,43 @@ Additionally, metadata about these images is provided in **six JSON files**, cor
|
|
| 340 |
|
| 341 |
## Results
|
| 342 |
In this section, we give the full experiment results, wherein the metrics of **Prec.**, **Rec.**, **F1.**, **R.L.**, **B.S.**, **Rel.**, **Eff.**, **Comp.**, **Pos.**, and **Avg.** represent image precision, image recall, image F1 score, rouge-l, BERTScore, image relevance, image effectiveness, comprehensive score, image position score, and average score, respectively. Specifically, the metric **Ord.** represents image ordering score.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 343 |
|
| 344 |
|
| 345 |
## Contact
|
|
|
|
| 340 |
|
| 341 |
## Results
|
| 342 |
In this section, we give the full experiment results, wherein the metrics of **Prec.**, **Rec.**, **F1.**, **R.L.**, **B.S.**, **Rel.**, **Eff.**, **Comp.**, **Pos.**, and **Avg.** represent image precision, image recall, image F1 score, rouge-l, BERTScore, image relevance, image effectiveness, comprehensive score, image position score, and average score, respectively. Specifically, the metric **Ord.** represents image ordering score.
|
| 343 |
+
### Comprehensive performance results on Wit(Web Dataset).
|
| 344 |
+
|
| 345 |
+
### Comprehensive performance results on Wit(Web Dataset).
|
| 346 |
+
### Comprehensive performance results on Wit(Web Dataset).
|
| 347 |
+
### Comprehensive performance results on Wit(Web Dataset).
|
| 348 |
+
### Comprehensive performance results on Wit(Lifestyle Dataset).
|
| 349 |
+
### Comprehensive performance results on Manual-MQA+(Lifestyle Dataset).
|
| 350 |
+
|
| 351 |
+
| Framework | Model | Manual-MQA+ | | | | | | | | | | |
|
| 352 |
+
|------------|------------------------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 353 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Ord. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 354 |
+
| Rule-Based | GPT-4o | 36.45 | 47.97 | 38.32 | 50.82 | 91.51 | 32.10 | 75.79 | 73.44 | 79.08 | 71.66 | 59.71 |
|
| 355 |
+
| | GPT-4o-mini | 37.29 | 47.18 | 38.22 | 50.40 | 91.05 | 32.83 | 73.28 | 71.79 | 78.87 | 69.70 | 59.06 |
|
| 356 |
+
| | Claude-3.5-Sonnet | 39.27 | 50.38 | 41.13 | 48.17 | 91.69 | 32.82 | 73.18 | 71.23 | 77.44 | 71.43 | 59.67 |
|
| 357 |
+
| | Gemini-1.5-Pro | 40.54 | 45.17 | 39.84 | 46.69 | 90.40 | 33.01 | 73.13 | 70.67 | 76.56 | 73.51 | 58.95 |
|
| 358 |
+
| | DeepSeek-V3 | 32.75 | 48.83 | 36.28 | 51.57 | 92.05 | 31.29 | 76.92 | 75.08 | 79.23 | 69.42 | 59.34 |
|
| 359 |
+
| | Qwen2-VL-7B-Instruct | 33.18 | 43.48 | 34.32 | 46.68 | 89.32 | 27.61 | 71.79 | 69.90 | 75.54 | 71.14 | 56.30 |
|
| 360 |
+
| | Qwen2-VL-72B-Instruct | 35.58 | 44.42 | 35.38 | 46.30 | 89.73 | 28.86 | 70.72 | 68.31 | 74.82 | 69.46 | 56.36 |
|
| 361 |
+
| | InternVL-2.5-8B | 29.53 | 45.93 | 32.06 | 42.30 | 89.64 | 24.17 | 72.10 | 69.23 | 74.15 | 70.72 | 54.98 |
|
| 362 |
+
| | InternVL-2.5-78B | 32.96 | 48.63 | 36.00 | 48.26 | 91.10 | 29.72 | 75.74 | 73.44 | 78.10 | 71.66 | 58.56 |
|
| 363 |
+
| | Llama-3.1-8B-Instruct | 32.07 | 27.50 | 26.58 | 30.90 | 82.93 | 15.10 | 50.87 | 49.44 | 62.00 | 55.84 | 43.32 |
|
| 364 |
+
| | Llama-3.3-70B-Instruct | 34.53 | 44.35 | 35.60 | 49.50 | 91.22 | 30.26 | 73.13 | 71.03 | 75.74 | 69.26 | 57.46 |
|
| 365 |
+
| MLLM-Based | GPT-4o | 35.07 | 33.78 | 32.44 | 44.68 | 91.16 | 24.50 | 75.49 | 73.28 | 79.59 | 73.38 | 56.34 |
|
| 366 |
+
| | GPT-4o-mini | 23.43 | 32.24 | 25.16 | 43.60 | 91.05 | 17.33 | 72.92 | 71.13 | 75.23 | 62.22 | 51.43 |
|
| 367 |
+
| | Claude-3。5-Sonnet | 25.17 | 39.24 | 28.47 | 40.32 | 91.02 | 19.94 | 80.51 | 78.10 | 80.41 | 75.12 | 55.83 |
|
| 368 |
+
| | Gemini-1.5-Pro | 36.01 | 44.68 | 37.14 | 48.87 | 90.99 | 28.76 | 76.62 | 74.62 | 79.79 | 66.32 | 58.38 |
|
| 369 |
+
| | Qwen2-VL-7B-Instruct | 13.32 | 15.05 | 13.48 | 41.07 | 86.02 | 3.09 | 13.38 | 12.82 | 57.74 | 10.46 | 26.65 |
|
| 370 |
+
| | Qwen2-VL-72B-Instruct | 22.13 | 24.92 | 21.62 | 44.36 | 90.34 | 12.95 | 49.08 | 47.13 | 73.44 | 41.23 | 42.72 |
|
| 371 |
+
| | InternVL-2.5-8B | 17.23 | 26.63 | 18.65 | 39.71 | 89.33 | 9.34 | 47.38 | 46.26 | 71.23 | 39.90 | 40.57 |
|
| 372 |
+
| | InternVL-2.5-78B | 19.70 | 23.19 | 19.37 | 42.90 | 91.01 | 11.36 | 55.95 | 55.18 | 73.28 | 45.90 | 43.78 |
|
| 373 |
+
| LLM-Based | GPT-4o | 34.02 | 46.48 | 36.78 | 45.99 | 91.46 | 35.80 | 77.59 | 75.64 | 78.05 | 71.65 | 59.35 |
|
| 374 |
+
| | GPT-4o-mini | 36.94 | 31.87 | 32.64 | 45.77 | 91.35 | 25.46 | 55.33 | 54.05 | 81.79 | 55.56 | 51.08 |
|
| 375 |
+
| | Claude-3.5-Sonnet | 45.21 | 44.59 | 43.20 | 42.68 | 91.64 | 40.39 | 75.08 | 72.67 | 82.62 | 74.73 | 61.28 |
|
| 376 |
+
| | Gemini-1.5-Pro | 46.23 | 49.69 | 45.43 | 50.21 | 91.58 | 39.87 | 76.62 | 74.36 | 80.36 | 73.40 | 62.77 |
|
| 377 |
+
| | DeepSeek-V3 | 34.71 | 47.89 | 37.82 | 43.80 | 91.38 | 36.81 | 81.08 | 78.77 | 80.67 | 71.65 | 60.46 |
|
| 378 |
+
| | Llama-3.1-8B-Instruct | 12.65 | 13.12 | 12.38 | 22.27 | 76.31 | 3.03 | 10.56 | 10.46 | 35.59 | 10.06 | 20.64 |
|
| 379 |
+
| | Llama-3.3-70B-Instruct | 25.74 | 50.15 | 31.26 | 39.80 | 91.31 | 28.03 | 76.72 | 74.36 | 75.95 | 62.56 | 55.59 |
|
| 380 |
|
| 381 |
|
| 382 |
## Contact
|