eve
commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -343,9 +343,99 @@ In this section, we give the full experiment results, wherein the metrics of **P
|
|
| 343 |
### Comprehensive performance results on Wit(Web Dataset).
|
| 344 |
|
| 345 |
### Comprehensive performance results on Wit(Web Dataset).
|
| 346 |
-
### Comprehensive performance results on
|
| 347 |
-
|
| 348 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 349 |
### Comprehensive performance results on Manual-MQA+(Lifestyle Dataset).
|
| 350 |
|
| 351 |
| Framework | Model | Manual-MQA+ | | | | | | | | | | |
|
|
|
|
| 343 |
### Comprehensive performance results on Wit(Web Dataset).
|
| 344 |
|
| 345 |
### Comprehensive performance results on Wit(Web Dataset).
|
| 346 |
+
### Comprehensive performance results on Web-MQA+(Web Dataset).
|
| 347 |
+
| Framework | Model | Web-MQA+ | | | | | | | | | |
|
| 348 |
+
|------------|------------------------|-----------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 349 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 350 |
+
| Rule-Based | GPT-4o | 32.47 | 16.93 | 22.11 | 39.17 | 90.56 | 29.47 | 27.81 | 73.87 | 32.80 | 40.58 |
|
| 351 |
+
| | GPT-4o-mini | 26.89 | 14.27 | 18.46 | 34.88 | 89.84 | 24.53 | 23.44 | 72.72 | 27.40 | 36.94 |
|
| 352 |
+
| | Claude-3.5-Sonnet | 52.27 | 29.20 | 36.89 | 49.74 | 93.69 | 47.12 | 44.75 | 80.27 | 53.07 | 54.11 |
|
| 353 |
+
| | Gemini-1.5-Pro | 26.60 | 15.00 | 18.87 | 28.75 | 85.91 | 24.80 | 23.55 | 70.56 | 27.60 | 35.74 |
|
| 354 |
+
| | DeepSeek-V3 | 53.27 | 31.00 | 38.40 | 50.29 | 93.71 | 48.83 | 46.13 | 78.96 | 54.53 | 55.01 |
|
| 355 |
+
| | Qwen2-VL-7B-Instruct | 16.69 | 8.67 | 11.33 | 33.36 | 90.12 | 15.04 | 14.13 | 64.43 | 16.96 | 30.08 |
|
| 356 |
+
| | Qwen2-VL-72B-Instruct | 18.87 | 10.47 | 13.27 | 29.15 | 86.36 | 17.20 | 16.56 | 66.53 | 19.07 | 30.83 |
|
| 357 |
+
| | InternVL-2.5-8B | 12.80 | 6.67 | 8.71 | 23.42 | 84.23 | 12.03 | 11.47 | 62.56 | 13.20 | 26.12 |
|
| 358 |
+
| | InternVL-2.5-78B | 25.09 | 14.13 | 17.77 | 36.30 | 90.46 | 22.99 | 21.49 | 69.31 | 25.56 | 35.90 |
|
| 359 |
+
| | Llama-3.1-8B-Instruct | 25.20 | 15.73 | 18.61 | 24.90 | 83.01 | 25.41 | 23.95 | 56.56 | 28.97 | 33.59 |
|
| 360 |
+
| | Llama-3.3-70B-Instruct | 41.80 | 24.00 | 29.93 | 44.60 | 91.86 | 38.13 | 36.11 | 74.77 | 43.13 | 47.15 |
|
| 361 |
+
| MLLM-Based | GPT-4o | 89.78 | 83.80 | 85.47 | 52.09 | 95.14 | 94.27 | 90.08 | 91.25 | 93.74 | 86.18 |
|
| 362 |
+
| | GPT-4o-mini | 87.71 | 88.60 | 87.82 | 53.13 | 95.66 | 93.49 | 89.44 | 90.03 | 91.49 | 86.37 |
|
| 363 |
+
| | Claude-3.5-Sonnet | 88.50 | 91.33 | 89.45 | 50.48 | 94.89 | 95.68 | 92.88 | 93.20 | 92.96 | 87.71 |
|
| 364 |
+
| | Gemini-1.5-Pro | 83.51 | 83.73 | 82.91 | 37.06 | 91.10 | 94.05 | 90.05 | 90.43 | 87.01 | 82.21 |
|
| 365 |
+
| | Qwen2-VL-7B-Instruct | 30.85 | 31.73 | 29.53 | 37.55 | 90.45 | 36.83 | 34.56 | 67.01 | 34.95 | 43.72 |
|
| 366 |
+
| | Qwen2-VL-72B-Instruct | 62.64 | 57.60 | 58.82 | 42.56 | 91.67 | 67.25 | 64.11 | 82.59 | 65.44 | 65.85 |
|
| 367 |
+
| | InternVL-2.5-8B | 62.98 | 59.67 | 59.98 | 46.92 | 93.31 | 70.59 | 67.12 | 78.45 | 69.95 | 67.66 |
|
| 368 |
+
| | InternVL-2.5-78B | 79.78 | 74.13 | 75.77 | 52.47 | 95.29 | 81.65 | 78.48 | 89.20 | 83.28 | 78.89 |
|
| 369 |
+
| LLM-Based | GPT-4o | 86.18 | 78.73 | 81.15 | 54.87 | 95.96 | 86.21 | 82.37 | 89.52 | 87.02 | 82.45 |
|
| 370 |
+
| | GPT-4o-mini | 92.86 | 93.40 | 92.95 | 53.50 | 95.82 | 93.20 | 89.28 | 89.95 | 94.59 | 88.39 |
|
| 371 |
+
| | Claude-3.5-Sonnet | 92.40 | 92.47 | 92.16 | 54.27 | 95.51 | 94.48 | 91.07 | 91.68 | 94.23 | 88.70 |
|
| 372 |
+
| | Gemini-1.5-Pro | 90.16 | 90.13 | 89.82 | 45.64 | 93.38 | 94.13 | 90.16 | 90.75 | 91.38 | 86.17 |
|
| 373 |
+
| | DeepSeek-V3 | 94.52 | 94.27 | 94.20 | 56.25 | 96.10 | 94.27 | 90.11 | 90.80 | 95.93 | 89.61 |
|
| 374 |
+
| | Llama-3.1-8B-Instruct | 29.34 | 26.27 | 26.31 | 33.70 | 81.16 | 32.08 | 30.48 | 51.81 | 32.38 | 38.17 |
|
| 375 |
+
| | Llama-3.3-70B-Instruct | 66.83 | 95.80 | 75.47 | 47.98 | 94.79 | 92.03 | 88.03 | 88.93 | 69.34 | 79.91 |
|
| 376 |
+
|
| 377 |
+
### Comprehensive performance results on Arxiv-MQA+(Academic Paper Dataset).
|
| 378 |
+
| Framework | Model | Arxiv-MQA+ | | | | | | | | | |
|
| 379 |
+
|------------|------------------------|-------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 380 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 381 |
+
| Rule-Based | GPT-4o | 55.42 | 63.04 | 57.70 | 44.96 | 94.67 | 69.10 | 67.30 | 84.20 | 75.75 | 68.02 |
|
| 382 |
+
| | GPT-4o-mini | 51.71 | 59.29 | 53.80 | 44.21 | 94.36 | 67.50 | 64.80 | 85.20 | 73.75 | 66.07 |
|
| 383 |
+
| | Claude-3.5-Sonnet | 55.17 | 62.79 | 57.37 | 42.78 | 94.09 | 69.10 | 66.10 | 84.20 | 75.75 | 67.48 |
|
| 384 |
+
| | Gemini-1.5-Pro | 52.43 | 56.29 | 53.10 | 42.18 | 93.85 | 64.20 | 61.80 | 83.20 | 70.28 | 64.15 |
|
| 385 |
+
| | DeepSeek-V3 | 56.12 | 67.29 | 59.34 | 45.74 | 94.90 | 74.00 | 70.30 | 84.30 | 78.46 | 70.05 |
|
| 386 |
+
| | Qwen2-VL-7B-Instruct | 49.17 | 52.17 | 49.57 | 39.32 | 92.09 | 60.70 | 58.90 | 78.90 | 67.08 | 60.88 |
|
| 387 |
+
| | Qwen2-VL-72B-Instruct | 45.42 | 48.71 | 45.68 | 39.86 | 92.39 | 60.30 | 58.10 | 79.60 | 65.42 | 59.50 |
|
| 388 |
+
| | InternVL-2.5-8B | 39.20 | 48.29 | 41.51 | 40.36 | 91.42 | 61.60 | 59.20 | 76.70 | 65.17 | 58.16 |
|
| 389 |
+
| | InternVL-2.5-78B | 52.21 | 62.00 | 55.28 | 43.66 | 94.51 | 71.00 | 68.70 | 85.40 | 75.38 | 67.57 |
|
| 390 |
+
| | Llama-3.1-8B-Instruct | 21.50 | 23.08 | 21.90 | 26.61 | 85.92 | 26.20 | 25.00 | 58.70 | 29.00 | 35.32 |
|
| 391 |
+
| | Llama-3.3-70B-Instruct | 53.00 | 58.17 | 53.97 | 44.14 | 94.39 | 65.80 | 63.50 | 83.60 | 73.42 | 65.55 |
|
| 392 |
+
| MLLM-Based | GPT-4o | 60.39 | 74.29 | 64.23 | 44.25 | 95.15 | 89.40 | 86.20 | 87.50 | 90.39 | 76.87 |
|
| 393 |
+
| | GPT-4o-mini | 36.17 | 74.79 | 46.78 | 42.48 | 95.08 | 83.60 | 80.80 | 83.20 | 74.66 | 68.62 |
|
| 394 |
+
| | Claude-3.5-Sonnet | 47.12 | 83.50 | 57.68 | 40.60 | 94.65 | 89.30 | 86.70 | 87.60 | 86.38 | 74.84 |
|
| 395 |
+
| | Gemini-1.5-Pro | 58.13 | 80.25 | 64.74 | 41.84 | 94.30 | 85.10 | 82.40 | 85.90 | 83.61 | 75.14 |
|
| 396 |
+
| | Qwen2-VL-7B-Instruct | 1.63 | 4.00 | 2.18 | 33.01 | 84.62 | 5.20 | 5.10 | 49.80 | 4.46 | 21.11 |
|
| 397 |
+
| | Qwen2-VL-72B-Instruct | 31.99 | 44.87 | 35.22 | 40.54 | 93.53 | 57.90 | 56.60 | 84.20 | 55.16 | 55.56 |
|
| 398 |
+
| | InternVL-2.5-8B | 12.22 | 27.87 | 15.78 | 31.99 | 83.72 | 30.40 | 29.50 | 58.10 | 28.49 | 35.34 |
|
| 399 |
+
| | InternVL-2.5-78B | 36.62 | 55.00 | 41.77 | 37.99 | 94.47 | 68.10 | 66.20 | 84.80 | 64.11 | 61.01 |
|
| 400 |
+
| LLM-Based | GPT-4o | 65.28 | 76.54 | 68.54 | 44.13 | 95.23 | 86.00 | 82.70 | 88.90 | 84.84 | 76.91 |
|
| 401 |
+
| | GPT-4o-mini | 37.69 | 83.33 | 49.90 | 41.23 | 95.01 | 85.90 | 82.60 | 84.50 | 69.07 | 69.91 |
|
| 402 |
+
| | Claude-3.5-Sonnet | 62.17 | 88.00 | 70.16 | 41.04 | 94.37 | 90.90 | 88.60 | 89.60 | 88.17 | 79.22 |
|
| 403 |
+
| | Gemini-1.5-Pro | 59.85 | 78.63 | 65.22 | 42.41 | 94.32 | 84.60 | 82.20 | 87.60 | 80.15 | 75.00 |
|
| 404 |
+
| | DeepSeek-V3 | 46.57 | 81.13 | 56.69 | 39.48 | 94.70 | 90.30 | 86.40 | 87.50 | 70.01 | 72.53 |
|
| 405 |
+
| | Llama-3.1-8B-Instruct | 1.50 | 2.00 | 1.67 | 25.78 | 80.61 | 3.30 | 3.00 | 43.40 | 4.00 | 18.36 |
|
| 406 |
+
| | Llama-3.3-70B-Instruct | 38.78 | 84.88 | 48.56 | 37.83 | 95.01 | 85.50 | 81.80 | 83.40 | 64.59 | 68.93 |
|
| 407 |
+
|
| 408 |
+
### Comprehensive performance results on Recipe-MQA+(Lifestyle Dataset).
|
| 409 |
+
| Framework | Model | Recipe-MQA+ | | | | | | | | | | |
|
| 410 |
+
|------------|------------------------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|
|
| 411 |
+
| | | Prec. | Rec. | F1 | R.L. | B.S. | Ord. | Rel. | Eff. | Comp. | Pos. | Avg. |
|
| 412 |
+
| Rule-Based | GPT-4o | 48.79 | 66.11 | 52.76 | 51.80 | 92.10 | 45.30 | 77.80 | 74.64 | 79.19 | 78.04 | 66.65 |
|
| 413 |
+
| | GPT-4o-mini | 51.10 | 63.88 | 53.43 | 49.49 | 91.14 | 45.58 | 75.42 | 72.42 | 79.40 | 76.91 | 65.88 |
|
| 414 |
+
| | Claude-3.5-Sonnet | 52.15 | 62.95 | 53.48 | 47.13 | 92.08 | 44.94 | 75.36 | 72.53 | 79.84 | 77.21 | 65.77 |
|
| 415 |
+
| | Gemini-1.5-Pro | 50.61 | 51.46 | 47.23 | 40.71 | 87.97 | 39.61 | 71.09 | 68.31 | 78.40 | 73.08 | 60.85 |
|
| 416 |
+
| | DeepSeek-V3 | 26.13 | 59.00 | 33.36 | 50.51 | 92.48 | 22.96 | 74.58 | 71.92 | 73.36 | 64.49 | 56.88 |
|
| 417 |
+
| | Qwen2-VL-7B-Instruct | 45.55 | 63.81 | 48.46 | 50.79 | 91.85 | 41.36 | 77.99 | 74.92 | 78.06 | 78.36 | 65.11 |
|
| 418 |
+
| | Qwen2-VL-72B-Instruct | 31.20 | 50.10 | 34.40 | 46.33 | 89.91 | 24.99 | 73.41 | 70.50 | 72.61 | 71.18 | 56.46 |
|
| 419 |
+
| | InternVL-2.5-8B | 29.37 | 52.39 | 32.92 | 42.87 | 90.19 | 23.14 | 73.58 | 70.92 | 72.53 | 71.38 | 55.93 |
|
| 420 |
+
| | InternVL-2.5-78B | 20.90 | 70.86 | 29.26 | 51.20 | 92.37 | 17.82 | 75.20 | 72.77 | 74.43 | 54.05 | 55.89 |
|
| 421 |
+
| | Llama-3.1-8B-Instruct | 27.59 | 37.70 | 25.17 | 25.89 | 81.02 | 18.83 | 64.42 | 61.52 | 65.64 | 61.73 | 46.95 |
|
| 422 |
+
| | Llama-3.3-70B-Instruct | 29.56 | 51.38 | 34.55 | 51.56 | 93.19 | 24.57 | 74.31 | 71.50 | 72.64 | 69.57 | 57.28 |
|
| 423 |
+
| MLLM-Based | GPT-4o | 45.20 | 46.49 | 42.25 | 45.74 | 92.72 | 33.70 | 77.31 | 74.64 | 81.65 | 78.01 | 61.77 |
|
| 424 |
+
| | GPT-4o-mini | 30.31 | 50.26 | 33.86 | 40.16 | 91.81 | 22.67 | 77.97 | 75.49 | 77.52 | 71.23 | 57.13 |
|
| 425 |
+
| | Claude-3.5-Sonnet | 30.04 | 54.21 | 35.01 | 34.54 | 90.90 | 22.18 | 80.56 | 78.18 | 79.75 | 74.75 | 58.01 |
|
| 426 |
+
| | Gemini-1.5-Pro | 39.01 | 59.50 | 43.50 | 43.43 | 89.89 | 32.49 | 81.94 | 79.22 | 81.64 | 70.42 | 62.10 |
|
| 427 |
+
| | Qwen2-VL-7B-Instruct | 9.06 | 15.17 | 9.48 | 34.47 | 84.65 | 4.44 | 18.81 | 18.08 | 55.62 | 17.17 | 26.69 |
|
| 428 |
+
| | Qwen2-VL-72B-Instruct | 19.19 | 26.47 | 19.70 | 43.26 | 91.35 | 12.27 | 43.25 | 41.57 | 74.52 | 39.73 | 41.13 |
|
| 429 |
+
| | InternVL-2.5-8B | 23.01 | 39.81 | 23.89 | 33.22 | 89.42 | 15.34 | 67.19 | 64.96 | 74.45 | 63.44 | 49.47 |
|
| 430 |
+
| | InternVL-2.5-78B | 21.72 | 30.07 | 21.22 | 36.60 | 91.13 | 13.87 | 56.60 | 54.66 | 75.79 | 52.99 | 45.46 |
|
| 431 |
+
| LLM-Based | GPT-4o | 49.70 | 65.03 | 51.91 | 44.75 | 92.42 | 43.59 | 82.58 | 79.38 | 81.02 | 81.88 | 67.23 |
|
| 432 |
+
| | GPT-4o-mini | 45.59 | 39.32 | 39.61 | 47.56 | 92.91 | 32.04 | 51.78 | 49.82 | 83.47 | 54.86 | 53.70 |
|
| 433 |
+
| | Claude-3.5-Sonnet | 62.24 | 67.73 | 61.48 | 38.65 | 91.49 | 53.23 | 81.15 | 78.30 | 84.96 | 83.87 | 70.31 |
|
| 434 |
+
| | Gemini-1.5-Pro | 64.87 | 71.43 | 64.43 | 47.01 | 90.70 | 56.89 | 82.39 | 79.16 | 83.55 | 80.69 | 72.11 |
|
| 435 |
+
| | DeepSeek-V3 | 47.53 | 70.82 | 51.92 | 39.83 | 91.84 | 40.90 | 84.38 | 81.46 | 82.97 | 77.92 | 66.96 |
|
| 436 |
+
| | Llama-3.1-8B-Instruct | 11.56 | 12.69 | 10.89 | 24.61 | 75.21 | 6.70 | 17.71 | 17.04 | 41.86 | 18.32 | 23.66 |
|
| 437 |
+
| | Llama-3.3-70B-Instruct | 36.87 | 72.52 | 44.31 | 38.38 | 91.99 | 31.00 | 81.84 | 79.19 | 80.84 | 71.99 | 62.89 |
|
| 438 |
+
|
| 439 |
### Comprehensive performance results on Manual-MQA+(Lifestyle Dataset).
|
| 440 |
|
| 441 |
| Framework | Model | Manual-MQA+ | | | | | | | | | | |
|