Update README.md
Browse files
README.md
CHANGED
|
@@ -199,21 +199,21 @@ Please refer to the evaluation guidelines in our [github repo](https://github.co
|
|
| 199 |
|
| 200 |
| Model | Avg.(%) | Type |
|
| 201 |
|----------------------------|---------|-------------|
|
| 202 |
-
| ๐ฅ
|
| 203 |
-
| ๐ฅ
|
| 204 |
-
| ๐ฅ
|
| 205 |
-
|
|
| 206 |
| InternVL3-78B | 35.52 | Open-Source |
|
| 207 |
-
|
|
| 208 |
-
|
|
| 209 |
-
|
|
| 210 |
| QwenVL2.5-72B | 34.33 | Open-Source |
|
| 211 |
-
|
|
| 212 |
-
|
|
| 213 |
| InternVL2.5-38B | 31.94 | Open-Source |
|
| 214 |
| InternVL3-8B | 31.94 | Open-Source |
|
| 215 |
-
|
|
| 216 |
-
| QwenVL3-30B(Thinking) | 31.64 | Open-Source |
|
| 217 |
| QwenVL2.5-32B | 31.04 | Open-Source |
|
| 218 |
| LLaVA-Video-72B | 31.04 | Open-Source |
|
| 219 |
| InternVL3-38B | 30.45 | Open-Source |
|
|
@@ -225,6 +225,7 @@ Please refer to the evaluation guidelines in our [github repo](https://github.co
|
|
| 225 |
| InternVideo2.5-8B | 27.76 | Open-Source |
|
| 226 |
| LLaVA-Video-7B | 27.16 | Open-Source |
|
| 227 |
|
|
|
|
| 228 |
</details>
|
| 229 |
|
| 230 |
*Note: For the three sub-benchmarks, we take the higher score of each model across the two settings for easier presentation.*
|
|
|
|
| 199 |
|
| 200 |
| Model | Avg.(%) | Type |
|
| 201 |
|----------------------------|---------|-------------|
|
| 202 |
+
| ๐ฅGemini 2.5 Flash | 38.81 | Proprietary |
|
| 203 |
+
| ๐ฅGemini 2.5 Flash (Thinking) | 38.21 | Proprietary |
|
| 204 |
+
| ๐ฅO3 | 37.61 | Proprietary |
|
| 205 |
+
| Doubao-1.5-thinking | 37.05 | Proprietary |
|
| 206 |
| InternVL3-78B | 35.52 | Open-Source |
|
| 207 |
+
| GPT-5 | 35.22 | Proprietary |
|
| 208 |
+
| Gemini 3 Pro | 35.22 | Proprietary |
|
| 209 |
+
| O4-mini | 34.33 | Proprietary |
|
| 210 |
| QwenVL2.5-72B | 34.33 | Open-Source |
|
| 211 |
+
| Seed-1.6-vision | 33.04 | Proprietary |
|
| 212 |
+
| Claude-haiku-4.5 | 32.84 | Proprietary |
|
| 213 |
| InternVL2.5-38B | 31.94 | Open-Source |
|
| 214 |
| InternVL3-8B | 31.94 | Open-Source |
|
| 215 |
+
| GPT-4o | 31.94 | Proprietary |
|
| 216 |
+
| QwenVL3-30B (Thinking) | 31.64 | Open-Source |
|
| 217 |
| QwenVL2.5-32B | 31.04 | Open-Source |
|
| 218 |
| LLaVA-Video-72B | 31.04 | Open-Source |
|
| 219 |
| InternVL3-38B | 30.45 | Open-Source |
|
|
|
|
| 225 |
| InternVideo2.5-8B | 27.76 | Open-Source |
|
| 226 |
| LLaVA-Video-7B | 27.16 | Open-Source |
|
| 227 |
|
| 228 |
+
|
| 229 |
</details>
|
| 230 |
|
| 231 |
*Note: For the three sub-benchmarks, we take the higher score of each model across the two settings for easier presentation.*
|