rbler commited on
Commit
292bcb0
ยท
verified ยท
1 Parent(s): f211c7e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -11
README.md CHANGED
@@ -199,21 +199,21 @@ Please refer to the evaluation guidelines in our [github repo](https://github.co
199
 
200
  | Model | Avg.(%) | Type |
201
  |----------------------------|---------|-------------|
202
- | ๐Ÿฅ‡gemini-2.5-flash | 38.81 | Proprietary |
203
- | ๐Ÿฅˆgemini-2.5-flash(thinking) | 38.21 | Proprietary |
204
- | ๐Ÿฅ‰o3 | 37.61 | Proprietary |
205
- | doubao-1-5-thinking | 37.05 | Proprietary |
206
  | InternVL3-78B | 35.52 | Open-Source |
207
- | gpt-5 | 35.22 | Proprietary |
208
- | gemini-3-pro | 35.22 | Proprietary |
209
- | o4-mini | 34.33 | Proprietary |
210
  | QwenVL2.5-72B | 34.33 | Open-Source |
211
- | seed-1-6-vision | 33.04 | Proprietary |
212
- | claude-haiku-4-5 | 32.84 | Proprietary |
213
  | InternVL2.5-38B | 31.94 | Open-Source |
214
  | InternVL3-8B | 31.94 | Open-Source |
215
- | gpt-4o | 31.94 | Proprietary |
216
- | QwenVL3-30B(Thinking) | 31.64 | Open-Source |
217
  | QwenVL2.5-32B | 31.04 | Open-Source |
218
  | LLaVA-Video-72B | 31.04 | Open-Source |
219
  | InternVL3-38B | 30.45 | Open-Source |
@@ -225,6 +225,7 @@ Please refer to the evaluation guidelines in our [github repo](https://github.co
225
  | InternVideo2.5-8B | 27.76 | Open-Source |
226
  | LLaVA-Video-7B | 27.16 | Open-Source |
227
 
 
228
  </details>
229
 
230
  *Note: For the three sub-benchmarks, we take the higher score of each model across the two settings for easier presentation.*
 
199
 
200
  | Model | Avg.(%) | Type |
201
  |----------------------------|---------|-------------|
202
+ | ๐Ÿฅ‡Gemini 2.5 Flash | 38.81 | Proprietary |
203
+ | ๐ŸฅˆGemini 2.5 Flash (Thinking) | 38.21 | Proprietary |
204
+ | ๐Ÿฅ‰O3 | 37.61 | Proprietary |
205
+ | Doubao-1.5-thinking | 37.05 | Proprietary |
206
  | InternVL3-78B | 35.52 | Open-Source |
207
+ | GPT-5 | 35.22 | Proprietary |
208
+ | Gemini 3 Pro | 35.22 | Proprietary |
209
+ | O4-mini | 34.33 | Proprietary |
210
  | QwenVL2.5-72B | 34.33 | Open-Source |
211
+ | Seed-1.6-vision | 33.04 | Proprietary |
212
+ | Claude-haiku-4.5 | 32.84 | Proprietary |
213
  | InternVL2.5-38B | 31.94 | Open-Source |
214
  | InternVL3-8B | 31.94 | Open-Source |
215
+ | GPT-4o | 31.94 | Proprietary |
216
+ | QwenVL3-30B (Thinking) | 31.64 | Open-Source |
217
  | QwenVL2.5-32B | 31.04 | Open-Source |
218
  | LLaVA-Video-72B | 31.04 | Open-Source |
219
  | InternVL3-38B | 30.45 | Open-Source |
 
225
  | InternVideo2.5-8B | 27.76 | Open-Source |
226
  | LLaVA-Video-7B | 27.16 | Open-Source |
227
 
228
+
229
  </details>
230
 
231
  *Note: For the three sub-benchmarks, we take the higher score of each model across the two settings for easier presentation.*