EliYuan00 commited on
Commit
336d1f6
Β·
1 Parent(s): 5706b44

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -43
README.md CHANGED
@@ -14,17 +14,20 @@ size_categories:
14
  - 1K<n<10K
15
  ---
16
 
17
- # Video-MME-v2
18
-
19
  <p align="center">
20
  <img src="assets/logo.png" width="100%" height="100%">
21
  </p>
22
 
23
- <font size=7><div align='center' > [[🍎 Project Page](https://video-mme-v2.netlify.app/)] [[πŸ“– Paper](https://github.com/MME-Benchmarks/Video-MME-v2)] [[πŸ€— Dataset](https://huggingface.co/datasets/MME-Benchmarks/Video-MME-v2)] [[πŸ† Leaderboard](https://video-mme-v2.netlify.app/#leaderboard)] </div></font>
 
 
 
 
 
24
 
25
  ---
26
 
27
- ### πŸ€— About This Repo
28
 
29
  This repository contains annotation data for "Video-MME-v2: Towards the Next Stage in Benchmarks
30
  for Comprehensive Video Understanding". It mainly consists of three parts: `videos/`, `test.parquet`, and `subtitle.zip`.
@@ -36,12 +39,12 @@ for Comprehensive Video Understanding". It mainly consists of three parts: `vide
36
  - `subtitle.zip` contains **800 JSONL files**, named `{001..800}.jsonl`, each corresponding to a video id. Each file contains word-level entries with timestamps.
37
 
38
  <p align="center">
39
- <img src="assets/teaser.pdf" width="100%" height="100%">
40
  </p>
41
 
42
  ---
43
 
44
- ### πŸ‘€ Dataset Intro
45
 
46
  In 2024, our [**Video-MME**](https://video-mme.github.io/) benchmark became a standard evaluation set for frontier models like Gemini and GPT. However, as model capabilities rapidly evolve, scores on existing benchmarks are saturating, yet a clear gap remains between **leaderboard performance and actual user experience**. This indicates that current evaluation paradigms fail to capture true video understanding abilities. To address this, we spent a year redesigning the evaluation system from first principles and now introduce **Video-MME v2**β€”a progressive and robust benchmark designed to drive the next generation of video understanding models.
47
 
@@ -74,49 +77,57 @@ In 2024, our [**Video-MME**](https://video-mme.github.io/) benchmark became a st
74
 
75
  ---
76
 
77
- ### πŸ™Œ Example
 
 
 
 
 
 
78
 
79
- <video controls width="100%" src="assets/demo.mp4"></video>
 
 
80
 
81
- **Q1:** Does the ball exist underneath any of the shells?
82
- A. No.
83
- B. Yes.
 
84
  C. Cannot be determined.
 
85
 
86
- **Q2:** Underneath which shell is the ball located at the end?
87
- A. There is no ball under any shell.
88
- B. The third shell.
89
- C. The sixth shell.
90
- D. The second shell.
91
- E. The seventh shell.
92
- F. The fifth shell.
93
- G. The fourth shell.
 
94
  H. The first shell.
 
95
 
96
- **Q3:** The host performed a total of two shell swaps (defining a single swap as an instance where all shells return to an approximately straight line). Underneath which shell was the ball located after the first swap?
97
- A. There is no ball under any shell.
98
- B. The seventh shell.
99
- C. The fourth shell.
100
- D. The fifth shell.
101
- E. The sixth shell.
102
- F. The second shell.
103
- G. The third shell.
 
104
  H. The first shell.
 
105
 
106
- **Q4:** The host performed a total of two shell swaps (defining a single swap as an instance where all shells return to an approximately straight line). Underneath which shell was the ball located initially?
107
-
108
- A. The seventh shell.
109
-
110
- B. The fourth shell.
111
-
112
- C. The fifth shell.
113
-
114
- D. The third shell.
115
-
116
- E. The second shell.
117
-
118
- F. There is no ball under any shell.
119
-
120
- G. The first shell.
121
-
122
  H. The sixth shell.
 
 
14
  - 1K<n<10K
15
  ---
16
 
 
 
17
  <p align="center">
18
  <img src="assets/logo.png" width="100%" height="100%">
19
  </p>
20
 
21
+ <p align="center">
22
+ <a href="https://video-mme-v2.netlify.app/">🍎 Project Page</a> |
23
+ <a href="https://github.com/MME-Benchmarks/Video-MME-v2">πŸ“– Paper</a> |
24
+ <a href="https://huggingface.co/datasets/MME-Benchmarks/Video-MME-v2">πŸ€— Dataset</a> |
25
+ <a href="https://video-mme-v2.netlify.app/#leaderboard">πŸ† Leaderboard</a>
26
+ </p>
27
 
28
  ---
29
 
30
+ # πŸ€— About This Repo
31
 
32
  This repository contains annotation data for "Video-MME-v2: Towards the Next Stage in Benchmarks
33
  for Comprehensive Video Understanding". It mainly consists of three parts: `videos/`, `test.parquet`, and `subtitle.zip`.
 
39
  - `subtitle.zip` contains **800 JSONL files**, named `{001..800}.jsonl`, each corresponding to a video id. Each file contains word-level entries with timestamps.
40
 
41
  <p align="center">
42
+ <img src="assets/teaser.png" width="100%" height="100%">
43
  </p>
44
 
45
  ---
46
 
47
+ # πŸ‘€ Dataset Intro
48
 
49
  In 2024, our [**Video-MME**](https://video-mme.github.io/) benchmark became a standard evaluation set for frontier models like Gemini and GPT. However, as model capabilities rapidly evolve, scores on existing benchmarks are saturating, yet a clear gap remains between **leaderboard performance and actual user experience**. This indicates that current evaluation paradigms fail to capture true video understanding abilities. To address this, we spent a year redesigning the evaluation system from first principles and now introduce **Video-MME v2**β€”a progressive and robust benchmark designed to drive the next generation of video understanding models.
50
 
 
77
 
78
  ---
79
 
80
+ # πŸ™Œ Example
81
+
82
+ <p align="center">
83
+ <a href="https://huggingface.co/datasets/MME-Benchmarks/Video-MME-v2/resolve/main/assets/demo.mp4">
84
+ <img src="assets/demo_cover.png" width="90%" alt="Demo video"/>
85
+ </a>
86
+ </p>
87
 
88
+ <p align="center">
89
+ Click the image above to open the demo video.
90
+ </p>
91
 
92
+ <p>
93
+ <strong>Q1:</strong> Does the ball exist underneath any of the shells?<br>
94
+ A. No.<br>
95
+ B. Yes.<br>
96
  C. Cannot be determined.
97
+ </p>
98
 
99
+ <p>
100
+ <strong>Q2:</strong> Underneath which shell is the ball located at the end?<br>
101
+ A. There is no ball under any shell.<br>
102
+ B. The third shell.<br>
103
+ C. The sixth shell.<br>
104
+ D. The second shell.<br>
105
+ E. The seventh shell.<br>
106
+ F. The fifth shell.<br>
107
+ G. The fourth shell.<br>
108
  H. The first shell.
109
+ </p>
110
 
111
+ <p>
112
+ <strong>Q3:</strong> The host performed a total of two shell swaps (defining a single swap as an instance where all shells return to an approximately straight line). Underneath which shell was the ball located after the first swap?
113
+ A. There is no ball under any shell.<br>
114
+ B. The seventh shell.<br>
115
+ C. The fourth shell.<br>
116
+ D. The fifth shell.<br>
117
+ E. The sixth shell.<br>
118
+ F. The second shell.<br>
119
+ G. The third shell.<br>
120
  H. The first shell.
121
+ </p>
122
 
123
+ <p>
124
+ <strong>Q4:</strong> The host performed a total of two shell swaps (defining a single swap as an instance where all shells return to an approximately straight line). Underneath which shell was the ball located initially?
125
+ A. The seventh shell.<br>
126
+ B. The fourth shell.<br>
127
+ C. The fifth shell.<br>
128
+ D. The third shell.<br>
129
+ E. The second shell.<br>
130
+ F. There is no ball under any shell.<br>
131
+ G. The first shell.<br>
 
 
 
 
 
 
 
132
  H. The sixth shell.
133
+ </p>