ch-chenyu commited on
Commit
4fc6651
·
verified ·
1 Parent(s): 0395002

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -9,7 +9,7 @@ size_categories:
9
  <h1>Seeing from Another Perspective: Evaluating Multi-View Understanding in MLLMs</h1>
10
 
11
 
12
- <a href='https://next-gpt.github.io/'><img src='https://img.shields.io/badge/Project-Page-Green'></a>
13
  <a href='https://arxiv.org/pdf/xxxx.10727'><img src='https://img.shields.io/badge/Paper-PDF-orange'></a>
14
  <a href='https://arxiv.org/pdf/xxxx.10727'><img src='https://img.shields.io/badge/Arxiv-Page-purple'></a>
15
  <a href="https://github.com/Chenyu-Wang567/All-Angles-Bench/tree/main"><img src='https://img.shields.io/badge/Code-Github-red'></a>
@@ -30,6 +30,8 @@ The dataset presents a comprehensive benchmark consisting of over 2,100 human-an
30
  - **[EgoHumans](https://github.com/rawalkhirodkar/egohumans)** - Egocentric multi-view human activity understanding dataset
31
  - **[Ego-Exo4D](https://github.com/facebookresearch/Ego4d)** - Large-scale egocentric and exocentric video dataset for multi-person interaction understanding
32
 
 
 
33
 
34
  ## Usage
35
 
 
9
  <h1>Seeing from Another Perspective: Evaluating Multi-View Understanding in MLLMs</h1>
10
 
11
 
12
+ <a href='https://danielchyeh.github.io/All-Angles-Bench/'><img src='https://img.shields.io/badge/Project-Page-Green'></a>
13
  <a href='https://arxiv.org/pdf/xxxx.10727'><img src='https://img.shields.io/badge/Paper-PDF-orange'></a>
14
  <a href='https://arxiv.org/pdf/xxxx.10727'><img src='https://img.shields.io/badge/Arxiv-Page-purple'></a>
15
  <a href="https://github.com/Chenyu-Wang567/All-Angles-Bench/tree/main"><img src='https://img.shields.io/badge/Code-Github-red'></a>
 
30
  - **[EgoHumans](https://github.com/rawalkhirodkar/egohumans)** - Egocentric multi-view human activity understanding dataset
31
  - **[Ego-Exo4D](https://github.com/facebookresearch/Ego4d)** - Large-scale egocentric and exocentric video dataset for multi-person interaction understanding
32
 
33
+ Thanks for their wonderful data.
34
+
35
 
36
  ## Usage
37