Datasets:

Tasks:
Other
Modalities:
Video
Languages:
English
Size:
< 1K
ArXiv:
Libraries:
Datasets
License:
ragor commited on
Commit
eb79d7f
·
verified ·
1 Parent(s): 49a80d1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -4,7 +4,7 @@ language:
4
  - en
5
  ---
6
 
7
- # OWM-95 Benchmark
8
 
9
  [![Project Page](https://img.shields.io/badge/Project-Page-blue)](https://compvis.github.io/myriad)
10
  [![Paper](https://img.shields.io/badge/arXiv-paper-b31b1b)](_blank)
@@ -12,10 +12,10 @@ language:
12
 
13
  ## Abstract
14
 
15
- The OWM-95 benchmark was proposed in the paper [Envisioning the Future, One Step at a Time](_blank) and used to evaluate the [MYRIAD](https://huggingface.co/CompVis/myriad/) model.
16
 
17
- OWM-95 is a benchmark of 95 curated videos with motion annotations, where the distribution of motion is constrained to make evaluation of probabilistic motion prediction methods feasible.
18
- Videos are obtained from Pexels ([Pexels License](https://www.pexels.com/license/)), we manually annotate relevant objects, and the type of motion observed. We use an off-the-shelf tracker to obtain motion trajectories and manually verify correctness.
19
 
20
  ## Project Page and Code
21
 
 
4
  - en
5
  ---
6
 
7
+ # OWM Benchmark
8
 
9
  [![Project Page](https://img.shields.io/badge/Project-Page-blue)](https://compvis.github.io/myriad)
10
  [![Paper](https://img.shields.io/badge/arXiv-paper-b31b1b)](_blank)
 
12
 
13
  ## Abstract
14
 
15
+ The OWM benchmark was proposed in the paper [Envisioning the Future, One Step at a Time](_blank) and used to evaluate the [MYRIAD](https://huggingface.co/CompVis/myriad/) model.
16
 
17
+ OWM is a benchmark of 95 curated videos with motion annotations, with the distribution of motion constrained to enable the evaluation of probabilistic motion prediction methods.
18
+ Videos are obtained from Pexels ([Pexels License](https://www.pexels.com/license/)). We manually annotate relevant objects and the type of motion observed. We use an off-the-shelf tracker to obtain motion trajectories and manually verify correctness.
19
 
20
  ## Project Page and Code
21