Datasets:

Tasks:
Other
Modalities:
Video
Languages:
English
Size:
< 1K
ArXiv:
Libraries:
Datasets
License:
nielsr HF Staff commited on
Commit
7eea8b5
·
verified ·
1 Parent(s): 3b928f5

Update dataset card metadata, paper link and usage

Browse files

This PR improves the dataset card by:
1. Adding `task_categories: [other]` to the YAML metadata.
2. Fixing the broken paper link in the Markdown content.
3. Updating the sample usage command to match the instructions provided in the official GitHub repository.
4. Correcting a typo in the license reference (from `44.0` to `4.0`).

Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -1,18 +1,20 @@
1
  ---
2
- license: cc-by-nc-sa-4.0
3
  language:
4
  - en
 
 
 
5
  ---
6
 
7
  # OWM Benchmark
8
 
9
  [![Project Page](https://img.shields.io/badge/Project-Page-blue)](https://compvis.github.io/myriad)
10
- [![Paper](https://img.shields.io/badge/arXiv-paper-b31b1b)](https://arxiv.org/abs/2604.09527)
11
  [![MYRIAD Weights](https://img.shields.io/badge/HuggingFace-Weights-orange)](https://huggingface.co/CompVis/myriad)
12
 
13
  ## Abstract
14
 
15
- The OWM benchmark was proposed in the paper [Envisioning the Future, One Step at a Time](_blank) and used to evaluate the [MYRIAD](https://huggingface.co/CompVis/myriad/) model.
16
 
17
  OWM is a benchmark of 95 curated videos with motion annotations, with the distribution of motion constrained to enable the evaluation of probabilistic motion prediction methods.
18
  Videos are obtained from Pexels ([Pexels License](https://www.pexels.com/license/)). We manually annotate relevant objects and the type of motion observed. We use an off-the-shelf tracker to obtain motion trajectories and manually verify correctness.
@@ -24,21 +26,23 @@ Videos are obtained from Pexels ([Pexels License](https://www.pexels.com/license
24
 
25
  ![OWM samples](https://compvis.github.io/myriad/static/images/paper-svg/owm-qualitative.svg)
26
 
27
- *OWM samples are include complex real-world scenes with different motion types and complexities.*
28
 
29
  ## Usage
30
 
31
  We provide code to run the OWM evaluation in our [GitHub repository](https://github.com/CompVis/flow-poke-transformer).
32
 
33
- To run the evaluation, first download the data by running `hf download CompVis/owm-95 --repo-type dataset` then run the evaluation script via
 
 
34
  ```shell
35
- python -m scripts.eval.myriad_eval.openset_prediction --checkpoint-path path/to/checkpoint -data-path path/to/data
36
  ```
37
 
38
  ## License
39
 
40
  - Videos are sourced from Pexels and thus licensed under the [Pexels License](https://www.pexels.com/license/)
41
- - Metadata and motion annotations are provided under the [CC-BY-NC-SA-44.0](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en) license
42
 
43
  ## Citation
44
 
 
1
  ---
 
2
  language:
3
  - en
4
+ license: cc-by-nc-sa-4.0
5
+ task_categories:
6
+ - other
7
  ---
8
 
9
  # OWM Benchmark
10
 
11
  [![Project Page](https://img.shields.io/badge/Project-Page-blue)](https://compvis.github.io/myriad)
12
+ [![Paper](https://img.shields.io/badge/arXiv-paper-b31b1b)](https://huggingface.co/papers/2604.09527)
13
  [![MYRIAD Weights](https://img.shields.io/badge/HuggingFace-Weights-orange)](https://huggingface.co/CompVis/myriad)
14
 
15
  ## Abstract
16
 
17
+ The OWM benchmark was proposed in the paper [Envisioning the Future, One Step at a Time](https://huggingface.co/papers/2604.09527) and used to evaluate the [MYRIAD](https://huggingface.co/CompVis/myriad/) model.
18
 
19
  OWM is a benchmark of 95 curated videos with motion annotations, with the distribution of motion constrained to enable the evaluation of probabilistic motion prediction methods.
20
  Videos are obtained from Pexels ([Pexels License](https://www.pexels.com/license/)). We manually annotate relevant objects and the type of motion observed. We use an off-the-shelf tracker to obtain motion trajectories and manually verify correctness.
 
26
 
27
  ![OWM samples](https://compvis.github.io/myriad/static/images/paper-svg/owm-qualitative.svg)
28
 
29
+ *OWM samples include complex real-world scenes with different motion types and complexities.*
30
 
31
  ## Usage
32
 
33
  We provide code to run the OWM evaluation in our [GitHub repository](https://github.com/CompVis/flow-poke-transformer).
34
 
35
+ To run the evaluation, first download the data by running `hf download CompVis/owm-95 --repo-type dataset`.
36
+
37
+ Then run the evaluation script via:
38
  ```shell
39
+ python -m scripts.myriad_eval.openset_prediction --data-root path/to/data --ckpt-path path/to/checkpoint --dataset-name owm
40
  ```
41
 
42
  ## License
43
 
44
  - Videos are sourced from Pexels and thus licensed under the [Pexels License](https://www.pexels.com/license/)
45
+ - Metadata and motion annotations are provided under the [CC-BY-NC-SA-4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en) license
46
 
47
  ## Citation
48