File size: 2,944 Bytes
2579511 b8c8f93 2579511 450e4ca b8c8f93 2579511 082d223 2579511 082d223 2579511 082d223 b8c8f93 2579511 49c8ead 2579511 b8c8f93 2579511 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 | ---
language:
- en
license: cc-by-nc-sa-4.0
size_categories:
- n<1K
task_categories:
- other
pretty_name: MYRIAD-Physics
---
# MYRIAD-Physics
[](https://compvis.github.io/myriad)
[](https://arxiv.org/abs/2604.09527)
[](https://huggingface.co/papers/2604.09527)
[](https://github.com/CompVis/flow-poke-transformer)
[](https://huggingface.co/CompVis/myriad)
[](https://huggingface.co/datasets/CompVis/owm-95)
MYRIAD-Physics extends Physics-IQ and Physion with motion annotations and object tracks for evaluating probabilistic future trajectory forecasting under physical interactions. It was presented in the paper [Envisioning the Future, One Step at a Time](https://huggingface.co/papers/2604.09527).
## Abstract
MYRIAD-Physics extends Physics-IQ and Physion with motion annotations and object tracks (following the same approach proposed in [`CompVis/owm-95`](https://huggingface.co/datasets/CompVis/owm-95)) for evaluating probabilistic future trajectory forecasting under physical interactions.
Unlike [`CompVis/owm-95`](https://huggingface.co/datasets/CompVis/owm-95), which distributes videos together with annotations, this repository provides only the additional metadata: annotations and trajectories for videos that must be obtained separately using [this download script](https://github.com/CompVis/flow-poke-transformer/blob/main/scripts/myriad_eval/download_datasets.sh).
We manually annotate relevant objects and the type of motion observed, and we use an off-the-shelf tracker to obtain motion trajectories with manual verification of correctness.
## Project Page and Code
- Project Page: https://compvis.github.io/myriad
- GitHub Repository: https://github.com/CompVis/flow-poke-transformer
## Usage
We provide code and instructions to download the dataset and run the MYRIAD evaluation in our [GitHub repository](https://github.com/CompVis/flow-poke-transformer).
To run the benchmark evaluation for this dataset, you can use the following command:
```shell
python -m scripts.myriad_eval.openset_prediction --data-root path/to/data --ckpt-path path/to/checkpoint --dataset-name [owm | physion | physics-iq]
```
## Citation
If you find our data or code useful, please cite our paper:
```bibtex
@inproceedings{baumann2026envisioning,
title={Envisioning the Future, One Step at a Time},
author={Baumann, Stefan Andreas and Wiese, Jannik and Martorella, Tommaso and Kalayeh, Mahdi M. and Ommer, Bjorn},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2026}
}
``` |