Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Expected str 'name', but got: NoneType
Error code:   UnexpectedError

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

jpg-00
unknown
jpg-01
unknown
jpg-02
unknown
jpg-03
unknown
jpg-04
unknown
jpg-05
unknown
jpg-06
unknown
jpg-07
unknown
json
dict
__key__
string
__url__
string
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "22ad5dacf5f446008c394c28342aac37", "label": "pan left" }
22ad5dacf5f446008c394c28342aac37
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "432fd29caa5d4a28866e331de2a0f993", "label": "arc cw, dolly out" }
432fd29caa5d4a28866e331de2a0f993
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "3d37ce867dca449789278ab75edbfa02", "label": "crane down, dolly in" }
3d37ce867dca449789278ab75edbfa02
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "159a87f459834e31b1ac6d632c8661d7", "label": "arc cw, dolly out, tilt up" }
159a87f459834e31b1ac6d632c8661d7
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "75acc73d1d6f4303a5d377dcf73f07fd", "label": "static" }
75acc73d1d6f4303a5d377dcf73f07fd
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "6363d616fb3f4be6b7c941d1165622c3", "label": "dolly in, tilt down" }
6363d616fb3f4be6b7c941d1165622c3
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "957fc66abc5d49d788a50dca4a077a81", "label": "pan right" }
957fc66abc5d49d788a50dca4a077a81
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "a9f5e76114a741aa98d164438616554b", "label": "dolly in, roll cw" }
a9f5e76114a741aa98d164438616554b
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "1228cec7d1d4493a8134982cee6ed83d", "label": "crane up" }
1228cec7d1d4493a8134982cee6ed83d
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
"/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAIBAQEBAQIBAQECAgICAgQDAgICAgUEBAMEBgUGBgYFBgYGBwkIBgcJBwYGCAsICQo(...TRUNCATED)
{ "id": "e573a48ae05d420a8a3c40a67d48955d", "label": "crane down" }
e573a48ae05d420a8a3c40a67d48955d
"hf://datasets/fengyee/camera-motion-dataset-and-benchmark@f67fe5e59c9d3549acf67f3d09b4995e62c0dee8/(...TRUNCATED)
End of preview.

Camera Motion Dataset and Benchmark

This repository packages the dataset and benchmark release for our paper "Geometry-Guided Camera Motion Understanding in VideoLLMs" as a Hugging Face dataset repo. Derived from the KlingTeam/MultiCamVideo-Dataset.

It contains:

  • WebDataset tar shards for train, val, and test
  • Split manifests (train.json, val.json, test.json) that map source clips to sample ids
  • A multiple-choice VQA benchmark (cam-motion.jsonl)
  • Small example scripts under examples/

Files

Path Description
train-*.tar Training split WebDataset shards
val-*.tar Validation split WebDataset shards
test-*.tar Test split WebDataset shards
train.json Training manifest with video_path, clip_index, label, and id
val.json Validation manifest with the same schema
test.json Test manifest with the same schema
cam-motion.jsonl Multiple-choice benchmark for camera-motion understanding
examples/load_webdataset.py Minimal WebDataset loader example
examples/inspect_tar.py Prints the internal structure of one tar shard
examples/read_benchmark.py Reads benchmark rows and maps them back to manifests

Dataset Format

Each WebDataset sample stores 8 RGB frames resized to 560 x 560 and a JSON metadata record:

<sample_id>.jpg-00
<sample_id>.jpg-01
...
<sample_id>.jpg-07
<sample_id>.json

The JSON sidecar has the form:

{
  "label": "dolly in, truck right",
  "id": "00ae5c50b2b54704ad833996c20c055d"
}

The manifests provide the lookup back to source provenance:

{
  "video_path": "path/to/source/video.mp4",
  "clip_index": 2,
  "label": "dolly in, truck right",
  "id": "00ae5c50b2b54704ad833996c20c055d"
}

cam-motion.jsonl contains multiple-choice benchmark rows like:

{
  "video": "path/to/source/video.mp4",
  "clip_index": 2,
  "conversations": [
    {
      "from": "human",
      "value": "<video>\nIdentify the camera motion depicted in the video using standard cinematographic terminology.\nOptions:\n(A) ...\n(B) ...\n(C) ...\n(D) ..."
    },
    {
      "from": "gpt",
      "value": "Answer: (A) ..."
    }
  ]
}

The video field in the benchmark is a provenance string copied from dataset construction. After release, it should be matched against video_path in the manifest files rather than treated as a valid local path.

Current Split Layout

  • Manifest rows: train=9819, val=1227, test=1228 (12274 total)
  • Materialized shard samples: matches manifests exactly (0 missing)
  • Benchmark rows: 12274 (one row per unique clip)

All splits are deduplicated and non-overlapping.

The example and validation scripts in this repo show how to inspect these counts locally before publishing.

Usage

Download the dataset repo locally, then point webdataset at the shard files:

import io
import json
from pathlib import Path

from PIL import Image
import webdataset as wds

dataset_root = Path("camera-motion-dataset-and-benchmark")
shards = sorted(str(p) for p in dataset_root.glob("train-*.tar"))

def decode_sample(sample):
    frames = [
        Image.open(io.BytesIO(sample[f"jpg-{i:02d}"])).convert("RGB")
        for i in range(8)
    ]
    meta = json.loads(sample["json"])
    return {
        "id": meta["id"],
        "label": meta["label"],
        "frames": frames,
    }

dataset = wds.WebDataset(shards, shardshuffle=False).map(decode_sample)
sample = next(iter(dataset))
print(sample["id"], sample["label"], len(sample["frames"]))

To inspect the benchmark:

python examples/read_benchmark.py --dataset-root . --show 3

To inspect one tar shard:

python examples/inspect_tar.py --tar train-000000.tar

Suggested Hugging Face Workflow

  1. Stage a clean dataset repo from the source project using the local helper scripts.
  2. If you are rebuilding corrected manifests first, point stage_release.py at that dataset directory with --dataset-dir.
  3. Create a private dataset repo first.
  4. Upload the staged folder.
  5. Inspect the dataset card, files, and example scripts on the Hub.
  6. Make the repo public only after the release audit looks correct.

Citation

TDB

Downloads last month
-