Improve dataset card: Add metadata, paper, and project page links

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +36 -2
README.md CHANGED
@@ -1,6 +1,40 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
 
3
  ---
4
- This repo contains the data we produced/postprocessed as part of **AllTracker: Efficient Dense Point Tracking at High Resolution**.
5
 
6
- This data is used by the training scripts in our [github repo](https://github.com/aharley/alltracker/), and leads to the models in our [model page](https://huggingface.co/aharley/alltracker), which you can test in our [Gradio demo](https://huggingface.co/spaces/aharley/alltracker).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ task_categories:
4
+ - image-to-image
5
+ library_name:
6
+ - datasets
7
+ tags:
8
+ - point-tracking
9
+ - optical-flow
10
+ - video
11
+ - dense-correspondence
12
  ---
 
13
 
14
+ # AllTracker: Efficient Dense Point Tracking Dataset
15
+
16
+ This repository contains the data produced/postprocessed as part of [**AllTracker: Efficient Dense Point Tracking at High Resolution**](https://huggingface.co/papers/2506.07310).
17
+
18
+ AllTracker is a model that estimates long-range point tracks by estimating the flow field between a query frame and every other frame of a video. This dataset supports the training and evaluation of such models, providing high-resolution and dense correspondence fields.
19
+
20
+ **Project Page:** [https://alltracker.github.io](https://alltracker.github.io)
21
+ **GitHub Repository (Code):** [https://github.com/aharley/alltracker/](https://github.com/aharley/alltracker/)
22
+ **Hugging Face Model Page:** [https://huggingface.co/aharley/alltracker](https://huggingface.co/aharley/alltracker)
23
+ **Gradio Demo:** [https://huggingface.co/spaces/aharley/alltracker](https://huggingface.co/spaces/aharley/alltracker)
24
+
25
+ ## Dataset Usage and Preparation
26
+
27
+ This data is used by the training scripts in the associated [GitHub repository](https://github.com/aharley/alltracker/). For detailed instructions on how to download, prepare, and use this dataset for training, please refer to the [**"Data prep" section in the GitHub repository's README**](https://github.com/aharley/alltracker/#data-prep).
28
+
29
+ ## Citation
30
+
31
+ If you use this dataset or the associated code for your research, please cite the paper:
32
+
33
+ ```bibtex
34
+ @inproceedings{harley2025alltracker,
35
+ author = {Adam W. Harley and Yang You and Xinglong Sun and Yang Zheng and Nikhil Raghuraman and Yunqi Gu and Sheldon Liang and Wen-Hsuan Chu and Achal Dave and Pavel Tokmakov and Suya You and Rares Ambrus and Katerina Fragkiadaki and Leonidas J. Guibas},
36
+ title = {All{T}racker: {E}fficient Dense Point Tracking at High Resolution},
37
+ booktitle = {ICCV},
38
+ year = {2025}
39
+ }
40
+ ```