dirtmaxim commited on
Commit
cc9e3c1
·
verified ·
1 Parent(s): 402cf62

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -11
README.md CHANGED
@@ -42,12 +42,11 @@ configs:
42
 
43
  ## Dataset Description
44
  - **Homepage:** [BaboonLand Site](https://baboonland.xyz/)
45
- - **Repository:** https://huggingface.co/datasets/imageomics/BaboonLand
46
- - **Paper:** [https://link.springer.com/article/10.1007/s11263-025-02493-5](https://link.springer.com/article/10.1007/s11263-025-02493-5)
47
  - **arXiv:** [https://arxiv.org/pdf/2405.17698](https://arxiv.org/pdf/2405.17698)
48
 
49
  ### Dataset Summary
50
- BaboonLand is an aerial drone video dataset of wild olive baboons (*Papio anubis*) collected over 21 consecutive days in Laikipia (Mpala Research Centre), Kenya, following three troops during morning and evening movements to and from sleeping sites. The dataset contains UAV footage across diverse environments (e.g., sleeping tree, river, rock, open savannah, cliff), with up to ~70 individuals per frame, yielding dense multi-object scenes from an overhead viewpoint.
51
 
52
  The dataset supports three core subtasks: detection, multi-object tracking, and behavior recognition. It includes (1) a detection dataset derived from ~5.3K-resolution frames via multi-scale tiling (≈30K images), (2) ~0.5 hours of dense tracking annotations, and (3) ~20 hours of behavior “mini-scenes” annotated into 12 behavior classes and additional category for occlusions.
53
 
@@ -283,13 +282,13 @@ BaboonLand includes task-specific evaluation sets:
283
 
284
  ####
285
  ```
286
- @misc{hdr_imageomics_institute_2026,
287
- author = { Isla Duporge and Maksim Kholiavchenko and Roi Harel and Scott Wolf and Daniel Rubenstein and Tanya Berger-Wolf and Margaret Crofoot and Stephen Lee and Julie Barreau and Jenna Kline and Michelle Ramirez and Charles Stewart },
288
- title = { BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos },
289
  year = 2026,
290
- url = { https://huggingface.co/datasets/imageomics/BaboonLand },
291
- doi = { 10.57967/hf/7470 },
292
- publisher = { Hugging Face }
293
  }
294
  ```
295
 
@@ -297,7 +296,7 @@ BaboonLand includes task-specific evaluation sets:
297
  ```
298
  @article{duporge2025baboonland,
299
  title={BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos},
300
- author={Duporge, Isla and Kholiavchenko, Maksim and Harel, Roi and Wolf, Scott and Rubenstein, Daniel I and Crofoot, Margaret C and Berger-Wolf, Tanya and Lee, Stephen J and Barreau, Julie and Kline, Jenna and others},
301
  journal={International Journal of Computer Vision},
302
  pages={1--12},
303
  year={2025},
@@ -307,4 +306,4 @@ BaboonLand includes task-specific evaluation sets:
307
 
308
  ### Contributions / Acknowledgments
309
 
310
- This material is based upon work supported by the National Science Foundation under Award No. 2118240 and Award No. 2112606. ID was supported by the National Academy of Sciences Research Associate Program and the United States Army Research Laboratory while conducting this study. ID collected all the UAV data on a Civil Aviation Authority Drone License CAA NQE Approval Number: 0216/1365 in conjunction with authorization from a KCAA operator under a Remote Pilot License. The data was gathered at the Mpala Research Centre in Kenya, in accordance with Research License No. NACOSTI/P/22/18214. The data collection protocol adhered strictly to the guidelines set forth by the Institutional Animal Care and Use Committee under permission No. IACUC 1835F.
 
42
 
43
  ## Dataset Description
44
  - **Homepage:** [BaboonLand Site](https://baboonland.xyz/)
45
+ - **Paper:** [BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos](https://link.springer.com/article/10.1007/s11263-025-02493-5)
 
46
  - **arXiv:** [https://arxiv.org/pdf/2405.17698](https://arxiv.org/pdf/2405.17698)
47
 
48
  ### Dataset Summary
49
+ BaboonLand is an aerial drone video dataset of wild olive baboons (*Papio anubis*) collected over 21 consecutive days in Laikipia ([Mpala Research Centre](https://mpala.org)), Kenya, following three troops during morning and evening movements to and from sleeping sites. The dataset contains UAV footage across diverse environments (e.g., sleeping tree, river, rock, open savannah, cliff), with up to ~70 individuals per frame, yielding dense multi-object scenes from an overhead viewpoint.
50
 
51
  The dataset supports three core subtasks: detection, multi-object tracking, and behavior recognition. It includes (1) a detection dataset derived from ~5.3K-resolution frames via multi-scale tiling (≈30K images), (2) ~0.5 hours of dense tracking annotations, and (3) ~20 hours of behavior “mini-scenes” annotated into 12 behavior classes and additional category for occlusions.
52
 
 
282
 
283
  ####
284
  ```
285
+ @misc{duporge2026baboonland_hf,
286
+ title = {BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos},
287
+ author = {Isla Duporge and Maksim Kholiavchenko and Roi Harel and Scott Wolf and Daniel Rubenstein and Margaret Crofoot and Tanya Berger-Wolf and Stephen Lee and Julie Barreau and Jenna Kline and Michelle Ramirez and Charles Stewart},
288
  year = 2026,
289
+ url = {https://huggingface.co/datasets/imageomics/BaboonLand},
290
+ doi = {10.57967/hf/7470},
291
+ publisher = {Hugging Face}
292
  }
293
  ```
294
 
 
296
  ```
297
  @article{duporge2025baboonland,
298
  title={BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos},
299
+ author={Duporge, Isla and Kholiavchenko, Maksim and Harel, Roi and Wolf, Scott and Rubenstein, Daniel I and Crofoot, Margaret C and Berger-Wolf, Tanya and Lee, Stephen J and Barreau, Julie and Kline, Jenna and Ramirez, Michelle and Stewart, Charles},
300
  journal={International Journal of Computer Vision},
301
  pages={1--12},
302
  year={2025},
 
306
 
307
  ### Contributions / Acknowledgments
308
 
309
+ This material is based upon work supported by the US National Science Foundation (NSF) through the [Imageomics Institute](https://imageomics.org) (NSF HDR program) under [Award No. 2118240](https://www.nsf.gov/awardsearch/showAward?AWD_ID=2118240) (“Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning”) and through the [AI Institute for Intelligent Cyberinfrastructure with Computational Learning in the Environment (ICICLE)](https://icicle.osu.edu/) under [Award No. 2112606](https://www.nsf.gov/awardsearch/showAward?AWD_ID=2112606). ID was supported by the National Academy of Sciences Research Associate Program and the United States Army Research Laboratory while conducting this study. ID collected all the UAV data on a Civil Aviation Authority Drone License CAA NQE Approval Number: 0216/1365 in conjunction with authorization from a KCAA operator under a Remote Pilot License. The data were gathered at the [Mpala Research Centre](https://mpala.org/) in Kenya, in accordance with Research License No. NACOSTI/P/22/18214. The data collection protocol adhered strictly to the guidelines set forth by the Institutional Animal Care and Use Committee under permission No. IACUC 1835F. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.