Datasets:

ArXiv:
DOI:
License:
mateosss commited on
Commit
22c148e
Β·
verified Β·
1 Parent(s): 9a4877a

Add header with quick links to README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -0
README.md CHANGED
@@ -2,6 +2,8 @@
2
  license: cc-by-4.0
3
  ---
4
 
 
 
5
  <img alt="Monado SLAM Datasets cover image"
6
  src="/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/extras/cover.png"
7
  style="width: 720px;margin: auto;">
@@ -53,6 +55,8 @@ For additional documentation see:
53
  - [**Samsung Odyssey+ documentation**](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MO_odyssey_plus/README.md)
54
  - [**Post-processing walkthrough video**](https://www.youtube.com/watch?v=0PX_6PNwrvQ)
55
 
 
 
56
  To replicate the results from the paper "The Monado SLAM Dataset for Egocentric
57
  Visual-Inertial Tracking", you can download all the relevant runs, evaluations,
58
  and plotting scripts from
@@ -199,6 +203,13 @@ and plotting scripts from
199
 
200
  #### Table of sequences
201
 
 
 
 
 
 
 
 
202
  <table>
203
  <thead>
204
  <tr>
@@ -999,6 +1010,14 @@ and plotting scripts from
999
 
1000
  </details>
1001
 
 
 
 
 
 
 
 
 
1002
  ## License
1003
 
1004
  This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
 
2
  license: cc-by-4.0
3
  ---
4
 
5
+ <span style="display:block; margin:auto; text-align:center">πŸ“„ [**arXiv**](https://arxiv.org/abs/2508.00088) – πŸŽ₯ [**video**](https://youtu.be/-lsHpsTI-Q0) – ⬇️ [**get**](#table-of-sequences) – πŸ“Š [**results**](#results) – πŸ’¬ [**contact**](#contact)</span>
6
+
7
  <img alt="Monado SLAM Datasets cover image"
8
  src="/datasets/collabora/monado-slam-datasets/resolve/main/M_monado_datasets/extras/cover.png"
9
  style="width: 720px;margin: auto;">
 
55
  - [**Samsung Odyssey+ documentation**](https://huggingface.co/datasets/collabora/monado-slam-datasets/blob/main/M_monado_datasets/MO_odyssey_plus/README.md)
56
  - [**Post-processing walkthrough video**](https://www.youtube.com/watch?v=0PX_6PNwrvQ)
57
 
58
+ ### Results
59
+
60
  To replicate the results from the paper "The Monado SLAM Dataset for Egocentric
61
  Visual-Inertial Tracking", you can download all the relevant runs, evaluations,
62
  and plotting scripts from
 
203
 
204
  #### Table of sequences
205
 
206
+ You can preview and download individual sequences from these tables. If you want
207
+ to get the full dataset, you can use the [hf
208
+ CLI](https://huggingface.co/docs/huggingface_hub/en/guides/cli#download-an-entire-repository)
209
+ or clone with git lfs. Calibration information can be found in the `extras`
210
+ directory of each (e.g., for the
211
+ [Index](https://huggingface.co/datasets/collabora/monado-slam-datasets/tree/main/M_monado_datasets/MI_valve_index/extras))
212
+
213
  <table>
214
  <thead>
215
  <tr>
 
1010
 
1011
  </details>
1012
 
1013
+ ## Contact
1014
+
1015
+ If you have any additional question, you can reach out by:
1016
+
1017
+ - Opening a [new discussion](https://huggingface.co/datasets/collabora/monado-slam-datasets/discussions) thread in hugging face.
1018
+ - Join Monado's discord server and tag me (`@mateosss`) in the #SLAM channel for more flexible interaction.
1019
+ - Contact me directly at <mateo.demayo@tum.de>.
1020
+
1021
  ## License
1022
 
1023
  This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.