AGarioud commited on
Commit
c2381ab
·
verified ·
1 Parent(s): f83e1b7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +87 -3
README.md CHANGED
@@ -4,7 +4,7 @@ pipeline_tag: image-segmentation
4
  tags:
5
  - semantic segmentation
6
  - pytorch
7
- - SSL
8
  library_name: pytorch
9
  ---
10
 
@@ -55,7 +55,91 @@ Benchmark results on 4 datasets :
55
 
56
  <hr>
57
 
58
- ## Usage
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
59
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
60
 
61
- <hr>
 
4
  tags:
5
  - semantic segmentation
6
  - pytorch
7
+ - landcover
8
  library_name: pytorch
9
  ---
10
 
 
55
 
56
  <hr>
57
 
58
+ ## 🚀 Getting Started
59
+
60
+ First, set up the module with [Poetry](https://python-poetry.org/).
61
+
62
+ ```bash
63
+ # 1. Change directory
64
+ cd MAESTRO
65
+
66
+ # 2. Install dependencies with Poetry
67
+ poetry install
68
+ ```
69
+
70
+ Then, you can start from the following minimal examples.
71
+
72
+ Intra-dataset MAESTRO on TreeSatAI-TS:
73
+ ```bash
74
+ # pre-train, probe and finetune on TreeSatAI-TS
75
+ poetry run python main.py \
76
+ model.model=mae model.model_size=medium \
77
+ opt_pretrain.epochs=100 opt_probe.epochs=10 opt_finetune.epochs=50 \
78
+ datasets.name_dataset=treesatai_ts \
79
+ datasets.root_dir=/path/to/dataset/dir datasets.treesatai_ts.rel_dir=TreeSatAI-TS \
80
+ run.exp_dir=/path/to/experiments/dir run.exp_name=mae-m_treesat
81
+ ```
82
+
83
+ Intra-dataset MAESTRO on PASTIS-HD:
84
+ ```bash
85
+ # pre-train, probe and finetune on PASTIS-HD
86
+ poetry run python main.py \
87
+ model.model=mae model.model_size=medium \
88
+ opt_pretrain.epochs=100 opt_probe.epochs=10 opt_finetune.epochs=50 \
89
+ datasets.name_dataset=pastis_hd \
90
+ datasets.root_dir=/path/to/dataset/dir datasets.pastis_hd.rel_dir=PASTIS-HD \
91
+ run.exp_dir=/path/to/experiments/dir run.exp_name=mae-m_pastis
92
+ ```
93
+
94
+ Intra-dataset MAESTRO on FLAIR-HUB:
95
+ ```bash
96
+ # pre-train, probe and finetune on FLAIR-HUB
97
+ poetry run python main.py \
98
+ model.model=mae model.model_size=medium \
99
+ opt_pretrain.epochs=100 opt_probe.epochs=15 opt_finetune.epochs=100 \
100
+ datasets.name_dataset=flair \
101
+ datasets.root_dir=/path/to/dataset/dir datasets.flair.rel_dir=FLAIR-HUB \
102
+ run.exp_dir=/path/to/experiments/dir run.exp_name=mae-m_flair
103
+ ```
104
+
105
+ Cross-dataset MAESTRO from S2-NAIP urban to TreeSatAI-TS:
106
+ ```bash
107
+ # pre-train on S2-NAIP urban
108
+ poetry run python main.py \
109
+ model.model=mae model.model_size=medium \
110
+ opt_pretrain.epochs=15 opt_probe.epochs=0 opt_finetune.epochs=0 \
111
+ datasets.name_dataset=s2_naip \
112
+ datasets.root_dir=/path/to/dataset/dir datasets.s2_naip.rel_dir=s2-naip-urban \
113
+ run.exp_dir=/path/to/experiments/dir run.exp_name=mae-m_s2-naip && \
114
+ # probe and finetune on TreeSatAI-TS
115
+ poetry run python main.py \
116
+ model.model=mae model.model_size=medium \
117
+ opt_pretrain.epochs=0 opt_probe.epochs=10 opt_finetune.epochs=50 \
118
+ datasets.name_dataset=treesatai_ts \
119
+ datasets.treesatai_ts.aerial.image_size=240 datasets.treesatai_ts.aerial.patch_size.mae=16 \
120
+ datasets.treesatai_ts.s1_asc.name_embed=s1 datasets.treesatai_ts.s1_des.name_embed=s1 \
121
+ datasets.root_dir=/path/to/dataset/dir datasets.treesatai_ts.rel_dir=TreeSatAI-TS \
122
+ run.exp_dir=/path/to/experiments/dir run.load_name=mae-m_s2-naip run.exp_name=mae-m_s2-naip-x-treesat
123
+
124
+ ```
125
 
126
+ <hr>
127
+
128
+ ## Reference
129
+
130
+ If you use this code, please cite:
131
+
132
+ ```bibtex
133
+ @article{labatie2025maestro,
134
+ title={MAESTRO: Masked AutoEncoders for Multimodal, Multitemporal, and Multispectral Earth Observation Data},
135
+ author={Labatie, Antoine and Vaccaro, Michael and Lardiere, Nina and Garioud, Anatol and Gonthier, Nicolas},
136
+ journal={arXiv preprint arXiv:2508.10894},
137
+ year={2025}
138
+ }
139
+ ```
140
+
141
+ <hr>
142
+
143
+ ## Acknowledgement
144
 
145
+ The experiments in the paper were conducted using HPC/AI resources from GENCI-IDRIS (allocations A0181013803, A0161013803, and AD010114597R1).