tforgaard commited on
Commit
30dfefe
·
verified ·
1 Parent(s): 9b557f6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +120 -10
README.md CHANGED
@@ -2,25 +2,135 @@
2
  license: apache-2.0
3
  datasets:
4
  - FM4CS/THOR-Pretrain
 
5
  tags:
6
  - NR
7
  - ESA
8
  - Foundation Model
9
  - Earth Observation
 
 
 
 
 
 
 
 
10
  ---
11
 
12
- # THOR tiny
 
 
 
 
13
 
14
- The THOR (Transformer based foundation model for Heterogeneous Observation and Resolution) foundation model supports Sentinel-1 (SAR), Sentinel-2 (MSI), and Sentinel-3 OLCI & SLSTR, with resolutions ranging from 10 m to 1000 m. Furthermore it is trained using flexible patch sizes and ground covers ranging from 1000m to 50 000m.
15
 
16
- # Example usage
17
- ...
18
 
19
- # Integrations
20
- ...
21
 
22
- # Source
23
- https://github.com/FM4CS/fm4cs
24
 
25
- # Citation
26
- If you use THOR in your research, please cite the pre-print ...
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: apache-2.0
3
  datasets:
4
  - FM4CS/THOR-Pretrain
5
+ pipeline_tag: image-feature-extraction
6
  tags:
7
  - NR
8
  - ESA
9
  - Foundation Model
10
  - Earth Observation
11
+ - Geospatial
12
+ - Remote Sensing
13
+ - Sentinel-1
14
+ - Sentinel-2
15
+ - Sentinel-3
16
+ - SAR
17
+ - Multispectral
18
+ - Climate
19
  ---
20
 
21
+ [![Website](https://img.shields.io/badge/Website-THOR-0F62FE)](https://thor-model.notion.site/THOR-Foundation-Model-Showcase-2ee64c7f3cb78087bf77feb6350bdcc6)
22
+ [![arXiv](https://img.shields.io/badge/arXiv-2601.16011-b31b1b?logo=arxiv)](https://arxiv.org/abs/2601.16011)
23
+ [![Code](https://img.shields.io/badge/Code-GitHub-181717?logo=github)](https://github.com/FM4CS/THOR)
24
+ [![TerraTorch Extension](https://img.shields.io/badge/TerraTorch-Extension-EE4B2B?logo=github)](https://github.com/FM4CS/thor_terratorch_ext)
25
+ [![Dataset](https://img.shields.io/badge/Dataset-HuggingFace-FFD21E?logo=huggingface)](https://huggingface.co/datasets/FM4CS/THOR-Pretrain)
26
 
27
+ # THOR Tiny
28
 
29
+ THOR (Transformer based foundation model for Heterogeneous Observation and Resolution) is a compute-adaptive geospatial foundation model developed by Norwegian Computing Center (NR), UiT The Arctic University of Norway and ESA Φ-lab.
 
30
 
31
+ ## Model Description
 
32
 
33
+ THOR unifies data from Copernicus Sentinel-1, -2, and -3 (OLCI & SLSTR) satellites, processing their native 10 m to 1000 m resolutions in a single model. THOR is pre-trained with a novel randomized patch and input image size strategy, allowing deployment at inference with any patch size for dynamic trade-offs between computational cost and feature resolution without retraining.
 
34
 
35
+ Key features:
36
+ - **Multi-sensor support**: Sentinel-1 (SAR), Sentinel-2 (MSI), Sentinel-3 OLCI & SLSTR
37
+ - **Flexible resolution**: 10 m to 1000 m native resolutions
38
+ - **Compute-adaptive**: Flexible patch sizes and ground covers (1000 m to +100,000 m)
39
+ - **Data-efficient**: State-of-the-art performance in data-limited regimes
40
+ - **Model type:** Vision Transformer (FlexiViT)
41
+
42
+ ## Usage
43
+
44
+ THOR is designed for fine-tuning on downstream tasks such as land cover classification, crop mapping, flood detection, and more. Its flexible architecture allows users to adapt the model to various geospatial applications while leveraging its multi-sensor capabilities.
45
+
46
+ For downstream applications, we recommend using the [terratorch](https://github.com/terrastackai/terratorch) framework with our [THOR terratorch extension](https://github.com/FM4CS/thor_terratorch_ext).
47
+
48
+
49
+ ### Terratorch backbone loading example
50
+
51
+ ```python
52
+ # Example usage of THOR ViT backbone with terratorch
53
+
54
+ # Import our custom thor_terratorch_ext module to register THOR backbones
55
+ import thor_terratorch_ext # noqa: F401
56
+
57
+ # Load the backbone registry
58
+ from terratorch import BACKBONE_REGISTRY
59
+
60
+ # List available THOR backbones
61
+ print([b for b in list(BACKBONE_REGISTRY) if "thor" in b])
62
+
63
+ # Build a THOR ViT model with specific bands
64
+ model = BACKBONE_REGISTRY.build(
65
+ "thor_v1_tiny",
66
+ pretrained=True,
67
+ model_bands=["BLUE", "GREEN", "RED", "VV", "VH"],
68
+ input_params=dict( # Optional input parameters to customize
69
+ ground_covers=[
70
+ 2880
71
+ ], # Ground cover in meters (typically input image size [px] * input image resolution)
72
+ flexivit_patch_size_seqs=[8], # Patch size in pixels
73
+ ),
74
+ )
75
+ ```
76
+
77
+ ## Training Details
78
+
79
+ ### Training Data
80
+
81
+ THOR is pre-trained on [THOR-Pretrain](https://huggingface.co/datasets/FM4CS/THOR-Pretrain), a large-scale multi-sensor dataset containing paired observations from Sentinel-1, Sentinel-2, and Sentinel-3 satellites, as well as auxiliary land cover and elevation data and meteorological variables.
82
+
83
+
84
+ ### Training Procedure
85
+
86
+ For training configuration, see the config file: [thor-base.yaml](https://github.com/FM4CS/THOR/blob/33842760f061063baf9fe3f748008c84a663fa8b/thor/config/pretrain/final/thor-base.yaml)
87
+
88
+
89
+ ### Compute Infrastructure
90
+ The model was trained on the LUMI supercomputer in Finland using 4 nodes, each equipped with 4 AMD MI250X GPUs, totaling 32 GCDs.
91
+
92
+ ## Evaluation
93
+
94
+ ### Results
95
+
96
+ THOR demonstrates highly competitive performance on the PANGAEA benchmark, particularly in data-limited regimes. With only 10% training data, THOR-Base achieves the best average rank across all datasets.
97
+
98
+ | Model | HLS Burns | MADOS | PASTIS | Sen1Floods11 | FBP | DynEarthNet | CropMap | SN7 | AI4Farms |
99
+ |-------|-----------|-------|--------|--------------|-----|-------------|---------|-----|----------|
100
+ | CROMA | 76.44 | 32.44 | 32.80 | *87.22* | 37.39 | 36.08 | 36.77 | 42.15 | 38.48 |
101
+ | DOFA | 71.98 | 23.77 | 27.68 | 82.84 | 27.82 | **39.15** | 29.91 | 46.10 | 27.74 |
102
+ | Prithvi | 77.73 | 21.24 | 33.56 | 86.28 | 29.98 | 32.28 | 27.71 | 36.78 | 35.04 |
103
+ | SpectralGPT | **83.35** | 20.29 | 34.53 | 83.12 | 39.51 | 35.33 | 31.06 | 36.31 | 37.35 |
104
+ | Terramind-B | 77.39 | **44.06** | **39.96** | 84.43 | *54.00* | *37.35* | 35.65 | 43.21 | 38.59 |
105
+ | UNet Baseline | *79.46* | 24.30 | 29.53 | **88.55** | 52.58 | 35.59 | 13.88 | 46.08 | 34.84 |
106
+ | ViT Baseline | 75.92 | 10.18 | 38.44 | 81.85 | **56.53** | 35.39 | 27.76 | 36.01 | **39.20** |
107
+ | THOR-B | 76.90 | 40.67 | *38.93* | 86.29 | 42.80 | 35.21 | **42.23** | *55.94* | *38.90* |
108
+ | THOR-T | 75.98 | *41.65* | 36.26 | 82.70 | 42.81 | 34.03 | *37.82* | **58.52** | 38.56 |
109
+
110
+ *Results in mIoU on PANGAEA benchmark with 10% training data. **Bold** = best, *italic* = second-best.*
111
+
112
+ ## Attribution
113
+
114
+ The development of THOR was funded and supported by European Space Agency (ESA) Φ-lab (FM4CS project, contract no. 4000143489/24/I-DT), and the Research Council of Norway (KnowEarth project no. 337481).
115
+
116
+ ## Citation
117
+
118
+ If you use THOR in your research, please cite the [paper](https://arxiv.org/abs/2601.16011):
119
+
120
+ **BibTeX:**
121
+
122
+ ```bibtex
123
+ @article{forgaard2026thor,
124
+ title={THOR: A Versatile Foundation Model for Earth Observation Climate and Society Applications},
125
+ author={Theodor Forgaard and Jarle H. Reksten and Anders U. Waldeland and Valerio Marsocci and Nicolas Longépé and Michael Kampffmeyer and Arnt-Børre Salberg},
126
+ year={2026},
127
+ eprint={2601.16011},
128
+ archivePrefix={arXiv},
129
+ primaryClass={eess.IV},
130
+ url={https://arxiv.org/abs/2601.16011},
131
+ }
132
+ ```
133
+
134
+ ## Contact
135
+ Theodor Forgaard - Norwegian Computing Center (NR) -
136
+ tforgaard@nr.no