JOtholt commited on
Commit
dc2325d
·
verified ·
1 Parent(s): ce34bef

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -2
README.md CHANGED
@@ -9,5 +9,32 @@ tags:
9
 
10
  # Cerrora: A High-Resolution Regional Weather Model for Europe
11
 
12
- This repository contains the trained checkpoints for Cerrora, an AI-based regional weather model for Europe based on [Microsoft's Aurora](https://huggingface.co/microsoft/aurora).
13
- You can find the training code and more information about how to use the checkpoint in our [GitHub repository](https://github.com/HPI-DeepLearning/Cerrora)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
 
10
  # Cerrora: A High-Resolution Regional Weather Model for Europe
11
 
12
+ This repository contains the trained checkpoints for Cerrora, an AI-based regional weather model for Europe
13
+ based on [Microsoft's Aurora](https://huggingface.co/microsoft/aurora).
14
+ The model is trained on the [Copernicus European Regional Reanalysis (CERRA)](https://doi.org/10.1002/qj.4764) dataset provided by ECMWF.
15
+ You can find the training and inference code, as well as information about how to use the checkpoints in our [GitHub repository](https://github.com/HPI-DeepLearning/Cerrora).
16
+ For more detailed information about the model, the training procedure, and the evaluation results, read our technical report (TODO: MISSING LINK).
17
+
18
+ ## Model Architecture
19
+
20
+ Aurora is a 1.3B parameter model consisting of an encoder that projects the input into a fixed-size latent,
21
+ a backbone to process the latent, and a decoder to recreate the original data shape.
22
+ The encoder and decoder use a Perceiver architecture, while the backbone is a Swin Transformer.
23
+ It is pretrained on a variety of weather and climate datasets, with the goal of providing a foundation model
24
+ that can be finetuned for diverse downstream tasks.
25
+ For more information, read the [Aurora paper](https://www.nature.com/articles/s41586-025-09005-y).
26
+ Cerrora is based on the 0.25° pretrained model with 6h lead time.
27
+ We mostly leave the model architecture unchanged, with the exception of the patch size, which we increase from 4 to 8.
28
+
29
+ ## Training Procedure
30
+
31
+ We adopt a two-stage training procedure consisting of a 6h pretraining stage, and a rollout finetuning stage.
32
+ In the first stage, the model is trained on CERRA data to forecast the weather state in 6 hours.
33
+ This done to adapt the model to the change in data domain and input resolution.
34
+ The 6h model is published as `cerrora-base.ckpt`.
35
+
36
+ In the second stage, we finetune the model to autoregressively roll out predictions for lead times up to 30 hours.
37
+ As we train a regional model, we face the issue that the CERRA input data lacks the global context necessary for performing longer forecasts.
38
+ We address this by using the IFS-HRES forecasts provided in the [WeatherBench2 GCP bucket](https://weatherbench2.readthedocs.io/en/latest/data-guide.html#ifs-hres)
39
+ as lateral boundary conditions.
40
+ The rollout trained model is published as `cerrora-rollout.ckpt`.