JOtholt commited on
Commit
2a37ae6
·
verified ·
1 Parent(s): 5313fca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -0
README.md CHANGED
@@ -15,6 +15,18 @@ The model is trained on the [Copernicus European Regional Reanalysis (CERRA)](ht
15
  You can find the training and inference code, as well as information about how to use the checkpoints in our [GitHub repository](https://github.com/HPI-DeepLearning/Cerrora).
16
  For more detailed information about the model, the training procedure, and the evaluation results, read our technical report (TODO: MISSING LINK).
17
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  ## Model Architecture
19
 
20
  Aurora is a 1.3B parameter model consisting of an encoder that projects the input into a fixed-size latent,
 
15
  You can find the training and inference code, as well as information about how to use the checkpoints in our [GitHub repository](https://github.com/HPI-DeepLearning/Cerrora).
16
  For more detailed information about the model, the training procedure, and the evaluation results, read our technical report (TODO: MISSING LINK).
17
 
18
+ In addition, this repository also contains a small preprocessed excerpt from the CERRA dataset and the IFS boundary conditions to facilitate easy
19
+ testing of the models.
20
+ Both CERRA and IFS-HRES are provided by ECMWF under the the Creative Commons Attribution 4.0 International license.
21
+ The full text of this license can be found [here](https://creativecommons.org/licenses/by/4.0/legalcode).
22
+ Compared to the original versions, our data is altered in some regards, for example we replace some variables with derived variables
23
+ for compatibility with the Aurora model.
24
+ This concerns the 10 meter wind, which is supplied as wind strength and direction in CERRA, and converted to u and v components.
25
+ We also calculate the specific humidity from the relative humidity supplied in CERRA
26
+ Additionally, we add a soil type static variable by interpolating the ERA5 soil type variable.
27
+ The scripts to replicate this preprocessing can be found in the GitHub repository.
28
+
29
+
30
  ## Model Architecture
31
 
32
  Aurora is a 1.3B parameter model consisting of an encoder that projects the input into a fixed-size latent,