ojpv commited on
Commit
6214201
·
verified ·
1 Parent(s): fcb12f6

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +108 -11
README.md CHANGED
@@ -1,32 +1,129 @@
1
- # ClimX: A challenge for extreme-aware climate model emulation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
 
3
- ClimX is a competition focused on developing **fast and accurate machine learning emulators** for the NorESM2-MM Earth System Model, with evaluation centered on **climate extremes**.
4
 
5
- ## What’s in this dataset?
6
 
7
- This Hugging Face dataset hosts the ClimX **full-resolution training data** (historical + projections). A lightweight, \\(16\times\\) spatially coarsened variant is provided separately for rapid prototyping (hosted on Kaggle).
8
 
9
- The full dataset is distributed in **NetCDF-4** format to support broad compatibility with common climate tooling.
10
 
11
- ## Problem summary
12
 
13
- Participants train emulators that take **forcing trajectories** (greenhouse gases + aerosols) and optionally past predicted state to produce daily climate fields at the native NorESM2-MM grid (\\(192 \times 288\\)). The **benchmark target** is not the raw fields themselves, but **15 extreme indices** derived from daily temperature and precipitation.
 
14
 
15
- Primary leaderboard metric (mean standardized MAE over indices):
 
 
 
 
 
 
 
 
 
16
 
17
  $$
18
- S = \frac{1}{15}\sum_{i=1}^{15}\frac{\mathrm{MAE}(\hat{Y}_i, Y_i)}{\sigma_i}
19
  $$
20
 
21
- Here \\(\sigma_i\\) is computed from the ground-truth \\(Y_i\\) values for the evaluation split (public/private) and held fixed for all submissions on that split.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
22
 
23
  ## Links
24
 
25
  - [Kaggle competition](https://www.kaggle.com/competitions/climx)
 
26
  - [Public code repository (challenge materials)](https://github.com/IPL-UV/ClimX)
27
  - [Website](https://ipl-uv.github.io/ClimX/)
28
 
29
  ## License and usage
30
 
31
- This dataset is provided for the ClimX competition and associated research/education use. Please follow the competition rules regarding external data/model restrictions and data redistribution.
 
 
 
 
 
 
32
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ pretty_name: "ClimX: extreme-aware climate model emulation"
5
+ tags:
6
+ - climate
7
+ - earth-system-model
8
+ - machine-learning
9
+ - emulation
10
+ - extremes
11
+ - netcdf
12
+ license: mit
13
+ task_categories:
14
+ - regression
15
+ - time-series-forecasting
16
+ ---
17
 
18
+ <!-- NOTE: Math formatting convention: inline math uses \\( ... \\) and math blocks use $$ ... $$. -->
19
 
20
+ # ClimX: a challenge for extreme-aware climate model emulation
21
 
22
+ ClimX is a challenge about building **fast and accurate machine learning emulators** for the NorESM2-MM Earth System Model, with evaluation focused on **climate extremes** rather than mean climate alone.
23
 
24
+ ## Dataset summary
25
 
26
+ This dataset contains the **full-resolution** ClimX data in **NetCDF-4** format (targets + forcings, depending on split) with a native grid of \\(192 \times 288\\) (about \\(1^\circ\\)) resolution. It also contains the **lite-resolution** version, with a native grid of \\(12 \times 18\\) (about \\(16^\circ\\)) resolution:
27
 
28
+ - **Lite-resolution**: <1GB, **\\(16\times\\)** spatially coarsened, meant for rapid prototyping.
29
+ - **Full-resolution**: ~200GB, full-resolution data for large-scale training.
30
 
31
+ ## What you will do (high level)
32
+
33
+ You train an emulator that predicts **daily** 2D fields for 7 surface variables:
34
+
35
+ - `tas`, `tasmax`, `tasmin`
36
+ - `pr`, `huss`, `psl`, `sfcWind`
37
+
38
+ However, the **benchmark targets are 15 extreme indices** derived from daily temperature and precipitation (ETCCDI-style indices). The daily fields are an **intermediate output** your emulator produces (useful for diagnostics and for computing the indices).
39
+
40
+ Conceptually:
41
 
42
  $$
43
+ x_t = g(f_t, f_{t-1}, \dots, f_{t-\alpha}, x_{t-1}, x_{t-2}, \dots, x_{t-\beta})
44
  $$
45
 
46
+ where \\(f_t\\) are forcings (greenhouse gases + aerosols) and \\(x_t\\) is the climate state.
47
+
48
+ ## Dataset structure
49
+
50
+ ### Spatial and temporal shape
51
+
52
+ Full-resolution daily fields:
53
+
54
+ - **Historical**: `lat: 192, lon: 288, time: 60224`
55
+ - **Projections**: `lat: 192, lon: 288, time: 31389`
56
+
57
+ ### Splits and scenarios (official challenge setup)
58
+
59
+ Training uses historical + several SSP scenarios; testing is on the held-out **SSP2-4.5** scenario:
60
+
61
+ - **Train**: historical (1850–2014) + `ssp126`, `ssp370`, `ssp585` (2015–2100)
62
+ - **Test (held-out)**: `ssp245` (2015–2100)
63
+
64
+ To avoid leakage, **targets for `ssp245` are withheld** in the official evaluation; only the **forcings** are provided for that scenario. The full outputs will be released after the competition.
65
+
66
+ ## How to load the data
67
+
68
+ This dataset is distributed as **NetCDF-4** files. There are two common ways to load it.
69
+
70
+ ### Option 1 (recommended): clone the ClimX code and use the helper loader
71
+
72
+ The ClimX repository already includes a helper module (`src/data/climx_hf.py`) that allows you to download the dataset from Hugging Face and open it as three lazily-loaded “virtual” xarray datasets:
73
+
74
+ ```bash
75
+ git clone https://github.com/IPL-UV/ClimX.git
76
+ cd ClimX
77
+ pip install -U "huggingface-hub" xarray netcdf4 dask
78
+ ```
79
+
80
+ ```python
81
+ from src.data.climx_hf import download_climx_from_hf, open_climx_virtual_datasets
82
+
83
+ # Download NetCDF artifacts from HF into a local cache directory.
84
+ root = download_climx_from_hf("/path/to/hf_cache", variant="full")
85
+
86
+ # Open as three virtual datasets (lazy / dask-friendly).
87
+ ds = open_climx_virtual_datasets(root, variant="full") # or "lite"
88
+
89
+ ds.hist # historical (targets + forcings)
90
+ ds.train # projections training scenarios (targets + forcings; excludes `ssp245` scenario)
91
+ ds.test_forcings # `ssp245` scenario forcings only (no targets)
92
+ ```
93
+
94
+ ### Option 2: download NetCDFs and open with xarray directly
95
+
96
+ You can also download files from Hugging Face and open them with **xarray**.
97
+
98
+ Example:
99
+
100
+ ```python
101
+ from huggingface_hub import hf_hub_download
102
+ import xarray as xr
103
+
104
+ path = hf_hub_download(
105
+ repo_id="isp-uv-es/ClimX",
106
+ repo_type="dataset",
107
+ filename="PATH/TO/A/FILE.nc", # replace with an actual file in this dataset repo
108
+ )
109
+ ds = xr.open_dataset(path)
110
+ print(ds)
111
+ ```
112
 
113
  ## Links
114
 
115
  - [Kaggle competition](https://www.kaggle.com/competitions/climx)
116
+ - [Full dataset (this page)](https://huggingface.co/datasets/isp-uv-es/ClimX)
117
  - [Public code repository (challenge materials)](https://github.com/IPL-UV/ClimX)
118
  - [Website](https://ipl-uv.github.io/ClimX/)
119
 
120
  ## License and usage
121
 
122
+ The dataset is released under **MIT**. In addition, if you are participating in the ClimX competition, please follow the competition rules (notably: restrictions on external climate training data and redistribution of competition data).
123
+
124
+ ## Citation
125
+
126
+ If you use this dataset in academic work, please cite the ClimX challenge materials and the underlying NorESM2 model description:
127
+
128
+ - Seland, Ø., Bentsen, M., Graff, L. S., et al. (2020). *The Norwegian Earth System Model, NorESM2 – Evaluation of the CMIP6 DECK and historical simulations*. Geoscientific Model Development.
129