Datasets:
File size: 8,529 Bytes
3aaecd3 4f2978f 3aaecd3 4f2978f 3aaecd3 4f2978f 3aaecd3 4f2978f 3aaecd3 4f2978f | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 | ---
license: mit
task_categories:
- image-classification
- time-series-forecasting
language:
- en
tags:
- wildfire
- remote-sensing
- earth-observation
- canada
- benchmark
- hard-negative-mining
size_categories:
- n>1T
---
# FireMPC: A Pan-Canadian Wildfire Forecasting Benchmark
FireMPC is a pan-Canadian wildfire risk benchmark covering approximately one
billion hectares across all fifteen Canadian terrestrial ecozones at 1 km daily
resolution from 2000 to 2025, integrating 55 drivers across fuel, terrain,
anthropogenic, and meteorological families.
This release provides four pre-built sample caches that share the same
underlying data cube but differ in their training and test sample construction
strategies, enabling controlled study of FWI-guided Hard Negative Mining
(FWI-HNM) versus random negative sampling.
## Repository Contents
```
MPCFire/
├── cache_A/ # Train: random sampling | Test: FWI-HNM
├── cache_Y/ # Train: random sampling | Test: random sampling
├── cache_G/ # Train: FWI-HNM | Test: FWI-HNM
├── cache_H/ # Train: FWI-HNM | Test: random sampling
└── Entire_Canada_Maps/ # Raw driver rasters, 13 modalities x 26 years (yearly tar archives)
```
Each cache directory contains exactly three files:
| File | Size | Description |
| --- | --- | --- |
| `windows_<hash>.h5` | ~14 GB | Pre-extracted 10-day input windows + labels for every sample (positives and negatives). One HDF5 file per cache. |
| `samples_variant_<X>.json` | ~4.4 MB | Sample index: train / val / test split assignments, sample identifiers, and metadata. |
| `norm_stats.npz` | ~2 KB | Channel-wise mean and standard deviation used for input normalisation. Skips the fire-mask channel and the categorical land-cover channel. |
## Variant Design
The four caches form a 2x2 ablation grid that decouples the negative-sampling
strategy used during training from the strategy used during evaluation:
| | Test = FWI-HNM | Test = Random |
| ------- | -------------- | ------------- |
| **Train = Random** | `cache_A` | `cache_Y` |
| **Train = FWI-HNM** | `cache_G` | `cache_H` |
* **FWI-HNM (FWI-guided Hard Negative Mining)** scores every non-fire candidate
with a calibrated six-component CFFDRS composite (FFMC, DMC, DC, ISI, BUI,
FWI), partitions the pool at the median, and combines hard negatives
(fire-weather-matched non-ignitions) with representative negatives
(low-danger baseline) in equal proportions.
* **Random sampling** draws negatives uniformly from the non-fire candidate
pool (firemask = 5).
Comparing rows isolates the effect of the training-pool construction; comparing
columns isolates the effect of the evaluation-pool construction. The diagonal
pair (`cache_A`, `cache_G`) corresponds to the standard production setup; the
off-diagonal pair (`cache_Y`, `cache_H`) is used to verify that any
FWI-HNM advantage reflects genuine boundary hardening rather than train-test
distributional alignment.
## Raw Driver Maps (`Entire_Canada_Maps/`)
`Entire_Canada_Maps/` provides the underlying raster stack used to build the
sample caches above. It covers the full Canadian landmass for 2000-2025 (26
years) and is organised into 13 modality subfolders that correspond to the
drivers listed in Table 1 of the paper. Each subfolder contains one tar
archive per year; each archive holds that year's daily (or annual / static)
GeoTIFFs.
All rasters are stored with **integer scaling (scaledInt)** to reduce volume.
### Subfolder Index
| Folder | Channels | Source | Cadence | Notes |
| --- | --- | --- | --- | --- |
| `DEMs/` | DEM, Slope, Aspect (sin/cos), Hillshade, TPI, TWI (7 channels) | ASTER GDEM | Static | Elevation and derived terrain indices; broadcast across the daily axis |
| `ERA5/` | temperature_2m, u/v_component_of_wind_10m, snow_cover, total_precipitation_sum, surface_latent_heat_flux_sum, dewpoint_temperature_2m, surface_pressure, volumetric_soil_water_layer_1-4, temperature_2m_max, skin_temperature_max, potential_evaporation_sum, total_evaporation_sum, skin_reservoir_content, surface_net_solar_radiation_sum (18 channels) | ERA5-Land Daily Aggregate (`ECMWF/ERA5_LAND/DAILY_AGGR`) | Daily | Atmospheric reanalysis fields |
| `FWI/` | FFMC, DMC, DC, ISI, BUI, FWI (6 channels) | CFFDRS (ERA5-driven) | Daily | Canadian Forest Fire Weather Index components |
| `MCD09CMG/` | Coarse Resolution Brightness Temperature Bands 20 / 21 / 31 / 32 (4 channels) | MOD/MYD09CMG | Daily | Coarse-resolution composite brightness temperatures |
| `MCD09GA/` | Bands 1, 2, 3, 7 (4 channels) | MOD/MYD09GA | Daily | QA-filtered, gap-filled surface reflectance |
| `MCD11A1/` | LST_Day_1km, LST_Night_1km, Emis_31, Emis_32 (4 channels) | MOD/MYD11A1 | Daily | QA-filtered, gap-filled land surface temperature and emissivity |
| `MCD12Q1/` | Land Cover Class | MCD12Q1 | Annual | Land cover / land use class; broadcast across the daily axis |
| `MCD14A1/` | Active Fire (binary) | MOD/MYD14A1 | Daily | Supervision target only; **not** included as a model input channel |
| `MCD15A3H/` | LAI, FPAR (2 channels) | MCD15A3H | Daily (from 4-day composite, interpolated) | Leaf area index / fraction of absorbed PAR |
| `NDVI_EVI/` | NDVI, EVI (2 channels) | MODIS-derived | Daily | Vegetation activity indices |
| `OSMs/` | Road / Powerline / Building / Water Density (4 channels) | OSM-derived | Static | Infrastructure accessibility; broadcast across the daily axis |
| `VPD/` | Vapor Pressure Deficit | ERA5-derived | Daily | Atmospheric moisture demand |
| `Worldpop/` | Population Density | WorldPop | Annual | Human population density; broadcast across the daily axis |
### File Naming
```
Entire_Canada_Maps/<modality>/<modality>__<YYYY>.tar
```
After extraction, individual files are named `YYYY_MM_DD.tif` for daily
modalities or `<modality>_YYYY.tif` for static / annual modalities.
### Download and Extraction
```python
from huggingface_hub import hf_hub_download
import tarfile, pathlib
tar_path = hf_hub_download(
repo_id="AnonymousData4NeurIPS/MPCFire",
repo_type="dataset",
filename="Entire_Canada_Maps/ERA5/ERA5__2020.tar",
)
out = pathlib.Path("./ERA5_2020")
with tarfile.open(tar_path) as tf:
tf.extractall(out)
```
To pull an entire modality:
```python
from huggingface_hub import snapshot_download
snapshot_download(
repo_id="AnonymousData4NeurIPS/MPCFire",
repo_type="dataset",
allow_patterns=["Entire_Canada_Maps/ERA5/*"],
)
```
## Splits
A temporal hold-out is used throughout:
| Split | Years |
| ----- | ----- |
| Train | 2000 - 2019 |
| Validation | 2020 - 2022 |
| Test | 2023 - 2025 |
The three-year test window deliberately covers the record-setting 2023 fire
season alongside the more typical 2024 and 2025 seasons. All splits maintain a
fixed 1:2 positive-to-negative ratio.
Sample identifiers and split assignments are stored in
`samples_variant_<X>.json` inside each cache.
## File Format
`windows_<hash>.h5` is a single HDF5 file with pre-extracted 10-day driver
windows and binary labels. The hash in the filename identifies the windowing
configuration (10-day backward window, 1 km patches) and is shared across all
four caches because the underlying input cube is identical; only the
positive/negative selection differs per variant.
`norm_stats.npz` provides per-channel mean and standard deviation arrays. The
fire-label channel and the categorical land-cover channel are excluded from
z-score normalisation.
## Reading the Data
Quick example using `huggingface_hub`:
```python
from huggingface_hub import snapshot_download
local_dir = snapshot_download(
repo_id="AnonymousData4NeurIPS/MPCFire",
repo_type="dataset",
allow_patterns=["cache_G/*"], # download a single variant
)
```
Then load the HDF5 file and the sample index:
```python
import h5py, json, numpy as np
cache = f"{local_dir}/cache_G"
with open(f"{cache}/samples_variant_G.json") as f:
samples = json.load(f)
norm = np.load(f"{cache}/norm_stats.npz")
h5 = h5py.File(next(p for p in __import__('os').listdir(cache) if p.endswith('.h5')), "r")
```
## Citation
This dataset accompanies a paper currently under anonymous peer review. A
citation entry will be added on acceptance.
## License
Released under the MIT License. The dataset is built from publicly available
products (MODIS, ERA5-Land, ASTER, WorldPop, OpenStreetMap); please consult the
licenses of the upstream sources for redistribution of derivative products. |