Dataset Viewer
The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   ConfigNamesError
Exception:    ValueError
Message:      Feature type 'Numpy' not found. Available feature types: ['Value', 'ClassLabel', 'Translation', 'TranslationVariableLanguages', 'LargeList', 'List', 'Array2D', 'Array3D', 'Array4D', 'Array5D', 'Audio', 'Image', 'Video', 'Pdf', 'Nifti']
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
                  config_names = get_dataset_config_names(
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 161, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                                   ^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1029, in dataset_module_factory
                  raise e1 from None
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1004, in dataset_module_factory
                  ).get_module()
                    ^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 604, in get_module
                  dataset_infos = DatasetInfosDict.from_dataset_card_data(dataset_card_data)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/info.py", line 386, in from_dataset_card_data
                  dataset_info = DatasetInfo._from_yaml_dict(dataset_card_data["dataset_info"])
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/info.py", line 317, in _from_yaml_dict
                  yaml_data["features"] = Features._from_yaml_list(yaml_data["features"])
                                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 2045, in _from_yaml_list
                  return cls.from_dict(from_yaml_inner(yaml_data))
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 1890, in from_dict
                  obj = generate_from_dict(dic)
                        ^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 1474, in generate_from_dict
                  return {key: generate_from_dict(value) for key, value in obj.items()}
                               ^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 1480, in generate_from_dict
                  raise ValueError(f"Feature type '{_type}' not found. Available feature types: {list(_FEATURE_TYPES.keys())}")
              ValueError: Feature type 'Numpy' not found. Available feature types: ['Value', 'ClassLabel', 'Translation', 'TranslationVariableLanguages', 'LargeList', 'List', 'Array2D', 'Array3D', 'Array4D', 'Array5D', 'Audio', 'Image', 'Video', 'Pdf', 'Nifti']

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Prithvi Burn Scar Severity Dataset

A curated dataset of Sentinel-2 multi-temporal satellite imagery chips for wildfire burn scar severity segmentation, pre-processed for the Prithvi EO 2.0 foundation model.

Author: Tushar Thokdar

Dataset Description

This dataset contains 108 image chips derived from Sentinel-2 L2A satellite imagery covering wildfire-affected regions. Each chip is a multi-temporal, multi-spectral array ready for semantic segmentation training with the NASA-IBM Prithvi EO 2.0 Vision Transformer.

Key Features

  • Novel Delta Channel: Each chip includes a computed difference frame (Delta = Clip(Post - Pre, -1, 1)) that explicitly encodes spectral change magnitude
  • Multi-temporal: 3 time steps per chip (Pre-fire, Post-fire, Delta)
  • Multi-spectral: 6 Sentinel-2 bands per time step
  • dNBR-derived labels: Ground truth severity classification from Differenced Normalized Burn Ratio

Data Format

Temporal Images (temporal_images/chip_*.npy)

  • Shape: (3, 6, 224, 224) per chip
  • Dtype: float32
  • Dimensions:
    • Axis 0 — Temporal frame: [Pre-fire, Post-fire, Delta]
    • Axis 1 — Spectral band: [B2(Blue), B3(Green), B4(Red), B8A(NIR), B11(SWIR1), B12(SWIR2)]
    • Axis 2,3 — Spatial: 224 x 224 pixels at 20m resolution

Masks (masks/chip_*.npy)

  • Shape: (224, 224) per chip
  • Dtype: int64
  • Values:
Value Class Description
0 Unburned No fire damage detected
1 Low Severity Minor vegetation damage
2 Moderate-Low Partial canopy loss
3 Moderate-High Significant vegetation loss
4 High Severity Complete vegetation destruction
255 Ignore Invalid/cloud/no-data pixels

Class Distribution

Class Pixels Percentage
Unburned 1,369,197 27.00%
Low Severity 771,342 15.21%
Moderate-Low 924,961 18.24%
Moderate-High 822,241 16.22%
High Severity 1,182,633 23.32%

Total labeled pixels: 5,070,374 across 108 chips

Data Source

  • Satellite: Sentinel-2 L2A (ESA Copernicus)
  • Resolution: 20 meters per pixel
  • Area of Interest: Northern California wildfire region
  • AOI Bounds: [-121.90, 39.80, -121.50, 40.30] (West, South, East, North)
  • Pre-fire window: 2024-05-15 to 2024-06-30
  • Post-fire window: 2024-09-15 to 2024-10-30
  • Cloud thresholds: Pre=20%, Post=30%

Processing Pipeline

  1. Google Earth Engine — Sentinel-2 cloud masking (SCL band), median compositing, 6-band extraction
  2. dNBR CalculationNBR = (NIR - SWIR2) / (NIR + SWIR2), dNBR = NBR_pre - NBR_post
  3. Severity Classification — Thresholded dNBR values mapped to 5 severity classes
  4. Reflectance Normalization — Raw DN scaled to [0, 1] surface reflectance
  5. Delta Channel GenerationDelta = Clip(Post - Pre, -1.0, 1.0)
  6. Tiling — 224x224 pixel chips with quality filtering (95% valid pixels, 1% minimum labeled pixels)

Usage

import numpy as np
from huggingface_hub import hf_hub_download

# Download a single chip
image_path = hf_hub_download(
    repo_id="Tushar365/prithvi-burn-scar-dataset",
    filename="temporal_images/chip_000000.npy",
    repo_type="dataset",
)
mask_path = hf_hub_download(
    repo_id="Tushar365/prithvi-burn-scar-dataset",
    filename="masks/chip_000000.npy",
    repo_type="dataset",
)

image = np.load(image_path)  # (3, 6, 224, 224)
mask = np.load(mask_path)    # (224, 224)

# RGB visualization (Pre-fire)
pre_rgb = image[0][[2, 1, 0]].transpose(1, 2, 0)  # B4, B3, B2
pre_rgb = np.clip(pre_rgb * 3.5, 0, 1)

Citation

If you use this dataset, please cite:

@misc{thokdar2024burnscardataset,
    title={Prithvi Burn Scar Severity Dataset with Delta Channel},
    author={Tushar Thokdar},
    year={2024},
    note={Sentinel-2 derived burn severity dataset for Prithvi EO 2.0}
}
Downloads last month
8

Models trained or fine-tuned on Tushar365/prithvi-burn-scar-dataset

Space using Tushar365/prithvi-burn-scar-dataset 1