|
|
--- |
|
|
license: cc-by-4.0 |
|
|
pretty_name: Hyperheight Data Cube Denoising and Super-Resolution |
|
|
task_categories: |
|
|
- image-to-image |
|
|
annotations_creators: |
|
|
- author-generated |
|
|
language: |
|
|
- en |
|
|
source_datasets: |
|
|
- NEON AOP discrete return LiDAR (DP1.30003.001) |
|
|
tags: |
|
|
- lidar |
|
|
- remote-sensing |
|
|
- neon |
|
|
- canopy |
|
|
- compressive-sensing |
|
|
size_categories: |
|
|
- 10K<n<100K |
|
|
configs: |
|
|
- config_name: default |
|
|
data_files: |
|
|
- split: train |
|
|
path: data/train-* |
|
|
- split: validation |
|
|
path: data/validation-* |
|
|
- split: test |
|
|
path: data/test-* |
|
|
dataset_info: |
|
|
features: |
|
|
- name: cube |
|
|
dtype: |
|
|
array3_d: |
|
|
shape: |
|
|
- 128 |
|
|
- 48 |
|
|
- 48 |
|
|
dtype: float32 |
|
|
- name: filename |
|
|
dtype: string |
|
|
splits: |
|
|
- name: train |
|
|
num_bytes: 75339604065 |
|
|
num_examples: 62535 |
|
|
- name: validation |
|
|
num_bytes: 16143770600 |
|
|
num_examples: 13400 |
|
|
- name: test |
|
|
num_bytes: 16144975359 |
|
|
num_examples: 13401 |
|
|
download_size: 4642822201 |
|
|
dataset_size: 107628350024 |
|
|
--- |
|
|
|
|
|
## Hyperheight Data Cube Denoising and Super-Resolution |
|
|
|
|
|
 |
|
|
|
|
|
## Dataset Summary |
|
|
- Generation code and pipeline: https://github.com/Anfera/HHDC-Creator (HHDC-Creator repo). |
|
|
- 3-D photon-count waveforms (Hyperheight data cubes) built from NEON discrete-return LiDAR using the HHDC pipeline (`hhdc/cube_generator.py`). |
|
|
- Each cube stores a high-resolution canopy volume (default: 0.5 m vertical bins over 64 m height, footprints every 2 m) across a 96 m × 96 m tile. In the HHDC-Creator pipeline, the exact settings are recorded per-sample in metadata, but this HF dataset only exposes the processed cubes and filenames. |
|
|
- Inputs for learning are simulated observations from the physics-based forward imaging model (`hhdc/forward_model.py`) that emulates the Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS), applying Gaussian beam aggregation, distance-based photon loss, and mixed Poisson + Gaussian noise to downsample/perturb the cube. |
|
|
- Targets are the clean, high-resolution cubes. The pairing supports denoising and spatial super-resolution with recommended settings of 10 m diameter footprints sampled on a 3 m × 6 m grid (along/across swath); users can adjust these parameters as needed. |
|
|
|
|
|
## Supported Tasks |
|
|
- Denoising of LiDAR photon-count hyperheight data cubes. |
|
|
- Super-resolution / resolution enhancement of lidar volumes. |
|
|
- Robust reconstruction under realistic sensor noise simulated by the forward model. |
|
|
|
|
|
## Dataset Structure |
|
|
|
|
|
### Storage and splits |
|
|
|
|
|
- **Format on the Hub:** Apache Arrow / Parquet, managed by 🤗 Datasets. |
|
|
- **Access:** via `load_dataset("anfera236/HHDC", split=...)`. |
|
|
- **Splits:** `train`, `validation`, `test` (see `dataset_info` for exact sizes). |
|
|
|
|
|
### Per-sample fields |
|
|
|
|
|
Each sample in this Hugging Face dataset contains: |
|
|
|
|
|
- **`cube`** — `float32`, shape `[128, 48, 48]` |
|
|
High-resolution Hyperheight data cube (channel-first: `[bins, H, W]`), derived from NEON discrete-return LiDAR using the HHDC-Creator pipeline. |
|
|
- **`filename`** — `string` |
|
|
Identifier for the source tile / sample (matches the tile-level naming used in HHDC-Creator). |
|
|
|
|
|
Additional fields produced by the HHDC-Creator pipeline (e.g. `x_centers`, `y_centers`, `bin_edges`, `footprint_counts`, `metadata`) are **not stored** in this HF dataset. They can be regenerated from NEON AOP LiDAR using the code in the HHDC-Creator repository. |
|
|
|
|
|
### Typical shapes and forward model |
|
|
|
|
|
With the default cube configuration (e.g. `cube_config_sample.json`, `cube_length = 96 m`, `footprint_separation = 2 m`): |
|
|
|
|
|
- **Clean high-res cube (`cube`):** `[128, 48, 48]` |
|
|
- 64 m vertical extent / 0.5 m bins → 128 height bins |
|
|
- 96 m × 96 m tile / 2 m grid → 48 × 48 footprints |
|
|
|
|
|
Low-resolution, noisy measurements are **generated on the fly** using the physics-based forward model (`LidarForwardImagingModel` in HHDC-Creator). For example, with `output_res_m=(3.0, 6.0)`: |
|
|
|
|
|
- **Noisy cube (model output, not stored in the dataset):** `[128, 32, 16]` |
|
|
|
|
|
Users are expected to: |
|
|
1. Load `cube` from this dataset as the clean target. |
|
|
2. Apply the forward model to obtain noisy / low-res inputs for denoising and super-resolution experiments. |
|
|
|
|
|
If you want to replicate our **exact** results, you can use the reference cube provided at `SampleCube/gt2.npz`. |
|
|
|
|
|
## Usage |
|
|
```python |
|
|
from datasets import load_dataset |
|
|
import torch |
|
|
|
|
|
from hhdc.forward_model import LidarForwardImagingModel # or your actual import path (check scripts folder in this repo) |
|
|
|
|
|
# Load dataset |
|
|
ds = load_dataset("anfera236/HHDC", split="train") |
|
|
ds.set_format(type="torch", columns=["cube"]) |
|
|
|
|
|
# Instantiate the LiDAR forward model (use your actual parameters) |
|
|
forward_model = LidarForwardImagingModel( |
|
|
input_res_m=(2.0, 2.0), |
|
|
output_res_m=(3.0, 6.0), |
|
|
footprint_diameter_m=10.0, |
|
|
b=0.1, # set to zero for no background photons |
|
|
eta=0.5, # set to zero for no Gaussian noise |
|
|
ref_altitude=500.0, |
|
|
ref_photon_count=20.0, |
|
|
) |
|
|
|
|
|
sample = ds[0] |
|
|
|
|
|
# High-res “clean” HHDC: [bins, H, W] |
|
|
clean = sample["cube"] |
|
|
|
|
|
# Low-res noisy measurement generated by the forward model: [bins, H_low, W_low] |
|
|
noisy = forward_model(clean) |
|
|
|
|
|
# Example: train a denoising/super-res model (my_model: noisy -> clean) |
|
|
pred = my_model(noisy.unsqueeze(0)) # [1, bins, H, W] ideally |
|
|
loss = loss_fn(pred, clean.unsqueeze(0)) # shapes must match |
|
|
loss.backward() |
|
|
``` |
|
|
|
|
|
## Evaluation |
|
|
 |
|
|
|
|
|
- Recommended metrics: PSNR and SSIM on the canopy height model (CHM), digital terrain model (DTM), and 50th percentile height maps (all derivable via `hhdc.canopy_plots.create_chm` in the HHDC-Creator repo). |
|
|
|
|
|
## Limitations and Risks |
|
|
- Forward model parameters (beam diameter, noise levels, output resolution, altitude) control task difficulty; we recommend documenting the values you use per experiment (e.g., in your own metadata/config). In the original HHDC-Creator pipeline these are stored per-sample in metadata, but this HF dataset does not include that field. |
|
|
- Outputs are simulated; real sensor artifacts (boresight errors, occlusions, calibration drift) are not modeled. |
|
|
- NEON LiDAR is collected over North America; models may not generalize to other biomes or sensor geometries without adaptation. |
|
|
|
|
|
## Licensing |
|
|
- Derived from NEON AOP discrete-return LiDAR (DP1.30003.001). Follow the NEON Data Usage and Citation Policy and cite the original survey months/sites used. |
|
|
- Include the citation for the Hyperheight paper when publishing results that use this dataset. |
|
|
|
|
|
## Citation |
|
|
``` |
|
|
@article{ramirez2024hyperheight, |
|
|
title={Hyperheight lidar compressive sampling and machine learning reconstruction of forested landscapes}, |
|
|
author={Ramirez-Jaime, Andres and Pena-Pena, Karelia and Arce, Gonzalo R and Harding, David and Stephen, Mark and MacKinnon, James}, |
|
|
journal={IEEE Transactions on Geoscience and Remote Sensing}, |
|
|
volume={62}, |
|
|
pages={1--16}, |
|
|
year={2024}, |
|
|
publisher={IEEE} |
|
|
} |
|
|
|
|
|
@article{ramirez2025super, |
|
|
title={Super-Resolved 3D Satellite Lidar Imaging of Earth Via Generative Diffusion Models}, |
|
|
author={Ramirez-Jaime, Andres and Porras-Diaz, Nestor and Arce, Gonzalo R and Stephen, Mark}, |
|
|
journal={IEEE Transactions on Geoscience and Remote Sensing}, |
|
|
year={2025}, |
|
|
publisher={IEEE} |
|
|
} |
|
|
|
|
|
@inproceedings{ramirez2025denoising, |
|
|
title={Denoising and Super-Resolution of Satellite Lidars Using Diffusion Generative Models}, |
|
|
author={Ramirez-Jaime, Andres and Porras-Diaz, Nestor and Arce, Gonzalo R and Stephen, Mark}, |
|
|
booktitle={2025 IEEE Statistical Signal Processing Workshop (SSP)}, |
|
|
pages={1--5}, |
|
|
year={2025}, |
|
|
organization={IEEE} |
|
|
} |
|
|
``` |
|
|
|
|
|
## Maintainers |
|
|
- Andres Ramirez-Jaime — aramjai@udel.edu |
|
|
|