Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -33,3 +33,97 @@ dataset_info:
|
|
| 33 |
download_size: 4642822201
|
| 34 |
dataset_size: 107628350024
|
| 35 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
download_size: 4642822201
|
| 34 |
dataset_size: 107628350024
|
| 35 |
---
|
| 36 |
+
|
| 37 |
+
## Hyperheight Data Cube Denoising and Super-Resolution
|
| 38 |
+
|
| 39 |
+

|
| 40 |
+
|
| 41 |
+
## Dataset Summary
|
| 42 |
+
- Generation code and pipeline: https://github.com/Anfera/HHDC-Creator (HHDC-Creator repo).
|
| 43 |
+
- 3-D photon-count waveforms (Hyperheight data cubes) built from NEON discrete-return LiDAR using the HHDC pipeline (`hhdc/cube_generator.py`).
|
| 44 |
+
- Each cube stores a high-resolution canopy volume (default: 0.5 m vertical bins over 64 m height, footprints every 2 m) across a 96 m × 96 m tile; actual settings are recorded per-sample in `metadata`.
|
| 45 |
+
- Inputs for learning are simulated observations from the physics-based forward imaging model (`hhdc/forward_model.py`) that emulates the Concurrent Artificially-intelligent Spectrometry and Adaptive Lidar System (CASALS), applying Gaussian beam aggregation, distance-based photon loss, and mixed Poisson + Gaussian noise to downsample/perturb the cube.
|
| 46 |
+
- Targets are the clean, high-resolution cubes. The pairing supports denoising and spatial super-resolution with recommended settings of 10 m diameter footprints sampled on a 3 m × 6 m grid (along/across swath); users can adjust these parameters as needed.
|
| 47 |
+
|
| 48 |
+
## Supported Tasks
|
| 49 |
+
- Denoising of LiDAR photon-count hyperheight data cubes.
|
| 50 |
+
- Super-resolution / resolution enhancement of lidar volumes.
|
| 51 |
+
- Robust reconstruction under realistic sensor noise simulated by the forward model.
|
| 52 |
+
|
| 53 |
+
## Dataset Structure
|
| 54 |
+
- **File layout:** NPZ archives, one per tile. Recommended HF repo structure: `train/`, `val/`, `test/` folders containing NPZs (or store them directly with split metadata in the HF dataset script).
|
| 55 |
+
- **Per-sample contents (npz keys):**
|
| 56 |
+
- `clean_cube`: `int32`, shape `[bins, H, W]` — high-res histogram (after swapping to channel-first).
|
| 57 |
+
- `x_centers`, `y_centers`: `float64` — footprint centers (meters, projected CRS from NEON tiles).
|
| 58 |
+
- `bin_edges`: `float64` — height bin edges (meters).
|
| 59 |
+
- `footprint_counts`: `int32` — raw point counts per footprint before histogramming.
|
| 60 |
+
- `metadata`: JSON string with cube config, tile bounds, tile indices, outlier quantile used, altitude, and source file name.
|
| 61 |
+
- **Typical shapes (example with `cube_config_sample.json`, `cube_length=96 m`, and `footprint_separation=2 m`):**
|
| 62 |
+
- `clean_cube`: `[128, 48, 48]` (64 m / 0.5 m vertical bins; 96 m tile / 2 m footprint grid).
|
| 63 |
+
- `noisy_cube`: `[128, 32, 16]` when using `output_res_m=(3.0, 6.0)` in the forward model.
|
| 64 |
+
|
| 65 |
+
## Usage
|
| 66 |
+
```python
|
| 67 |
+
from datasets import load_dataset
|
| 68 |
+
import torch
|
| 69 |
+
|
| 70 |
+
from hhdc.forward_model import LidarForwardImagingModel # or your actual import path
|
| 71 |
+
|
| 72 |
+
# Load dataset
|
| 73 |
+
ds = load_dataset("anfera236/HHDC", split="train")
|
| 74 |
+
|
| 75 |
+
# Instantiate the LiDAR forward model (use your actual parameters)
|
| 76 |
+
forward_model = LidarForwardImagingModel(
|
| 77 |
+
input_res_m=(2.0, 2.0),
|
| 78 |
+
output_res_m=(3.0, 6.0),
|
| 79 |
+
footprint_diameter_m=10.0,
|
| 80 |
+
b=0.1,
|
| 81 |
+
eta=0.5,
|
| 82 |
+
ref_altitude=500.0,
|
| 83 |
+
ref_photon_count=20.0,
|
| 84 |
+
)
|
| 85 |
+
|
| 86 |
+
sample = ds[0]
|
| 87 |
+
|
| 88 |
+
# High-res “clean” HHDC: [bins, H, W]
|
| 89 |
+
clean = torch.tensor(sample["cube"])
|
| 90 |
+
|
| 91 |
+
# Low-res noisy measurement generated by the forward model: [bins, H_low, W_low]
|
| 92 |
+
noisy = forward_model(clean)
|
| 93 |
+
|
| 94 |
+
# Example: train a denoising/super-res model (my_model: noisy -> clean)
|
| 95 |
+
pred = my_model(noisy.unsqueeze(0)) # [1, bins, H, W] ideally
|
| 96 |
+
loss = loss_fn(pred, clean.unsqueeze(0)) # shapes must match
|
| 97 |
+
loss.backward()
|
| 98 |
+
```
|
| 99 |
+
|
| 100 |
+
## Evaluation
|
| 101 |
+

|
| 102 |
+
|
| 103 |
+
- Recommended metrics: PSNR and SSIM on the canopy height model (CHM), digital terrain model (DTM), and 50th percentile height maps (all derivable via `hhdc.canopy_plots.create_chm` in the HHDC-Creator repo).
|
| 104 |
+
|
| 105 |
+
## Limitations and Risks
|
| 106 |
+
- Forward model parameters (beam diameter, noise levels, output resolution, altitude) control task difficulty; document the values used per sample in `metadata`.
|
| 107 |
+
- Outputs are simulated; real sensor artifacts (boresight errors, occlusions, calibration drift) are not modeled.
|
| 108 |
+
- NEON LiDAR is collected over North America; models may not generalize to other biomes or sensor geometries without adaptation.
|
| 109 |
+
|
| 110 |
+
## Licensing
|
| 111 |
+
- Derived from NEON AOP discrete-return LiDAR (DP1.30003.001). Follow the NEON Data Usage and Citation Policy and cite the original survey months/sites used.
|
| 112 |
+
- Include the citation for the Hyperheight paper when publishing results that use this dataset.
|
| 113 |
+
|
| 114 |
+
## Citation
|
| 115 |
+
```
|
| 116 |
+
@article{ramirez2024hyperheight,
|
| 117 |
+
title={Hyperheight lidar compressive sampling and machine learning reconstruction of forested landscapes},
|
| 118 |
+
author={Ramirez-Jaime, Andres and Pena-Pena, Karelia and Arce, Gonzalo R and Harding, David and Stephen, Mark and MacKinnon, James},
|
| 119 |
+
journal={IEEE Transactions on Geoscience and Remote Sensing},
|
| 120 |
+
volume={62},
|
| 121 |
+
pages={1--16},
|
| 122 |
+
year={2024},
|
| 123 |
+
publisher={IEEE}
|
| 124 |
+
}
|
| 125 |
+
```
|
| 126 |
+
|
| 127 |
+
## Maintainers
|
| 128 |
+
- Andres Ramirez-Jaime — aramjai@udel.edu
|
| 129 |
+
|