|
|
--- |
|
|
language: |
|
|
- en |
|
|
license: |
|
|
- cc-by-4.0 |
|
|
tags: |
|
|
- semantic-segmentation |
|
|
- scene-understanding |
|
|
- 3d-point-clouds |
|
|
- multimodal |
|
|
- drone-imagery |
|
|
pretty_name: NEST3D - Sociable Weaver Nest 3D Dataset |
|
|
--- |
|
|
|
|
|
# NEST3D: A High-Resolution Multimodal Dataset of Sociable Weaver Tree Nests |
|
|
|
|
|
 |
|
|
|
|
|
## Dataset Description |
|
|
|
|
|
NEST3D is a multimodal dataset of 104 sociable weaver nests, combining drone-based RGB and multispectral imagery with a semantically annotated 3D RGB point cloud. It captures trees hosting these nests through drone-based remote sensing, providing rich spatial and spectral information to benchmark and advance scene-level semantic segmentation methods for computer vision and ecological monitoring applications. |
|
|
|
|
|
### Key Characteristics |
|
|
|
|
|
- **Modality**: Multimodal (RGB imagery, multispectral bands, 3D point clouds) |
|
|
- **Task**: Scene-level semantic segmentation |
|
|
- **Scale**: Multiple tree-nest scenes with consistent spatial and spectral coverage |
|
|
- **Annotation**: Point-level semantic labels for 3D point clouds |
|
|
- **Data Source**: Drone-based RGB and multispectral imagery |
|
|
- **Application Domain**: Ecological monitoring, wildlife management, 3d semantic segmenation, 3d reconstruction. |
|
|
|
|
|
## Dataset Organization |
|
|
|
|
|
The dataset is organized into modality-specific directories to support flexible access and reuse: |
|
|
|
|
|
### Directory Structure |
|
|
|
|
|
``` |
|
|
NEST3D/ |
|
|
├── train/ |
|
|
│ ├── sample_001/ |
|
|
│ │ ├── RGB/ # RGB drone images |
|
|
│ │ │ ├── sample001_RGB_001.JPG |
|
|
│ │ │ └── ... |
|
|
│ │ ├── MS/ # Multispectral imagery |
|
|
│ │ │ ├── Green/ |
|
|
│ │ │ │ ├── sample001_G_001.TIF |
|
|
│ │ │ │ └── ... |
|
|
│ │ │ ├── Red/ |
|
|
│ │ │ │ ├── sample001_R_001.TIF |
|
|
│ │ │ │ └── ... |
|
|
│ │ │ ├── Red_Edge/ |
|
|
│ │ │ │ ├── sample001_RE_001.TIF |
|
|
│ │ │ │ └── ... |
|
|
│ │ │ └── NIR/ |
|
|
│ │ │ ├── sample001_NIR_001.TIF |
|
|
│ │ │ └── ... |
|
|
│ │ └── sample001.npy # 3D point cloud with labels |
|
|
│ └── sample_002/ |
|
|
│ └── ... |
|
|
│ |
|
|
└── test/ |
|
|
├── sample_084/ |
|
|
│ ├── RGB/ |
|
|
│ ├── MS/ |
|
|
│ └── sample084.npy |
|
|
└── ... |
|
|
``` |
|
|
|
|
|
### Data Modalities |
|
|
|
|
|
#### 1. **RGB Imagery** |
|
|
- Raw drone images from aerial acquisition |
|
|
- Format: JPEG |
|
|
- Organized by data split and scene identifier |
|
|
- Example path: `train/sample_001/RGB/sample001_RGB_119.JPG` |
|
|
|
|
|
#### 2. **Multispectral Imagery** |
|
|
- Four spectral bands from the same acquisitions as RGB |
|
|
- Organized into four band-specific folders: |
|
|
- **Green (G)**: Green channel imagery |
|
|
- **Red (R)**: Red channel imagery |
|
|
- **Red Edge (RE)**: Red Edge channel for vegetation analysis |
|
|
- **NIR**: Near-Infrared channel for vegetation health assessment |
|
|
- Format: GeoTIFF (.TIF) |
|
|
- Example paths: |
|
|
- `train/sample_001/MS/Green/sample001_G_119.TIF` |
|
|
- `train/sample_001/MS/Red/sample001_R_119.TIF` |
|
|
- `train/sample_001/MS/Red_Edge/sample001_RE_119.TIF` |
|
|
- `train/sample_001/MS/NIR/sample001_NIR_119.TIF` |
|
|
|
|
|
#### 3. **3D Point Clouds** |
|
|
- One NumPy file per scene containing the complete 3D reconstruction |
|
|
- Format: `.npy` (NumPy binary format) |
|
|
- Per-point attributes: `[x, y, z, r, g, b, label]` |
|
|
- **x, y, z**: 3D spatial coordinates (meters) |
|
|
- **r, g, b**: RGB color values (0-255) |
|
|
- **label**: Semantic class label (integer) |
|
|
- Example path: `train/sample_001/sample001.npy` |
|
|
|
|
|
## Data Splits |
|
|
|
|
|
The dataset is divided into fixed training and test sets: |
|
|
|
|
|
- **Training Set**: Used for model training and development |
|
|
- **Test Set**: Reserved for model evaluation and benchmarking |
|
|
|
|
|
Each split contains a consistent collection of scenes to ensure reliable evaluation. |
|
|
|
|
|
## Usage |
|
|
|
|
|
### Loading 3D Point Clouds |
|
|
|
|
|
```python |
|
|
import numpy as np |
|
|
|
|
|
# Load point cloud with semantic labels |
|
|
point_cloud = np.load('train/sample_001/sample001.npy') |
|
|
|
|
|
# Extract coordinates |
|
|
xyz = point_cloud[:, :3] |
|
|
|
|
|
# Extract colors |
|
|
rgb = point_cloud[:, 3:6] |
|
|
|
|
|
# Extract semantic labels |
|
|
labels = point_cloud[:, 6] |
|
|
``` |
|
|
|
|
|
### Loading Multispectral Imagery |
|
|
|
|
|
```python |
|
|
from PIL import Image |
|
|
import numpy as np |
|
|
|
|
|
# Load a single band |
|
|
green_band = np.array(Image.open('train/sample_001/MS/Green/sample001_G_001.TIF')) |
|
|
|
|
|
# Load all four bands for a given image |
|
|
green = np.array(Image.open('train/sample_001/MS/Green/sample001_G_001.TIF')) |
|
|
red = np.array(Image.open('train/sample_001/MS/Red/sample001_R_001.TIF')) |
|
|
red_edge = np.array(Image.open('train/sample_001/MS/Red_Edge/sample001_RE_001.TIF')) |
|
|
nir = np.array(Image.open('train/sample_001/MS/NIR/sample001_NIR_001.TIF')) |
|
|
|
|
|
# Stack into multiband image |
|
|
multispectral = np.stack([green, red, red_edge, nir], axis=-1) |
|
|
``` |
|
|
|
|
|
### Using with Hugging Face Datasets Library |
|
|
|
|
|
```python |
|
|
from datasets import load_dataset |
|
|
|
|
|
# Load the dataset |
|
|
dataset = load_dataset('NEST3D/dataset') |
|
|
``` |
|
|
|
|
|
## Downloading the Dataset |
|
|
|
|
|
### Option 1: Using Hugging Face Hub |
|
|
```bash |
|
|
pip install huggingface_hub |
|
|
|
|
|
huggingface-cli download NEST3D/dataset --repo-type dataset --local-dir ./NEST3D |
|
|
``` |
|
|
|
|
|
## Dataset Information |
|
|
|
|
|
- **Total Size**: |
|
|
- **Number of Scenes**: 104 samples. Split into train/test as 83/21 |
|
|
- **Modalities**: RGB, Multispectral (4 bands), 3D Point Clouds |
|
|
- **Image Format**: JPEG (RGB), GeoTIFF (Multispectral) |
|
|
- **Point Cloud Format**: NumPy arrays |
|
|
- **Annotation Type**: Per-point semantic labels |
|
|
|
|
|
## Acknowledgments |
|
|
|
|
|
This work was funded by: |
|
|
|
|
|
- **European Union's Horizon Europe** research and innovation programme through the Marie Skłodowska-Curie project **"WildDrone – Autonomous Drones for Nature Conservation"** (grant agreement no. 101071224) |
|
|
- **EPSRC-funded** "Autonomous Drones for Nature Conservation Missions" grant (EP/X029077/1) |
|
|
- **Swiss State Secretariat for Education, Research and Innovation (SERI)** under contract number 22.00280 |
|
|
|
|
|
We extend our gratitude to our collaborators and field partners in Namibia for their invaluable support during data collection. |
|
|
|
|
|
## Contact & Support |
|
|
|
|
|
For questions, issues, or contributions, please visit the [dataset discussion forum](https://huggingface.co/datasets/NEST3D/dataset/discussions). |
|
|
|
|
|
|
|
|
**Last Updated**: February 2026 |
|
|
**Dataset Version**: 1.0 |
|
|
|