Datasets:
Languages:
English
Size:
n<1K
Tags:
robotics
drone-navigation
vision-language-navigation
open-vocabulary-detection
embodied-ai
habitat-sim
License:
Dataset Viewer
The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Yonder Sample — NeurIPS 2026 Reviewer Preview
This is a ~500 MB sample of the Yonder drone navigation dataset, intended for fast inspection by NeurIPS reviewers and others who want to verify the data format and content before downloading the full ~3.3 TB release.
What's included
- One HSSD scene:
hssd-102343992 - 50 consecutive waypoint NPZs (
wp0000throughwp0049) - All 12 yaw orientations per waypoint
- All sensor modalities present in the full dataset: stereo RGB (left/right), forward depth, landing camera, up-IR, down-IR, 360° LiDAR, position, orientation
- Semantic segmentation for every waypoint (50 matching
*_semantics.npzfiles) - ~50 × 10 MB sensor + 50 × ~25 KB semantics ≈ 500 MB total
What's NOT in this sample
- Multiple scenes — by design. This sample is a single-scene slice. The full release spans 167 HSSD scenes, all with semantic annotations. Other scene sources (ReplicaCAD, Replica, HM3D) considered during early collection have been excluded from the public release for license-compatibility reasons; see the main dataset card for details.
- COCO bounding-box annotations — derived programmatically from the semantic
channels and shipped per-scene under
annotations/on the main repo.
Quick start
from huggingface_hub import snapshot_download
import numpy as np
local = snapshot_download(repo_id="astralhf/yonder-sample", repo_type="dataset")
# Sensor data
data = np.load(f"{local}/hssd-102343992_wp0000.npz")
print(sorted(data.keys())[:10])
# ['down_ir', 'lidar360', 'orientation', 'position', 'up_ir',
# 'yaw000_forward_depth', 'yaw000_landing_cam', 'yaw000_left_rgb',
# 'yaw000_right_rgb', 'yaw001_forward_depth']
print("left_rgb yaw000:", data["yaw000_left_rgb"].shape, data["yaw000_left_rgb"].dtype)
# left_rgb yaw000: (480, 640, 3) uint8
print("forward_depth yaw000:", data["yaw000_forward_depth"].shape, data["yaw000_forward_depth"].dtype)
# forward_depth yaw000: (480, 640) float16
print("lidar360:", data["lidar360"].shape, data["lidar360"].dtype)
# lidar360: (1024, 16) float32
# Semantic segmentation (one file per waypoint, 12 yaw keys)
sem = np.load(f"{local}/hssd-102343992_wp0000_semantics.npz")
print(sorted(sem.keys())[:4])
# ['yaw000_semantic', 'yaw030_semantic', 'yaw060_semantic', 'yaw090_semantic']
print("semantic yaw000:", sem["yaw000_semantic"].shape, sem["yaw000_semantic"].dtype)
# semantic yaw000: (480, 640) uint16 (per-pixel instance ID)
Visualizing a frame
import numpy as np
import matplotlib.pyplot as plt
data = np.load("hssd-102343992_wp0000.npz")
sem = np.load("hssd-102343992_wp0000_semantics.npz")
fig, axes = plt.subplots(1, 4, figsize=(20, 5))
axes[0].imshow(data["yaw000_left_rgb"]); axes[0].set_title("Left RGB")
axes[1].imshow(data["yaw000_right_rgb"]); axes[1].set_title("Right RGB")
axes[2].imshow(data["yaw000_forward_depth"], cmap="plasma")
axes[2].set_title("Forward depth (m)")
axes[3].imshow(sem["yaw000_semantic"], cmap="tab20")
axes[3].set_title("Semantic instance IDs")
for a in axes: a.axis("off")
plt.tight_layout(); plt.savefig("yonder_sample.png", dpi=150)
Going to the full dataset
# Single scene from the full repo (~25 GB sensor + semantics)
snapshot_download(
repo_id="astralhf/yonder",
repo_type="dataset",
allow_patterns=[
"indoor/drone-data/augmented/hssd-102343992/*.npz",
"semantics/hssd-102343992/*.npz",
"annotations/hssd-102343992/*.json",
],
)
# Just the manifests (a few MB) to plan a custom download
snapshot_download(
repo_id="astralhf/yonder",
repo_type="dataset",
allow_patterns="indoor/drone-data/augmented/*/manifest.json",
)
License
CC-BY-NC-4.0 (inheriting HSSD's NonCommercial restriction). See the main dataset card for full licensing and Responsible AI considerations.
Citation
@inproceedings{anonymous2026yonder,
title = {Yonder: A 4.65M-Frame Drone Navigation Dataset and the Cross-Simulator Generalization Gap},
author = {Anonymous Author(s)},
booktitle = {NeurIPS Datasets and Benchmarks Track},
year = {2026},
note = {Anonymized for double-blind review.}
}
- Downloads last month
- 184