Datasets:
The dataset viewer is not available for this dataset.
Error code: RetryableConfigNamesError
Exception: HfHubHTTPError
Message: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/AutelRobotics/CosFly/tree/a225f969eb743cc971bb0ea0c92b7d21e86c065e?expand=false&recursive=true&limit=1000&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoWDNZM0wxUnZkMjR3TWw5UGNIUXZkSEpoYW1WamRHOXllVjh4TnpjM01USTRNalExTDJGMVoxOHdNREV2Wm5KaGJXVnpYM0JzWVhsaVlXTnJMMlp5WVcxbFh6QXdNRGc1SWl3aWRISmxaVjl2YVdRaU9pSXdaVGd3WXpWalptUmtOekl5WWpZd01qRmhZalE0WW1aak5ESTNPRE01T1RNM01UQTBNR1F6SW4wPToxMTAwMA%3D%3D
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
config_names = get_dataset_config_names(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 161, in get_dataset_config_names
dataset_module = dataset_module_factory(
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1207, in dataset_module_factory
raise e1 from None
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1182, in dataset_module_factory
).get_module()
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 638, in get_module
patterns = get_data_patterns(base_path, download_config=self.download_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 493, in get_data_patterns
return _get_data_files_patterns(resolver)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 290, in _get_data_files_patterns
data_files = pattern_resolver(pattern)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 372, in resolve_pattern
for filepath, info in fs.glob(fs_pattern, detail=True, **glob_kwargs).items():
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 521, in glob
return super().glob(path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/spec.py", line 604, in glob
allpaths = self.find(root, maxdepth=depth, withdirs=True, detail=True, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 563, in find
out = self._ls_tree(path, recursive=True, refresh=refresh, revision=resolved_path.revision, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 446, in _ls_tree
self._ls_tree(
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 463, in _ls_tree
for path_info in tree:
^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 3140, in list_repo_tree
for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_pagination.py", line 46, in paginate
hf_raise_for_status(r)
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 482, in hf_raise_for_status
raise _format(HfHubHTTPError, str(e), response) from e
huggingface_hub.errors.HfHubHTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/AutelRobotics/CosFly/tree/a225f969eb743cc971bb0ea0c92b7d21e86c065e?expand=false&recursive=true&limit=1000&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoWDNZM0wxUnZkMjR3TWw5UGNIUXZkSEpoYW1WamRHOXllVjh4TnpjM01USTRNalExTDJGMVoxOHdNREV2Wm5KaGJXVnpYM0JzWVhsaVlXTnJMMlp5WVcxbFh6QXdNRGc1SWl3aWRISmxaVjl2YVdRaU9pSXdaVGd3WXpWalptUmtOekl5WWpZd01qRmhZalE0WW1aak5ESTNPRE01T1RNM01UQTBNR1F6SW4wPToxMTAwMA%3D%3DNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
CosFly-Track
Overview
This is the pre-release open-source dataset for the paper CosFly-Track: A Large-Scale Multi-Modal Dataset for UAV Visual Tracking via Multi-Constraint Trajectory Optimization. We currently release a filtered local snapshot, and additional data will be published here in future updates.
CosFly-Track contains simulated UAV visual tracking trajectories for target tracking and waypoint prediction. Each trajectory records RGB observations, depth and segmentation artifacts, drone poses, target annotations, camera projection metadata, and navigation waypoints.
The current local snapshot contains:
- 263 parent trajectory directories.
- 526 trace directories.
- 526
trajectory.jsonfiles. - Trace types observed:
ORIandaug_001. - Towns observed:
Town01,Town02,Town03,Town04,Town05,Town06,Town07,Town10HDand their_Optvariants.
Per-trace counts:
ORI 263
aug_001 263
Each released trajectory contains both a clean ORI trace and an augmented aug_001 trace. ORI is the clean reference trace. aug_001 is an augmented trace with perturbed drone pose and aligned frame indices. The augmented RGB observations and current pose can be used to train robustness, while the clean ORI future waypoints can be used as denoising supervision when desired.
Directory Layout
The filtered upload data is organized by town, parent trajectory, and trace type. The original date level has been removed from the release layout.
CosFly-Track/
βββ README_ZH.md
βββ data_sample/
β βββ trajectory_<id>/
β βββ ORI/
β β βββ trajectory.json
β β βββ frames_playback/
β β βββ frame_<index>/
β β βββ rgb.png
β β βββ depth.npy
β β βββ instance.png
β β βββ debug.png
β β βββ meta.json
β βββ aug_001/
β βββ trajectory.json
β βββ perturbation_report.json
β βββ frames_playback/
β βββ frame_<index>/
β βββ rgb.png
β βββ depth.npy
β βββ instance.png
β βββ debug.png
β βββ meta.json
βββ data_v7/
βββ Town<id>/
βββ trajectory_<id>/
βββ augmentation_summary.json
βββ ORI/
β βββ trajectory.json
β βββ frames_playback/
β βββ frame_<index>/
β βββ rgb.png
β βββ depth.npy
β βββ instance.png
β βββ debug.png
β βββ meta.json
βββ aug_001/
βββ trajectory.json
βββ perturbation_report.json
βββ frames_playback/
βββ frame_<index>/
βββ rgb.png
βββ depth.npy
βββ instance.png
βββ debug.png
βββ meta.json
Data Files
Each trace contains one trajectory-level JSON file:
data_v7/<Town>/trajectory_<id>/<trace_dir>/trajectory.json
Example:
data_v7/Town01/trajectory_1777083733/aug_001/trajectory.json
Observed top-level keys include:
camera
dataset_format
frames_dir
pedestrian_blueprint
points
schema
schema_version
source
trace_dir
Additional augmentation metadata is stored at the parent trajectory level:
data_v7/<Town>/trajectory_<id>/augmentation_summary.json
data_v7/<Town>/trajectory_<id>/aug_001/perturbation_report.json
The points array stores per-frame annotations. Observed per-point keys include:
drone_pose
index
is_perturbed
nav_waypoint
perturbation
target
timing
world_to_camera
Important nested fields:
drone_pose: UAV position and attitude, includingx,y,z,pitch,yaw, androll.target: target actor metadata and visual annotations, including visibility, image coordinates, depth, and 3D bounding box.nav_waypoint: navigation waypoint annotations in both world and image coordinates.world_to_camera: transformation matrix for projecting world coordinates to the camera frame.
Each playback frame directory contains:
rgb.png: RGB observation, usually 1280 x 720.depth.npy: depth array in NumPy format.instance.png: instance segmentation image.debug.png: visualization/debug image with overlays.meta.json: per-frame metadata aligned with the correspondingpointsentry.
Minimal Loader Example
from pathlib import Path
import json
import numpy as np
from PIL import Image
root = Path("data_v7")
for traj_json in sorted(root.glob("Town*/trajectory_*/*/trajectory.json")):
with traj_json.open("r", encoding="utf-8") as f:
traj = json.load(f)
trace_dir = traj.get("trace_dir")
points = traj.get("points", [])
if not points:
continue
first = points[0]
frame_index = first["index"]
frame_dir = traj_json.parent / "frames_playback" / f"frame_{frame_index:05d}"
rgb = Image.open(frame_dir / "rgb.png")
depth = np.load(frame_dir / "depth.npy")
with (frame_dir / "meta.json").open("r", encoding="utf-8") as f:
meta = json.load(f)
print(
trace_dir,
len(points),
rgb.size,
depth.shape,
first["drone_pose"],
first["target"],
meta["frame_id"],
)
break
Filtering and Quality Criteria
The current release is generated from filtered v7 data. Main checks include:
- Integrity checks:
ORIandaug_*traces exist, required frame files are readable, andmeta.jsoncontainsframe_id. - Basic trajectory quality: average target-drone distance <= 35, max distance <= 50, max target height <= 2, target visible ratio > 55%, and adjacent drone z-step <= 5.
- Drone collision checks: drone poses must not enter town-specific map 3D bounding boxes.
- Target collision checks: target 3D bounding boxes must not overlap map objects beyond the configured threshold.
- Town-map consistency: map bounding boxes are selected according to the Town name in each trajectory path.
Recommended Quality Checks
Before packaging a public release, run the following checks:
- Ensure every parent trajectory contains both
ORIandaug_001. - Ensure every trace has a readable
trajectory.json. - Ensure every frame referenced by
trajectory.jsonhasmeta.json,rgb.png,depth.npy,instance.png, anddebug.png. - Optionally decode all RGB and debug images with Pillow to catch corrupted images.
- Verify that frame indices between
ORIandaug_001are aligned. - Verify that public manifests and JSON metadata do not expose unintended internal machine paths.
Coordinate System
The trajectory fields x, y, z, pitch, yaw, and roll follow CARLA / Unreal-style world coordinates and Euler angles. For Three.js visualization, the project uses the mapping (x, z, -y).
Intended Uses
CosFly-Track can be used for:
- UAV visual target tracking.
- Waypoint prediction and visual navigation.
- Robustness training with augmented UAV poses.
- Multi-modal learning from RGB, depth, segmentation, and structured trajectory metadata.
- Trajectory prediction, visibility modeling, and simulator-based evaluation.
Release Checklist
The following items should be finalized before public open-source release:
- Public Hugging Face dataset URL.
- Dataset license.
- Paper authors and final citation.
- Train/validation/test split manifest.
- Benchmark metric definition and evaluation script path.
- Known limitations and simulator domain-gap notes.
Citation
Citation information is pending. Replace this section with the final BibTeX entry before release.
@misc{cosfly_track,
title = {CosFly-Track: A Large-Scale Multi-Modal Dataset for UAV Visual Tracking via Multi-Constraint Trajectory Optimization},
author = {TBD},
year = {2026},
howpublished = {TBD}
}
- Downloads last month
- 652