|
|
--- |
|
|
license: mit |
|
|
pipeline_tag: robotics |
|
|
tags: |
|
|
- autonomous-driving |
|
|
- imitation-learning |
|
|
- carla |
|
|
- transfuser |
|
|
pretty_name: LEAD Carla Leaderboard 2.0 |
|
|
size_categories: |
|
|
- 1M<n<10M |
|
|
--- |
|
|
|
|
|
# LEAD: Minimizing Learner–Expert Asymmetry in End-to-End Driving |
|
|
|
|
|
[**Project Page**](https://ln2697.github.io/lead) | [**Paper**](https://huggingface.co/papers/2512.20563) | [**Code**](https://github.com/autonomousvision/lead) |
|
|
|
|
|
Official CARLA dataset accompanies our paper LEAD: Minimizing Learner–Expert Asymmetry in End-to-End Driving. |
|
|
|
|
|
> We release the complete pipeline required to achieve state-of-the-art closed-loop performance on the Bench2Drive benchmark. Built around the CARLA simulator, the stack features a data-centric design with: |
|
|
> |
|
|
> - Extensive visualization suite and runtime type validation for easier debugging. |
|
|
> - Optimized storage format, packs 72 hours of driving in ~200GB. |
|
|
> - Native support for NAVSIM and Waymo Vision-based E2E and extending those benchmarks through closed-loop simulation and synthetic data for additional supervision during training. |
|
|
|
|
|
Find more information on [https://github.com/autonomousvision/lead](https://github.com/autonomousvision/lead). |
|
|
|
|
|
## Format |
|
|
|
|
|
Each route is stored as a sequence of synchronized frames. All sensor modalities are ego-centric and time-aligned. |
|
|
In addition to the nominal sensor suite, we provide a second, perturbated sensor stack corresponding to a counterfactual ego state used for recovery supervision. |
|
|
|
|
|
```html |
|
|
├── bboxes/ # Per-frame 3D bounding boxes for all actors |
|
|
├── depth/ # Compressed depth maps (should be used for auxiliary supervision only) |
|
|
├── depth_perturbated # Depth from a perturbated ego state |
|
|
├── hdmap/ # Ego-centric rasterized HD map |
|
|
├── hdmap_perturbated # HD map aligned to perturbated ego pose |
|
|
├── lidar/ # LiDAR point clouds |
|
|
├── metas/ # Per-frame metadata and ego state |
|
|
├── radar/ # Radar detections |
|
|
├── radar_perturbated # Radar detections from perturbated ego state |
|
|
├── rgb/ # Front-facing RGB images |
|
|
├── rgb_perturbated # RGB images from perturbated ego state |
|
|
├── semantics/ # Semantic segmentation maps |
|
|
├── semantics_perturbated # Semantics from perturbated ego state |
|
|
└── results.json # Route-level summary and evaluation metadata |
|
|
``` |
|
|
|
|
|
## Download |
|
|
|
|
|
You can either download a **single route** (useful for quick inspection / debugging) or **clone the full dataset** via Git LFS and unzip all routes. |
|
|
|
|
|
**Note:** Download the dataset after setting up the [lead repository](https://github.com/autonomousvision/lead). |
|
|
|
|
|
### Option 1: Download a single route |
|
|
|
|
|
```bash |
|
|
bash scripts/download_one_route.sh |
|
|
``` |
|
|
|
|
|
### Option 2: Download all routes (Git LFS) |
|
|
|
|
|
Clone the dataset repository directly into the expected directory: |
|
|
|
|
|
```bash |
|
|
git lfs install |
|
|
git clone https://huggingface.co/datasets/ln2697/lead_carla data/carla_leaderboard2/zip |
|
|
``` |
|
|
|
|
|
### Unzip routes |
|
|
|
|
|
Run |
|
|
|
|
|
```bash |
|
|
bash scripts/unzip_routes.sh |
|
|
``` |
|
|
|
|
|
## Citation |
|
|
|
|
|
If you find this work useful, please cite: |
|
|
|
|
|
```bibtex |
|
|
@article{Nguyen2025ARXIV, |
|
|
title={LEAD: Minimizing Learner-Expert Asymmetry in End-to-End Driving}, |
|
|
author={Nguyen, Long and Fauth, Micha and Jaeger, Bernhard and Dauner, Daniel and Igl, Maximilian and Geiger, Andreas and Chitta, Kashyap}, |
|
|
journal={arXiv preprint arXiv:2512.20563}, |
|
|
year={2025} |
|
|
} |
|
|
``` |
|
|
|
|
|
## License |
|
|
|
|
|
This project is released under the [MIT License](LICENSE) |