Update README.md
Browse files
README.md
CHANGED
|
@@ -16,12 +16,12 @@ paperswithcode_id: lisu
|
|
| 16 |
configs:
|
| 17 |
- config_name: default
|
| 18 |
data_files:
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
- split: test
|
| 26 |
path:
|
| 27 |
- Town02/**/*.parquet
|
|
@@ -47,4 +47,74 @@ dataset_info:
|
|
| 47 |
dtype: float32
|
| 48 |
- name: surf_norm_z
|
| 49 |
dtype: float32
|
| 50 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
configs:
|
| 17 |
- config_name: default
|
| 18 |
data_files:
|
| 19 |
+
- split: train
|
| 20 |
+
path:
|
| 21 |
+
- Town01/**/*.parquet
|
| 22 |
+
- Town03/**/*.parquet
|
| 23 |
+
- Town05/**/*.parquet
|
| 24 |
+
- Town07/**/*.parquet
|
| 25 |
- split: test
|
| 26 |
path:
|
| 27 |
- Town02/**/*.parquet
|
|
|
|
| 47 |
dtype: float32
|
| 48 |
- name: surf_norm_z
|
| 49 |
dtype: float32
|
| 50 |
+
---
|
| 51 |
+
# LiSu: A Dataset for LiDAR Surface Normal Estimation
|
| 52 |
+
|
| 53 |
+
LiSu provides synthetic LiDAR point clouds, each annotated with surface normal vectors.
|
| 54 |
+
This dataset is generated using [CARLA](https://github.com/carla-simulator/carla) simulator, ensuring diverse environmental conditions for robust training and evaluation.
|
| 55 |
+
Below is an example from LiSu, where surface normals are linearly mapped to the RGB color space for intuitive visualization:
|
| 56 |
+
<div style="display: flex;">
|
| 57 |
+
<img src="https://raw.githubusercontent.com/malicd/LiSu/refs/heads/master/docs/imgs/ex1.png" style="width: 49%; margin-right: 10px;">
|
| 58 |
+
<img src="https://raw.githubusercontent.com/malicd/LiSu/refs/heads/master/docs/imgs/ex2.png" style="width: 49%;">
|
| 59 |
+
</div>
|
| 60 |
+
|
| 61 |
+
|
| 62 |
+
## Dataset Details
|
| 63 |
+
|
| 64 |
+
### Dataset Description
|
| 65 |
+
|
| 66 |
+
We generate our dataset using [CARLA](https://github.com/carla-simulator/carla), a simulation framework based on the Unreal Engine.
|
| 67 |
+
Specifically, we leverage nine of CARLA's twelve pre-built maps, excluding two reserved for the CARLA Autonomous Driving Challenges and one undecorated map with low geometric detail (i.e. without buildings, sidewalks, etc.).
|
| 68 |
+
These selected maps represent diverse urban and rural environments, including downtown areas, small towns, and multi-lane highways.
|
| 69 |
+
For each simulation, we populated the scenes with a large number of dynamic actors, such as vehicles (cars, trucks, buses, vans, motorcycles, bicycles) and pedestrians (adults, children, police) as well as static props (barrels, garbage cans, road barriers, etc.).
|
| 70 |
+
The dynamic actors exhibited realistic movement patterns, governed by the underlying physics engine and adhered to real-world traffic rules, such as driving on designated roads and obeying traffic signals.
|
| 71 |
+
|
| 72 |
+
To capture realistic driving scenarios, we employ a virtual LiDAR sensor mounted atop a car operating in autopilot mode.
|
| 73 |
+
The LiDAR sensor is configured to emit 64 laser beams, a 10° upper and a -30° lower field of view.
|
| 74 |
+
Such a common sensor configuration strikes a balance between sparsity and density, providing a challenging yet fair evaluation environment.
|
| 75 |
+
To further mimic real-world conditions, we set the maximum range to 100 meters and introduce Gaussian noise with a standard deviation of 0.02 meters to the LiDAR point cloud.
|
| 76 |
+
The sensor captures data at a rate of 10Hz.
|
| 77 |
+
|
| 78 |
+
CARLA's default LiDAR sensor implementation is limited to position and intensity channels.
|
| 79 |
+
To enable surface normal collection, we extend CARLA's ray tracer to query surface normals at each intersection point between a ray and a mesh object.
|
| 80 |
+
These surface normals are then transformed into the sensor's coordinate frame and appended to the LiDAR data.
|
| 81 |
+
This requires modifications to both CARLA's C++ backend and Python frontend, adding three extra channels to store the x, y, and z components of the normal vector for each LiDAR point.
|
| 82 |
+
|
| 83 |
+
For each map, we conduct eleven randomly initialized and independent simulation runs.
|
| 84 |
+
A simulation is terminated early if prolonged traffic halts, such as red lights, occur.
|
| 85 |
+
On average, each simulation lasts approximately 50 seconds, resulting in total of 50045 labeled frames.
|
| 86 |
+
To ensure rigorous evaluation, we partition our dataset into training, validation, and testing sets.
|
| 87 |
+
We assign each map to exactly one split, preventing data leakage (i.e. using the same "city" in multiple splits).
|
| 88 |
+
One map is designated for validation, while the remaining eight maps are divided equally between the training and testing sets.
|
| 89 |
+
This results in 25053 training, 22167 testing, and 2825 validation frames.
|
| 90 |
+
|
| 91 |
+
### Dataset Sources
|
| 92 |
+
<div style="display: flex;">
|
| 93 |
+
<a href="https://github.com/malicd/LiSu" target="_blank">
|
| 94 |
+
<img src="https://img.shields.io/badge/GitHub-%23121011.svg?logo=github&logoColor=white" alt="GitHub">
|
| 95 |
+
</a>
|
| 96 |
+
|
| 97 |
+
<img alt="GitHub Repo stars" src="https://img.shields.io/github/stars/malicd/lisu" style="margin-right: 20px;">
|
| 98 |
+
|
| 99 |
+
<a href="https://arxiv.org/abs/2503.08601" target="_blank">
|
| 100 |
+
<img src="https://img.shields.io/badge/Paper-arXiv-red" alt="Paper arXiv">
|
| 101 |
+
</a>
|
| 102 |
+
</div>
|
| 103 |
+
|
| 104 |
+
### CARLA Pull Request
|
| 105 |
+
<div style="display: flex;">
|
| 106 |
+
<a href="https://github.com/carla-simulator/carla/pull/8773" target="_blank">
|
| 107 |
+
<img alt="GitHub pull request status" src="https://img.shields.io/github/status/s/pulls/carla-simulator/carla/8773?style=flat&label=PR">
|
| 108 |
+
</a>
|
| 109 |
+
</div>
|
| 110 |
+
|
| 111 |
+
## Citation
|
| 112 |
+
|
| 113 |
+
```
|
| 114 |
+
@inproceedings{cvpr2025lisu,
|
| 115 |
+
title={{LiSu: A Dataset and Method for LiDAR Surface Normal Estimation}},
|
| 116 |
+
author={Du\v{s}an Mali\'c and Christian Fruhwirth-Reisinger and Samuel Schulter and Horst Possegger},
|
| 117 |
+
booktitle={IEEE/CVF Computer Vision and Pattern Recognition Conference (CVPR)},
|
| 118 |
+
year={2025}
|
| 119 |
+
}
|
| 120 |
+
```
|