Commit Β·
ddcd443
1
Parent(s): 73a83f5
update
Browse filesThis view is limited to 50 files because it contains too many changes. Β See raw diff
- README.md +43 -35
- {test/seq β seq}/001/LICENSE.txt +0 -0
- {test/seq β seq}/001/lidar/00.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/01.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/02.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/03.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/04.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/05.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/06.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/07.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/08.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/09.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/10.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/11.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/12.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/13.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/14.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/15.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/16.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/17.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/18.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/19.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/20.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/21.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/22.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/23.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/24.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/25.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/26.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/27.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/28.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/29.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/30.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/31.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/32.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/33.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/34.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/35.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/36.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/37.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/38.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/39.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/40.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/41.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/42.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/43.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/44.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/45.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/46.pkl.gz +0 -0
- {test/seq β seq}/001/lidar/47.pkl.gz +0 -0
README.md
CHANGED
|
@@ -1,22 +1,6 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: cc-by-nc-sa-4.0
|
| 3 |
-
task_categories:
|
| 4 |
-
- object-detection
|
| 5 |
-
- zero-shot-object-detection
|
| 6 |
-
tags:
|
| 7 |
-
- 3d-object-detection
|
| 8 |
-
- few-shot-learning
|
| 9 |
-
- zero-shot-learning
|
| 10 |
-
- autonomous-driving
|
| 11 |
-
- lidar
|
| 12 |
-
- multimodal
|
| 13 |
-
- auto-annotation
|
| 14 |
-
- cvpr2026-workshop-challenge
|
| 15 |
-
---
|
| 16 |
-
|
| 17 |
# π CVPR 2026 Workshop Challenge: Auto-Annotation with Expert-Crafted Guidelines for 3D LiDAR Detection
|
| 18 |
|
| 19 |
-
Welcome to the
|
| 20 |
|
| 21 |
Inspired by recent advancements in foundation models and the critical bottleneck of data annotation in autonomous driving, this challenge introduces a novel paradigm: **Auto-Annotation from Expert-Crafted Guidelines**.
|
| 22 |
|
|
@@ -34,15 +18,18 @@ The repository is organized as follows to support the auto-annotation task. It c
|
|
| 34 |
cvpr-workshop-challenge-annoexpert-public/
|
| 35 |
βββ annotator_instructions/ # Textual guidelines for each category
|
| 36 |
β βββ instructions.pdf # Detailed definition & rules per class
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
βββ train/ # Few-Shot 2D Examples (Federated Annotation)
|
| 38 |
β βββ images/ # Exemplar images for each category
|
| 39 |
-
β βββ
|
| 40 |
-
βββ
|
| 41 |
-
β βββ
|
| 42 |
-
β
|
| 43 |
-
β
|
| 44 |
-
β β βββ lidar/ # LiDAR point clouds
|
| 45 |
-
β β βββ meta/ # Calibration and metadata
|
| 46 |
βββ .gitattributes # Git LFS configuration
|
| 47 |
βββ README.md # Dataset documentation
|
| 48 |
```
|
|
@@ -51,20 +38,40 @@ cvpr-workshop-challenge-annoexpert-public/
|
|
| 51 |
## π 2. Task Formulation & Data Formats
|
| 52 |
|
| 53 |
### 2.1 Expert Guidelines & Few-Shot Examples (Training)
|
|
|
|
| 54 |
Participants must rely on the provided guidelines and few-shot examples to understand the 25 target categories.
|
| 55 |
|
| 56 |
* **Annotator Instructions (`annotator_instructions/`)**: Contains the expert-crafted definitions and rules for annotating each class (e.g., whether to include a rider within a bicycle bounding box).
|
| 57 |
-
* **2D Visual Examples (`few_shot_examples/`)**:
|
| 58 |
-
* **Naming Convention (`images/`)**: `{category_name}&{seq_id}_{camera_name}_{frames_id}.[ext]`
|
| 59 |
-
* **Federated Annotation (`labels_2d/`)**: Note that these 2D examples are annotated in a *federated way*. In a given image, **only objects belonging to the target `{category_name}` are annotated**, while objects of other classes are intentionally ignored.
|
| 60 |
-
* **Label Format (`.txt`)**: `x y w h cls` (where `x`, `y` are the left-top coordinates, `w`, `h` are width and height).
|
| 61 |
|
| 62 |
-
|
| 63 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 64 |
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
* **
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 68 |
|
| 69 |
|
| 70 |
|
|
@@ -75,6 +82,7 @@ For the evaluation server to process your predictions, you must submit your 3D d
|
|
| 75 |
Participants must generate a single `submission.json` file containing all predictions for the 200 test images. The JSON file should contain a **list of dictionaries**, where each dictionary represents a single predicted 3D bounding box.
|
| 76 |
|
| 77 |
**JSON Structure Example:**
|
|
|
|
| 78 |
```json
|
| 79 |
[
|
| 80 |
{
|
|
@@ -83,7 +91,7 @@ Participants must generate a single `submission.json` file containing all predic
|
|
| 83 |
"frame_token": "001_front_left_camera_000029",
|
| 84 |
"label": "Car",
|
| 85 |
"score": 0.8,
|
| 86 |
-
"box_3d": [
|
| 87 |
},
|
| 88 |
{
|
| 89 |
"seq_id": "001",
|
|
@@ -91,7 +99,7 @@ Participants must generate a single `submission.json` file containing all predic
|
|
| 91 |
"frame_token": "001_front_left_camera_000029",
|
| 92 |
"label": "Pedestrian",
|
| 93 |
"score": 0.9,
|
| 94 |
-
"box_3d": [
|
| 95 |
}
|
| 96 |
]
|
| 97 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# π CVPR 2026 Workshop Challenge: Auto-Annotation with Expert-Crafted Guidelines for 3D LiDAR Detection
|
| 2 |
|
| 3 |
+
π Welcome to the CVPR 2026 Auto-Annotation Challenge, organized under the [AutoExpert workshop](https://autoexpert-arena.github.io/).
|
| 4 |
|
| 5 |
Inspired by recent advancements in foundation models and the critical bottleneck of data annotation in autonomous driving, this challenge introduces a novel paradigm: **Auto-Annotation from Expert-Crafted Guidelines**.
|
| 6 |
|
|
|
|
| 18 |
cvpr-workshop-challenge-annoexpert-public/
|
| 19 |
βββ annotator_instructions/ # Textual guidelines for each category
|
| 20 |
β βββ instructions.pdf # Detailed definition & rules per class
|
| 21 |
+
βββ seq/ # Shared sequence data (loaded via PandaSet Devkit)
|
| 22 |
+
β βββ {seq_id}/ # Standard PandaSet sequence structure
|
| 23 |
+
β βββ lidar/ # Raw LiDAR point clouds
|
| 24 |
+
βββ test/ # Multimodal Testing Set (Inputs only)
|
| 25 |
+
β βββ images/ # Multi-view test images (200 target frames)
|
| 26 |
βββ train/ # Few-Shot 2D Examples (Federated Annotation)
|
| 27 |
β βββ images/ # Exemplar images for each category
|
| 28 |
+
β βββ 2D_annotations/ # 2D bounding box annotations
|
| 29 |
+
βββ val/ # Validation Set
|
| 30 |
+
β βββ 2D_annotations/ # 2D bounding box annotations for validation frames
|
| 31 |
+
β βββ 3D_annotations/ # Ground truth 3D bounding boxes for local evaluation
|
| 32 |
+
β βββ images/ # Multi-view validation images
|
|
|
|
|
|
|
| 33 |
βββ .gitattributes # Git LFS configuration
|
| 34 |
βββ README.md # Dataset documentation
|
| 35 |
```
|
|
|
|
| 38 |
## π 2. Task Formulation & Data Formats
|
| 39 |
|
| 40 |
### 2.1 Expert Guidelines & Few-Shot Examples (Training)
|
| 41 |
+
|
| 42 |
Participants must rely on the provided guidelines and few-shot examples to understand the 25 target categories.
|
| 43 |
|
| 44 |
* **Annotator Instructions (`annotator_instructions/`)**: Contains the expert-crafted definitions and rules for annotating each class (e.g., whether to include a rider within a bicycle bounding box).
|
|
|
|
|
|
|
|
|
|
|
|
|
| 45 |
|
| 46 |
+
* **2D Visual Examples (`train/`)**:
|
| 47 |
+
|
| 48 |
+
* **Naming Convention (`images/`)**: `{category_name}&{seq_id}_{camera_name}_{frames_id}.[ext]`
|
| 49 |
+
|
| 50 |
+
* **Federated Annotation (`2D_annotations/`)**: Note that these 2D examples are annotated in a *federated way*. In a given image, **only objects belonging to the target `{category_name}` are annotated**, while objects of other classes are intentionally ignored.
|
| 51 |
+
|
| 52 |
+
* **Label Format (`.txt`)**: `x y w h cls` (where `x`, `y` are the left-top coordinates, `w`, `h` are width and height).
|
| 53 |
+
|
| 54 |
+
### 2.2 Validation Set (`val/`)
|
| 55 |
|
| 56 |
+
To help participants validate their models before submitting to the evaluation server, a validation set is provided with full annotations.
|
| 57 |
+
|
| 58 |
+
* **Images (`val/images/`)**: Multi-view images for the validation frames.
|
| 59 |
+
|
| 60 |
+
* **2D Annotations (`val/2D_annotations/`)**: Comprehensive 2D bounding box annotations.
|
| 61 |
+
|
| 62 |
+
* **3D Annotations (`val/3D_annotations/`)**: Ground truth 3D bounding boxes. Since training data lacks 3D references, you can use this set to locally evaluate your model's 3D detection metrics (mAP, NDS).
|
| 63 |
+
|
| 64 |
+
* **LiDAR & Calibration (`seq/`)**: **Crucially**, the corresponding raw LiDAR point clouds and sensor poses/intrinsics for these validation frames must be loaded via the **PandaSet Devkit** from the shared root `seq/{seq_id}/` directory.
|
| 65 |
+
|
| 66 |
+
### 2.3 Test Sensor Data (Evaluation)
|
| 67 |
+
|
| 68 |
+
The evaluation focuses on **200 specific keyframes** in the test set.
|
| 69 |
+
|
| 70 |
+
* **Images (`test/images/`)**: Contains multi-view test images (inputs only).
|
| 71 |
+
|
| 72 |
+
* **LiDAR & Calibration (`seq/`)**: Just like the validation set, the raw LiDAR sweeps and necessary calibration metadata for the test frames are provided within the shared root `seq/{seq_id}/` directory.
|
| 73 |
+
|
| 74 |
+
* **Data Access via Devkit**: You **must** use the official **PandaSet Devkit** API to read point clouds and sensor calibration. For example, use `sequence.camera[camera_name].poses[frame_idx]` for extrinsics, `sequence.camera[camera_name].intrinsics` for intrinsics, and `sequence.lidar[frame_idx]` for point clouds.
|
| 75 |
|
| 76 |
|
| 77 |
|
|
|
|
| 82 |
Participants must generate a single `submission.json` file containing all predictions for the 200 test images. The JSON file should contain a **list of dictionaries**, where each dictionary represents a single predicted 3D bounding box.
|
| 83 |
|
| 84 |
**JSON Structure Example:**
|
| 85 |
+
|
| 86 |
```json
|
| 87 |
[
|
| 88 |
{
|
|
|
|
| 91 |
"frame_token": "001_front_left_camera_000029",
|
| 92 |
"label": "Car",
|
| 93 |
"score": 0.8,
|
| 94 |
+
"box_3d": [10.5, -3.2, -1.0, 4.5, 1.8, 1.5, 0.12]
|
| 95 |
},
|
| 96 |
{
|
| 97 |
"seq_id": "001",
|
|
|
|
| 99 |
"frame_token": "001_front_left_camera_000029",
|
| 100 |
"label": "Pedestrian",
|
| 101 |
"score": 0.9,
|
| 102 |
+
"box_3d": [12.1, -1.5, -0.8, 0.5, 0.6, 1.7, 0.05]
|
| 103 |
}
|
| 104 |
]
|
| 105 |
```
|
{test/seq β seq}/001/LICENSE.txt
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/00.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/01.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/02.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/03.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/04.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/05.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/06.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/07.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/08.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/09.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/10.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/11.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/12.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/13.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/14.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/15.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/16.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/17.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/18.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/19.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/20.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/21.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/22.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/23.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/24.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/25.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/26.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/27.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/28.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/29.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/30.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/31.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/32.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/33.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/34.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/35.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/36.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/37.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/38.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/39.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/40.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/41.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/42.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/43.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/44.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/45.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/46.pkl.gz
RENAMED
|
File without changes
|
{test/seq β seq}/001/lidar/47.pkl.gz
RENAMED
|
File without changes
|