spanichella commited on
Commit
e487450
·
verified ·
1 Parent(s): 1621ee4

Upload 6 files

Browse files
Files changed (6) hide show
  1. LICENSE +201 -0
  2. README.md +237 -3
  3. Requierement.txt +10 -0
  4. UAV.zip +3 -0
  5. UAV_notebook.ipynb +0 -0
  6. dataset_gen.ipynb +614 -0
LICENSE ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [2026] [loubet--bonino Grégory]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md CHANGED
@@ -1,3 +1,237 @@
1
- ---
2
- license: gpl-3.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # UAV Testing Competition
2
+
3
+
4
+ Unmanned Aerial Vehicles (UAVs) equipped with onboard cameras and various sensors have already demonstrated the possibility of autonomous flights in real environments, leading to great interest in various application scenarios: crop monitoring, surveillance, medical and food delivery.
5
+
6
+ Over the years, support for UAV developers has increased with open-access projects for software and hardware, such as the autopilot support provided by [PX4](https://github.com/PX4/PX4-Autopilot) and [Ardupilot](https://github.com/ArduPilot/ardupilot).
7
+ However, despite the necessity of systematically testing such complex and automated systems to ensure their safe operation in real-world environments, there has been relatively limited investment in this direction so far.
8
+
9
+ The UAV Testing Competition organized jointly by the [International Conference on Software Testing, Verification and Validation (ICST)](https://conf.researchr.org/home/icst-2026) and [Search-Based and Fuzz Testing (SBFT) workshop](https://search-based-and-fuzz-testing.github.io/sbft26/) is an initiative designed to inspire and encourage the Software Testing Community to direct their attention toward UAVs as a rapidly emerging and crucial domain. The joint call is meant to help interested authors/participants reduce travel costs by selecting the most convenient and close venue.
10
+
11
+ ## Files
12
+
13
+ | File | Description |
14
+ |------|-------------|
15
+ | `UAV.db` | SQLite database — all flight data |
16
+ | `UAV_notebook.ipynb` | Exploration notebook — visualizations, metrics, ML export, and tools to add new missions |
17
+ | `Dataset_gen.ipynb` | Python pipeline — scans flight folders and populates the database |
18
+ | `requirements.txt` | Pinned library versions for reproducible environment setup |
19
+
20
+ Place all files in the same directory before running the notebook.
21
+
22
+ ---
23
+
24
+ ## What's in the database?
25
+
26
+ The database currently holds flights from PX4 simulations. Each flight contains two types of data:
27
+
28
+ **Dynamic data** — sensor time-series recorded during the flight, stored as ULog topics:
29
+ - `vehicle_local_position` — position in NED frame: x, y, z (meters) and velocity (m/s)
30
+ - `vehicle_attitude` — orientation as quaternion (w, x, y, z); dimensionless
31
+ - `sensor_combined` — raw IMU measurements: accelerometer (m/s²), gyroscope (rad/s)
32
+ - `vehicle_status` — flight mode and arming state
33
+
34
+ **Static data** — simulation conditions set before the flight:
35
+ - Wind parameters: mean velocity (m/s), direction (unit vector), gusts and variance (m/s)
36
+ - Obstacle positions (meters) and dimensions — height, length, width (meters) — up to N obstacles per flight
37
+ - PX4 flight controller parameters: navigation acceptance radius (meters), takeoff altitude (meters), etc.
38
+
39
+ ### Database schema
40
+
41
+ ```
42
+ missions
43
+ └── flights one row per simulation flight
44
+ ├── flight_context static conditions (wind, obstacles, PX4 params)
45
+ └── topics ULog topics recorded during the flight
46
+ ├── topic_fields field names and data types
47
+ └── topic_data time-series values (long format)
48
+ ```
49
+
50
+ `topic_data` uses **long format** — each row is one `(topic_id, row_index, field_name, value)` tuple.
51
+ The notebook's `get_topic()` helper pivots this into a wide DataFrame automatically.
52
+
53
+ ---
54
+
55
+ ## Quickstart
56
+
57
+ ### Requirements
58
+
59
+ Install dependencies using the provided `requirements.txt` to ensure version compatibility:
60
+
61
+ ```bash
62
+ pip install -r requirements.txt
63
+ ```
64
+
65
+ ### Open the notebook
66
+
67
+ ```bash
68
+ jupyter notebook UAV_notebook.ipynb
69
+ ```
70
+
71
+ The notebook connects to `UAV.db` directly — no additional setup needed.
72
+
73
+ ### Query the database directly
74
+
75
+ ```python
76
+ import sqlite3
77
+ import pandas as pd
78
+
79
+ con = sqlite3.connect("UAV.db")
80
+
81
+ # All flights with duration
82
+ flights = pd.read_sql(
83
+ "SELECT iter_number, duration_s FROM flights ORDER BY iter_number", con
84
+ )
85
+
86
+ # Position time-series for flight iteration #2 (long → wide)
87
+ pos_raw = pd.read_sql("""
88
+ SELECT td.row_index, td.field_name, td.value
89
+ FROM topic_data td
90
+ JOIN topics t ON t.id = td.topic_id
91
+ JOIN flights f ON f.id = t.flight_id
92
+ WHERE f.iter_number = 2
93
+ AND t.name = 'vehicle_local_position'
94
+ ORDER BY td.row_index
95
+ """, con)
96
+
97
+ pos = pos_raw.pivot(index="row_index", columns="field_name", values="value")
98
+
99
+ # Simulation conditions for every flight (wide format)
100
+ context = pd.read_sql("""
101
+ SELECT f.iter_number, c.key, c.value
102
+ FROM flights f
103
+ JOIN flight_context c ON c.flight_id = f.id
104
+ """, con)
105
+
106
+ context_wide = context.pivot_table(
107
+ index="iter_number", columns="key", values="value"
108
+ ).reset_index()
109
+ ```
110
+
111
+ ---
112
+
113
+ ## ML export
114
+
115
+ Use `build_ml_dataset()` (or run Section 7 of the notebook) to export a flat Parquet file ready for training.
116
+ Every row is one timestep. Static features are repeated on all rows belonging to the same flight.
117
+
118
+ | Group | Example columns | Note |
119
+ |-------|-----------------|------|
120
+ | Time | `timestamp` | Microseconds; resampled to a uniform grid |
121
+ | Position | `vehicle_local_position__x/y/z` | NED frame (meters) — use `-z` for altitude |
122
+ | Attitude | `vehicle_attitude__q[0..3]` | Quaternion, scalar-first (w, x, y, z); dimensionless |
123
+ | Wind | `wind_velocity_mean`, `wind_dir_x/y/z` | Static per flight; velocity in m/s, direction as unit vector |
124
+ | Obstacles | `obs0_x/y/z/h/l/w`, `obs1_...` | Static per flight; position and dimensions in meters |
125
+ | PX4 params | `px4_NAV_ACC_RAD`, `px4_MIS_TAKEOFF_ALT` | Static per flight; distances in meters |
126
+ | Flight ID | `iter_number`, `iter_name` | Use to group or split by flight |
127
+
128
+ ---
129
+
130
+ ## Notebook
131
+ Inside the notebook you will find everything that can be useful for data science such as statistics, queries, and how the database is structured. There is also code showing how the database is constructed.
132
+
133
+ ---
134
+
135
+ ## Adding new flights
136
+
137
+ Section 8 of the notebook walks through the full import process. In short:
138
+
139
+ 1. Organize your flights under a root folder (one subfolder per iteration, each containing a `.ulg` file and optionally a `.yaml` config)
140
+ 2. Set `NEW_ROOT_DIR` in the notebook
141
+ 3. Run the discovery cell to preview what will be imported
142
+ 4. Run the pipeline cell to write to `UAV.db`
143
+
144
+ You can also call the pipeline directly from Python:
145
+
146
+ ```python
147
+ from Dataset_gen import run_pipeline
148
+
149
+ run_pipeline(
150
+ root_dir = "./my_flights",
151
+ db_path = "UAV.db",
152
+ ml_topics = ["vehicle_local_position", "vehicle_attitude"],
153
+ resample_us = 100_000, # 10 Hz
154
+ downsample_factor = 10,
155
+ skip_existing = True,
156
+ )
157
+ ```
158
+
159
+ ---
160
+
161
+ ## Notes
162
+
163
+ - **NED frame**: `z` points downward — use `-z` to get altitude above ground (meters)
164
+ - **Quaternion convention**: `q[0]` = scalar part (w), `q[1..3]` = vector part (x, y, z)
165
+ - `topic_data` stores values in long format; use `pivot()` or the `get_topic()` helper to work with it
166
+
167
+ ## Full dataset
168
+ The full dataset can be found at the website here :
169
+ https://zenodo.org/records/18727376
170
+
171
+ ## References
172
+
173
+ If you use this tool in your research, please cite the following papers:
174
+
175
+ - **Sajad Khatiri**, Sebastiano Panichella, and Paolo Tonella, "Simulation-based Testing of Unmanned Aerial Vehicles with Aerialist," *In 2024 International Conference on Software Engineering (ICSE)*. [Link](https://dl.acm.org/doi/10.1145/3639478.3640031).
176
+
177
+ ````{code-block} bibtex
178
+ @inproceedings{icse2024Aerialist,
179
+ title={Simulation-based Testing of Unmanned Aerial Vehicles with Aerialist},
180
+ author={Khatiri, Sajad and Panichella, Sebastiano and Tonella, Paolo},
181
+ booktitle={International Conference on Software Engineering (ICSE)},
182
+ year={2024},
183
+ }
184
+ ````
185
+
186
+ - **SBFT Tool competition report**
187
+ ````{code-block} bibtex
188
+ @inproceedings{SBFT-UAV2026,
189
+ author = {Ramazan Erdem Uysal and Ali Javadi and Prakash Aryan and Aren Babikian and Dmytro Humeniuk and Sajad Mazraehkhatiri and Sebastiano Panichella},
190
+ title = {{SBFT} Tool Competition 2026 – UAV Testing Track},
191
+ booktitle = {International Workshop on Search-Based and Fuzz Testing,
192
+ SBFT@ICSE 2026},
193
+ year = {2026}
194
+ }
195
+ ````
196
+
197
+ - **ICST Tool competition report**
198
+ ````{code-block} bibtex
199
+ @inproceedings{ICST-UAV2026,
200
+ author = {Ramazan Erdem Uysal and Ali Javadi and Prakash Aryan and Aren Babikian and Dmytro Humeniuk and Sajad Mazraehkhatiri and Sebastiano Panichella},
201
+ title = {{ICST} Tool Competition 2026 – UAV Testing Track},
202
+ booktitle = {International Conference on Software Testing, Verification and Validation (ICST)},
203
+ year = {2026}
204
+ }
205
+ ````
206
+
207
+ - **Sajad Khatiri**, Sebastiano Panichella, and Paolo Tonella, "Simulation-based Test Case Generation for Unmanned Aerial Vehicles in the Neighborhood of Real Flights," *In 2023 IEEE 16th International Conference on Software Testing, Verification and Validation (ICST)*. [Link](https://ieeexplore.ieee.org/document/10132225).
208
+
209
+ ````{code-block} bibtex
210
+ @inproceedings{khatiri2023simulation,
211
+ title={Simulation-based test case generation for unmanned aerial vehicles in the neighborhood of real flights},
212
+ author={Khatiri, Sajad and Panichella, Sebastiano and Tonella, Paolo},
213
+ booktitle={2023 16th IEEE International Conference on Software Testing, Verification and Validation (ICST)},
214
+ year={2023},
215
+ }
216
+ ````
217
+
218
+ ## License
219
+
220
+ The software we developed is distributed under the Apache 2.0 license. See the [LICENSE](./LICENSE) file.
221
+
222
+ ## Contacts
223
+
224
+ Please refer to the [FAQ page](https://github.com/skhatiri/UAV-Testing-Competition/wiki/Home) in the Wiki.
225
+
226
+ You may also refer to (and contribute to) the [Discussions Page](https://github.com/skhatiri/UAV-Testing-Competition/discussions), where you may find user-submitted questions and corresponding answers.
227
+
228
+ You can also contact us directly using email:
229
+
230
+ - Ramazan Erdem Uysal, University of Bern, ramazan.uysal@unibe.ch
231
+ - Ali Javadi, University of Bern, ali.javadi@unibe.ch,
232
+ - Prakash Aryan, University of Bern, prakash.aryan@unibe.ch
233
+ - Loubet--Bonino Grégory, University of Bern, gregory.loubet-bonino@unibe.ch
234
+ - Aren Babikian, University of Toronto, aren.babikian@utoronto.ca
235
+ - Dmytro Humeniuk, Polytechnique Montréal, dmytro.humeniuk@polymtl.ca,
236
+ - Sajad Mazraehkhatiri, University of Bern, sajad.mazraehkhatiri@unibe.ch
237
+ - Sebastiano Panichella, University of Bern, sebastiano.panichella@unibe.ch
Requierement.txt ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ pandas==2.2.2
2
+ numpy==1.26.4
3
+ matplotlib==3.9.0
4
+ seaborn==0.13.2
5
+ scikit-learn==1.5.0
6
+ pyarrow==16.1.0
7
+ pyulog==0.9.0
8
+ pyyaml==6.0.1
9
+ tqdm==4.66.4
10
+ jupyter==1.0.0
UAV.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:31b5c49f16e65233361eb284b0989ed25a1f22cde72f39f926f5327dad23d2da
3
+ size 135
UAV_notebook.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
dataset_gen.ipynb ADDED
@@ -0,0 +1,614 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": null,
6
+ "id": "211b4872-8a32-461f-b66e-df285aae26f9",
7
+ "metadata": {},
8
+ "outputs": [],
9
+ "source": [
10
+ "import sqlite3\n",
11
+ "import yaml\n",
12
+ "import numpy as np\n",
13
+ "import pandas as pd\n",
14
+ "from pyulog import ULog\n",
15
+ "from pathlib import Path\n",
16
+ "from tqdm import tqdm\n",
17
+ "\n",
18
+ "\n",
19
+ "\n",
20
+ "def flatten_yaml(yaml_path: Path) -> dict:\n",
21
+ " \"\"\"Flatten YAML config into scalar key-value pairs.\"\"\"\n",
22
+ " with open(yaml_path, \"r\") as f:\n",
23
+ " cfg = yaml.safe_load(f)\n",
24
+ "\n",
25
+ " flat = {}\n",
26
+ "\n",
27
+ " # PX4 params\n",
28
+ " params = cfg.get(\"robot\", {}).get(\"params\", {})\n",
29
+ " if hasattr(params, '__dict__'):\n",
30
+ " params = params.__dict__\n",
31
+ " for k, v in params.items():\n",
32
+ " flat[f\"px4_{k}\"] = float(v) if v is not None else float(\"nan\")\n",
33
+ "\n",
34
+ " # Mission command\n",
35
+ " commands = cfg.get(\"mission\", {}).get(\"commands\", [])\n",
36
+ " if commands:\n",
37
+ " cmd = commands[0]\n",
38
+ " flat[\"cmd_mode\"] = float(cmd.get(\"mode\", float(\"nan\")))\n",
39
+ " flat[\"cmd_x\"] = float(cmd.get(\"x\", float(\"nan\")))\n",
40
+ " flat[\"cmd_y\"] = float(cmd.get(\"y\", float(\"nan\")))\n",
41
+ " flat[\"cmd_z\"] = float(cmd.get(\"z\", float(\"nan\")))\n",
42
+ " flat[\"cmd_r\"] = float(cmd.get(\"r\", float(\"nan\")))\n",
43
+ "\n",
44
+ " # Wind\n",
45
+ " wind = cfg.get(\"simulation\", {}).get(\"wind\", {})\n",
46
+ " flat[\"wind_velocity_mean\"] = float(wind.get(\"velocity_mean\", float(\"nan\")))\n",
47
+ " flat[\"wind_velocity_max\"] = float(wind.get(\"velocity_max\", float(\"nan\")))\n",
48
+ " flat[\"wind_velocity_variance\"] = float(wind.get(\"velocity_variance\", float(\"nan\")))\n",
49
+ " flat[\"wind_direction_variance\"] = float(wind.get(\"direction_variance\", float(\"nan\")))\n",
50
+ " flat[\"wind_gust_velocity_mean\"] = float(wind.get(\"gust_velocity_mean\", float(\"nan\")))\n",
51
+ " flat[\"wind_gust_velocity_max\"] = float(wind.get(\"gust_velocity_max\", float(\"nan\")))\n",
52
+ " flat[\"wind_gust_duration\"] = float(wind.get(\"gust_duration\", float(\"nan\")))\n",
53
+ " flat[\"wind_gust_start\"] = float(wind.get(\"gust_start\", float(\"nan\")))\n",
54
+ "\n",
55
+ " direction = wind.get(\"direction\", [float(\"nan\")] * 3)\n",
56
+ " flat[\"wind_dir_x\"] = float(direction[0]) if len(direction) > 0 else float(\"nan\")\n",
57
+ " flat[\"wind_dir_y\"] = float(direction[1]) if len(direction) > 1 else float(\"nan\")\n",
58
+ " flat[\"wind_dir_z\"] = float(direction[2]) if len(direction) > 2 else float(\"nan\")\n",
59
+ "\n",
60
+ " gust_dir = wind.get(\"gust_direction\", [float(\"nan\")] * 3)\n",
61
+ " flat[\"wind_gust_dir_x\"] = float(gust_dir[0]) if len(gust_dir) > 0 else float(\"nan\")\n",
62
+ " flat[\"wind_gust_dir_y\"] = float(gust_dir[1]) if len(gust_dir) > 1 else float(\"nan\")\n",
63
+ " flat[\"wind_gust_dir_z\"] = float(gust_dir[2]) if len(gust_dir) > 2 else float(\"nan\")\n",
64
+ "\n",
65
+ " # Obstacles\n",
66
+ " obstacles = cfg.get(\"simulation\", {}).get(\"obstacles\", [])\n",
67
+ " flat[\"obs_count\"] = float(len(obstacles))\n",
68
+ " shape_map = {\"box\": 0, \"cylinder\": 1, \"sphere\": 2}\n",
69
+ "\n",
70
+ " for i, obs in enumerate(obstacles):\n",
71
+ " prefix = f\"obs{i}_\"\n",
72
+ " pos = obs.get(\"position\", {})\n",
73
+ " size = obs.get(\"size\", {})\n",
74
+ " flat[f\"{prefix}x\"] = float(pos.get(\"x\", float(\"nan\")))\n",
75
+ " flat[f\"{prefix}y\"] = float(pos.get(\"y\", float(\"nan\")))\n",
76
+ " flat[f\"{prefix}z\"] = float(pos.get(\"z\", float(\"nan\")))\n",
77
+ " flat[f\"{prefix}r\"] = float(pos.get(\"r\", float(\"nan\")))\n",
78
+ " flat[f\"{prefix}h\"] = float(size.get(\"h\", float(\"nan\")))\n",
79
+ " flat[f\"{prefix}l\"] = float(size.get(\"l\", float(\"nan\")))\n",
80
+ " flat[f\"{prefix}w\"] = float(size.get(\"w\", float(\"nan\")))\n",
81
+ " flat[f\"{prefix}shape\"] = float(shape_map.get(obs.get(\"shape\", \"\"), float(\"nan\")))\n",
82
+ "\n",
83
+ " return flat\n",
84
+ "\n",
85
+ "\n",
86
+ "def discover_all_missions(root_dir: str) -> dict[str, list[dict]]:\n",
87
+ " \"\"\"\n",
88
+ " Scan root folder and auto-discover all missions.\n",
89
+ " \n",
90
+ " Returns:\n",
91
+ " {\n",
92
+ " \"mission2\": [iterations],\n",
93
+ " \"mission3\": [iterations],\n",
94
+ " ...\n",
95
+ " }\n",
96
+ " \"\"\"\n",
97
+ " root = Path(root_dir)\n",
98
+ " if not root.exists():\n",
99
+ " raise FileNotFoundError(f\"Root directory not found: {root_dir}\")\n",
100
+ "\n",
101
+ " missions = {}\n",
102
+ "\n",
103
+ " for mission_folder in sorted(root.iterdir()):\n",
104
+ " if not mission_folder.is_dir():\n",
105
+ " continue\n",
106
+ " \n",
107
+ " mission_name = mission_folder.name\n",
108
+ " iterations = []\n",
109
+ "\n",
110
+ " for iter_folder in sorted(mission_folder.iterdir()):\n",
111
+ " if not iter_folder.is_dir():\n",
112
+ " continue\n",
113
+ "\n",
114
+ " iter_number = -1\n",
115
+ " if iter_folder.name.startswith(\"iter\"):\n",
116
+ " try:\n",
117
+ " iter_number = int(iter_folder.name[4:7])\n",
118
+ " except (ValueError, IndexError):\n",
119
+ " pass\n",
120
+ " elif iter_folder.name.startswith(\"test-\"):\n",
121
+ " try:\n",
122
+ " iter_number = int(iter_folder.name.split(\"-\")[1])\n",
123
+ " except (ValueError, IndexError):\n",
124
+ " pass\n",
125
+ "\n",
126
+ " ulg_files = list(iter_folder.glob(\"*.ulg\"))\n",
127
+ " yaml_files = list(iter_folder.glob(\"*.yaml\"))\n",
128
+ "\n",
129
+ " iterations.append({\n",
130
+ " \"iter_name\": iter_folder.name,\n",
131
+ " \"iter_number\": iter_number,\n",
132
+ " \"iter_dir\": iter_folder,\n",
133
+ " \"ulg_files\": ulg_files,\n",
134
+ " \"yaml_path\": yaml_files[0] if yaml_files else None,\n",
135
+ " })\n",
136
+ "\n",
137
+ " if iterations:\n",
138
+ " missions[mission_name] = iterations\n",
139
+ "\n",
140
+ " return missions\n",
141
+ "\n",
142
+ "\n",
143
+ "class ULogToDB:\n",
144
+ "\n",
145
+ " def __init__(self, db_path: str = \"ulg_database.db\"):\n",
146
+ " self.db_path = db_path\n",
147
+ " self._migrate_and_init()\n",
148
+ "\n",
149
+ " def _get_existing_tables(self, conn) -> set:\n",
150
+ " return {r[0] for r in conn.execute(\n",
151
+ " \"SELECT name FROM sqlite_master WHERE type='table'\"\n",
152
+ " ).fetchall()}\n",
153
+ "\n",
154
+ " def _get_existing_columns(self, conn, table: str) -> set:\n",
155
+ " return {r[1] for r in conn.execute(f\"PRAGMA table_info({table})\").fetchall()}\n",
156
+ "\n",
157
+ " def _migrate_and_init(self):\n",
158
+ " with sqlite3.connect(self.db_path) as conn:\n",
159
+ " tables = self._get_existing_tables(conn)\n",
160
+ "\n",
161
+ " if \"missions\" not in tables:\n",
162
+ " conn.execute(\"\"\"\n",
163
+ " CREATE TABLE missions (\n",
164
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
165
+ " name TEXT UNIQUE,\n",
166
+ " root_dir TEXT\n",
167
+ " )\"\"\")\n",
168
+ " if \"flights\" in tables:\n",
169
+ " conn.execute(\n",
170
+ " \"INSERT OR IGNORE INTO missions (id, name, root_dir) VALUES (1, 'legacy', '')\"\n",
171
+ " )\n",
172
+ "\n",
173
+ " if \"flights\" not in tables:\n",
174
+ " conn.execute(\"\"\"\n",
175
+ " CREATE TABLE flights (\n",
176
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
177
+ " mission_id INTEGER,\n",
178
+ " iter_name TEXT,\n",
179
+ " iter_number INTEGER,\n",
180
+ " filename TEXT,\n",
181
+ " duration_s REAL,\n",
182
+ " start_timestamp_us INTEGER,\n",
183
+ " FOREIGN KEY(mission_id) REFERENCES missions(id)\n",
184
+ " )\"\"\")\n",
185
+ " else:\n",
186
+ " cols = self._get_existing_columns(conn, \"flights\")\n",
187
+ " if \"mission_id\" not in cols:\n",
188
+ " conn.execute(\"ALTER TABLE flights ADD COLUMN mission_id INTEGER DEFAULT 1\")\n",
189
+ " if \"iter_name\" not in cols:\n",
190
+ " conn.execute(\"ALTER TABLE flights ADD COLUMN iter_name TEXT\")\n",
191
+ " if \"iter_number\" not in cols:\n",
192
+ " conn.execute(\"ALTER TABLE flights ADD COLUMN iter_number INTEGER DEFAULT -1\")\n",
193
+ "\n",
194
+ " if \"flight_context\" not in tables:\n",
195
+ " conn.execute(\"\"\"\n",
196
+ " CREATE TABLE flight_context (\n",
197
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
198
+ " flight_id INTEGER,\n",
199
+ " key TEXT,\n",
200
+ " value REAL,\n",
201
+ " FOREIGN KEY(flight_id) REFERENCES flights(id)\n",
202
+ " )\"\"\")\n",
203
+ "\n",
204
+ " conn.executescript(\"\"\"\n",
205
+ " CREATE TABLE IF NOT EXISTS topics (\n",
206
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
207
+ " flight_id INTEGER,\n",
208
+ " name TEXT,\n",
209
+ " multi_id INTEGER,\n",
210
+ " message_count INTEGER,\n",
211
+ " FOREIGN KEY(flight_id) REFERENCES flights(id)\n",
212
+ " );\n",
213
+ "\n",
214
+ " CREATE TABLE IF NOT EXISTS topic_fields (\n",
215
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
216
+ " topic_id INTEGER,\n",
217
+ " field_name TEXT,\n",
218
+ " dtype TEXT,\n",
219
+ " FOREIGN KEY(topic_id) REFERENCES topics(id)\n",
220
+ " );\n",
221
+ "\n",
222
+ " CREATE TABLE IF NOT EXISTS topic_data (\n",
223
+ " id INTEGER PRIMARY KEY AUTOINCREMENT,\n",
224
+ " topic_id INTEGER,\n",
225
+ " row_index INTEGER,\n",
226
+ " field_name TEXT,\n",
227
+ " value REAL,\n",
228
+ " FOREIGN KEY(topic_id) REFERENCES topics(id)\n",
229
+ " );\n",
230
+ "\n",
231
+ " CREATE INDEX IF NOT EXISTS idx_topic_data_topic_id ON topic_data(topic_id);\n",
232
+ " CREATE INDEX IF NOT EXISTS idx_flights_mission ON flights(mission_id);\n",
233
+ " CREATE INDEX IF NOT EXISTS idx_topics_flight ON topics(flight_id);\n",
234
+ " CREATE INDEX IF NOT EXISTS idx_ctx_flight ON flight_context(flight_id);\n",
235
+ " \"\"\")\n",
236
+ "\n",
237
+ " def get_or_create_mission(self, name: str, root_dir: str) -> int:\n",
238
+ " with sqlite3.connect(self.db_path) as conn:\n",
239
+ " cur = conn.cursor()\n",
240
+ " cur.execute(\"SELECT id FROM missions WHERE name = ?\", (name,))\n",
241
+ " row = cur.fetchone()\n",
242
+ " if row:\n",
243
+ " return row[0]\n",
244
+ " cur.execute(\"INSERT INTO missions (name, root_dir) VALUES (?, ?)\", (name, str(root_dir)))\n",
245
+ " return cur.lastrowid\n",
246
+ "\n",
247
+ " def is_already_imported(self, filename: str, iter_name: str, mission_id: int) -> bool:\n",
248
+ " with sqlite3.connect(self.db_path) as conn:\n",
249
+ " row = conn.execute(\n",
250
+ " \"SELECT id FROM flights WHERE filename = ? AND iter_name = ? AND mission_id = ?\",\n",
251
+ " (filename, iter_name, mission_id),\n",
252
+ " ).fetchone()\n",
253
+ " return row is not None\n",
254
+ "\n",
255
+ " def insert_flight(self, ulog: ULog, ulg_path: Path, mission_id: int,\n",
256
+ " iter_name: str, iter_number: int) -> int:\n",
257
+ " duration = (ulog.last_timestamp - ulog.start_timestamp) / 1e6\n",
258
+ " with sqlite3.connect(self.db_path) as conn:\n",
259
+ " cur = conn.cursor()\n",
260
+ " cur.execute(\n",
261
+ " \"\"\"INSERT INTO flights\n",
262
+ " (mission_id, iter_name, iter_number, filename, duration_s, start_timestamp_us)\n",
263
+ " VALUES (?, ?, ?, ?, ?, ?)\"\"\",\n",
264
+ " (mission_id, iter_name, iter_number, ulg_path.name, duration, ulog.start_timestamp),\n",
265
+ " )\n",
266
+ " return cur.lastrowid\n",
267
+ "\n",
268
+ " def insert_yaml_context(self, flight_id: int, flat: dict):\n",
269
+ " with sqlite3.connect(self.db_path) as conn:\n",
270
+ " conn.executemany(\n",
271
+ " \"INSERT INTO flight_context (flight_id, key, value) VALUES (?, ?, ?)\",\n",
272
+ " [(flight_id, k, v) for k, v in flat.items()],\n",
273
+ " )\n",
274
+ "\n",
275
+ " def insert_topics_and_data(self, ulog: ULog, flight_id: int, topics_filter: list[str] | None = None):\n",
276
+ " \"\"\"\n",
277
+ " Insert topics + data with optional filtering.\n",
278
+ " \n",
279
+ " Args:\n",
280
+ " topics_filter: If provided, only store these topics to reduce DB size\n",
281
+ " \"\"\"\n",
282
+ " with sqlite3.connect(self.db_path) as conn:\n",
283
+ " cur = conn.cursor()\n",
284
+ " for dataset in ulog.data_list:\n",
285
+ " if topics_filter and dataset.name not in topics_filter:\n",
286
+ " continue # Skip this topic entirely\n",
287
+ "\n",
288
+ " cur.execute(\n",
289
+ " \"INSERT INTO topics (flight_id, name, multi_id, message_count) VALUES (?, ?, ?, ?)\",\n",
290
+ " (flight_id, dataset.name, dataset.multi_id, len(dataset.data[\"timestamp\"])),\n",
291
+ " )\n",
292
+ " topic_id = cur.lastrowid\n",
293
+ "\n",
294
+ " for field_name, values in dataset.data.items():\n",
295
+ " cur.execute(\n",
296
+ " \"INSERT INTO topic_fields (topic_id, field_name, dtype) VALUES (?, ?, ?)\",\n",
297
+ " (topic_id, field_name, str(values.dtype)),\n",
298
+ " )\n",
299
+ " cur.executemany(\n",
300
+ " \"INSERT INTO topic_data (topic_id, row_index, field_name, value) VALUES (?, ?, ?, ?)\",\n",
301
+ " [(topic_id, i, field_name, float(v)) for i, v in enumerate(values)],\n",
302
+ " )\n",
303
+ "\n",
304
+ " def process_single(self, ulg_path: Path, yaml_flat: dict,\n",
305
+ " mission_id: int, iter_name: str, iter_number: int,\n",
306
+ " topics_filter: list[str] | None = None) -> int:\n",
307
+ " ulog = ULog(str(ulg_path))\n",
308
+ " flight_id = self.insert_flight(ulog, ulg_path, mission_id, iter_name, iter_number)\n",
309
+ " if yaml_flat:\n",
310
+ " self.insert_yaml_context(flight_id, yaml_flat)\n",
311
+ " self.insert_topics_and_data(ulog, flight_id, topics_filter)\n",
312
+ " return flight_id\n",
313
+ "\n",
314
+ "\n",
315
+ "class ULogExporter:\n",
316
+ "\n",
317
+ " def __init__(self, db_path: str = \"ulg_database.db\"):\n",
318
+ " self.db_path = db_path\n",
319
+ "\n",
320
+ " def _get_yaml_context(self, flight_id: int) -> dict:\n",
321
+ " with sqlite3.connect(self.db_path) as conn:\n",
322
+ " rows = conn.execute(\n",
323
+ " \"SELECT key, value FROM flight_context WHERE flight_id = ?\", (flight_id,)\n",
324
+ " ).fetchall()\n",
325
+ " return {k: v for k, v in rows}\n",
326
+ "\n",
327
+ " def topic_to_dataframe(self, topic_id: int, downsample_factor: int = 1) -> pd.DataFrame:\n",
328
+ " \"\"\"\n",
329
+ " Load topic data with optional downsampling.\n",
330
+ " \n",
331
+ " Args:\n",
332
+ " downsample_factor: Keep every Nth sample (1 = no downsampling, 10 = 10x reduction)\n",
333
+ " \"\"\"\n",
334
+ " with sqlite3.connect(self.db_path) as conn:\n",
335
+ " if downsample_factor == 1:\n",
336
+ " df_raw = pd.read_sql_query(\n",
337
+ " \"SELECT row_index, field_name, value FROM topic_data WHERE topic_id = ? ORDER BY row_index\",\n",
338
+ " conn, params=(topic_id,),\n",
339
+ " )\n",
340
+ " else:\n",
341
+ " # Downsample at query time\n",
342
+ " df_raw = pd.read_sql_query(\n",
343
+ " f\"\"\"SELECT row_index, field_name, value \n",
344
+ " FROM topic_data \n",
345
+ " WHERE topic_id = ? AND row_index % ? = 0\n",
346
+ " ORDER BY row_index\"\"\",\n",
347
+ " conn, params=(topic_id, downsample_factor),\n",
348
+ " )\n",
349
+ " \n",
350
+ " if df_raw.empty:\n",
351
+ " return pd.DataFrame()\n",
352
+ " df = df_raw.pivot(index=\"row_index\", columns=\"field_name\", values=\"value\")\n",
353
+ " df.columns.name = None\n",
354
+ " return df.reset_index(drop=True)\n",
355
+ "\n",
356
+ " def build_ml_dataset(\n",
357
+ " self,\n",
358
+ " mission_name: str,\n",
359
+ " topics: list[str],\n",
360
+ " output_path: str = \"ml_dataset.parquet\",\n",
361
+ " resample_us: int = 100_000, # ← 100ms = 10Hz instead of 10ms\n",
362
+ " downsample_factor: int = 10, # ← Skip 9 out of 10 samples\n",
363
+ " ) -> pd.DataFrame:\n",
364
+ " \"\"\"\n",
365
+ " Build optimized ML dataset with aggressive size reduction.\n",
366
+ " \n",
367
+ " Args:\n",
368
+ " downsample_factor: Subsample raw data (10 = keep 1 in 10 samples)\n",
369
+ " resample_us: Target resampling period in microseconds (100000 = 10Hz)\n",
370
+ " \"\"\"\n",
371
+ " with sqlite3.connect(self.db_path) as conn:\n",
372
+ " mission_row = conn.execute(\"SELECT id FROM missions WHERE name = ?\", (mission_name,)).fetchone()\n",
373
+ " if not mission_row:\n",
374
+ " raise ValueError(f\"Mission unknown: {mission_name}\")\n",
375
+ " mission_id = mission_row[0]\n",
376
+ "\n",
377
+ " flights = conn.execute(\n",
378
+ " \"SELECT id, iter_name, iter_number, filename FROM flights WHERE mission_id = ? ORDER BY iter_number, filename\",\n",
379
+ " (mission_id,),\n",
380
+ " ).fetchall()\n",
381
+ "\n",
382
+ " all_dfs = []\n",
383
+ " print(f\"\\n🔧 Building optimized ML dataset for {mission_name}\")\n",
384
+ " print(f\" Downsample factor : {downsample_factor}x\")\n",
385
+ " print(f\" Resample period : {resample_us/1000:.0f}ms ({1e6/resample_us:.0f}Hz)\")\n",
386
+ " print(f\" Processing {len(flights)} flights...\")\n",
387
+ "\n",
388
+ " for flight_id, iter_name, iter_number, filename in tqdm(flights, desc=\"Merging\"):\n",
389
+ " with sqlite3.connect(self.db_path) as conn:\n",
390
+ " topic_rows = conn.execute(\n",
391
+ " \"SELECT id, name FROM topics WHERE flight_id = ?\", (flight_id,)\n",
392
+ " ).fetchall()\n",
393
+ " topic_map = {name: tid for tid, name in topic_rows}\n",
394
+ "\n",
395
+ " dfs = {}\n",
396
+ " for name in topics:\n",
397
+ " if name not in topic_map:\n",
398
+ " continue\n",
399
+ " df = self.topic_to_dataframe(topic_map[name], downsample_factor)\n",
400
+ " if \"timestamp\" not in df.columns or df.empty:\n",
401
+ " continue\n",
402
+ " df = df.rename(columns={c: f\"{name}__{c}\" if c != \"timestamp\" else c for c in df.columns})\n",
403
+ " dfs[name] = df.set_index(\"timestamp\")\n",
404
+ "\n",
405
+ " if not dfs:\n",
406
+ " continue\n",
407
+ "\n",
408
+ " merged = pd.concat(dfs.values(), axis=1, join=\"outer\").sort_index()\n",
409
+ " \n",
410
+ " # Resample to uniform grid\n",
411
+ " t_min, t_max = merged.index.min(), merged.index.max()\n",
412
+ " grid = np.arange(t_min, t_max, resample_us)\n",
413
+ " \n",
414
+ " if len(grid) > 0:\n",
415
+ " merged = (\n",
416
+ " merged.reindex(merged.index.union(grid))\n",
417
+ " .interpolate(\"index\")\n",
418
+ " .loc[grid]\n",
419
+ " .reset_index()\n",
420
+ " .rename(columns={\"index\": \"timestamp\"})\n",
421
+ " )\n",
422
+ " else:\n",
423
+ " merged = merged.reset_index().rename(columns={\"index\": \"timestamp\"})\n",
424
+ "\n",
425
+ " # Static YAML features\n",
426
+ " for key, value in self._get_yaml_context(flight_id).items():\n",
427
+ " merged[key] = value\n",
428
+ "\n",
429
+ " merged[\"iter_number\"] = iter_number\n",
430
+ " merged[\"iter_name\"] = iter_name\n",
431
+ " merged[\"filename\"] = filename\n",
432
+ " all_dfs.append(merged)\n",
433
+ "\n",
434
+ " if not all_dfs:\n",
435
+ " print(\"⚠️ No data found.\")\n",
436
+ " return pd.DataFrame()\n",
437
+ "\n",
438
+ " final = pd.concat(all_dfs, ignore_index=True)\n",
439
+ " out = Path(output_path)\n",
440
+ " out.parent.mkdir(parents=True, exist_ok=True)\n",
441
+ " final.to_parquet(out, index=False, compression=\"gzip\") # Add gzip compression\n",
442
+ "\n",
443
+ " size_mb = out.stat().st_size / (1024**2)\n",
444
+ " \n",
445
+ " print(f\"\\n✅ ML dataset: {out}\")\n",
446
+ " print(f\" File size : {size_mb:.1f} MB\")\n",
447
+ " print(f\" Shape : {final.shape}\")\n",
448
+ " print(f\" Iterations : {final['iter_name'].nunique()}\")\n",
449
+ " print(f\" Flights : {final['filename'].nunique()}\")\n",
450
+ " return final\n",
451
+ "\n",
452
+ "\n",
453
+ "# ======================================================\n",
454
+ "# PIPELINE\n",
455
+ "# ======================================================\n",
456
+ "\n",
457
+ "def run_pipeline(\n",
458
+ " root_dir: str,\n",
459
+ " db_path: str = \"test.db\",\n",
460
+ " ml_topics: list[str] | None = None,\n",
461
+ " resample_us: int = 100_000, \n",
462
+ " downsample_factor: int = 10, \n",
463
+ " skip_existing: bool = True,\n",
464
+ " export_per_mission: bool = True,\n",
465
+ "):\n",
466
+ " \"\"\"\n",
467
+ " Optimized pipeline with aggressive size reduction.\n",
468
+ " \n",
469
+ " Size reduction strategies:\n",
470
+ " 1. Only store ml_topics (not all topics)\n",
471
+ " 2. Downsample raw data by downsample_factor\n",
472
+ " 3. Resample to lower frequency (resample_us)\n",
473
+ " 4. Compress Parquet with gzip\n",
474
+ " \"\"\"\n",
475
+ " print(f\"\\n{'='*60}\")\n",
476
+ " print(f\"SURREALIST MULTI-MISSION PIPELINE (OPTIMIZED)\")\n",
477
+ " print(f\"{'='*60}\")\n",
478
+ " print(f\"Root directory : {root_dir}\")\n",
479
+ " print(f\"Database : {db_path}\")\n",
480
+ " print(f\"Topics to store : {ml_topics}\")\n",
481
+ " print(f\"Downsample factor : {downsample_factor}x\")\n",
482
+ " print(f\"Target frequency : {1e6/resample_us:.0f}Hz\\n\")\n",
483
+ "\n",
484
+ " missions_data = discover_all_missions(root_dir)\n",
485
+ "\n",
486
+ " total_iters = sum(len(iters) for iters in missions_data.values())\n",
487
+ " total_ulg = sum(len(it[\"ulg_files\"]) for iters in missions_data.values() for it in iters)\n",
488
+ "\n",
489
+ " print(f\"=== DISCOVERY ===\")\n",
490
+ " print(f\" Missions found : {len(missions_data)}\")\n",
491
+ " print(f\" Total iterations : {total_iters}\")\n",
492
+ " print(f\" Total .ulg files : {total_ulg}\")\n",
493
+ " for mission_name, iters in missions_data.items():\n",
494
+ " n_ulg = sum(len(it[\"ulg_files\"]) for it in iters)\n",
495
+ " print(f\" {mission_name:<15} : {len(iters)} iters, {n_ulg} .ulg\")\n",
496
+ "\n",
497
+ " print(f\"\\n=== IMPORT → {db_path} ===\")\n",
498
+ " db = ULogToDB(db_path=db_path)\n",
499
+ "\n",
500
+ " imported = skipped = failed = 0\n",
501
+ "\n",
502
+ " for mission_name, iterations in missions_data.items():\n",
503
+ " mission_id = db.get_or_create_mission(mission_name, root_dir)\n",
504
+ " print(f\"\\n [{mission_name}]\")\n",
505
+ "\n",
506
+ " for it in tqdm(iterations, desc=f\" {mission_name}\", leave=False):\n",
507
+ " yaml_flat = {}\n",
508
+ " if it[\"yaml_path\"] and it[\"yaml_path\"].exists():\n",
509
+ " try:\n",
510
+ " yaml_flat = flatten_yaml(it[\"yaml_path\"])\n",
511
+ " except Exception as e:\n",
512
+ " print(f\" ⚠️ YAML parse error in {it['iter_name']}: {e}\")\n",
513
+ "\n",
514
+ " for ulg_path in it[\"ulg_files\"]:\n",
515
+ " if skip_existing and db.is_already_imported(ulg_path.name, it[\"iter_name\"], mission_id):\n",
516
+ " skipped += 1\n",
517
+ " continue\n",
518
+ " try:\n",
519
+ " db.process_single(\n",
520
+ " ulg_path, yaml_flat, mission_id, it[\"iter_name\"], it[\"iter_number\"],\n",
521
+ " topics_filter=ml_topics # Only store ML topics\n",
522
+ " )\n",
523
+ " imported += 1\n",
524
+ " except Exception as e:\n",
525
+ " print(f\" ❌ {ulg_path.name}: {e}\")\n",
526
+ " failed += 1\n",
527
+ "\n",
528
+ " print(f\"\\n Imported : {imported}\")\n",
529
+ " print(f\" Skipped : {skipped}\")\n",
530
+ " print(f\" Failed : {failed}\")\n",
531
+ "\n",
532
+ " if ml_topics:\n",
533
+ " print(f\"\\n=== ML EXPORT (OPTIMIZED) ===\")\n",
534
+ " exporter = ULogExporter(db_path=db_path)\n",
535
+ "\n",
536
+ " if export_per_mission:\n",
537
+ " for mission_name in missions_data.keys():\n",
538
+ " exporter.build_ml_dataset(\n",
539
+ " mission_name = mission_name,\n",
540
+ " topics = ml_topics,\n",
541
+ " output_path = f\"exports/{mission_name}/ml_dataset.parquet\",\n",
542
+ " resample_us = resample_us,\n",
543
+ " downsample_factor = downsample_factor,\n",
544
+ " )\n",
545
+ "\n",
546
+ " print(f\"\\n{'='*60}\")\n",
547
+ " print(\"PIPELINE COMPLETE\")\n",
548
+ " print(f\"{'='*60}\\n\")\n",
549
+ "\n",
550
+ "\n",
551
+ "\n",
552
+ "\n",
553
+ "\n"
554
+ ]
555
+ },
556
+ {
557
+ "cell_type": "code",
558
+ "execution_count": null,
559
+ "id": "b41f2ee7-a073-4ca3-8ed2-1eac49b92150",
560
+ "metadata": {},
561
+ "outputs": [],
562
+ "source": [
563
+ "\n",
564
+ "if __name__ == \"__main__\":\n",
565
+ "\n",
566
+ " run_pipeline(\n",
567
+ " root_dir = \".\\Surrealist\\Surrealist-evaluated-tests\",\n",
568
+ " db_path = \"UAV.db\",\n",
569
+ "\n",
570
+ " ml_topics = [\n",
571
+ " \"vehicle_local_position\",\n",
572
+ " \"vehicle_attitude\",\n",
573
+ " \"sensor_combined\",\n",
574
+ " \"vehicle_status\",\n",
575
+ " ],\n",
576
+ "\n",
577
+ " resample_us = 100_000, \n",
578
+ " downsample_factor = 10,\n",
579
+ " skip_existing = True,\n",
580
+ " export_per_mission = True,\n",
581
+ " )"
582
+ ]
583
+ },
584
+ {
585
+ "cell_type": "code",
586
+ "execution_count": null,
587
+ "id": "33410dc3",
588
+ "metadata": {},
589
+ "outputs": [],
590
+ "source": []
591
+ }
592
+ ],
593
+ "metadata": {
594
+ "kernelspec": {
595
+ "display_name": "base",
596
+ "language": "python",
597
+ "name": "python3"
598
+ },
599
+ "language_info": {
600
+ "codemirror_mode": {
601
+ "name": "ipython",
602
+ "version": 3
603
+ },
604
+ "file_extension": ".py",
605
+ "mimetype": "text/x-python",
606
+ "name": "python",
607
+ "nbconvert_exporter": "python",
608
+ "pygments_lexer": "ipython3",
609
+ "version": "3.13.5"
610
+ }
611
+ },
612
+ "nbformat": 4,
613
+ "nbformat_minor": 5
614
+ }