Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ArrowNotImplementedError
Message:      Cannot write struct type 'value' with no child field to Parquet. Consider adding a dummy child field.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1871, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 641, in write_table
                  self._build_writer(inferred_schema=pa_table.schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 456, in _build_writer
                  self.pa_writer = self._WRITER_CLASS(self.stream, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__
                  self.writer = _parquet.ParquetWriter(
                File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'value' with no child field to Parquet. Consider adding a dummy child field.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1887, in _prepare_split_single
                  num_examples, num_bytes = writer.finalize()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 660, in finalize
                  self._build_writer(self.schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 456, in _build_writer
                  self.pa_writer = self._WRITER_CLASS(self.stream, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__
                  self.writer = _parquet.ParquetWriter(
                File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'value' with no child field to Parquet. Consider adding a dummy child field.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1433, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1050, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 925, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1001, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1742, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1898, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

rows
dict
cols
dict
levels
dict
obstacle_prob
dict
goods_prob
dict
shelves
dict
aisles
dict
obstacles
dict
goods
dict
vacant_area
dict
person
dict
drones
dict
forklifts
dict
tasks
dict
{ "value": 6, "description": "The number of rows in the warehouse" }
{ "value": 6, "description": "Number of columns in the warehouse" }
{ "value": 3, "description": "The number of shelve levels in the warehouse" }
{ "value": 0, "description": "Probability of obstacle generation" }
{ "value": 0, "description": "Probability of goods generation" }
{ "description": "The information of all shelves in the warehouse, with the key being shelf ID and the value being detailed shelf information, including ID, capacity, coordinates, type, floor height, and cargo", "value": { "0": { "id": 0, "capacity": 1, "coord": [ 1, 1, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "1": { "id": 1, "capacity": 1, "coord": [ 1, 1, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "2": { "id": 2, "capacity": 1, "coord": [ 1, 1, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "3": { "id": 3, "capacity": 1, "coord": [ 1, 2, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "4": { "id": 4, "capacity": 1, "coord": [ 1, 2, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "5": { "id": 5, "capacity": 1, "coord": [ 1, 2, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "6": { "id": 6, "capacity": 1, "coord": [ 2, 1, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "7": { "id": 7, "capacity": 1, "coord": [ 2, 1, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "8": { "id": 8, "capacity": 1, "coord": [ 2, 1, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "9": { "id": 9, "capacity": 1, "coord": [ 2, 2, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "10": { "id": 10, "capacity": 1, "coord": [ 2, 2, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "11": { "id": 11, "capacity": 1, "coord": [ 2, 2, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "12": { "id": 12, "capacity": 1, "coord": [ 3, 1, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "13": { "id": 13, "capacity": 1, "coord": [ 3, 1, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "14": { "id": 14, "capacity": 1, "coord": [ 3, 1, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "15": { "id": 15, "capacity": 1, "coord": [ 3, 2, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "16": { "id": 16, "capacity": 1, "coord": [ 3, 2, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "17": { "id": 17, "capacity": 1, "coord": [ 3, 2, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "18": { "id": 18, "capacity": 1, "coord": [ 4, 1, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "19": { "id": 19, "capacity": 1, "coord": [ 4, 1, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "20": { "id": 20, "capacity": 1, "coord": [ 4, 1, 2 ], "type": "double_shelf", "height": 2, "goods": [] }, "21": { "id": 21, "capacity": 1, "coord": [ 4, 2, 0 ], "type": "double_shelf", "height": 0, "goods": [] }, "22": { "id": 22, "capacity": 1, "coord": [ 4, 2, 1 ], "type": "double_shelf", "height": 1, "goods": [] }, "23": { "id": 23, "capacity": 1, "coord": [ 4, 2, 2 ], "type": "double_shelf", "height": 2, "goods": [] } } }
{ "description": "The information of all channels in the warehouse is divided into internal channels and peripheral channels", "value": { "internal": [ [ 4, 3 ], [ 2, 3 ], [ 3, 3 ], [ 1, 3 ] ], "perimeter": [ [ 4, 0 ], [ 5, 4 ], [ 5, 1 ], [ 0, 2 ], [ 0, 5 ], [ 1, 0 ], [ 2, 5 ], [ 3, 0 ], [ 4, 5 ], [ 5, 0 ], [ 5, 3 ], [ 0, 1 ], [ 0, 4 ], [ 1, 5 ], [ 3, 5 ], [ 5, 2 ], [ 5, 5 ], [ 0, 0 ], [ 0, 3 ], [ 2, 0 ] ] } }
{ "description": "Information on all obstacles in the warehouse, with the key being obstacle ID and the value being detailed obstacle information, including ID, category, and coordinates", "value": { "0": { "id": "0", "category": "good", "position": [ 4, 14, 0 ] }, "1": { "id": "1", "category": "sundries", "position": [ 1, 0, 0 ] }, "2": { "id": "2", "category": "sundries", "position": [ 9, 7, 0 ] } } }
{ "description": "The information of all goods in the warehouse, with the key being the goods ID and the value being the detailed information of the goods, including category, name, ID, and coordinates", "value": {} }
{ "value": [ [ 1, 4 ], [ 2, 4 ], [ 3, 4 ], [ 4, 4 ] ], "description": "Coordinate list of each cell in the vacant area of the warehouse" }
{ "description": "The information of all people in the warehouse, with the key being the person ID and the value being the person detailed information, including ID, name, and face matching degree", "value": { "0": { "name": "Jack", "position": [ 8, 12 ], "face_match": 0.7640500840262894 }, "1": { "name": "Grace", "position": [ 8, 12 ], "face_match": 0.6472275034292652 }, "2": { "name": "David", "position": [ 0, 2 ], "face_match": 0.8982275745860175 } } }
{ "description": "The information of all drones in the warehouse, with the key being the drone ID and the value being the detailed information of the drone, including ID, coordinates, and mounted devices", "value": { "1": { "id": 1, "coord": [ 0, 0, 0 ], "mount": "RFID" } } }
{ "description": "The information of all forklifts in the warehouse, with the key being the forklift ID and the value being the detailed information of the forklift, including ID and coordinates. The third dimension of the forklift coordinates is the height of the fork", "value": { "1": { "id": 1, "coord": [ 1, 0, 0 ] } } }
{ "value": { "1": { "task": "The obstacle good at [4, 14, 0] is in the warehouse, recommend to move it to somewhere else like [2, 13] or anywhere else in the vacant_area", "related_device": "forklift", "task_completion_criteria": "The obstacle good at [4, 14, 0] is in the warehouse, has moved to somewhere else in the vacant_area" }, "2": { "task": "The obstacle sundries at [1, 0, 0] is in the warehouse, recommend to move it to somewhere else like [8, 13] or anywhere else in the vacant_area", "related_device": "forklift", "task_completion_criteria": "The obstacle sundries at [1, 0, 0] is in the warehouse, has moved to somewhere else in the vacant_area" }, "3": { "task": "The person suspected of being Grace has a face matching degree of only 0.6472275034292652 at [8, 12]. It is recommended to drive them away", "related_device": "drone", "task_completion_criteria": "The person who named Grace at [8, 12] with a face matching degree less than 0.7 are no longer in the warehouse" }, "4": { "task": "The obstacle sundries at [9, 7, 0] is in the warehouse, recommend to move it to somewhere else like [6, 13] or anywhere else in the vacant_area", "related_device": "forklift", "task_completion_criteria": "The obstacle sundries at [9, 7, 0] is in the warehouse, has moved to somewhere else in the vacant_area" } }, "description": "The tasks of the warehouse now, contain the task, the related device to slove task, and the task completion criteria." }
README.md exists but content is empty.
Downloads last month
-