Dataset Viewer
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code: StreamingRowsError
Exception: RuntimeError
Message: Disallowed deserialization of 'arrow.py_extension_type':
storage_type = list<item: list<item: float>>
serialized = b'\x80\x04\x95K\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array2DExtensionType\x94\x93\x94K\x02K\x03\x86\x94\x8c\x07float32\x94\x86\x94R\x94.'
pickle disassembly:
0: \x80 PROTO 4
2: \x95 FRAME 75
11: \x8c SHORT_BINUNICODE 'datasets.features.features'
39: \x94 MEMOIZE (as 0)
40: \x8c SHORT_BINUNICODE 'Array2DExtensionType'
62: \x94 MEMOIZE (as 1)
63: \x93 STACK_GLOBAL
64: \x94 MEMOIZE (as 2)
65: K BININT1 2
67: K BININT1 3
69: \x86 TUPLE2
70: \x94 MEMOIZE (as 3)
71: \x8c SHORT_BINUNICODE 'float32'
80: \x94 MEMOIZE (as 4)
81: \x86 TUPLE2
82: \x94 MEMOIZE (as 5)
83: R REDUCE
84: \x94 MEMOIZE (as 6)
85: . STOP
highest protocol among opcodes = 4
Reading of untrusted Parquet or Feather files with a PyExtensionType column
allows arbitrary code execution.
If you trust this file, you can enable reading the extension type by one of:
- upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)`
- disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()`
We strongly recommend updating your Parquet/Feather files to use extension types
derived from `pyarrow.ExtensionType` instead, and register this type explicitly.
See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types
for more details.
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 323, in compute
compute_first_rows_from_parquet_response(
File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response
rows_index = indexer.get_rows_index(
File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 631, in get_rows_index
return RowsIndex(
File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 512, in __init__
self.parquet_index = self._init_parquet_index(
File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 529, in _init_parquet_index
response = get_previous_step_or_raise(
File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise
raise CachedArtifactError(
libcommon.simple_cache.CachedArtifactError: The previous step failed.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 92, in get_rows_or_raise
return get_rows(
File "/src/libs/libcommon/src/libcommon/utils.py", line 183, in decorator
return func(*args, **kwargs)
File "/src/services/worker/src/worker/utils.py", line 69, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1389, in __iter__
for key, example in ex_iterable:
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 282, in __iter__
for key, pa_table in self.generate_tables_fn(**self.kwargs):
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 86, in _generate_tables
parquet_file = pq.ParquetFile(f)
File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 341, in __init__
self.reader.open(
File "pyarrow/_parquet.pyx", line 1262, in pyarrow._parquet.ParquetReader.open
File "pyarrow/types.pxi", line 88, in pyarrow.lib._datatype_to_pep3118
File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow_hotfix/__init__.py", line 47, in __arrow_ext_deserialize__
raise RuntimeError(
RuntimeError: Disallowed deserialization of 'arrow.py_extension_type':
storage_type = list<item: list<item: float>>
serialized = b'\x80\x04\x95K\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array2DExtensionType\x94\x93\x94K\x02K\x03\x86\x94\x8c\x07float32\x94\x86\x94R\x94.'
pickle disassembly:
0: \x80 PROTO 4
2: \x95 FRAME 75
11: \x8c SHORT_BINUNICODE 'datasets.features.features'
39: \x94 MEMOIZE (as 0)
40: \x8c SHORT_BINUNICODE 'Array2DExtensionType'
62: \x94 MEMOIZE (as 1)
63: \x93 STACK_GLOBAL
64: \x94 MEMOIZE (as 2)
65: K BININT1 2
67: K BININT1 3
69: \x86 TUPLE2
70: \x94 MEMOIZE (as 3)
71: \x8c SHORT_BINUNICODE 'float32'
80: \x94 MEMOIZE (as 4)
81: \x86 TUPLE2
82: \x94 MEMOIZE (as 5)
83: R REDUCE
84: \x94 MEMOIZE (as 6)
85: . STOP
highest protocol among opcodes = 4
Reading of untrusted Parquet or Feather files with a PyExtensionType column
allows arbitrary code execution.
If you trust this file, you can enable reading the extension type by one of:
- upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)`
- disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()`
We strongly recommend updating your Parquet/Feather files to use extension types
derived from `pyarrow.ExtensionType` instead, and register this type explicitly.
See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types
for more details.Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Dataset Card for "fill100k"
Kinda similar to https://huggingface.co/datasets/offchan/fill50k but with more images, lower resolution, uniform circle radius, and uniform color randomization covering the full spectrum of all possible colors.
Colors are 2x3 array containing circle color followed by background color in RGB format. Color names are derived from the closest color in a list of predefined common colors. Distance between 2 colors is calculated using the sum of weighted squared differences. The weights are 0.299, 0.587, and 0.114.
- Downloads last month
- 22