Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ValueError
Message:      Object arrays cannot be loaded when allow_pickle=False
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1500, in _prepare_split_single
                  example = self.info.features.encode_example(record) if self.info.features is not None else record
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 2151, in encode_example
                  example = cast_to_python_objects(example)
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 487, in cast_to_python_objects
                  return _cast_to_python_objects(
                         ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 416, in _cast_to_python_objects
                  casted_v, has_changed_v = _cast_to_python_objects(
                                            ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 415, in _cast_to_python_objects
                  for k, v in obj.items():
                              ^^^^^^^^^^^
                File "<frozen _collections_abc>", line 894, in __iter__
                File "/usr/local/lib/python3.12/site-packages/numpy/lib/_npyio_impl.py", line 257, in __getitem__
                  return format.read_array(
                         ^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/numpy/lib/_format_impl.py", line 833, in read_array
                  raise ValueError("Object arrays cannot be loaded when "
              ValueError: Object arrays cannot be loaded when allow_pickle=False
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1342, in compute_config_parquet_and_info_response
                  parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
                                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 907, in stream_convert_to_parquet
                  builder._prepare_split(split_generator=splits_generators[split], file_format="parquet")
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1345, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1523, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

csv
unknown
json
dict
__key__
string
__url__
string
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_112157/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_112326/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_112357/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_112921/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_113224/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_113400/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX2tleSx0YXNrLHByb2Nlc3NlZF9kaXIscHJvY2Vzc19vayxydW5fb2ssc3RhYmxlLHByaW1hcnlfcmVhc29uLHJlYXN(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
"adam_pro_31dof/humanoid/_batch_eval/adam_pro_31dof_batch_adam_pro_1031_adjusted_20260412_113800/sum(...TRUNCATED)
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX25hbWUscnVuX2Rpcix0cmFqZWN0b3J5X3BhdGgsbnVtX2ZyYW1lcyxxcG9zX2RpbSxxdmVsX2RpbSxzaW1fZHQsZnB(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
adam_pro_31dof/humanoid/_quality_check/20260422_213936/summary
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
"bW90aW9uX25hbWUscnVuX2Rpcix0cmFqZWN0b3J5X3BhdGgsbnVtX2ZyYW1lcyxxcG9zX2RpbSxxdmVsX2RpbSxzaW1fZHQsZnB(...TRUNCATED)
{"csv_path":"/home/humanoid/rgz_work/adam-pro-spider/adam-pro-spider-main/example_datasets/processed(...TRUNCATED)
adam_pro_31dof/humanoid/_quality_check/20260422_222613/summary
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
null
null
adam_pro_31dof/humanoid/adam_0-BALANCE_001_Skeleton_006_z_up_x_forward_gym/0/config
"hf://datasets/RRGGZZ/Adam-pro-spider@d65056db0be1d7cff60bf80a29633e2c21882b17/data/adam_pro_31dof_h(...TRUNCATED)
End of preview.

Adam Pro SPIDER Human Motion Retargeting Dataset

This dataset contains Adam Pro 31-DOF humanoid motion data after physics-based retargeting and optimization with SPIDER. The trajectory_mjwp.npz files are the optimized MuJoCo-Warp/SPIDER outputs, while trajectory_kinematic.npz stores the corresponding kinematic reference used for comparison and replay.

Repository: RRGGZZ/Adam-pro-spider

Source directory in the SPIDER workspace:

example_datasets/processed/human_motion/adam_pro_31dof/humanoid

Archive prefix inside shards:

adam_pro_31dof/humanoid

Contents

  • Files: 7511
  • Payload size: 3.45 GiB
  • Shards: 4

Each tar shard preserves the processed SPIDER directory layout. Typical motion entries include:

  • scene.xml: MuJoCo scene used for replay.
  • scene_eq.xml: equality-constraint scene variant.
  • task_info.json: task metadata and reference timing.
  • source_info.json: source motion metadata.
  • 0/trajectory_mjwp.npz: SPIDER-optimized physics trajectory.
  • 0/trajectory_kinematic.npz: kinematic reference trajectory.
  • 0/config.yaml: optimization config saved with the run.

Download And Restore

From your SPIDER workspace, download the dataset repository and extract the shards:

hf download RRGGZZ/Adam-pro-spider \
  --repo-type dataset \
  --local-dir hf_download/Adam-pro-spider
mkdir -p example_datasets/processed/human_motion
for shard in hf_download/Adam-pro-spider/data/*.tar; do
  tar -xf "$shard" -C example_datasets/processed/human_motion
done

The extracted tree will include:

example_datasets/processed/human_motion/adam_pro_31dof/humanoid

Raw Adam PKL Split Motions

This repository also includes the split Adam PKL source motions used before SPIDER/MJWP optimization. These files are stored separately from the optimized trajectories:

raw/adam_pkl_split
  • Source workspace path: human_motion/adam_pkl_split
  • Archive prefix inside shards: human_motion/adam_pkl_split
  • Files: 4660
  • Payload size: 5.60 GiB
  • Shards: 6

To download only this raw PKL split subset:

hf download RRGGZZ/Adam-pro-spider \
  --repo-type dataset \
  --include "raw/adam_pkl_split/*" \
  --local-dir hf_download/Adam-pro-spider

Restore it from the SPIDER workspace root:

mkdir -p human_motion
for shard in hf_download/Adam-pro-spider/raw/adam_pkl_split/data/*.tar; do
  tar -xf "$shard" -C .
done

The extracted tree will include:

human_motion/adam_pkl_split

Replay

After extracting the shards, replay one optimized motion with the aligned viewer:

./replay_aligned.sh adam_0-walk_forward_relax_impro_002__A001

Or replay directly with headless_replay.sh:

./headless_replay.sh \
  --task adam_0-walk_forward_relax_impro_002__A001 \
  --data-id 0 \
  --robot-type adam_pro_31dof \
  --config-yaml spider/assets/robots/adam_pro_31dof/config.yaml \
  --replay-only \
  --no-process \
  --replay-mode mujoco \
  --show-reference

Manifest

See manifest.json for shard checksums and file counts.

Maintainer Upload

To rebuild and upload this dataset after adding new processed motions, run from the SPIDER workspace:

.venv/bin/python scripts/package_hf_dataset.py \
  --repo-id RRGGZZ/Adam-pro-spider \
  --upload \
  --commit-message "Update Adam Pro 31DOF humanoid SPIDER optimized dataset"

If the shards are already built under hf_upload/adam_pro_31dof_humanoid, upload only the existing staging folder:

.venv/bin/python scripts/package_hf_dataset.py \
  --repo-id RRGGZZ/Adam-pro-spider \
  --upload \
  --upload-only \
  --commit-message "Update Adam Pro 31DOF humanoid SPIDER optimized dataset"

To package and upload the raw Adam PKL split subset:

.venv/bin/python scripts/package_hf_dataset.py \
  --source-dir human_motion/adam_pkl_split \
  --out-dir hf_upload/adam_pkl_split \
  --archive-prefix human_motion/adam_pkl_split \
  --shard-prefix adam_pkl_split \
  --path-in-repo raw/adam_pkl_split \
  --no-readme \
  --upload \
  --commit-message "Upload Adam PKL split source motions"
Downloads last month
69