The dataset viewer is not available for this subset.
Exception: SplitsNotFoundError
Message: The split names could not be parsed from the dataset config.
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 779, in _error_catcher
yield
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 925, in _raw_read
raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
urllib3.exceptions.IncompleteRead: IncompleteRead(5124929 bytes read, 118463 more expected)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 820, in generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1091, in stream
data = self.read(amt=amt, decode_content=decode_content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1008, in read
data = self._raw_read(amt)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 903, in _raw_read
with self._error_catcher():
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 803, in _error_catcher
raise ProtocolError(arg, e) from e
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(5124929 bytes read, 118463 more expected)', IncompleteRead(5124929 bytes read, 118463 more expected))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 286, in get_dataset_config_info
for split_generator in builder._split_generators(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 61, in _split_generators
self.info.features = _recursive_infer_features(h5)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 220, in _recursive_infer_features
features = _recursive_infer_features(dset)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 220, in _recursive_infer_features
features = _recursive_infer_features(dset)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 220, in _recursive_infer_features
features = _recursive_infer_features(dset)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/hdf5/hdf5.py", line 218, in _recursive_infer_features
for path, dset in h5_obj.items():
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/h5py/_hl/base.py", line 437, in __iter__
yield (key, self._mapping.get(key))
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/h5py/_hl/group.py", line 403, in get
return self[name]
~~~~^^^^^^
File "h5py/_objects.pyx", line 56, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 57, in h5py._objects.with_phil.wrapper
File "/usr/local/lib/python3.12/site-packages/h5py/_hl/group.py", line 360, in __getitem__
oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "h5py/_objects.pyx", line 56, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 57, in h5py._objects.with_phil.wrapper
File "h5py/h5o.pyx", line 257, in h5py.h5o.open
File "h5py/h5fd.pyx", line 162, in h5py.h5fd.H5FD_fileobj_read
File "/usr/local/lib/python3.12/site-packages/fsspec/spec.py", line 1856, in readinto
data = self.read(out.nbytes)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/utils/file_utils.py", line 844, in read_with_retries
out = read(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 1015, in read
return super().read(length)
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/spec.py", line 1846, in read
out = self.cache._fetch(self.loc, self.loc + length)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/caching.py", line 189, in _fetch
self.cache = self.fetcher(start, end) # new block replaces old
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 969, in _fetch_range
r = http_backoff(
^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 310, in http_backoff
response = session.request(method=method, url=url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 724, in send
history = [resp for resp in gen]
^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 265, in resolve_redirects
resp = self.send(
^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 746, in send
r.content
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 902, in content
self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 822, in generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(5124929 bytes read, 118463 more expected)', IncompleteRead(5124929 bytes read, 118463 more expected))
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
for split in get_dataset_split_names(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 340, in get_dataset_split_names
info = get_dataset_config_info(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 291, in get_dataset_config_info
raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Cloth-splatters/folding-meshes
Synthetic cloth-folding trajectories for training graph-based dynamics and state-estimation models. Generated in MuJoCo with variable-topology cloth meshes (circles, with/without holes, at varying resolutions).
Used by:
Cloth-splatters/folding-dynamics-gnsCloth-splatters/folding-dynamics-edge-gnnCloth-splatters/folding-state-est-gps
Files
| File | Size | Train / Val cloths | Cameras |
|---|---|---|---|
fold_seed_1397.h5 |
1.95 GB | 105 / 45 | cam_0 |
fold_seed_5305.h5 |
784 MB | — | cam_0 |
fold_multicam_seed_5305.h5 |
769 MB | 10 / 10 | cam_0, cam_16, cam_28 |
fold_multicam_seed_5305.h5 is the canonical evaluation set (multi-view).
Different seeds correspond to independent generator runs and are not
deterministic re-renders of each other.
Schema
Each HDF5 file has three top-level groups:
metadata/
rendering_parameters/ # Blender-style camera intrinsics + per-frame transforms
training/
<cloth_name>/ # e.g. circle_with_0_holes_1
edges (E, 2) int32
faces (F, 3) int32
rest_positions (V, 3) float64
trajectory_<i>/
actuated_vertices (V, 1) float64 # one-hot picker assignment
goal_vertices (V, 1) float64
step_0000/
positions (V, 3) float64
velocities (V, 3) float64
gripper_pos (1, 3) float64
pointclouds/cam_* # (N, 3) per camera
images/cam_* # (H, W, 3) RGB
depth/cam_* # (H, W) float
segmentation/cam_* # (H, W) int
step_0001/ ...
validation/
<cloth_name>/ ...
Vertex count V varies per cloth (different meshes have different resolutions
— ~44–71 in current files). Edge/face indices are 0-based into positions.
Loading
The recommended path is via the project's src/hub.py:
from src.hub import resolve_dataset
path = resolve_dataset("folding-meshes", filename="fold_seed_1397.h5")
Or directly:
from huggingface_hub import hf_hub_download
path = hf_hub_download(
"Cloth-splatters/folding-meshes",
filename="fold_seed_1397.h5",
repo_type="dataset",
)
Generation
Trajectories were produced by a MuJoCo-based folding-action sampler over
procedurally generated 2D cloth meshes (with optional holes). Per-frame
camera transforms in metadata/rendering_parameters follow Blender NeRF
convention.
License
MIT.
- Downloads last month
- 14