Dataset Viewer
Duplicate
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code:   StreamingRowsError
Exception:    TypeError
Message:      Couldn't cast array of type struct<state: struct<eef: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, gripper: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>>, action: struct<eef: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, gripper: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>>, relative_action: struct<action: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, eef: struct<min: list<item: list<item: double>>, max: list<item: list<item: double>>, mean: list<item: list<item: double>>, std: list<item: list<item: double>>, q01: list<item: list<item: double>>, q99: list<item: list<item: double>>>>> to int64
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
                  return get_rows(
                         ^^^^^^^^^
                File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
                  return func(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2690, in __iter__
                  for key, example in ex_iterable:
                                      ^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2227, in __iter__
                  for key, pa_table in self._iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2251, in _iter_arrow
                  for key, pa_table in self.ex_iterable._iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 494, in _iter_arrow
                  for key, pa_table in iterator:
                                       ^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 384, in _iter_arrow
                  for key, pa_table in self.generate_tables_fn(**gen_kwags):
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 299, in _generate_tables
                  self._cast_table(pa_table, json_field_paths=json_field_paths),
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 128, in _cast_table
                  pa_table = table_cast(pa_table, self.info.features.arrow_schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2321, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2255, in cast_table_to_schema
                  cast_array_to_feature(
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1804, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2095, in cast_array_to_feature
                  return array_cast(
                         ^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1806, in wrapper
                  return func(array, *args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1959, in array_cast
                  raise TypeError(f"Couldn't cast array of type {_short_str(array.type)} to {_short_str(pa_type)}")
              TypeError: Couldn't cast array of type struct<state: struct<eef: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, gripper: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>>, action: struct<eef: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, gripper: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>>, relative_action: struct<action: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: double>, q01: list<item: double>, q99: list<item: double>>, eef: struct<min: list<item: list<item: double>>, max: list<item: list<item: double>>, mean: list<item: list<item: double>>, std: list<item: list<item: double>>, q01: list<item: list<item: double>>, q99: list<item: list<item: double>>>>> to int64

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

GR00T-N1.7 UR5 Real-World Finetuned Checkpoints

UR5 真机数据集上从 nvidia/GR00T-N1.7-3B 微调出的两个任务模型,已剥离训练状态,仅保留推理 / deploy 所需文件。

仓库内容

子目录 任务 来源 checkpoint 训练 loss
ur5_lab_bread_6d_finetune/ 抓取面包 (bread) step 9000 ~0.0055
ur5_lab_cube_6d_finetune/ 抓取方块 (cube) step 10000 ~0.0098

每个任务子目录已平铺为可直接 from_pretrained 加载的格式:

<task>/
├── config.json                       # Gr00tN1d7 模型配置
├── model-00001-of-00002.safetensors  # ~5.0 GB
├── model-00002-of-00002.safetensors  # ~1.9 GB
├── model.safetensors.index.json
├── processor_config.json             # 输入预处理
├── statistics.json                   # 动作 / 状态归一化
├── embodiment_id.json                # embodiment tag → id 映射
└── experiment_cfg/                   # 训练时完整配置(用于推理时构造 modality)
    ├── conf.yaml
    ├── config.yaml
    ├── dataset_statistics.json
    ├── final_model_config.json
    └── final_processor_config.json

DeepSpeed 优化器状态、trainer_state.jsonscheduler.ptrng_state_*.pth 等训练专用文件已剥离。

关键模型 / 数据配置

  • 基础模型:nvidia/GR00T-N1.7-3B(Cosmos-Reason2-2B backbone)
  • Embodiment tag:new_embodiment(id = 10)
  • 视觉输入:video.cam0 + video.cam1,原图先 crop 230×230,再 resize 256×256
  • 状态输入:state.eef + state.gripper(单帧,state_history_length=1
  • 动作输出:
    • action.eef: 相对位姿,格式 xyz+rot6d(9 维)
    • action.gripper: 绝对值,格式 default(1 维)
    • action_horizon = 40,使用 delta indices 0..15
  • 推理 dtype:bfloat16
  • 详细参数见每个子目录的 experiment_cfg/conf.yaml

安装 Isaac-GR00T

git clone https://github.com/NVIDIA/Isaac-GR00T.git
cd Isaac-GR00T
pip install -e .
pip install flash-attn --no-build-isolation

需要 GPU + CUDA。模型推理需要 ~14 GB 显存(bf16)。

下载 checkpoint

from huggingface_hub import snapshot_download

# 只下载 bread(约 7 GB)
bread_dir = snapshot_download(
    repo_id="yqi19/gr00t_real_checkpoint",
    repo_type="dataset",
    allow_patterns="ur5_lab_bread_6d_finetune/*",
)
model_path = f"{bread_dir}/ur5_lab_bread_6d_finetune"

或一次性下载全部(约 14 GB):

local_dir = snapshot_download(
    repo_id="yqi19/gr00t_real_checkpoint",
    repo_type="dataset",
)

加载 / 推理

最直接的方式是复用 Isaac-GR00T 仓库自带的脚本,只把 --model_path 指向上面解出来的目录即可。常用的入口:

  • scripts/eval_policy.py:单条 trajectory 上的开环评测
  • scripts/inference_service.py:起一个 policy 服务接收 obs、返回 action

例如:

cd Isaac-GR00T
python scripts/eval_policy.py \
    --model_path /path/to/ur5_lab_bread_6d_finetune \
    --embodiment_tag new_embodiment \
    --data_config new_embodiment              # 如该 tag 不在 DATA_CONFIG_MAP,需自行注册

如果你的 data_config 中没有 new_embodiment,可以照搬训练时的 modality config(在 experiment_cfg/conf.yamldata.modality_configs.new_embodiment 字段下)注册一个,关键字段:

video:    { modality_keys: [cam0, cam1] }
state:    { modality_keys: [eef, gripper] }
action:
  modality_keys: [eef, gripper]
  delta_indices: [0..15]
  action_configs:
    - { state_key: eef,     type: eef,     rep: relative, format: xyz+rot6d }
    - { state_key: gripper, type: non_eef, rep: absolute, format: default }

快速 smoke test

确认权重 + 配置自洽(不跑数据,仅加载模型):

from gr00t.model.gr00t_n1d7 import Gr00tN1d7   # 类名以 Isaac-GR00T 仓库为准
m = Gr00tN1d7.from_pretrained(model_path, torch_dtype="bfloat16")
print("loaded", sum(p.numel() for p in m.parameters()) / 1e9, "B params")

训练背景

起点 nvidia/GR00T-N1.7-3B
数据 bread_gr00t_6d / cube_gr00t_6d(UR5 真机,6D EEF + gripper)
max_steps 10000
optimizer adamw_torch, lr 1e-4, cosine, weight_decay 1e-5, warmup 5%
batch global 128, 4 × GPU, DeepSpeed Zero-2
精度 bf16

bread 选 step 9000:在 loss 曲线上最低 (~0.0055),10000 步略回升到 ~0.0083;cube 选 step 10000。

Downloads last month
69