|
|
--- |
|
|
license: apache-2.0 |
|
|
--- |
|
|
|
|
|
# FineGrasp: Towards Robust Grasping for Delicate Objects |
|
|
|
|
|
<div align="center" class="authors"> |
|
|
<a href="https://scholar.google.com/citations?user=Ke5SamYAAAAJ&hl=en" target="_blank">Yun Du*</a>, |
|
|
<a href="https://scholar.google.com/citations?hl=en&user=aB02FDEAAAAJ" target="_blank">Mengao Zhao*</a>, |
|
|
<a href="https://wzmsltw.github.io/" target="_blank">Tianwei Lin</a>, |
|
|
<a href="" target="_blank">Yiwei Jin</a>, |
|
|
<a href="" target="_blank">Chaodong Huang</a>, |
|
|
<a href="https://scholar.google.com/citations?user=HQfc8TEAAAAJ&hl=en" target="_blank">Zhizhong Su</a> |
|
|
</div> |
|
|
|
|
|
<div align="center" style="line-height: 3;"> |
|
|
</a> |
|
|
<a href="https://horizonrobotics.github.io/robot_lab/finegrasp/index.html" target="_blank" style="margin: 2px;"> |
|
|
<img alt="Project" src="https://img.shields.io/badge/Project-Website-blue" style="display: inline-block; vertical-align: middle;"/> |
|
|
</a> |
|
|
<a href="https://github.com/HorizonRobotics/robo_orchard_lab/tree/master/projects/finegrasp_graspnet1b" target="_blank" style="margin: 2px;"> |
|
|
<img alt="FineGrasp Code" src="https://img.shields.io/badge/FineGrasp_Code-Github-bule" style="display: inline-block; vertical-align: middle;"/> |
|
|
</a> |
|
|
</a> |
|
|
<a href="https://github.com/HorizonRobotics/robo_orchard_grasp_label" target="_blank" style="margin: 2px;"> |
|
|
<img alt="Grasp Label Code" src="https://img.shields.io/badge/Grasp Label Code-Github-bule" style="display: inline-block; vertical-align: middle;"/> |
|
|
</a> |
|
|
<a href="https://arxiv.org/abs/2507.05978" target="_blank" style="margin: 2px;"> |
|
|
<img alt="Paper" src="https://img.shields.io/badge/Paper-arXiv-red" style="display: inline-block; vertical-align: middle;"/> |
|
|
</a> |
|
|
<a href="https://huggingface.co/HorizonRobotics/Finegrasp" target="_blank" style="margin: 2px;"> |
|
|
<img alt="Model" src="https://img.shields.io/badge/Model-HuggingFace-red" style="display: inline-block; vertical-align: middle;"/> |
|
|
</a> |
|
|
|
|
|
</div> |
|
|
|
|
|
|
|
|
## 1. Grasp performance in GraspNet-1Billion dataset. |
|
|
|
|
|
| Method | ckpt | Camera | Seen (AP) | Similar (AP) | Novel (AP) | Average (AP) | |
|
|
| --------- | ---- | --------- | --------- | ------------ | ---------- | ---------- | |
|
|
| FineGrasp | finegrasp_pipeline/model.safetensors | Realsense | 71.67 | 62.83 | 27.40 | 53.97 | |
|
|
| FineGrasp + CD | finegrasp_pipeline/model.safetensors | Realsense | 73.71 | 64.56 | 28.14 | 55.47 | |
|
|
| FineGrasp + Simulation Data | finegrasp_pipeline_sim/model.safetensors | Realsense | 70.21 | 61.98 | 26.18 | 52.79 | |
|
|
|
|
|
|
|
|
We also provide a grasp-based baseline for [Challenge Cup](https://developer.d-robotics.cc/tiaozhanbei-2025), please refer to this [code](https://github.com/HorizonRobotics/robo_orchard_lab/tree/master/projects/pick_place_agent) |
|
|
> Notice: finegrasp_pipeline_sim/model.safetensors is trained for [Challenge Cup](https://developer.d-robotics.cc/tiaozhanbei-2025) RoboTwin simulation benchmark. |
|
|
|
|
|
|
|
|
## 2. Example Infer Code |
|
|
```Python |
|
|
import os |
|
|
import numpy as np |
|
|
import scipy.io as scio |
|
|
from PIL import Image |
|
|
from robo_orchard_lab.models.finegrasp.processor import GraspInput |
|
|
from huggingface_hub import snapshot_download |
|
|
from robo_orchard_lab.inference import InferencePipelineMixin |
|
|
|
|
|
file_path = snapshot_download( |
|
|
repo_id="HorizonRobotics/FineGrasp", |
|
|
allow_patterns=[ |
|
|
"finegrasp_pipeline/**", |
|
|
"data_example/**" |
|
|
], |
|
|
) |
|
|
|
|
|
loaded_pipeline = InferencePipelineMixin.load( |
|
|
os.path.join(file_path, "finegrasp_pipeline") |
|
|
) |
|
|
|
|
|
rgb_image_path = os.path.join(file_path, "data_example/0000_rgb.png") |
|
|
depth_image_path = os.path.join(file_path, "data_example/0000_depth.png") |
|
|
intrinsic_file = os.path.join(file_path, "data_example/0000.mat") |
|
|
|
|
|
depth_image = np.array(Image.open(depth_image_path), dtype=np.float32) |
|
|
rgb_image = np.array(Image.open(rgb_image_path), dtype=np.float32) / 255.0 |
|
|
intrinsic_matrix = scio.loadmat(intrinsic_file)["intrinsic_matrix"] |
|
|
workspace = [-1, 1, -1, 1, 0.0, 2.0] |
|
|
depth_scale = 1000.0 |
|
|
|
|
|
input_data = GraspInput( |
|
|
rgb_image=rgb_image, |
|
|
depth_image=depth_image, |
|
|
depth_scale=depth_scale, |
|
|
intrinsic_matrix=intrinsic_matrix, |
|
|
workspace=workspace, |
|
|
) |
|
|
|
|
|
loaded_pipeline.to("cuda") |
|
|
loaded_pipeline.model.eval() |
|
|
output = loaded_pipeline(input_data) |
|
|
print(f"Best grasp pose: {output.grasp_poses[0]}") |
|
|
|
|
|
``` |
|
|
|
|
|
## Citation |
|
|
``` |
|
|
@misc{du2025finegrasp, |
|
|
title={FineGrasp: Towards Robust Grasping for Delicate Objects}, |
|
|
author={Yun Du and Mengao Zhao and Tianwei Lin and Yiwei Jin and Chaodong Huang and Zhizhong Su}, |
|
|
year={2025}, |
|
|
eprint={2507.05978}, |
|
|
archivePrefix={arXiv}, |
|
|
primaryClass={cs.RO}, |
|
|
url={https://arxiv.org/abs/2507.05978}, |
|
|
} |
|
|
``` |