nielsr HF Staff commited on
Commit
f25dfd3
·
verified ·
1 Parent(s): 827ee88

Improve model card: add `transformers` library_name, paper abstract, and update links

Browse files

This PR enhances the model card by:
- Adding `library_name: transformers` to the metadata, enabling the automated "how to use" code snippet on the Hugging Face Hub.
- Updating the primary paper link to the Hugging Face Papers page ([GraspMolmo: Generalizable Task-Oriented Grasping via Large-Scale Synthetic Data Generation](https://huggingface.co/papers/2505.13441)) for better discoverability.
- Including a direct link to the project's GitHub repository: [[Code]](https://github.com/abhaybd/GraspMolmo).
- Adding the paper's abstract to provide a comprehensive overview of the model.
- Adding the BibTeX citation from the GitHub README for proper academic referencing.

Files changed (1) hide show
  1. README.md +22 -4
README.md CHANGED
@@ -1,25 +1,29 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - allenai/PRISM
5
  language:
6
  - en
7
- base_model:
8
- - allenai/Molmo-7B-D-0924
9
  pipeline_tag: robotics
10
  tags:
11
  - robotics
12
  - grasping
13
  - task-oriented-grasping
14
  - manipulation
 
15
  ---
16
 
17
  # GraspMolmo
18
 
19
- [[Paper]](https://arxiv.org/pdf/2505.13441) [[arXiv]](https://arxiv.org/abs/2505.13441) [[Project Website]](https://abhaybd.github.io/GraspMolmo/) [[Data]](https://huggingface.co/datasets/allenai/PRISM)
20
 
21
  GraspMolmo is a generalizable open-vocabulary task-oriented grasping (TOG) model for robotic manipulation. Given an image and a task to complete (e.g. "Pour me some tea"), GraspMolmo will point to the most appropriate grasp location, which can then be matched to the closest stable grasp.
22
 
 
 
 
23
  ## Code Sample
24
 
25
  ```python
@@ -75,4 +79,18 @@ gm = GraspMolmo()
75
  idx = gm.pred_grasp(rgb, point_cloud, task, grasps)
76
 
77
  print(f"Predicted grasp: {grasps[idx]}")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
  ```
 
1
  ---
2
+ base_model:
3
+ - allenai/Molmo-7B-D-0924
4
  datasets:
5
  - allenai/PRISM
6
  language:
7
  - en
8
+ license: mit
 
9
  pipeline_tag: robotics
10
  tags:
11
  - robotics
12
  - grasping
13
  - task-oriented-grasping
14
  - manipulation
15
+ library_name: transformers
16
  ---
17
 
18
  # GraspMolmo
19
 
20
+ [[Paper]](https://huggingface.co/papers/2505.13441) [[arXiv]](https://arxiv.org/abs/2505.13441) [[Project Website]](https://abhaybd.github.io/GraspMolmo/) [[Data]](https://huggingface.co/datasets/allenai/PRISM) [[Code]](https://github.com/abhaybd/GraspMolmo)
21
 
22
  GraspMolmo is a generalizable open-vocabulary task-oriented grasping (TOG) model for robotic manipulation. Given an image and a task to complete (e.g. "Pour me some tea"), GraspMolmo will point to the most appropriate grasp location, which can then be matched to the closest stable grasp.
23
 
24
+ ## Paper Abstract
25
+ We present GrasMolmo, a generalizable open-vocabulary task-oriented grasping (TOG) model. GraspMolmo predicts semantically appropriate, stable grasps conditioned on a natural language instruction and a single RGB-D frame. For instance, given "pour me some tea", GraspMolmo selects a grasp on a teapot handle rather than its body. Unlike prior TOG methods, which are limited by small datasets, simplistic language, and uncluttered scenes, GraspMolmo learns from PRISM, a novel large-scale synthetic dataset of 379k samples featuring cluttered environments and diverse, realistic task descriptions. We fine-tune the Molmo visual-language model on this data, enabling GraspMolmo to generalize to novel open-vocabulary instructions and objects. In challenging real-world evaluations, GraspMolmo achieves state-of-the-art results, with a 70% prediction success on complex tasks, compared to the 35% achieved by the next best alternative. GraspMolmo also successfully demonstrates the ability to predict semantically correct bimanual grasps zero-shot. We release our synthetic dataset, code, model, and benchmarks to accelerate research in task-semantic robotic manipulation, which, along with videos, are available at this https URL .
26
+
27
  ## Code Sample
28
 
29
  ```python
 
79
  idx = gm.pred_grasp(rgb, point_cloud, task, grasps)
80
 
81
  print(f"Predicted grasp: {grasps[idx]}")
82
+ ```
83
+
84
+ ## Citation
85
+
86
+ ```
87
+ @misc{deshpande2025graspmolmo,
88
+ title={GraspMolmo: Generalizable Task-Oriented Grasping via Large-Scale Synthetic Data Generation},
89
+ author={Abhay Deshpande and Yuquan Deng and Arijit Ray and Jordi Salvador and Winson Han and Jiafei Duan and Kuo-Hao Zeng and Yuke Zhu and Ranjay Krishna and Rose Hendrix},
90
+ year={2025},
91
+ eprint={2505.13441},
92
+ archivePrefix={arXiv},
93
+ primaryClass={cs.RO},
94
+ url={https://arxiv.org/abs/2505.13441},
95
+ }
96
  ```