Add pipeline tag, GitHub link, installation, and sample usage

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +46 -5
README.md CHANGED
@@ -1,20 +1,61 @@
1
  ---
2
  license: cc-by-4.0
 
3
  ---
4
 
5
  <h1 align="center">Particulate: Feed-Forward 3D Object Articulation</h1>
6
  <p align="center">
7
  <a href="https://arxiv.org/abs/2512.11798"><img src="https://img.shields.io/badge/arXiv-2512.11798-b31b1b" alt="arXiv"></a>
8
  <a href="https://ruiningli.com/particulate"><img src="https://img.shields.io/badge/Project_Page-green" alt="Project Page"></a>
 
9
  <a href='https://huggingface.co/spaces/rayli/particulate'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Demo-blue'></a>
10
  </p>
11
 
12
- <!-- Overview -->
13
- ## 🌟 Overview
14
- - A feed-forward model that, given a single static 3D mesh of an everyday object, directly infers all attributes of the underlying articulated structure, including its 3D parts, kinemaic structure, and motion constraints.
15
- - The model is a 150M-parameter transformer model trained end-to-end on public datasets of articulated objects.
16
 
17
  ## 🌟 Overview
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
 
19
- ![i_bS4dh6DqcsLEcppQblG](https://cdn-uploads.huggingface.co/production/uploads/6467cc17e7a6a374fd1a41e5/nHQsN8TTSIjZlJ365e4aM.png)
20
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-4.0
3
+ pipeline_tag: robotics
4
  ---
5
 
6
  <h1 align="center">Particulate: Feed-Forward 3D Object Articulation</h1>
7
  <p align="center">
8
  <a href="https://arxiv.org/abs/2512.11798"><img src="https://img.shields.io/badge/arXiv-2512.11798-b31b1b" alt="arXiv"></a>
9
  <a href="https://ruiningli.com/particulate"><img src="https://img.shields.io/badge/Project_Page-green" alt="Project Page"></a>
10
+ <a href='https://github.com/ruiningli/particulate'><img src='https://img.shields.io/badge/GitHub-Code-black.svg?logo=github' alt='GitHub Code'></a>
11
  <a href='https://huggingface.co/spaces/rayli/particulate'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Demo-blue'></a>
12
  </p>
13
 
14
+ This model's weights are licensed under [CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/). The accompanying code is licensed under [Apache-2.0](https://github.com/ruiningli/particulate/blob/main/LICENSE).
 
 
 
15
 
16
  ## 🌟 Overview
17
+ Particulate is a feed-forward approach that, given a single static 3D mesh of an everyday object, directly infers **all** attributes of the underlying articulated structure, including its 3D parts, kinematic structure, and motion constraints.
18
+
19
+ ### Key Features
20
+ - **Ultra-fast Inference**: Our model recovers a fully articulated 3D object with a single forward pass in ~10 seconds.
21
+ - **SOTA Performance**: Our model significantly outperforms prior methods on the task of 3D articulation estimation.
22
+ - **GenAI Compatible**: Our model can also accurately infer the articulated structure of AI-generated 3D assets, enabling full-fledged generation of articulated assets from images or texts when combined with an off-the-shelf 3D generator.
23
+
24
+ ![Particulate Overview](https://cdn-uploads.huggingface.co/production/uploads/6467cc17e7a6a374fd1a41e5/nHQsN8TTSIjZlJ365e4aM.png)
25
+
26
+ ## πŸ’» Code
27
+ The official code repository for Particulate can be found on GitHub: [https://github.com/ruiningli/particulate](https://github.com/ruiningli/particulate)
28
+
29
+ ## πŸ”§ Installation
30
+ Our implementation is tested on pytorch==2.4.0 with cuda 12.4 on Ubuntu 22.04.
31
+ ```
32
+ conda create -n particulate python=3.10
33
+ conda activate particulate
34
+ pip install -r requirements.txt
35
+ ```
36
+
37
+ ## πŸš€ Sample Usage
38
+ To use our model to predict the articulated structure of a custom 3D model (alternatively, you can try our [demo](https://huggingface.co/spaces/rayli/particulate) on HuggingFace without local setup):
39
+
40
+ ```bash
41
+ python infer.py --input_mesh ./hunyuan3d-examples/foldingchair.glb
42
+ ```
43
+
44
+ The script will automatically download the pre-trained checkpoint from Huggingface.
45
+
46
+ ### Extra Arguments for Inference
47
+ - `up_dir`: The up direction of the input mesh. Our model is trained on 3D models with up direction +Z. To achieve optimal result, it is important to make sure the input mesh follow the same convention. The script will automatically rotate the input model to be +Z up with this argument. You can use the visualization in the [demo](https://huggingface.co/spaces/rayli/particulate) to determine the up direction.
48
+ - `num_points`: The number of points to be sampled as input to the network. Note that we uniformly sample 50% of points and sample the remaining 50% from *sharp* edges. Please make sure the number of uniform points is larger than the number of faces in the input mesh.
49
+ - `min_part_confidence`: Increasing this value will merge parts that have low confidence scores to other parts. Consider increasing this value if the prediction is over segmented.
50
+ - `no_strict`: By default, the prediction will be post-processed to ensure that each articulated part is a union of different connected components in the original mesh (i.e., no connected components are split across parts). If the input mesh does **not** have clean connected components, please specify `--no_strict`.
51
 
52
+ ## πŸ“ Citation
53
 
54
+ ```bibtex
55
+ @article{li2025particulate,
56
+ title = {Particulate: Feed-Forward 3D Object Articulation},
57
+ author = {Ruining Li and Yuxin Yao and Chuanxia Zheng and Christian Rupprecht and Joan Lasenby and Shangzhe Wu and Andrea Vedaldi},
58
+ journal = {arXiv preprint arXiv:2512.11798},
59
+ year = {2025}
60
+ }
61
+ ```