File size: 3,252 Bytes
0b45f86
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9582887
 
3c0347b
e11199f
0b45f86
 
 
 
 
 
 
 
 
 
 
 
3c0347b
0b45f86
 
 
3c0347b
 
 
 
 
 
 
 
 
 
 
0b45f86
 
 
9582887
 
 
95e7d7a
9582887
95e7d7a
3c0347b
 
 
 
 
0b45f86
3c0347b
9582887
0b45f86
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3c0347b
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: apache-2.0
base_model: Qwen/Qwen2.5-7B-Instruct
tags:
  - text-to-cad
  - code-generation
  - cadquery
  - 3d-modeling
  - reinforcement-learning
language:
  - en
pipeline_tag: text-generation
library_name: transformers
---

# CAD-Coder

**CAD-Coder: Text-to-CAD Generation with Chain-of-Thought and Geometric Reward**

**Accepted at NeurIPS 2025 (Poster)**

This is the reinforcement learning (GRPO) fine-tuned model for generating CadQuery code from natural language descriptions.

## Model Description

CAD-Coder reformulates text-to-CAD as the generation of CadQuery scripts—a Python-based, parametric CAD language. The model is trained with a two-stage pipeline:

1. **Supervised Fine-Tuning (SFT)**: Learning CadQuery syntax and text-to-code mapping
2. **Reinforcement Learning (GRPO)**: Optimizing geometric accuracy with CAD-specific rewards (Chamfer Distance + Format Reward)

### Key Features

- Generates executable CadQuery Python code from natural language
- Chain-of-Thought (CoT) reasoning for complex CAD structures
- Geometric reward optimization for accurate 3D model generation
- Supports diverse CAD operations beyond simple sketch-extrusion

## Usage

For complete inference scripts, please visit our [GitHub repository](https://github.com/gudo7208/CAD-Coder).

### Installation

```bash
pip install transformers
pip install "numpy<2.0" cadquery==2.3.1  # Optional: for code execution
```

### Quick Start

```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "gudo7208/CAD-Coder"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype="auto", device_map="auto")

prompt = "Create a cylinder with radius 10mm and height 20mm, with a central hole of radius 5mm."

text = tokenizer.apply_chat_template(
    [{"role": "user", "content": prompt}],
    tokenize=False,
    add_generation_prompt=True
)
inputs = tokenizer([text], return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=2048)
print(tokenizer.decode(outputs[0][inputs.input_ids.shape[1]:], skip_special_tokens=True))
```

## Performance

| Method | Mean CD | Median CD | IR% |
|--------|---------|-----------|-----|
| Text2CAD | 29.29 | 0.37 | 3.75 |
| **CAD-Coder (Ours)** | **6.54** | **0.17** | **1.45** |

*CD metrics are ×10³. Lower is better.*

## Training Details

- **Base Model**: Qwen2.5-7B-Instruct
- **Training Data**: 110K text-CadQuery-3D model triplets + 1.5K CoT samples
- **Hardware**: 8× NVIDIA A800 80GB GPUs
- **Framework**: Hugging Face Transformers, DeepSpeed, Verl (GRPO)

## Citation

```bibtex
@article{guan2025cadcoder,
  title={CAD-Coder: Text-to-CAD Generation with Chain-of-Thought and Geometric Reward},
  author={Guan, Yandong and Wang, Xilin and Xing, Ximing and Zhang, Jing and Xu, Dong and Yu, Qian},
  journal={arXiv preprint arXiv:2505.19713},
  year={2025}
}
```

## License

This model is released under the Apache 2.0 License, following the base model (Qwen2.5-7B-Instruct) license terms.

## Acknowledgements

- Base model: [Qwen2.5-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-7B-Instruct)
- Training data derived from [Text2CAD](https://github.com/sadilkhan/Text2CAD) dataset