Improve model card: Add metadata, links, description, and citation
#1
by
nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,4 +1,29 @@
|
|
| 1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
```python
|
| 3 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 4 |
import torch
|
|
@@ -63,4 +88,20 @@ if matches:
|
|
| 63 |
print(code)
|
| 64 |
else:
|
| 65 |
print("No Verilog code found in the response!")
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 66 |
```
|
|
|
|
| 1 |
+
---
|
| 2 |
+
pipeline_tag: text-generation
|
| 3 |
+
library_name: transformers
|
| 4 |
+
license: apache-2.0
|
| 5 |
+
tags:
|
| 6 |
+
- code-generation
|
| 7 |
+
- verilog
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+
# QiMeng-SALV: Signal-Aware Learning for Verilog Code Generation
|
| 11 |
+
|
| 12 |
+
This repository contains the `SALV-Qwen2.5-Coder-7B-Instruct` model, an advanced model for Verilog code generation presented in the paper [QiMeng-SALV: Signal-Aware Learning for Verilog Code Generation](https://huggingface.co/papers/2510.19296).
|
| 13 |
+
|
| 14 |
+
QiMeng-SALV introduces a novel framework for Verilog code generation that shifts reinforcement learning optimization from module-level to signal-level rewards. By leveraging Abstract Syntax Tree (AST) analysis and signal-aware verification, it extracts functionally correct code segments from partially incorrect modules, enabling more effective RL training. This method addresses the issue of insufficient functional rewards and achieves state-of-the-art performance on VerilogEval and RTLLM.
|
| 15 |
+
|
| 16 |
+
## Resources
|
| 17 |
+
|
| 18 |
+
* **Paper**: [https://huggingface.co/papers/2510.19296](https://huggingface.co/papers/2510.19296)
|
| 19 |
+
* **Project Page**: [https://zy1xxx.github.io/SALV](https://zy1xxx.github.io/SALV)
|
| 20 |
+
* **Code**: [https://github.com/zy1xxx/SALV](https://github.com/zy1xxx/SALV)
|
| 21 |
+
* **Dataset**: [https://huggingface.co/datasets/TabCanNotTab/SALV-dataset](https://huggingface.co/datasets/TabCanNotTab/SALV-dataset)
|
| 22 |
+
|
| 23 |
+
## Usage
|
| 24 |
+
|
| 25 |
+
You can use this model with the `transformers` library:
|
| 26 |
+
|
| 27 |
```python
|
| 28 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 29 |
import torch
|
|
|
|
| 88 |
print(code)
|
| 89 |
else:
|
| 90 |
print("No Verilog code found in the response!")
|
| 91 |
+
```
|
| 92 |
+
|
| 93 |
+
## Citation
|
| 94 |
+
|
| 95 |
+
If you find QiMeng-SALV useful, please cite our paper:
|
| 96 |
+
|
| 97 |
+
```bibtex
|
| 98 |
+
@misc{zhang2025qimengsalvsignalawarelearningverilog,
|
| 99 |
+
title={QiMeng-SALV: Signal-Aware Learning for Verilog Code Generation},
|
| 100 |
+
author={Yang Zhang and Rui Zhang and Jiaming Guo and Lei Huang and Di Huang and Yunpu Zhao and Shuyao Cheng and Pengwei Jin and Chongxiao Li and Zidong Du and Xing Hu and Qi Guo and Yunji Chen},
|
| 101 |
+
year={2025},
|
| 102 |
+
eprint={2510.19296},
|
| 103 |
+
archivePrefix={arXiv},
|
| 104 |
+
primaryClass={cs.LG},
|
| 105 |
+
url={https://arxiv.org/abs/2510.19296},
|
| 106 |
+
}
|
| 107 |
```
|