Improve model card: add pipeline tag, library name, and metadata

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +9 -6
README.md CHANGED
@@ -1,10 +1,13 @@
1
  ---
2
- license: mit
3
- language:
4
- - en
5
  base_model:
6
  - Qwen/Qwen3-VL-4B-Instruct
 
 
 
 
 
7
  ---
 
8
  # HieroSA (Chinese)
9
 
10
  [Paper](https://arxiv.org/abs/2601.05508) | [GitHub](https://github.com/THUNLP-MT/HieroSA)
@@ -15,17 +18,17 @@ HieroSA supports both modern logographic scripts and ancient hieroglyphs 🌍, e
15
 
16
  ## More Details
17
 
18
- Please refer to our [GitHub Repository](https://github.com/THUNLP-MT/HieroSA) for more details about this model.
19
 
20
  ## Citation
21
 
22
  If you find our work helpful for your research, please consider citing our work.
23
 
24
- ```plain
25
  @article{luo2026hierosa,
26
  title={Enabling Stroke-Level Structural Analysis of Hieroglyphic Scripts without Language-Specific Priors},
27
  author={Fuwen Luo and Zihao Wan and Ziyue Wang and Yaluo Liu and Pau Tong Lin Xu and Xuanjia Qiao and Xiaolong Wang and Peng Li and Yang Liu},
28
  journal={arXiv preprint arXiv:2601.05508},
29
  year={2026}
30
  }
31
- ```
 
1
  ---
 
 
 
2
  base_model:
3
  - Qwen/Qwen3-VL-4B-Instruct
4
+ language:
5
+ - en
6
+ license: mit
7
+ pipeline_tag: image-text-to-text
8
+ library_name: transformers
9
  ---
10
+
11
  # HieroSA (Chinese)
12
 
13
  [Paper](https://arxiv.org/abs/2601.05508) | [GitHub](https://github.com/THUNLP-MT/HieroSA)
 
18
 
19
  ## More Details
20
 
21
+ Please refer to our [GitHub Repository](https://github.com/THUNLP-MT/HieroSA) for more details about this model, including environment setup and inference scripts.
22
 
23
  ## Citation
24
 
25
  If you find our work helpful for your research, please consider citing our work.
26
 
27
+ ```bibtex
28
  @article{luo2026hierosa,
29
  title={Enabling Stroke-Level Structural Analysis of Hieroglyphic Scripts without Language-Specific Priors},
30
  author={Fuwen Luo and Zihao Wan and Ziyue Wang and Yaluo Liu and Pau Tong Lin Xu and Xuanjia Qiao and Xiaolong Wang and Peng Li and Yang Liu},
31
  journal={arXiv preprint arXiv:2601.05508},
32
  year={2026}
33
  }
34
+ ```