Add `pipeline_tag` and `library_name` to model card

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +23 -4
README.md CHANGED
@@ -1,17 +1,19 @@
1
  ---
 
 
 
 
2
  language:
3
  - en
4
  license: mit
 
 
5
  tags:
6
  - chain-of-thought
7
  - implicit-reasoning
8
  - multimodal
9
  - llama3
10
  - instruction-tuned
11
- datasets:
12
- - gsm8k
13
- - svamp
14
- - multi_arith
15
  model-index:
16
  - name: SIM_COT-LLaMA3-CODI-8B
17
  results:
@@ -171,5 +173,22 @@ Average accuracy over 1 sampling: xxx
171
  - average length of COT: average number of latent reasoning tokens.
172
  - average accuracy: aggregated accuracy across sampled runs.
173
 
 
 
 
 
 
 
 
 
 
 
 
 
174
 
 
175
 
 
 
 
 
 
1
  ---
2
+ datasets:
3
+ - gsm8k
4
+ - svamp
5
+ - multi_arith
6
  language:
7
  - en
8
  license: mit
9
+ pipeline_tag: text-generation
10
+ library_name: transformers
11
  tags:
12
  - chain-of-thought
13
  - implicit-reasoning
14
  - multimodal
15
  - llama3
16
  - instruction-tuned
 
 
 
 
17
  model-index:
18
  - name: SIM_COT-LLaMA3-CODI-8B
19
  results:
 
173
  - average length of COT: average number of latent reasoning tokens.
174
  - average accuracy: aggregated accuracy across sampled runs.
175
 
176
+ ## ✒️ Citation
177
+
178
+ If you find our work helpful for your research, please consider giving a star ⭐ and citation 📝
179
+
180
+ ```bibtex
181
+ @article{wei2025simcot,
182
+ title={{SIM-COT}: Supervised Implicit Chain-of-Thought},
183
+ author={Wei, Xilin and Liu, Xiaoran and Zang, Yuhang and Dong, Xiaoyi and Cao, Yuhang and Wang, Jiaqi and Qiu, Xipeng and Lin, Dahua},
184
+ journal={arXiv preprint arXiv:2509.20317},
185
+ year={2025}
186
+ }
187
+ ```
188
 
189
+ ## ❤️ Acknowledgments
190
 
191
+ - [Coconut](https://github.com/facebookresearch/coconut): The codebase we built upon. Thanks for their wonderful work.
192
+ - [CODI](https://github.com/zhenyi4/codi): Our work is based on this codebase; we are grateful for their valuable contribution.
193
+ - [LLaMA series](https://huggingface.co/meta-llama/collections): The amazing open-sourced large language model!
194
+ - [GPT2](https://huggingface.co/openai-community/gpt2): An impressive open-source large language model!