toolevalxm commited on
Commit
1f675b8
·
verified ·
1 Parent(s): 0d0664a

Add BibTeX citation section for base model

Browse files
Files changed (1) hide show
  1. README.md +38 -2
README.md CHANGED
@@ -1,5 +1,41 @@
1
- # Annoy Qwen Coder Spec Model This model is part of the Annoy project for code reasoning and execution specification. ## Model Description This model has been fine-tuned for speculative execution reasoning tasks on code. It can predict input/output pairs and verify execution trajectories. ## Training The model was trained using the Annoy methodology on the PythonEdu-Rs dataset. Training was conducted in two stages: - Stage 1: Initial speculative reasoning training - Stage 2: Refinement with verified predictions ## Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec") tokenizer = AutoTokenizer.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec") ``` ## Citation If you use this model, please cite our paper.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
 
3
  **Base Model**
4
 
5
- This model is fine-tuned from Qwen/Qwen2.5-Coder-7B.
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Annoy Qwen Coder Spec Model
2
+
3
+ This model is part of the Annoy project for code reasoning and execution specification.
4
+
5
+ ## Model Description
6
+
7
+ This model has been fine-tuned for speculative execution reasoning tasks on code. It can predict input/output pairs and verify execution trajectories.
8
+
9
+ ## Training
10
+
11
+ The model was trained using the Annoy methodology on the PythonEdu-Rs dataset. Training was conducted in two stages:
12
+ - Stage 1: Initial speculative reasoning training
13
+ - Stage 2: Refinement with verified predictions
14
+
15
+ ## Usage
16
+
17
+ ```python
18
+ from transformers import AutoModelForCausalLM, AutoTokenizer
19
+
20
+ model = AutoModelForCausalLM.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec")
21
+ tokenizer = AutoTokenizer.from_pretrained("toolevalxm/qwen2.5-7b-coder_spec")
22
+ ```
23
+
24
+ ## Citation
25
+
26
+ If you use this model, please cite our paper.
27
 
28
  **Base Model**
29
 
30
+ This model is fine-tuned from Qwen/Qwen2.5-Coder-7B.
31
+
32
+ ## BibTeX Citation
33
+
34
+ ```bibtex
35
+ @article{hui2024qwen2,
36
+ title={Qwen2. 5-Coder Technical Report},
37
+ author={Hui, Binyuan and Yang, Jian and Cui, Zeyu and Yang, Jiaxi and Liu, Dayiheng and Zhang, Lei and Liu, Tianyu and Zhang, Jiajun and Yu, Bowen and Dang, Kai and others},
38
+ journal={arXiv preprint arXiv:2409.12186},
39
+ year={2024}
40
+ }
41
+ ```