Houxing commited on
Commit
dcefea0
·
0 Parent(s):

initial commit

Browse files
.gitattributes ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
LICENSE ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [yyyy] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ ---
6
+ ## Empowering Character-level Text Infilling by Eliminating Sub-Tokens
7
+
8
+ <p align="center">
9
+ <a href="https://arxiv.org/abs/2405.17103">📄 Paper</a> •
10
+ <a href="https://github.com/SenseLLM/FIM-SE">🏠 Repo</a> •
11
+ <a href="https://huggingface.co/SenseLLM/FIM-SE-CL-13B">🤖 Models</a>
12
+ </p>
13
+
14
+ ## Introduction
15
+ FIM-SE stands for Fill-In-the-Middle with both Starting and Ending character constraints. The proposed method addresses character-level infilling tasks by utilizing a line-level format to avoid predicting any sub-token in inference.
16
+
17
+ ![](method.png)
18
+
19
+ <hr>
20
+
21
+ ## Models
22
+
23
+ | Model | Checkpoint | Size | License|
24
+ |:------|:-----------|:-----|:-------|
25
+ | FIM-SE-CL-7B | 🤗 [HF Link](https://huggingface.co/SenseLLM/FIM-SE-CL-7B) | 7B | [Llama2](https://ai.meta.com/llama/license/) |
26
+ | FIM-SE-CL-34B | 🤗 [HF Link](https://huggingface.co/SenseLLM/FIM-SE-CL-34B) | 13B | [Llama2](https://ai.meta.com/llama/license/) |
27
+ | FIM-SE-SC-1B | 🤗 [HF Link](https://huggingface.co/SenseLLM/FIM-SE-SC-1B) | 1B | [StarCoder](https://github.com/bigcode-project/starcoder/blob/main/LICENSE) |
28
+ | FIM-SE-SC-15B | 🤗 [HF Link](https://huggingface.co/SenseLLM/FIM-SE-SC-15B) | 15B | [StarCoder](https://github.com/bigcode-project/starcoder/blob/main/LICENSE) |
29
+
30
+ ## How to Use
31
+
32
+ #### Prompt Format
33
+
34
+ As shown in the figure, the prompt is organized as
35
+ ```text
36
+ <PRE>R-Prefix<SUF>R-Suffix<START>L-Prefix<END>F-Suffix<MID>
37
+ ```
38
+
39
+ #### Inference Code
40
+ Please refer to our [GitHub Repo](https://github.com/SenseLLM/FIM-SE) for more technical details.
41
+
42
+ ## Citation
43
+
44
+ If you find this repo useful for your research, please kindly cite our paper:
45
+ ```
46
+ @misc{ren2024empowering,
47
+ title={Empowering Character-level Text Infilling by Eliminating Sub-Tokens},
48
+ author={Houxing Ren and Mingjie Zhan and Zhongyuan Wu and Hongsheng Li},
49
+ year={2024},
50
+ eprint={2405.17103},
51
+ archivePrefix={arXiv},
52
+ primaryClass={cs.CL}
53
+ }
54
+ ```
55
+
56
+ ## Acknowledgments
57
+
58
+ We thank the following amazing projects that truly inspired us:
59
+
60
+ - [FIM](https://arxiv.org/abs/2207.14255)
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "<END>": 49153,
3
+ "<START>": 49152
4
+ }
config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/mnt/cache/code/models/starcoder-base",
3
+ "activation_function": "gelu",
4
+ "architectures": [
5
+ "GPTBigCodeForCausalLM"
6
+ ],
7
+ "attention_softmax_in_fp32": false,
8
+ "attn_pdrop": 0.1,
9
+ "bos_token_id": 0,
10
+ "embd_pdrop": 0.1,
11
+ "eos_token_id": 0,
12
+ "inference_runner": 0,
13
+ "initializer_range": 0.02,
14
+ "layer_norm_epsilon": 1e-05,
15
+ "max_batch_size": null,
16
+ "max_sequence_length": null,
17
+ "model_type": "gpt_bigcode",
18
+ "multi_query": true,
19
+ "n_embd": 6144,
20
+ "n_head": 48,
21
+ "n_inner": 24576,
22
+ "n_layer": 40,
23
+ "n_positions": 8192,
24
+ "pad_key_length": true,
25
+ "pre_allocate_kv_cache": false,
26
+ "resid_pdrop": 0.1,
27
+ "scale_attention_softmax_in_fp32": false,
28
+ "scale_attn_weights": true,
29
+ "summary_activation": null,
30
+ "summary_first_dropout": 0.1,
31
+ "summary_proj_to_labels": true,
32
+ "summary_type": "cls_index",
33
+ "summary_use_proj": true,
34
+ "torch_dtype": "bfloat16",
35
+ "transformers_version": "4.36.1",
36
+ "use_cache": false,
37
+ "validate_runner_input": true,
38
+ "vocab_size": 49280
39
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
method.png ADDED
model-00001-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e10fc90636d54ca4541ec10be65ed0aa68f80b6da827ea881434063b20913435
3
+ size 4953758120
model-00002-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2dff58114323abcaccb62023729942c9b40d631ec4b9ce7837384b336acb64c8
3
+ size 4930227792
model-00003-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3c6de19f0115a9c1c2a92e0cd847317525324e2587f6d53850e2ea105b57e81a
3
+ size 4927118464
model-00004-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8fcac9bb32ec76779c11fc31c897e14f3257f28c18f3d3d4e9f28035a96c2ce
3
+ size 4930227864
model-00005-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2fd7f03be36f869bada2523c298e21e2976f5eac5e80664f19ded3cdc829b245
3
+ size 4927118464
model-00006-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:65cfd7804a6ebe8a82af22841dade6b539c03c144957631fe8fccb6a6a6b1cce
3
+ size 4930227864
model-00007-of-00007.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7bb1d462f5e35da57e1896f9bbaa64fd0aaec305c6fbfd8c6ae97b2b4e4478db
3
+ size 2043411344
model.safetensors.index.json ADDED
@@ -0,0 +1,492 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 31642038272
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00007-of-00007.safetensors",
7
+ "transformer.h.0.attn.c_attn.bias": "model-00001-of-00007.safetensors",
8
+ "transformer.h.0.attn.c_attn.weight": "model-00001-of-00007.safetensors",
9
+ "transformer.h.0.attn.c_proj.bias": "model-00001-of-00007.safetensors",
10
+ "transformer.h.0.attn.c_proj.weight": "model-00001-of-00007.safetensors",
11
+ "transformer.h.0.ln_1.bias": "model-00001-of-00007.safetensors",
12
+ "transformer.h.0.ln_1.weight": "model-00001-of-00007.safetensors",
13
+ "transformer.h.0.ln_2.bias": "model-00001-of-00007.safetensors",
14
+ "transformer.h.0.ln_2.weight": "model-00001-of-00007.safetensors",
15
+ "transformer.h.0.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
16
+ "transformer.h.0.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
17
+ "transformer.h.0.mlp.c_proj.bias": "model-00001-of-00007.safetensors",
18
+ "transformer.h.0.mlp.c_proj.weight": "model-00001-of-00007.safetensors",
19
+ "transformer.h.1.attn.c_attn.bias": "model-00001-of-00007.safetensors",
20
+ "transformer.h.1.attn.c_attn.weight": "model-00001-of-00007.safetensors",
21
+ "transformer.h.1.attn.c_proj.bias": "model-00001-of-00007.safetensors",
22
+ "transformer.h.1.attn.c_proj.weight": "model-00001-of-00007.safetensors",
23
+ "transformer.h.1.ln_1.bias": "model-00001-of-00007.safetensors",
24
+ "transformer.h.1.ln_1.weight": "model-00001-of-00007.safetensors",
25
+ "transformer.h.1.ln_2.bias": "model-00001-of-00007.safetensors",
26
+ "transformer.h.1.ln_2.weight": "model-00001-of-00007.safetensors",
27
+ "transformer.h.1.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
28
+ "transformer.h.1.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
29
+ "transformer.h.1.mlp.c_proj.bias": "model-00001-of-00007.safetensors",
30
+ "transformer.h.1.mlp.c_proj.weight": "model-00001-of-00007.safetensors",
31
+ "transformer.h.10.attn.c_attn.bias": "model-00002-of-00007.safetensors",
32
+ "transformer.h.10.attn.c_attn.weight": "model-00002-of-00007.safetensors",
33
+ "transformer.h.10.attn.c_proj.bias": "model-00002-of-00007.safetensors",
34
+ "transformer.h.10.attn.c_proj.weight": "model-00002-of-00007.safetensors",
35
+ "transformer.h.10.ln_1.bias": "model-00002-of-00007.safetensors",
36
+ "transformer.h.10.ln_1.weight": "model-00002-of-00007.safetensors",
37
+ "transformer.h.10.ln_2.bias": "model-00002-of-00007.safetensors",
38
+ "transformer.h.10.ln_2.weight": "model-00002-of-00007.safetensors",
39
+ "transformer.h.10.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
40
+ "transformer.h.10.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
41
+ "transformer.h.10.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
42
+ "transformer.h.10.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
43
+ "transformer.h.11.attn.c_attn.bias": "model-00002-of-00007.safetensors",
44
+ "transformer.h.11.attn.c_attn.weight": "model-00002-of-00007.safetensors",
45
+ "transformer.h.11.attn.c_proj.bias": "model-00002-of-00007.safetensors",
46
+ "transformer.h.11.attn.c_proj.weight": "model-00002-of-00007.safetensors",
47
+ "transformer.h.11.ln_1.bias": "model-00002-of-00007.safetensors",
48
+ "transformer.h.11.ln_1.weight": "model-00002-of-00007.safetensors",
49
+ "transformer.h.11.ln_2.bias": "model-00002-of-00007.safetensors",
50
+ "transformer.h.11.ln_2.weight": "model-00002-of-00007.safetensors",
51
+ "transformer.h.11.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
52
+ "transformer.h.11.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
53
+ "transformer.h.11.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
54
+ "transformer.h.11.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
55
+ "transformer.h.12.attn.c_attn.bias": "model-00002-of-00007.safetensors",
56
+ "transformer.h.12.attn.c_attn.weight": "model-00002-of-00007.safetensors",
57
+ "transformer.h.12.attn.c_proj.bias": "model-00003-of-00007.safetensors",
58
+ "transformer.h.12.attn.c_proj.weight": "model-00003-of-00007.safetensors",
59
+ "transformer.h.12.ln_1.bias": "model-00002-of-00007.safetensors",
60
+ "transformer.h.12.ln_1.weight": "model-00002-of-00007.safetensors",
61
+ "transformer.h.12.ln_2.bias": "model-00003-of-00007.safetensors",
62
+ "transformer.h.12.ln_2.weight": "model-00003-of-00007.safetensors",
63
+ "transformer.h.12.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
64
+ "transformer.h.12.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
65
+ "transformer.h.12.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
66
+ "transformer.h.12.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
67
+ "transformer.h.13.attn.c_attn.bias": "model-00003-of-00007.safetensors",
68
+ "transformer.h.13.attn.c_attn.weight": "model-00003-of-00007.safetensors",
69
+ "transformer.h.13.attn.c_proj.bias": "model-00003-of-00007.safetensors",
70
+ "transformer.h.13.attn.c_proj.weight": "model-00003-of-00007.safetensors",
71
+ "transformer.h.13.ln_1.bias": "model-00003-of-00007.safetensors",
72
+ "transformer.h.13.ln_1.weight": "model-00003-of-00007.safetensors",
73
+ "transformer.h.13.ln_2.bias": "model-00003-of-00007.safetensors",
74
+ "transformer.h.13.ln_2.weight": "model-00003-of-00007.safetensors",
75
+ "transformer.h.13.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
76
+ "transformer.h.13.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
77
+ "transformer.h.13.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
78
+ "transformer.h.13.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
79
+ "transformer.h.14.attn.c_attn.bias": "model-00003-of-00007.safetensors",
80
+ "transformer.h.14.attn.c_attn.weight": "model-00003-of-00007.safetensors",
81
+ "transformer.h.14.attn.c_proj.bias": "model-00003-of-00007.safetensors",
82
+ "transformer.h.14.attn.c_proj.weight": "model-00003-of-00007.safetensors",
83
+ "transformer.h.14.ln_1.bias": "model-00003-of-00007.safetensors",
84
+ "transformer.h.14.ln_1.weight": "model-00003-of-00007.safetensors",
85
+ "transformer.h.14.ln_2.bias": "model-00003-of-00007.safetensors",
86
+ "transformer.h.14.ln_2.weight": "model-00003-of-00007.safetensors",
87
+ "transformer.h.14.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
88
+ "transformer.h.14.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
89
+ "transformer.h.14.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
90
+ "transformer.h.14.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
91
+ "transformer.h.15.attn.c_attn.bias": "model-00003-of-00007.safetensors",
92
+ "transformer.h.15.attn.c_attn.weight": "model-00003-of-00007.safetensors",
93
+ "transformer.h.15.attn.c_proj.bias": "model-00003-of-00007.safetensors",
94
+ "transformer.h.15.attn.c_proj.weight": "model-00003-of-00007.safetensors",
95
+ "transformer.h.15.ln_1.bias": "model-00003-of-00007.safetensors",
96
+ "transformer.h.15.ln_1.weight": "model-00003-of-00007.safetensors",
97
+ "transformer.h.15.ln_2.bias": "model-00003-of-00007.safetensors",
98
+ "transformer.h.15.ln_2.weight": "model-00003-of-00007.safetensors",
99
+ "transformer.h.15.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
100
+ "transformer.h.15.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
101
+ "transformer.h.15.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
102
+ "transformer.h.15.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
103
+ "transformer.h.16.attn.c_attn.bias": "model-00003-of-00007.safetensors",
104
+ "transformer.h.16.attn.c_attn.weight": "model-00003-of-00007.safetensors",
105
+ "transformer.h.16.attn.c_proj.bias": "model-00003-of-00007.safetensors",
106
+ "transformer.h.16.attn.c_proj.weight": "model-00003-of-00007.safetensors",
107
+ "transformer.h.16.ln_1.bias": "model-00003-of-00007.safetensors",
108
+ "transformer.h.16.ln_1.weight": "model-00003-of-00007.safetensors",
109
+ "transformer.h.16.ln_2.bias": "model-00003-of-00007.safetensors",
110
+ "transformer.h.16.ln_2.weight": "model-00003-of-00007.safetensors",
111
+ "transformer.h.16.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
112
+ "transformer.h.16.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
113
+ "transformer.h.16.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
114
+ "transformer.h.16.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
115
+ "transformer.h.17.attn.c_attn.bias": "model-00003-of-00007.safetensors",
116
+ "transformer.h.17.attn.c_attn.weight": "model-00003-of-00007.safetensors",
117
+ "transformer.h.17.attn.c_proj.bias": "model-00003-of-00007.safetensors",
118
+ "transformer.h.17.attn.c_proj.weight": "model-00003-of-00007.safetensors",
119
+ "transformer.h.17.ln_1.bias": "model-00003-of-00007.safetensors",
120
+ "transformer.h.17.ln_1.weight": "model-00003-of-00007.safetensors",
121
+ "transformer.h.17.ln_2.bias": "model-00003-of-00007.safetensors",
122
+ "transformer.h.17.ln_2.weight": "model-00003-of-00007.safetensors",
123
+ "transformer.h.17.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
124
+ "transformer.h.17.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
125
+ "transformer.h.17.mlp.c_proj.bias": "model-00003-of-00007.safetensors",
126
+ "transformer.h.17.mlp.c_proj.weight": "model-00003-of-00007.safetensors",
127
+ "transformer.h.18.attn.c_attn.bias": "model-00003-of-00007.safetensors",
128
+ "transformer.h.18.attn.c_attn.weight": "model-00003-of-00007.safetensors",
129
+ "transformer.h.18.attn.c_proj.bias": "model-00003-of-00007.safetensors",
130
+ "transformer.h.18.attn.c_proj.weight": "model-00003-of-00007.safetensors",
131
+ "transformer.h.18.ln_1.bias": "model-00003-of-00007.safetensors",
132
+ "transformer.h.18.ln_1.weight": "model-00003-of-00007.safetensors",
133
+ "transformer.h.18.ln_2.bias": "model-00003-of-00007.safetensors",
134
+ "transformer.h.18.ln_2.weight": "model-00003-of-00007.safetensors",
135
+ "transformer.h.18.mlp.c_fc.bias": "model-00003-of-00007.safetensors",
136
+ "transformer.h.18.mlp.c_fc.weight": "model-00003-of-00007.safetensors",
137
+ "transformer.h.18.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
138
+ "transformer.h.18.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
139
+ "transformer.h.19.attn.c_attn.bias": "model-00004-of-00007.safetensors",
140
+ "transformer.h.19.attn.c_attn.weight": "model-00004-of-00007.safetensors",
141
+ "transformer.h.19.attn.c_proj.bias": "model-00004-of-00007.safetensors",
142
+ "transformer.h.19.attn.c_proj.weight": "model-00004-of-00007.safetensors",
143
+ "transformer.h.19.ln_1.bias": "model-00004-of-00007.safetensors",
144
+ "transformer.h.19.ln_1.weight": "model-00004-of-00007.safetensors",
145
+ "transformer.h.19.ln_2.bias": "model-00004-of-00007.safetensors",
146
+ "transformer.h.19.ln_2.weight": "model-00004-of-00007.safetensors",
147
+ "transformer.h.19.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
148
+ "transformer.h.19.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
149
+ "transformer.h.19.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
150
+ "transformer.h.19.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
151
+ "transformer.h.2.attn.c_attn.bias": "model-00001-of-00007.safetensors",
152
+ "transformer.h.2.attn.c_attn.weight": "model-00001-of-00007.safetensors",
153
+ "transformer.h.2.attn.c_proj.bias": "model-00001-of-00007.safetensors",
154
+ "transformer.h.2.attn.c_proj.weight": "model-00001-of-00007.safetensors",
155
+ "transformer.h.2.ln_1.bias": "model-00001-of-00007.safetensors",
156
+ "transformer.h.2.ln_1.weight": "model-00001-of-00007.safetensors",
157
+ "transformer.h.2.ln_2.bias": "model-00001-of-00007.safetensors",
158
+ "transformer.h.2.ln_2.weight": "model-00001-of-00007.safetensors",
159
+ "transformer.h.2.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
160
+ "transformer.h.2.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
161
+ "transformer.h.2.mlp.c_proj.bias": "model-00001-of-00007.safetensors",
162
+ "transformer.h.2.mlp.c_proj.weight": "model-00001-of-00007.safetensors",
163
+ "transformer.h.20.attn.c_attn.bias": "model-00004-of-00007.safetensors",
164
+ "transformer.h.20.attn.c_attn.weight": "model-00004-of-00007.safetensors",
165
+ "transformer.h.20.attn.c_proj.bias": "model-00004-of-00007.safetensors",
166
+ "transformer.h.20.attn.c_proj.weight": "model-00004-of-00007.safetensors",
167
+ "transformer.h.20.ln_1.bias": "model-00004-of-00007.safetensors",
168
+ "transformer.h.20.ln_1.weight": "model-00004-of-00007.safetensors",
169
+ "transformer.h.20.ln_2.bias": "model-00004-of-00007.safetensors",
170
+ "transformer.h.20.ln_2.weight": "model-00004-of-00007.safetensors",
171
+ "transformer.h.20.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
172
+ "transformer.h.20.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
173
+ "transformer.h.20.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
174
+ "transformer.h.20.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
175
+ "transformer.h.21.attn.c_attn.bias": "model-00004-of-00007.safetensors",
176
+ "transformer.h.21.attn.c_attn.weight": "model-00004-of-00007.safetensors",
177
+ "transformer.h.21.attn.c_proj.bias": "model-00004-of-00007.safetensors",
178
+ "transformer.h.21.attn.c_proj.weight": "model-00004-of-00007.safetensors",
179
+ "transformer.h.21.ln_1.bias": "model-00004-of-00007.safetensors",
180
+ "transformer.h.21.ln_1.weight": "model-00004-of-00007.safetensors",
181
+ "transformer.h.21.ln_2.bias": "model-00004-of-00007.safetensors",
182
+ "transformer.h.21.ln_2.weight": "model-00004-of-00007.safetensors",
183
+ "transformer.h.21.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
184
+ "transformer.h.21.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
185
+ "transformer.h.21.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
186
+ "transformer.h.21.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
187
+ "transformer.h.22.attn.c_attn.bias": "model-00004-of-00007.safetensors",
188
+ "transformer.h.22.attn.c_attn.weight": "model-00004-of-00007.safetensors",
189
+ "transformer.h.22.attn.c_proj.bias": "model-00004-of-00007.safetensors",
190
+ "transformer.h.22.attn.c_proj.weight": "model-00004-of-00007.safetensors",
191
+ "transformer.h.22.ln_1.bias": "model-00004-of-00007.safetensors",
192
+ "transformer.h.22.ln_1.weight": "model-00004-of-00007.safetensors",
193
+ "transformer.h.22.ln_2.bias": "model-00004-of-00007.safetensors",
194
+ "transformer.h.22.ln_2.weight": "model-00004-of-00007.safetensors",
195
+ "transformer.h.22.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
196
+ "transformer.h.22.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
197
+ "transformer.h.22.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
198
+ "transformer.h.22.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
199
+ "transformer.h.23.attn.c_attn.bias": "model-00004-of-00007.safetensors",
200
+ "transformer.h.23.attn.c_attn.weight": "model-00004-of-00007.safetensors",
201
+ "transformer.h.23.attn.c_proj.bias": "model-00004-of-00007.safetensors",
202
+ "transformer.h.23.attn.c_proj.weight": "model-00004-of-00007.safetensors",
203
+ "transformer.h.23.ln_1.bias": "model-00004-of-00007.safetensors",
204
+ "transformer.h.23.ln_1.weight": "model-00004-of-00007.safetensors",
205
+ "transformer.h.23.ln_2.bias": "model-00004-of-00007.safetensors",
206
+ "transformer.h.23.ln_2.weight": "model-00004-of-00007.safetensors",
207
+ "transformer.h.23.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
208
+ "transformer.h.23.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
209
+ "transformer.h.23.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
210
+ "transformer.h.23.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
211
+ "transformer.h.24.attn.c_attn.bias": "model-00004-of-00007.safetensors",
212
+ "transformer.h.24.attn.c_attn.weight": "model-00004-of-00007.safetensors",
213
+ "transformer.h.24.attn.c_proj.bias": "model-00004-of-00007.safetensors",
214
+ "transformer.h.24.attn.c_proj.weight": "model-00004-of-00007.safetensors",
215
+ "transformer.h.24.ln_1.bias": "model-00004-of-00007.safetensors",
216
+ "transformer.h.24.ln_1.weight": "model-00004-of-00007.safetensors",
217
+ "transformer.h.24.ln_2.bias": "model-00004-of-00007.safetensors",
218
+ "transformer.h.24.ln_2.weight": "model-00004-of-00007.safetensors",
219
+ "transformer.h.24.mlp.c_fc.bias": "model-00004-of-00007.safetensors",
220
+ "transformer.h.24.mlp.c_fc.weight": "model-00004-of-00007.safetensors",
221
+ "transformer.h.24.mlp.c_proj.bias": "model-00004-of-00007.safetensors",
222
+ "transformer.h.24.mlp.c_proj.weight": "model-00004-of-00007.safetensors",
223
+ "transformer.h.25.attn.c_attn.bias": "model-00004-of-00007.safetensors",
224
+ "transformer.h.25.attn.c_attn.weight": "model-00004-of-00007.safetensors",
225
+ "transformer.h.25.attn.c_proj.bias": "model-00005-of-00007.safetensors",
226
+ "transformer.h.25.attn.c_proj.weight": "model-00005-of-00007.safetensors",
227
+ "transformer.h.25.ln_1.bias": "model-00004-of-00007.safetensors",
228
+ "transformer.h.25.ln_1.weight": "model-00004-of-00007.safetensors",
229
+ "transformer.h.25.ln_2.bias": "model-00005-of-00007.safetensors",
230
+ "transformer.h.25.ln_2.weight": "model-00005-of-00007.safetensors",
231
+ "transformer.h.25.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
232
+ "transformer.h.25.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
233
+ "transformer.h.25.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
234
+ "transformer.h.25.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
235
+ "transformer.h.26.attn.c_attn.bias": "model-00005-of-00007.safetensors",
236
+ "transformer.h.26.attn.c_attn.weight": "model-00005-of-00007.safetensors",
237
+ "transformer.h.26.attn.c_proj.bias": "model-00005-of-00007.safetensors",
238
+ "transformer.h.26.attn.c_proj.weight": "model-00005-of-00007.safetensors",
239
+ "transformer.h.26.ln_1.bias": "model-00005-of-00007.safetensors",
240
+ "transformer.h.26.ln_1.weight": "model-00005-of-00007.safetensors",
241
+ "transformer.h.26.ln_2.bias": "model-00005-of-00007.safetensors",
242
+ "transformer.h.26.ln_2.weight": "model-00005-of-00007.safetensors",
243
+ "transformer.h.26.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
244
+ "transformer.h.26.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
245
+ "transformer.h.26.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
246
+ "transformer.h.26.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
247
+ "transformer.h.27.attn.c_attn.bias": "model-00005-of-00007.safetensors",
248
+ "transformer.h.27.attn.c_attn.weight": "model-00005-of-00007.safetensors",
249
+ "transformer.h.27.attn.c_proj.bias": "model-00005-of-00007.safetensors",
250
+ "transformer.h.27.attn.c_proj.weight": "model-00005-of-00007.safetensors",
251
+ "transformer.h.27.ln_1.bias": "model-00005-of-00007.safetensors",
252
+ "transformer.h.27.ln_1.weight": "model-00005-of-00007.safetensors",
253
+ "transformer.h.27.ln_2.bias": "model-00005-of-00007.safetensors",
254
+ "transformer.h.27.ln_2.weight": "model-00005-of-00007.safetensors",
255
+ "transformer.h.27.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
256
+ "transformer.h.27.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
257
+ "transformer.h.27.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
258
+ "transformer.h.27.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
259
+ "transformer.h.28.attn.c_attn.bias": "model-00005-of-00007.safetensors",
260
+ "transformer.h.28.attn.c_attn.weight": "model-00005-of-00007.safetensors",
261
+ "transformer.h.28.attn.c_proj.bias": "model-00005-of-00007.safetensors",
262
+ "transformer.h.28.attn.c_proj.weight": "model-00005-of-00007.safetensors",
263
+ "transformer.h.28.ln_1.bias": "model-00005-of-00007.safetensors",
264
+ "transformer.h.28.ln_1.weight": "model-00005-of-00007.safetensors",
265
+ "transformer.h.28.ln_2.bias": "model-00005-of-00007.safetensors",
266
+ "transformer.h.28.ln_2.weight": "model-00005-of-00007.safetensors",
267
+ "transformer.h.28.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
268
+ "transformer.h.28.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
269
+ "transformer.h.28.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
270
+ "transformer.h.28.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
271
+ "transformer.h.29.attn.c_attn.bias": "model-00005-of-00007.safetensors",
272
+ "transformer.h.29.attn.c_attn.weight": "model-00005-of-00007.safetensors",
273
+ "transformer.h.29.attn.c_proj.bias": "model-00005-of-00007.safetensors",
274
+ "transformer.h.29.attn.c_proj.weight": "model-00005-of-00007.safetensors",
275
+ "transformer.h.29.ln_1.bias": "model-00005-of-00007.safetensors",
276
+ "transformer.h.29.ln_1.weight": "model-00005-of-00007.safetensors",
277
+ "transformer.h.29.ln_2.bias": "model-00005-of-00007.safetensors",
278
+ "transformer.h.29.ln_2.weight": "model-00005-of-00007.safetensors",
279
+ "transformer.h.29.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
280
+ "transformer.h.29.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
281
+ "transformer.h.29.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
282
+ "transformer.h.29.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
283
+ "transformer.h.3.attn.c_attn.bias": "model-00001-of-00007.safetensors",
284
+ "transformer.h.3.attn.c_attn.weight": "model-00001-of-00007.safetensors",
285
+ "transformer.h.3.attn.c_proj.bias": "model-00001-of-00007.safetensors",
286
+ "transformer.h.3.attn.c_proj.weight": "model-00001-of-00007.safetensors",
287
+ "transformer.h.3.ln_1.bias": "model-00001-of-00007.safetensors",
288
+ "transformer.h.3.ln_1.weight": "model-00001-of-00007.safetensors",
289
+ "transformer.h.3.ln_2.bias": "model-00001-of-00007.safetensors",
290
+ "transformer.h.3.ln_2.weight": "model-00001-of-00007.safetensors",
291
+ "transformer.h.3.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
292
+ "transformer.h.3.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
293
+ "transformer.h.3.mlp.c_proj.bias": "model-00001-of-00007.safetensors",
294
+ "transformer.h.3.mlp.c_proj.weight": "model-00001-of-00007.safetensors",
295
+ "transformer.h.30.attn.c_attn.bias": "model-00005-of-00007.safetensors",
296
+ "transformer.h.30.attn.c_attn.weight": "model-00005-of-00007.safetensors",
297
+ "transformer.h.30.attn.c_proj.bias": "model-00005-of-00007.safetensors",
298
+ "transformer.h.30.attn.c_proj.weight": "model-00005-of-00007.safetensors",
299
+ "transformer.h.30.ln_1.bias": "model-00005-of-00007.safetensors",
300
+ "transformer.h.30.ln_1.weight": "model-00005-of-00007.safetensors",
301
+ "transformer.h.30.ln_2.bias": "model-00005-of-00007.safetensors",
302
+ "transformer.h.30.ln_2.weight": "model-00005-of-00007.safetensors",
303
+ "transformer.h.30.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
304
+ "transformer.h.30.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
305
+ "transformer.h.30.mlp.c_proj.bias": "model-00005-of-00007.safetensors",
306
+ "transformer.h.30.mlp.c_proj.weight": "model-00005-of-00007.safetensors",
307
+ "transformer.h.31.attn.c_attn.bias": "model-00005-of-00007.safetensors",
308
+ "transformer.h.31.attn.c_attn.weight": "model-00005-of-00007.safetensors",
309
+ "transformer.h.31.attn.c_proj.bias": "model-00005-of-00007.safetensors",
310
+ "transformer.h.31.attn.c_proj.weight": "model-00005-of-00007.safetensors",
311
+ "transformer.h.31.ln_1.bias": "model-00005-of-00007.safetensors",
312
+ "transformer.h.31.ln_1.weight": "model-00005-of-00007.safetensors",
313
+ "transformer.h.31.ln_2.bias": "model-00005-of-00007.safetensors",
314
+ "transformer.h.31.ln_2.weight": "model-00005-of-00007.safetensors",
315
+ "transformer.h.31.mlp.c_fc.bias": "model-00005-of-00007.safetensors",
316
+ "transformer.h.31.mlp.c_fc.weight": "model-00005-of-00007.safetensors",
317
+ "transformer.h.31.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
318
+ "transformer.h.31.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
319
+ "transformer.h.32.attn.c_attn.bias": "model-00006-of-00007.safetensors",
320
+ "transformer.h.32.attn.c_attn.weight": "model-00006-of-00007.safetensors",
321
+ "transformer.h.32.attn.c_proj.bias": "model-00006-of-00007.safetensors",
322
+ "transformer.h.32.attn.c_proj.weight": "model-00006-of-00007.safetensors",
323
+ "transformer.h.32.ln_1.bias": "model-00006-of-00007.safetensors",
324
+ "transformer.h.32.ln_1.weight": "model-00006-of-00007.safetensors",
325
+ "transformer.h.32.ln_2.bias": "model-00006-of-00007.safetensors",
326
+ "transformer.h.32.ln_2.weight": "model-00006-of-00007.safetensors",
327
+ "transformer.h.32.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
328
+ "transformer.h.32.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
329
+ "transformer.h.32.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
330
+ "transformer.h.32.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
331
+ "transformer.h.33.attn.c_attn.bias": "model-00006-of-00007.safetensors",
332
+ "transformer.h.33.attn.c_attn.weight": "model-00006-of-00007.safetensors",
333
+ "transformer.h.33.attn.c_proj.bias": "model-00006-of-00007.safetensors",
334
+ "transformer.h.33.attn.c_proj.weight": "model-00006-of-00007.safetensors",
335
+ "transformer.h.33.ln_1.bias": "model-00006-of-00007.safetensors",
336
+ "transformer.h.33.ln_1.weight": "model-00006-of-00007.safetensors",
337
+ "transformer.h.33.ln_2.bias": "model-00006-of-00007.safetensors",
338
+ "transformer.h.33.ln_2.weight": "model-00006-of-00007.safetensors",
339
+ "transformer.h.33.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
340
+ "transformer.h.33.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
341
+ "transformer.h.33.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
342
+ "transformer.h.33.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
343
+ "transformer.h.34.attn.c_attn.bias": "model-00006-of-00007.safetensors",
344
+ "transformer.h.34.attn.c_attn.weight": "model-00006-of-00007.safetensors",
345
+ "transformer.h.34.attn.c_proj.bias": "model-00006-of-00007.safetensors",
346
+ "transformer.h.34.attn.c_proj.weight": "model-00006-of-00007.safetensors",
347
+ "transformer.h.34.ln_1.bias": "model-00006-of-00007.safetensors",
348
+ "transformer.h.34.ln_1.weight": "model-00006-of-00007.safetensors",
349
+ "transformer.h.34.ln_2.bias": "model-00006-of-00007.safetensors",
350
+ "transformer.h.34.ln_2.weight": "model-00006-of-00007.safetensors",
351
+ "transformer.h.34.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
352
+ "transformer.h.34.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
353
+ "transformer.h.34.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
354
+ "transformer.h.34.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
355
+ "transformer.h.35.attn.c_attn.bias": "model-00006-of-00007.safetensors",
356
+ "transformer.h.35.attn.c_attn.weight": "model-00006-of-00007.safetensors",
357
+ "transformer.h.35.attn.c_proj.bias": "model-00006-of-00007.safetensors",
358
+ "transformer.h.35.attn.c_proj.weight": "model-00006-of-00007.safetensors",
359
+ "transformer.h.35.ln_1.bias": "model-00006-of-00007.safetensors",
360
+ "transformer.h.35.ln_1.weight": "model-00006-of-00007.safetensors",
361
+ "transformer.h.35.ln_2.bias": "model-00006-of-00007.safetensors",
362
+ "transformer.h.35.ln_2.weight": "model-00006-of-00007.safetensors",
363
+ "transformer.h.35.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
364
+ "transformer.h.35.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
365
+ "transformer.h.35.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
366
+ "transformer.h.35.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
367
+ "transformer.h.36.attn.c_attn.bias": "model-00006-of-00007.safetensors",
368
+ "transformer.h.36.attn.c_attn.weight": "model-00006-of-00007.safetensors",
369
+ "transformer.h.36.attn.c_proj.bias": "model-00006-of-00007.safetensors",
370
+ "transformer.h.36.attn.c_proj.weight": "model-00006-of-00007.safetensors",
371
+ "transformer.h.36.ln_1.bias": "model-00006-of-00007.safetensors",
372
+ "transformer.h.36.ln_1.weight": "model-00006-of-00007.safetensors",
373
+ "transformer.h.36.ln_2.bias": "model-00006-of-00007.safetensors",
374
+ "transformer.h.36.ln_2.weight": "model-00006-of-00007.safetensors",
375
+ "transformer.h.36.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
376
+ "transformer.h.36.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
377
+ "transformer.h.36.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
378
+ "transformer.h.36.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
379
+ "transformer.h.37.attn.c_attn.bias": "model-00006-of-00007.safetensors",
380
+ "transformer.h.37.attn.c_attn.weight": "model-00006-of-00007.safetensors",
381
+ "transformer.h.37.attn.c_proj.bias": "model-00006-of-00007.safetensors",
382
+ "transformer.h.37.attn.c_proj.weight": "model-00006-of-00007.safetensors",
383
+ "transformer.h.37.ln_1.bias": "model-00006-of-00007.safetensors",
384
+ "transformer.h.37.ln_1.weight": "model-00006-of-00007.safetensors",
385
+ "transformer.h.37.ln_2.bias": "model-00006-of-00007.safetensors",
386
+ "transformer.h.37.ln_2.weight": "model-00006-of-00007.safetensors",
387
+ "transformer.h.37.mlp.c_fc.bias": "model-00006-of-00007.safetensors",
388
+ "transformer.h.37.mlp.c_fc.weight": "model-00006-of-00007.safetensors",
389
+ "transformer.h.37.mlp.c_proj.bias": "model-00006-of-00007.safetensors",
390
+ "transformer.h.37.mlp.c_proj.weight": "model-00006-of-00007.safetensors",
391
+ "transformer.h.38.attn.c_attn.bias": "model-00006-of-00007.safetensors",
392
+ "transformer.h.38.attn.c_attn.weight": "model-00006-of-00007.safetensors",
393
+ "transformer.h.38.attn.c_proj.bias": "model-00007-of-00007.safetensors",
394
+ "transformer.h.38.attn.c_proj.weight": "model-00007-of-00007.safetensors",
395
+ "transformer.h.38.ln_1.bias": "model-00006-of-00007.safetensors",
396
+ "transformer.h.38.ln_1.weight": "model-00006-of-00007.safetensors",
397
+ "transformer.h.38.ln_2.bias": "model-00007-of-00007.safetensors",
398
+ "transformer.h.38.ln_2.weight": "model-00007-of-00007.safetensors",
399
+ "transformer.h.38.mlp.c_fc.bias": "model-00007-of-00007.safetensors",
400
+ "transformer.h.38.mlp.c_fc.weight": "model-00007-of-00007.safetensors",
401
+ "transformer.h.38.mlp.c_proj.bias": "model-00007-of-00007.safetensors",
402
+ "transformer.h.38.mlp.c_proj.weight": "model-00007-of-00007.safetensors",
403
+ "transformer.h.39.attn.c_attn.bias": "model-00007-of-00007.safetensors",
404
+ "transformer.h.39.attn.c_attn.weight": "model-00007-of-00007.safetensors",
405
+ "transformer.h.39.attn.c_proj.bias": "model-00007-of-00007.safetensors",
406
+ "transformer.h.39.attn.c_proj.weight": "model-00007-of-00007.safetensors",
407
+ "transformer.h.39.ln_1.bias": "model-00007-of-00007.safetensors",
408
+ "transformer.h.39.ln_1.weight": "model-00007-of-00007.safetensors",
409
+ "transformer.h.39.ln_2.bias": "model-00007-of-00007.safetensors",
410
+ "transformer.h.39.ln_2.weight": "model-00007-of-00007.safetensors",
411
+ "transformer.h.39.mlp.c_fc.bias": "model-00007-of-00007.safetensors",
412
+ "transformer.h.39.mlp.c_fc.weight": "model-00007-of-00007.safetensors",
413
+ "transformer.h.39.mlp.c_proj.bias": "model-00007-of-00007.safetensors",
414
+ "transformer.h.39.mlp.c_proj.weight": "model-00007-of-00007.safetensors",
415
+ "transformer.h.4.attn.c_attn.bias": "model-00001-of-00007.safetensors",
416
+ "transformer.h.4.attn.c_attn.weight": "model-00001-of-00007.safetensors",
417
+ "transformer.h.4.attn.c_proj.bias": "model-00001-of-00007.safetensors",
418
+ "transformer.h.4.attn.c_proj.weight": "model-00001-of-00007.safetensors",
419
+ "transformer.h.4.ln_1.bias": "model-00001-of-00007.safetensors",
420
+ "transformer.h.4.ln_1.weight": "model-00001-of-00007.safetensors",
421
+ "transformer.h.4.ln_2.bias": "model-00001-of-00007.safetensors",
422
+ "transformer.h.4.ln_2.weight": "model-00001-of-00007.safetensors",
423
+ "transformer.h.4.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
424
+ "transformer.h.4.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
425
+ "transformer.h.4.mlp.c_proj.bias": "model-00001-of-00007.safetensors",
426
+ "transformer.h.4.mlp.c_proj.weight": "model-00001-of-00007.safetensors",
427
+ "transformer.h.5.attn.c_attn.bias": "model-00001-of-00007.safetensors",
428
+ "transformer.h.5.attn.c_attn.weight": "model-00001-of-00007.safetensors",
429
+ "transformer.h.5.attn.c_proj.bias": "model-00001-of-00007.safetensors",
430
+ "transformer.h.5.attn.c_proj.weight": "model-00001-of-00007.safetensors",
431
+ "transformer.h.5.ln_1.bias": "model-00001-of-00007.safetensors",
432
+ "transformer.h.5.ln_1.weight": "model-00001-of-00007.safetensors",
433
+ "transformer.h.5.ln_2.bias": "model-00001-of-00007.safetensors",
434
+ "transformer.h.5.ln_2.weight": "model-00001-of-00007.safetensors",
435
+ "transformer.h.5.mlp.c_fc.bias": "model-00001-of-00007.safetensors",
436
+ "transformer.h.5.mlp.c_fc.weight": "model-00001-of-00007.safetensors",
437
+ "transformer.h.5.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
438
+ "transformer.h.5.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
439
+ "transformer.h.6.attn.c_attn.bias": "model-00002-of-00007.safetensors",
440
+ "transformer.h.6.attn.c_attn.weight": "model-00002-of-00007.safetensors",
441
+ "transformer.h.6.attn.c_proj.bias": "model-00002-of-00007.safetensors",
442
+ "transformer.h.6.attn.c_proj.weight": "model-00002-of-00007.safetensors",
443
+ "transformer.h.6.ln_1.bias": "model-00002-of-00007.safetensors",
444
+ "transformer.h.6.ln_1.weight": "model-00002-of-00007.safetensors",
445
+ "transformer.h.6.ln_2.bias": "model-00002-of-00007.safetensors",
446
+ "transformer.h.6.ln_2.weight": "model-00002-of-00007.safetensors",
447
+ "transformer.h.6.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
448
+ "transformer.h.6.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
449
+ "transformer.h.6.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
450
+ "transformer.h.6.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
451
+ "transformer.h.7.attn.c_attn.bias": "model-00002-of-00007.safetensors",
452
+ "transformer.h.7.attn.c_attn.weight": "model-00002-of-00007.safetensors",
453
+ "transformer.h.7.attn.c_proj.bias": "model-00002-of-00007.safetensors",
454
+ "transformer.h.7.attn.c_proj.weight": "model-00002-of-00007.safetensors",
455
+ "transformer.h.7.ln_1.bias": "model-00002-of-00007.safetensors",
456
+ "transformer.h.7.ln_1.weight": "model-00002-of-00007.safetensors",
457
+ "transformer.h.7.ln_2.bias": "model-00002-of-00007.safetensors",
458
+ "transformer.h.7.ln_2.weight": "model-00002-of-00007.safetensors",
459
+ "transformer.h.7.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
460
+ "transformer.h.7.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
461
+ "transformer.h.7.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
462
+ "transformer.h.7.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
463
+ "transformer.h.8.attn.c_attn.bias": "model-00002-of-00007.safetensors",
464
+ "transformer.h.8.attn.c_attn.weight": "model-00002-of-00007.safetensors",
465
+ "transformer.h.8.attn.c_proj.bias": "model-00002-of-00007.safetensors",
466
+ "transformer.h.8.attn.c_proj.weight": "model-00002-of-00007.safetensors",
467
+ "transformer.h.8.ln_1.bias": "model-00002-of-00007.safetensors",
468
+ "transformer.h.8.ln_1.weight": "model-00002-of-00007.safetensors",
469
+ "transformer.h.8.ln_2.bias": "model-00002-of-00007.safetensors",
470
+ "transformer.h.8.ln_2.weight": "model-00002-of-00007.safetensors",
471
+ "transformer.h.8.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
472
+ "transformer.h.8.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
473
+ "transformer.h.8.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
474
+ "transformer.h.8.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
475
+ "transformer.h.9.attn.c_attn.bias": "model-00002-of-00007.safetensors",
476
+ "transformer.h.9.attn.c_attn.weight": "model-00002-of-00007.safetensors",
477
+ "transformer.h.9.attn.c_proj.bias": "model-00002-of-00007.safetensors",
478
+ "transformer.h.9.attn.c_proj.weight": "model-00002-of-00007.safetensors",
479
+ "transformer.h.9.ln_1.bias": "model-00002-of-00007.safetensors",
480
+ "transformer.h.9.ln_1.weight": "model-00002-of-00007.safetensors",
481
+ "transformer.h.9.ln_2.bias": "model-00002-of-00007.safetensors",
482
+ "transformer.h.9.ln_2.weight": "model-00002-of-00007.safetensors",
483
+ "transformer.h.9.mlp.c_fc.bias": "model-00002-of-00007.safetensors",
484
+ "transformer.h.9.mlp.c_fc.weight": "model-00002-of-00007.safetensors",
485
+ "transformer.h.9.mlp.c_proj.bias": "model-00002-of-00007.safetensors",
486
+ "transformer.h.9.mlp.c_proj.weight": "model-00002-of-00007.safetensors",
487
+ "transformer.ln_f.bias": "model-00007-of-00007.safetensors",
488
+ "transformer.ln_f.weight": "model-00007-of-00007.safetensors",
489
+ "transformer.wpe.weight": "model-00001-of-00007.safetensors",
490
+ "transformer.wte.weight": "model-00001-of-00007.safetensors"
491
+ }
492
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|endoftext|>",
4
+ "<fim_prefix>",
5
+ "<fim_middle>",
6
+ "<fim_suffix>",
7
+ "<fim_pad>",
8
+ "<filename>",
9
+ "<gh_stars>",
10
+ "<issue_start>",
11
+ "<issue_comment>",
12
+ "<issue_closed>",
13
+ "<jupyter_start>",
14
+ "<jupyter_text>",
15
+ "<jupyter_code>",
16
+ "<jupyter_output>",
17
+ "<empty_output>",
18
+ "<commit_before>",
19
+ "<commit_msg>",
20
+ "<commit_after>",
21
+ "<reponame>",
22
+ {
23
+ "content": "<START>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false
28
+ },
29
+ {
30
+ "content": "<END>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false
35
+ }
36
+ ],
37
+ "bos_token": {
38
+ "content": "<|endoftext|>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "eos_token": {
45
+ "content": "<|endoftext|>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ },
51
+ "unk_token": {
52
+ "content": "<|endoftext|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false
57
+ }
58
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,203 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<fim_prefix>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "<fim_middle>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<fim_suffix>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "4": {
37
+ "content": "<fim_pad>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "5": {
45
+ "content": "<filename>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "6": {
53
+ "content": "<gh_stars>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "7": {
61
+ "content": "<issue_start>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "8": {
69
+ "content": "<issue_comment>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": true
75
+ },
76
+ "9": {
77
+ "content": "<issue_closed>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": true
83
+ },
84
+ "10": {
85
+ "content": "<jupyter_start>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": true
91
+ },
92
+ "11": {
93
+ "content": "<jupyter_text>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": true
99
+ },
100
+ "12": {
101
+ "content": "<jupyter_code>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": true
107
+ },
108
+ "13": {
109
+ "content": "<jupyter_output>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": true
115
+ },
116
+ "14": {
117
+ "content": "<empty_output>",
118
+ "lstrip": false,
119
+ "normalized": false,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": true
123
+ },
124
+ "15": {
125
+ "content": "<commit_before>",
126
+ "lstrip": false,
127
+ "normalized": false,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": true
131
+ },
132
+ "16": {
133
+ "content": "<commit_msg>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": true
139
+ },
140
+ "17": {
141
+ "content": "<commit_after>",
142
+ "lstrip": false,
143
+ "normalized": false,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": true
147
+ },
148
+ "18": {
149
+ "content": "<reponame>",
150
+ "lstrip": false,
151
+ "normalized": false,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": true
155
+ },
156
+ "49152": {
157
+ "content": "<START>",
158
+ "lstrip": false,
159
+ "normalized": false,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": true
163
+ },
164
+ "49153": {
165
+ "content": "<END>",
166
+ "lstrip": false,
167
+ "normalized": false,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": true
171
+ }
172
+ },
173
+ "additional_special_tokens": [
174
+ "<|endoftext|>",
175
+ "<fim_prefix>",
176
+ "<fim_middle>",
177
+ "<fim_suffix>",
178
+ "<fim_pad>",
179
+ "<filename>",
180
+ "<gh_stars>",
181
+ "<issue_start>",
182
+ "<issue_comment>",
183
+ "<issue_closed>",
184
+ "<jupyter_start>",
185
+ "<jupyter_text>",
186
+ "<jupyter_code>",
187
+ "<jupyter_output>",
188
+ "<empty_output>",
189
+ "<commit_before>",
190
+ "<commit_msg>",
191
+ "<commit_after>",
192
+ "<reponame>",
193
+ "<START>",
194
+ "<END>"
195
+ ],
196
+ "bos_token": "<|endoftext|>",
197
+ "clean_up_tokenization_spaces": true,
198
+ "eos_token": "<|endoftext|>",
199
+ "model_max_length": 1000000000000000019884624838656,
200
+ "tokenizer_class": "GPT2Tokenizer",
201
+ "unk_token": "<|endoftext|>",
202
+ "vocab_size": 49152
203
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff