Primeness commited on
Commit
b0d0f3f
·
verified ·
1 Parent(s): 1e61a5a

Upload tokenizer

Browse files
Files changed (4) hide show
  1. README.md +199 -0
  2. special_tokens_map.json +30 -0
  3. tokenizer.json +404 -0
  4. tokenizer_config.json +44 -0
README.md ADDED
@@ -0,0 +1,199 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags: []
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+ This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": true,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": true,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
@@ -0,0 +1,404 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "version": "1.0",
3
+ "truncation": null,
4
+ "padding": null,
5
+ "added_tokens": [
6
+ {
7
+ "id": 0,
8
+ "content": "<unk>",
9
+ "single_word": false,
10
+ "lstrip": false,
11
+ "rstrip": false,
12
+ "normalized": false,
13
+ "special": true
14
+ },
15
+ {
16
+ "id": 1,
17
+ "content": "<s>",
18
+ "single_word": false,
19
+ "lstrip": false,
20
+ "rstrip": false,
21
+ "normalized": false,
22
+ "special": true
23
+ },
24
+ {
25
+ "id": 2,
26
+ "content": "</s>",
27
+ "single_word": false,
28
+ "lstrip": false,
29
+ "rstrip": true,
30
+ "normalized": false,
31
+ "special": true
32
+ }
33
+ ],
34
+ "normalizer": null,
35
+ "pre_tokenizer": {
36
+ "type": "Metaspace",
37
+ "replacement": "▁",
38
+ "prepend_scheme": "never",
39
+ "split": false
40
+ },
41
+ "post_processor": {
42
+ "type": "TemplateProcessing",
43
+ "single": [
44
+ {
45
+ "Sequence": {
46
+ "id": "A",
47
+ "type_id": 0
48
+ }
49
+ }
50
+ ],
51
+ "pair": [
52
+ {
53
+ "Sequence": {
54
+ "id": "A",
55
+ "type_id": 0
56
+ }
57
+ },
58
+ {
59
+ "Sequence": {
60
+ "id": "B",
61
+ "type_id": 1
62
+ }
63
+ }
64
+ ],
65
+ "special_tokens": {}
66
+ },
67
+ "decoder": {
68
+ "type": "Sequence",
69
+ "decoders": [
70
+ {
71
+ "type": "Replace",
72
+ "pattern": {
73
+ "String": "▁"
74
+ },
75
+ "content": " "
76
+ },
77
+ {
78
+ "type": "ByteFallback"
79
+ },
80
+ {
81
+ "type": "Fuse"
82
+ }
83
+ ]
84
+ },
85
+ "model": {
86
+ "type": "BPE",
87
+ "dropout": null,
88
+ "unk_token": "<unk>",
89
+ "continuing_subword_prefix": null,
90
+ "end_of_word_suffix": null,
91
+ "fuse_unk": true,
92
+ "byte_fallback": true,
93
+ "ignore_merges": false,
94
+ "vocab": {
95
+ "<unk>": 0,
96
+ "<s>": 1,
97
+ "</s>": 2,
98
+ "<0x00>": 3,
99
+ "<0x01>": 4,
100
+ "<0x02>": 5,
101
+ "<0x03>": 6,
102
+ "<0x04>": 7,
103
+ "<0x05>": 8,
104
+ "<0x06>": 9,
105
+ "<0x07>": 10,
106
+ "<0x08>": 11,
107
+ "<0x09>": 12,
108
+ "<0x0A>": 13,
109
+ "<0x0B>": 14,
110
+ "<0x0C>": 15,
111
+ "<0x0D>": 16,
112
+ "<0x0E>": 17,
113
+ "<0x0F>": 18,
114
+ "<0x10>": 19,
115
+ "<0x11>": 20,
116
+ "<0x12>": 21,
117
+ "<0x13>": 22,
118
+ "<0x14>": 23,
119
+ "<0x15>": 24,
120
+ "<0x16>": 25,
121
+ "<0x17>": 26,
122
+ "<0x18>": 27,
123
+ "<0x19>": 28,
124
+ "<0x1A>": 29,
125
+ "<0x1B>": 30,
126
+ "<0x1C>": 31,
127
+ "<0x1D>": 32,
128
+ "<0x1E>": 33,
129
+ "<0x1F>": 34,
130
+ "<0x20>": 35,
131
+ "<0x21>": 36,
132
+ "<0x22>": 37,
133
+ "<0x23>": 38,
134
+ "<0x24>": 39,
135
+ "<0x25>": 40,
136
+ "<0x26>": 41,
137
+ "<0x27>": 42,
138
+ "<0x28>": 43,
139
+ "<0x29>": 44,
140
+ "<0x2A>": 45,
141
+ "<0x2B>": 46,
142
+ "<0x2C>": 47,
143
+ "<0x2D>": 48,
144
+ "<0x2E>": 49,
145
+ "<0x2F>": 50,
146
+ "<0x30>": 51,
147
+ "<0x31>": 52,
148
+ "<0x32>": 53,
149
+ "<0x33>": 54,
150
+ "<0x34>": 55,
151
+ "<0x35>": 56,
152
+ "<0x36>": 57,
153
+ "<0x37>": 58,
154
+ "<0x38>": 59,
155
+ "<0x39>": 60,
156
+ "<0x3A>": 61,
157
+ "<0x3B>": 62,
158
+ "<0x3C>": 63,
159
+ "<0x3D>": 64,
160
+ "<0x3E>": 65,
161
+ "<0x3F>": 66,
162
+ "<0x40>": 67,
163
+ "<0x41>": 68,
164
+ "<0x42>": 69,
165
+ "<0x43>": 70,
166
+ "<0x44>": 71,
167
+ "<0x45>": 72,
168
+ "<0x46>": 73,
169
+ "<0x47>": 74,
170
+ "<0x48>": 75,
171
+ "<0x49>": 76,
172
+ "<0x4A>": 77,
173
+ "<0x4B>": 78,
174
+ "<0x4C>": 79,
175
+ "<0x4D>": 80,
176
+ "<0x4E>": 81,
177
+ "<0x4F>": 82,
178
+ "<0x50>": 83,
179
+ "<0x51>": 84,
180
+ "<0x52>": 85,
181
+ "<0x53>": 86,
182
+ "<0x54>": 87,
183
+ "<0x55>": 88,
184
+ "<0x56>": 89,
185
+ "<0x57>": 90,
186
+ "<0x58>": 91,
187
+ "<0x59>": 92,
188
+ "<0x5A>": 93,
189
+ "<0x5B>": 94,
190
+ "<0x5C>": 95,
191
+ "<0x5D>": 96,
192
+ "<0x5E>": 97,
193
+ "<0x5F>": 98,
194
+ "<0x60>": 99,
195
+ "<0x61>": 100,
196
+ "<0x62>": 101,
197
+ "<0x63>": 102,
198
+ "<0x64>": 103,
199
+ "<0x65>": 104,
200
+ "<0x66>": 105,
201
+ "<0x67>": 106,
202
+ "<0x68>": 107,
203
+ "<0x69>": 108,
204
+ "<0x6A>": 109,
205
+ "<0x6B>": 110,
206
+ "<0x6C>": 111,
207
+ "<0x6D>": 112,
208
+ "<0x6E>": 113,
209
+ "<0x6F>": 114,
210
+ "<0x70>": 115,
211
+ "<0x71>": 116,
212
+ "<0x72>": 117,
213
+ "<0x73>": 118,
214
+ "<0x74>": 119,
215
+ "<0x75>": 120,
216
+ "<0x76>": 121,
217
+ "<0x77>": 122,
218
+ "<0x78>": 123,
219
+ "<0x79>": 124,
220
+ "<0x7A>": 125,
221
+ "<0x7B>": 126,
222
+ "<0x7C>": 127,
223
+ "<0x7D>": 128,
224
+ "<0x7E>": 129,
225
+ "<0x7F>": 130,
226
+ "<0x80>": 131,
227
+ "<0x81>": 132,
228
+ "<0x82>": 133,
229
+ "<0x83>": 134,
230
+ "<0x84>": 135,
231
+ "<0x85>": 136,
232
+ "<0x86>": 137,
233
+ "<0x87>": 138,
234
+ "<0x88>": 139,
235
+ "<0x89>": 140,
236
+ "<0x8A>": 141,
237
+ "<0x8B>": 142,
238
+ "<0x8C>": 143,
239
+ "<0x8D>": 144,
240
+ "<0x8E>": 145,
241
+ "<0x8F>": 146,
242
+ "<0x90>": 147,
243
+ "<0x91>": 148,
244
+ "<0x92>": 149,
245
+ "<0x93>": 150,
246
+ "<0x94>": 151,
247
+ "<0x95>": 152,
248
+ "<0x96>": 153,
249
+ "<0x97>": 154,
250
+ "<0x98>": 155,
251
+ "<0x99>": 156,
252
+ "<0x9A>": 157,
253
+ "<0x9B>": 158,
254
+ "<0x9C>": 159,
255
+ "<0x9D>": 160,
256
+ "<0x9E>": 161,
257
+ "<0x9F>": 162,
258
+ "<0xA0>": 163,
259
+ "<0xA1>": 164,
260
+ "<0xA2>": 165,
261
+ "<0xA3>": 166,
262
+ "<0xA4>": 167,
263
+ "<0xA5>": 168,
264
+ "<0xA6>": 169,
265
+ "<0xA7>": 170,
266
+ "<0xA8>": 171,
267
+ "<0xA9>": 172,
268
+ "<0xAA>": 173,
269
+ "<0xAB>": 174,
270
+ "<0xAC>": 175,
271
+ "<0xAD>": 176,
272
+ "<0xAE>": 177,
273
+ "<0xAF>": 178,
274
+ "<0xB0>": 179,
275
+ "<0xB1>": 180,
276
+ "<0xB2>": 181,
277
+ "<0xB3>": 182,
278
+ "<0xB4>": 183,
279
+ "<0xB5>": 184,
280
+ "<0xB6>": 185,
281
+ "<0xB7>": 186,
282
+ "<0xB8>": 187,
283
+ "<0xB9>": 188,
284
+ "<0xBA>": 189,
285
+ "<0xBB>": 190,
286
+ "<0xBC>": 191,
287
+ "<0xBD>": 192,
288
+ "<0xBE>": 193,
289
+ "<0xBF>": 194,
290
+ "<0xC0>": 195,
291
+ "<0xC1>": 196,
292
+ "<0xC2>": 197,
293
+ "<0xC3>": 198,
294
+ "<0xC4>": 199,
295
+ "<0xC5>": 200,
296
+ "<0xC6>": 201,
297
+ "<0xC7>": 202,
298
+ "<0xC8>": 203,
299
+ "<0xC9>": 204,
300
+ "<0xCA>": 205,
301
+ "<0xCB>": 206,
302
+ "<0xCC>": 207,
303
+ "<0xCD>": 208,
304
+ "<0xCE>": 209,
305
+ "<0xCF>": 210,
306
+ "<0xD0>": 211,
307
+ "<0xD1>": 212,
308
+ "<0xD2>": 213,
309
+ "<0xD3>": 214,
310
+ "<0xD4>": 215,
311
+ "<0xD5>": 216,
312
+ "<0xD6>": 217,
313
+ "<0xD7>": 218,
314
+ "<0xD8>": 219,
315
+ "<0xD9>": 220,
316
+ "<0xDA>": 221,
317
+ "<0xDB>": 222,
318
+ "<0xDC>": 223,
319
+ "<0xDD>": 224,
320
+ "<0xDE>": 225,
321
+ "<0xDF>": 226,
322
+ "<0xE0>": 227,
323
+ "<0xE1>": 228,
324
+ "<0xE2>": 229,
325
+ "<0xE3>": 230,
326
+ "<0xE4>": 231,
327
+ "<0xE5>": 232,
328
+ "<0xE6>": 233,
329
+ "<0xE7>": 234,
330
+ "<0xE8>": 235,
331
+ "<0xE9>": 236,
332
+ "<0xEA>": 237,
333
+ "<0xEB>": 238,
334
+ "<0xEC>": 239,
335
+ "<0xED>": 240,
336
+ "<0xEE>": 241,
337
+ "<0xEF>": 242,
338
+ "<0xF0>": 243,
339
+ "<0xF1>": 244,
340
+ "<0xF2>": 245,
341
+ "<0xF3>": 246,
342
+ "<0xF4>": 247,
343
+ "<0xF5>": 248,
344
+ "<0xF6>": 249,
345
+ "<0xF7>": 250,
346
+ "<0xF8>": 251,
347
+ "<0xF9>": 252,
348
+ "<0xFA>": 253,
349
+ "<0xFB>": 254,
350
+ "<0xFC>": 255,
351
+ "<0xFD>": 256,
352
+ "<0xFE>": 257,
353
+ "<0xFF>": 258,
354
+ "▁": 259,
355
+ "e": 260,
356
+ "t": 261,
357
+ "a": 262,
358
+ "o": 263,
359
+ "i": 264,
360
+ "n": 265,
361
+ "s": 266,
362
+ "r": 267,
363
+ "h": 268,
364
+ "l": 269,
365
+ "d": 270,
366
+ "c": 271,
367
+ "u": 272,
368
+ "m": 273,
369
+ "f": 274,
370
+ "p": 275,
371
+ "g": 276,
372
+ "y": 277,
373
+ "w": 278,
374
+ "b": 279,
375
+ ".": 280,
376
+ ",": 281,
377
+ "v": 282,
378
+ "k": 283,
379
+ "T": 284,
380
+ "I": 285,
381
+ "A": 286,
382
+ "S": 287,
383
+ "-": 288,
384
+ "1": 289,
385
+ "C": 290,
386
+ "0": 291,
387
+ "x": 292,
388
+ "’": 293,
389
+ "P": 294,
390
+ "M": 295,
391
+ "2": 296,
392
+ "B": 297,
393
+ "W": 298,
394
+ "E": 299,
395
+ "D": 300,
396
+ "H": 301,
397
+ ")": 302,
398
+ "(": 303,
399
+ "F": 304,
400
+ "O": 305
401
+ },
402
+ "merges": []
403
+ }
404
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": null,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<unk>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "</s>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": true,
27
+ "single_word": false,
28
+ "special": true
29
+ }
30
+ },
31
+ "bos_token": "<s>",
32
+ "chat_template": "{% for message in messages %}{% if message['role'] == 'system' %}{{'<|system|>\n' + message['content'] + '<|end|>\n'}}{% elif message['role'] == 'user' %}{{'<|user|>\n' + message['content'] + '<|end|>\n'}}{% elif message['role'] == 'assistant' %}{{'<|assistant|>\n' + message['content'] + '<|end|>\n'}}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '<|assistant|>\n' }}{% else %}{{ eos_token }}{% endif %}",
33
+ "clean_up_tokenization_spaces": false,
34
+ "eos_token": "</s>",
35
+ "legacy": false,
36
+ "model_max_length": 4096,
37
+ "pad_token": "</s>",
38
+ "padding_side": "left",
39
+ "sp_model_kwargs": {},
40
+ "spaces_between_special_tokens": false,
41
+ "tokenizer_class": "LlamaTokenizer",
42
+ "unk_token": "<unk>",
43
+ "use_default_system_prompt": false
44
+ }