radoslavralev commited on
Commit
58b1e16
·
verified ·
1 Parent(s): 6513cd6

Training in progress, step 5000

Browse files
1_Pooling/config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "word_embedding_dimension": 768,
3
  "pooling_mode_cls_token": true,
4
  "pooling_mode_mean_tokens": false,
5
  "pooling_mode_max_tokens": false,
 
1
  {
2
+ "word_embedding_dimension": 512,
3
  "pooling_mode_cls_token": true,
4
  "pooling_mode_mean_tokens": false,
5
  "pooling_mode_max_tokens": false,
Information-Retrieval_evaluation_val_results.csv CHANGED
@@ -6,3 +6,4 @@ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Precisi
6
  -1,-1,0.7614,0.82615,0.850775,0.7614,0.7614,0.2753833333333333,0.82615,0.170155,0.850775,0.7614,0.7960862499999959,0.8003843253968239,0.8201550154419872,0.8038332983359062
7
  -1,-1,0.7966,0.87425,0.900575,0.7966,0.7966,0.2914166666666666,0.87425,0.180115,0.900575,0.7966,0.8372962499999956,0.8416481150793601,0.8637140791780538,0.8444611118975183
8
  -1,-1,0.7467,0.81875,0.842275,0.7467,0.7467,0.27291666666666664,0.81875,0.16845500000000002,0.842275,0.7467,0.784354583333328,0.7884659325396792,0.8088581445720447,0.7917670616349511
 
 
6
  -1,-1,0.7614,0.82615,0.850775,0.7614,0.7614,0.2753833333333333,0.82615,0.170155,0.850775,0.7614,0.7960862499999959,0.8003843253968239,0.8201550154419872,0.8038332983359062
7
  -1,-1,0.7966,0.87425,0.900575,0.7966,0.7966,0.2914166666666666,0.87425,0.180115,0.900575,0.7966,0.8372962499999956,0.8416481150793601,0.8637140791780538,0.8444611118975183
8
  -1,-1,0.7467,0.81875,0.842275,0.7467,0.7467,0.27291666666666664,0.81875,0.16845500000000002,0.842275,0.7467,0.784354583333328,0.7884659325396792,0.8088581445720447,0.7917670616349511
9
+ -1,-1,0.83665,0.91045,0.9361,0.83665,0.83665,0.3034833333333333,0.91045,0.18722000000000003,0.9361,0.83665,0.8753945833333286,0.8793089583333286,0.9000254411118587,0.8812821493075779
README.md CHANGED
@@ -5,123 +5,51 @@ tags:
5
  - feature-extraction
6
  - dense
7
  - generated_from_trainer
8
- - dataset_size:713743
9
  - loss:MultipleNegativesRankingLoss
10
- base_model: Alibaba-NLP/gte-modernbert-base
11
  widget:
12
- - source_sentence: 'Abraham Lincoln: Why is the Gettysburg Address so memorable?'
13
  sentences:
14
- - 'Abraham Lincoln: Why is the Gettysburg Address so memorable?'
15
- - What does the Gettysburg Address really mean?
16
- - What is eatalo.com?
17
- - source_sentence: Has the influence of Ancient Carthage in science, math, and society
18
- been underestimated?
19
  sentences:
20
- - How does one earn money online without an investment from home?
21
- - Has the influence of Ancient Carthage in science, math, and society been underestimated?
22
- - Has the influence of the Ancient Etruscans in science and math been underestimated?
23
- - source_sentence: Is there any app that shares charging to others like share it how
24
- we transfer files?
25
  sentences:
26
- - How do you think of Chinese claims that the present Private Arbitration is illegal,
27
- its verdict violates the UNCLOS and is illegal?
28
- - Is there any app that shares charging to others like share it how we transfer
29
- files?
30
- - Are there any platforms that provides end-to-end encryption for file transfer/
31
- sharing?
32
- - source_sentence: Why AAP’s MLA Dinesh Mohaniya has been arrested?
33
  sentences:
34
- - What are your views on the latest sex scandal by AAP MLA Sandeep Kumar?
35
- - What is a dc current? What are some examples?
36
- - Why AAP’s MLA Dinesh Mohaniya has been arrested?
37
- - source_sentence: What is the difference between economic growth and economic development?
38
  sentences:
39
- - How cold can the Gobi Desert get, and how do its average temperatures compare
40
- to the ones in the Simpson Desert?
41
- - the difference between economic growth and economic development is What?
42
- - What is the difference between economic growth and economic development?
43
  pipeline_tag: sentence-similarity
44
  library_name: sentence-transformers
45
- metrics:
46
- - cosine_accuracy@1
47
- - cosine_accuracy@3
48
- - cosine_accuracy@5
49
- - cosine_precision@1
50
- - cosine_precision@3
51
- - cosine_precision@5
52
- - cosine_recall@1
53
- - cosine_recall@3
54
- - cosine_recall@5
55
- - cosine_ndcg@10
56
- - cosine_mrr@1
57
- - cosine_mrr@5
58
- - cosine_mrr@10
59
- - cosine_map@100
60
- model-index:
61
- - name: SentenceTransformer based on Alibaba-NLP/gte-modernbert-base
62
- results:
63
- - task:
64
- type: information-retrieval
65
- name: Information Retrieval
66
- dataset:
67
- name: val
68
- type: val
69
- metrics:
70
- - type: cosine_accuracy@1
71
- value: 0.83665
72
- name: Cosine Accuracy@1
73
- - type: cosine_accuracy@3
74
- value: 0.91045
75
- name: Cosine Accuracy@3
76
- - type: cosine_accuracy@5
77
- value: 0.9361
78
- name: Cosine Accuracy@5
79
- - type: cosine_precision@1
80
- value: 0.83665
81
- name: Cosine Precision@1
82
- - type: cosine_precision@3
83
- value: 0.3034833333333333
84
- name: Cosine Precision@3
85
- - type: cosine_precision@5
86
- value: 0.18722000000000003
87
- name: Cosine Precision@5
88
- - type: cosine_recall@1
89
- value: 0.83665
90
- name: Cosine Recall@1
91
- - type: cosine_recall@3
92
- value: 0.91045
93
- name: Cosine Recall@3
94
- - type: cosine_recall@5
95
- value: 0.9361
96
- name: Cosine Recall@5
97
- - type: cosine_ndcg@10
98
- value: 0.9000254411118587
99
- name: Cosine Ndcg@10
100
- - type: cosine_mrr@1
101
- value: 0.83665
102
- name: Cosine Mrr@1
103
- - type: cosine_mrr@5
104
- value: 0.8753945833333286
105
- name: Cosine Mrr@5
106
- - type: cosine_mrr@10
107
- value: 0.8793089583333286
108
- name: Cosine Mrr@10
109
- - type: cosine_map@100
110
- value: 0.8812821493075779
111
- name: Cosine Map@100
112
  ---
113
 
114
- # SentenceTransformer based on Alibaba-NLP/gte-modernbert-base
115
 
116
- This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-modernbert-base](https://huggingface.co/Alibaba-NLP/gte-modernbert-base). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
117
 
118
  ## Model Details
119
 
120
  ### Model Description
121
  - **Model Type:** Sentence Transformer
122
- - **Base model:** [Alibaba-NLP/gte-modernbert-base](https://huggingface.co/Alibaba-NLP/gte-modernbert-base) <!-- at revision e7f32e3c00f91d699e8c43b53106206bcc72bb22 -->
123
  - **Maximum Sequence Length:** 128 tokens
124
- - **Output Dimensionality:** 768 dimensions
125
  - **Similarity Function:** Cosine Similarity
126
  <!-- - **Training Dataset:** Unknown -->
127
  <!-- - **Language:** Unknown -->
@@ -137,8 +65,8 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [A
137
 
138
  ```
139
  SentenceTransformer(
140
- (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
141
- (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
142
  )
143
  ```
144
 
@@ -157,23 +85,23 @@ Then you can load this model and run inference.
157
  from sentence_transformers import SentenceTransformer
158
 
159
  # Download from the 🤗 Hub
160
- model = SentenceTransformer("redis/model-b-structured")
161
  # Run inference
162
  sentences = [
163
- 'What is the difference between economic growth and economic development?',
164
- 'What is the difference between economic growth and economic development?',
165
- 'the difference between economic growth and economic development is What?',
166
  ]
167
  embeddings = model.encode(sentences)
168
  print(embeddings.shape)
169
- # [3, 768]
170
 
171
  # Get the similarity scores for the embeddings
172
  similarities = model.similarity(embeddings, embeddings)
173
  print(similarities)
174
- # tensor([[ 1.0000, 1.0000, -0.0640],
175
- # [ 1.0000, 1.0000, -0.0640],
176
- # [-0.0640, -0.0640, 1.0000]])
177
  ```
178
 
179
  <!--
@@ -200,32 +128,6 @@ You can finetune this model on your own dataset.
200
  *List how the model may foreseeably be misused and address what users ought not to do with the model.*
201
  -->
202
 
203
- ## Evaluation
204
-
205
- ### Metrics
206
-
207
- #### Information Retrieval
208
-
209
- * Dataset: `val`
210
- * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
211
-
212
- | Metric | Value |
213
- |:-------------------|:--------|
214
- | cosine_accuracy@1 | 0.8367 |
215
- | cosine_accuracy@3 | 0.9104 |
216
- | cosine_accuracy@5 | 0.9361 |
217
- | cosine_precision@1 | 0.8367 |
218
- | cosine_precision@3 | 0.3035 |
219
- | cosine_precision@5 | 0.1872 |
220
- | cosine_recall@1 | 0.8367 |
221
- | cosine_recall@3 | 0.9104 |
222
- | cosine_recall@5 | 0.9361 |
223
- | **cosine_ndcg@10** | **0.9** |
224
- | cosine_mrr@1 | 0.8367 |
225
- | cosine_mrr@5 | 0.8754 |
226
- | cosine_mrr@10 | 0.8793 |
227
- | cosine_map@100 | 0.8813 |
228
-
229
  <!--
230
  ## Bias, Risks and Limitations
231
 
@@ -244,49 +146,23 @@ You can finetune this model on your own dataset.
244
 
245
  #### Unnamed Dataset
246
 
247
- * Size: 713,743 training samples
248
- * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
249
- * Approximate statistics based on the first 1000 samples:
250
- | | anchor | positive | negative |
251
- |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
252
- | type | string | string | string |
253
- | details | <ul><li>min: 6 tokens</li><li>mean: 15.96 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.93 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.72 tokens</li><li>max: 59 tokens</li></ul> |
254
- * Samples:
255
- | anchor | positive | negative |
256
- |:-------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------|
257
- | <code>Which one is better Linux OS? Ubuntu or Mint?</code> | <code>Why do you use Linux Mint?</code> | <code>Which one is not better Linux OS ? Ubuntu or Mint ?</code> |
258
- | <code>What is flow?</code> | <code>What is flow?</code> | <code>What are flow lines?</code> |
259
- | <code>How is Trump planning to get Mexico to pay for his supposed wall?</code> | <code>How is it possible for Donald Trump to force Mexico to pay for the wall?</code> | <code>Why do we connect the positive terminal before the negative terminal to ground in a vehicle battery?</code> |
260
- * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
261
- ```json
262
- {
263
- "scale": 7.0,
264
- "similarity_fct": "cos_sim",
265
- "gather_across_devices": false
266
- }
267
- ```
268
-
269
- ### Evaluation Dataset
270
-
271
- #### Unnamed Dataset
272
-
273
- * Size: 40,000 evaluation samples
274
- * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
275
  * Approximate statistics based on the first 1000 samples:
276
- | | anchor | positive | negative |
277
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
278
  | type | string | string | string |
279
- | details | <ul><li>min: 7 tokens</li><li>mean: 15.47 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.48 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.76 tokens</li><li>max: 67 tokens</li></ul> |
280
  * Samples:
281
- | anchor | positive | negative |
282
- |:-------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
283
- | <code>Why are all my questions on Quora marked needing improvement?</code> | <code>Why are all my questions immediately being marked as needing improvement?</code> | <code>For a post-graduate student in IIT, is it allowed to take an external scholarship as a top-up to his/her MHRD assistantship?</code> |
284
- | <code>Can blue butter fly needle with vaccum tube be reused? Is it HIV risk? . Heard the needle is too small to be reused . Had blood draw at clinic?</code> | <code>Can blue butter fly needle with vaccum tube be reused? Is it HIV risk? . Heard the needle is too small to be reused . Had blood draw at clinic?</code> | <code>Can blue butter fly needle with vaccum tube be reused not ? Is it HIV risk ? . Heard the needle is too small to be reused . Had blood draw at clinic ?</code> |
285
- | <code>Why do people still believe the world is flat?</code> | <code>Why are there still people who believe the world is flat?</code> | <code>I'm not able to buy Udemy course .it is not accepting mine and my friends debit card.my card can be used for Flipkart .how to purchase now?</code> |
286
  * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
287
  ```json
288
  {
289
- "scale": 7.0,
290
  "similarity_fct": "cos_sim",
291
  "gather_across_devices": false
292
  }
@@ -295,49 +171,36 @@ You can finetune this model on your own dataset.
295
  ### Training Hyperparameters
296
  #### Non-Default Hyperparameters
297
 
298
- - `eval_strategy`: steps
299
- - `per_device_train_batch_size`: 128
300
- - `per_device_eval_batch_size`: 128
301
- - `learning_rate`: 2e-05
302
- - `weight_decay`: 0.0001
303
- - `max_steps`: 5000
304
- - `warmup_ratio`: 0.1
305
  - `fp16`: True
306
- - `dataloader_drop_last`: True
307
- - `dataloader_num_workers`: 1
308
- - `dataloader_prefetch_factor`: 1
309
- - `load_best_model_at_end`: True
310
- - `optim`: adamw_torch
311
- - `ddp_find_unused_parameters`: False
312
- - `push_to_hub`: True
313
- - `hub_model_id`: redis/model-b-structured
314
- - `eval_on_start`: True
315
 
316
  #### All Hyperparameters
317
  <details><summary>Click to expand</summary>
318
 
319
  - `overwrite_output_dir`: False
320
  - `do_predict`: False
321
- - `eval_strategy`: steps
322
  - `prediction_loss_only`: True
323
- - `per_device_train_batch_size`: 128
324
- - `per_device_eval_batch_size`: 128
325
  - `per_gpu_train_batch_size`: None
326
  - `per_gpu_eval_batch_size`: None
327
  - `gradient_accumulation_steps`: 1
328
  - `eval_accumulation_steps`: None
329
  - `torch_empty_cache_steps`: None
330
- - `learning_rate`: 2e-05
331
- - `weight_decay`: 0.0001
332
  - `adam_beta1`: 0.9
333
  - `adam_beta2`: 0.999
334
  - `adam_epsilon`: 1e-08
335
- - `max_grad_norm`: 1.0
336
- - `num_train_epochs`: 3.0
337
- - `max_steps`: 5000
338
  - `lr_scheduler_type`: linear
339
  - `lr_scheduler_kwargs`: {}
340
- - `warmup_ratio`: 0.1
341
  - `warmup_steps`: 0
342
  - `log_level`: passive
343
  - `log_level_replica`: warning
@@ -365,14 +228,14 @@ You can finetune this model on your own dataset.
365
  - `tpu_num_cores`: None
366
  - `tpu_metrics_debug`: False
367
  - `debug`: []
368
- - `dataloader_drop_last`: True
369
- - `dataloader_num_workers`: 1
370
- - `dataloader_prefetch_factor`: 1
371
  - `past_index`: -1
372
  - `disable_tqdm`: False
373
  - `remove_unused_columns`: True
374
  - `label_names`: None
375
- - `load_best_model_at_end`: True
376
  - `ignore_data_skip`: False
377
  - `fsdp`: []
378
  - `fsdp_min_num_params`: 0
@@ -382,23 +245,23 @@ You can finetune this model on your own dataset.
382
  - `parallelism_config`: None
383
  - `deepspeed`: None
384
  - `label_smoothing_factor`: 0.0
385
- - `optim`: adamw_torch
386
  - `optim_args`: None
387
  - `adafactor`: False
388
  - `group_by_length`: False
389
  - `length_column_name`: length
390
  - `project`: huggingface
391
  - `trackio_space_id`: trackio
392
- - `ddp_find_unused_parameters`: False
393
  - `ddp_bucket_cap_mb`: None
394
  - `ddp_broadcast_buffers`: False
395
  - `dataloader_pin_memory`: True
396
  - `dataloader_persistent_workers`: False
397
  - `skip_memory_metrics`: True
398
  - `use_legacy_prediction_loop`: False
399
- - `push_to_hub`: True
400
  - `resume_from_checkpoint`: None
401
- - `hub_model_id`: redis/model-b-structured
402
  - `hub_strategy`: every_save
403
  - `hub_private_repo`: None
404
  - `hub_always_push`: False
@@ -425,43 +288,31 @@ You can finetune this model on your own dataset.
425
  - `neftune_noise_alpha`: None
426
  - `optim_target_modules`: None
427
  - `batch_eval_metrics`: False
428
- - `eval_on_start`: True
429
  - `use_liger_kernel`: False
430
  - `liger_kernel_config`: None
431
  - `eval_use_gather_object`: False
432
  - `average_tokens_across_devices`: True
433
  - `prompts`: None
434
  - `batch_sampler`: batch_sampler
435
- - `multi_dataset_batch_sampler`: proportional
436
  - `router_mapping`: {}
437
  - `learning_rate_mapping`: {}
438
 
439
  </details>
440
 
441
  ### Training Logs
442
- | Epoch | Step | Training Loss | Validation Loss | val_cosine_ndcg@10 |
443
- |:------:|:----:|:-------------:|:---------------:|:------------------:|
444
- | 0 | 0 | - | 2.2389 | 0.8638 |
445
- | 0.0448 | 250 | 1.0018 | 0.4153 | 0.8910 |
446
- | 0.0897 | 500 | 0.3879 | 0.3664 | 0.8940 |
447
- | 0.1345 | 750 | 0.3583 | 0.3532 | 0.8937 |
448
- | 0.1793 | 1000 | 0.3453 | 0.3371 | 0.8962 |
449
- | 0.2242 | 1250 | 0.3371 | 0.3299 | 0.8956 |
450
- | 0.2690 | 1500 | 0.3283 | 0.3230 | 0.8967 |
451
- | 0.3138 | 1750 | 0.323 | 0.3185 | 0.8974 |
452
- | 0.3587 | 2000 | 0.3205 | 0.3139 | 0.8978 |
453
- | 0.4035 | 2250 | 0.315 | 0.3123 | 0.8985 |
454
- | 0.4484 | 2500 | 0.3132 | 0.3095 | 0.8987 |
455
- | 0.4932 | 2750 | 0.3082 | 0.3071 | 0.8991 |
456
- | 0.5380 | 3000 | 0.3065 | 0.3045 | 0.8985 |
457
- | 0.5829 | 3250 | 0.3041 | 0.3029 | 0.8988 |
458
- | 0.6277 | 3500 | 0.3046 | 0.3015 | 0.8996 |
459
- | 0.6725 | 3750 | 0.3023 | 0.3002 | 0.8995 |
460
- | 0.7174 | 4000 | 0.3017 | 0.2991 | 0.9000 |
461
- | 0.7622 | 4250 | 0.3001 | 0.2985 | 0.8996 |
462
- | 0.8070 | 4500 | 0.3006 | 0.2975 | 0.8999 |
463
- | 0.8519 | 4750 | 0.2983 | 0.2970 | 0.8998 |
464
- | 0.8967 | 5000 | 0.2991 | 0.2966 | 0.9000 |
465
 
466
 
467
  ### Framework Versions
 
5
  - feature-extraction
6
  - dense
7
  - generated_from_trainer
8
+ - dataset_size:100000
9
  - loss:MultipleNegativesRankingLoss
10
+ base_model: prajjwal1/bert-small
11
  widget:
12
+ - source_sentence: How do I calculate IQ?
13
  sentences:
14
+ - What is the easiest way to know my IQ?
15
+ - How do I calculate not IQ ?
16
+ - What are some creative and innovative business ideas with less investment in India?
17
+ - source_sentence: How can I learn martial arts in my home?
 
18
  sentences:
19
+ - How can I learn martial arts by myself?
20
+ - What are the advantages and disadvantages of investing in gold?
21
+ - Can people see that I have looked at their pictures on instagram if I am not following
22
+ them?
23
+ - source_sentence: When Enterprise picks you up do you have to take them back?
24
  sentences:
25
+ - Are there any software Training institute in Tuticorin?
26
+ - When Enterprise picks you up do you have to take them back?
27
+ - When Enterprise picks you up do them have to take youback?
28
+ - source_sentence: What are some non-capital goods?
 
 
 
29
  sentences:
30
+ - What are capital goods?
31
+ - How is the value of [math]\pi[/math] calculated?
32
+ - What are some non-capital goods?
33
+ - source_sentence: What is the QuickBooks technical support phone number in New York?
34
  sentences:
35
+ - What caused the Great Depression?
36
+ - Can I apply for PR in Canada?
37
+ - Which is the best QuickBooks Hosting Support Number in New York?
 
38
  pipeline_tag: sentence-similarity
39
  library_name: sentence-transformers
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  ---
41
 
42
+ # SentenceTransformer based on prajjwal1/bert-small
43
 
44
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [prajjwal1/bert-small](https://huggingface.co/prajjwal1/bert-small). It maps sentences & paragraphs to a 512-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
45
 
46
  ## Model Details
47
 
48
  ### Model Description
49
  - **Model Type:** Sentence Transformer
50
+ - **Base model:** [prajjwal1/bert-small](https://huggingface.co/prajjwal1/bert-small) <!-- at revision 0ec5f86f27c1a77d704439db5e01c307ea11b9d4 -->
51
  - **Maximum Sequence Length:** 128 tokens
52
+ - **Output Dimensionality:** 512 dimensions
53
  - **Similarity Function:** Cosine Similarity
54
  <!-- - **Training Dataset:** Unknown -->
55
  <!-- - **Language:** Unknown -->
 
65
 
66
  ```
67
  SentenceTransformer(
68
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'})
69
+ (1): Pooling({'word_embedding_dimension': 512, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
70
  )
71
  ```
72
 
 
85
  from sentence_transformers import SentenceTransformer
86
 
87
  # Download from the 🤗 Hub
88
+ model = SentenceTransformer("sentence_transformers_model_id")
89
  # Run inference
90
  sentences = [
91
+ 'What is the QuickBooks technical support phone number in New York?',
92
+ 'Which is the best QuickBooks Hosting Support Number in New York?',
93
+ 'Can I apply for PR in Canada?',
94
  ]
95
  embeddings = model.encode(sentences)
96
  print(embeddings.shape)
97
+ # [3, 512]
98
 
99
  # Get the similarity scores for the embeddings
100
  similarities = model.similarity(embeddings, embeddings)
101
  print(similarities)
102
+ # tensor([[1.0000, 0.8563, 0.0594],
103
+ # [0.8563, 1.0000, 0.1245],
104
+ # [0.0594, 0.1245, 1.0000]])
105
  ```
106
 
107
  <!--
 
128
  *List how the model may foreseeably be misused and address what users ought not to do with the model.*
129
  -->
130
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
131
  <!--
132
  ## Bias, Risks and Limitations
133
 
 
146
 
147
  #### Unnamed Dataset
148
 
149
+ * Size: 100,000 training samples
150
+ * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>sentence_2</code>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
151
  * Approximate statistics based on the first 1000 samples:
152
+ | | sentence_0 | sentence_1 | sentence_2 |
153
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
154
  | type | string | string | string |
155
+ | details | <ul><li>min: 6 tokens</li><li>mean: 15.79 tokens</li><li>max: 66 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.68 tokens</li><li>max: 66 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 16.37 tokens</li><li>max: 67 tokens</li></ul> |
156
  * Samples:
157
+ | sentence_0 | sentence_1 | sentence_2 |
158
+ |:-----------------------------------------------------------------|:-----------------------------------------------------------------|:----------------------------------------------------------------------------------|
159
+ | <code>Is masturbating bad for boys?</code> | <code>Is masturbating bad for boys?</code> | <code>How harmful or unhealthy is masturbation?</code> |
160
+ | <code>Does a train engine move in reverse?</code> | <code>Does a train engine move in reverse?</code> | <code>Time moves forward, not in reverse. Doesn't that make time a vector?</code> |
161
+ | <code>What is the most badass thing anyone has ever done?</code> | <code>What is the most badass thing anyone has ever done?</code> | <code>anyone is the most badass thing Whathas ever done?</code> |
162
  * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
163
  ```json
164
  {
165
+ "scale": 20.0,
166
  "similarity_fct": "cos_sim",
167
  "gather_across_devices": false
168
  }
 
171
  ### Training Hyperparameters
172
  #### Non-Default Hyperparameters
173
 
174
+ - `per_device_train_batch_size`: 64
175
+ - `per_device_eval_batch_size`: 64
 
 
 
 
 
176
  - `fp16`: True
177
+ - `multi_dataset_batch_sampler`: round_robin
 
 
 
 
 
 
 
 
178
 
179
  #### All Hyperparameters
180
  <details><summary>Click to expand</summary>
181
 
182
  - `overwrite_output_dir`: False
183
  - `do_predict`: False
184
+ - `eval_strategy`: no
185
  - `prediction_loss_only`: True
186
+ - `per_device_train_batch_size`: 64
187
+ - `per_device_eval_batch_size`: 64
188
  - `per_gpu_train_batch_size`: None
189
  - `per_gpu_eval_batch_size`: None
190
  - `gradient_accumulation_steps`: 1
191
  - `eval_accumulation_steps`: None
192
  - `torch_empty_cache_steps`: None
193
+ - `learning_rate`: 5e-05
194
+ - `weight_decay`: 0.0
195
  - `adam_beta1`: 0.9
196
  - `adam_beta2`: 0.999
197
  - `adam_epsilon`: 1e-08
198
+ - `max_grad_norm`: 1
199
+ - `num_train_epochs`: 3
200
+ - `max_steps`: -1
201
  - `lr_scheduler_type`: linear
202
  - `lr_scheduler_kwargs`: {}
203
+ - `warmup_ratio`: 0.0
204
  - `warmup_steps`: 0
205
  - `log_level`: passive
206
  - `log_level_replica`: warning
 
228
  - `tpu_num_cores`: None
229
  - `tpu_metrics_debug`: False
230
  - `debug`: []
231
+ - `dataloader_drop_last`: False
232
+ - `dataloader_num_workers`: 0
233
+ - `dataloader_prefetch_factor`: None
234
  - `past_index`: -1
235
  - `disable_tqdm`: False
236
  - `remove_unused_columns`: True
237
  - `label_names`: None
238
+ - `load_best_model_at_end`: False
239
  - `ignore_data_skip`: False
240
  - `fsdp`: []
241
  - `fsdp_min_num_params`: 0
 
245
  - `parallelism_config`: None
246
  - `deepspeed`: None
247
  - `label_smoothing_factor`: 0.0
248
+ - `optim`: adamw_torch_fused
249
  - `optim_args`: None
250
  - `adafactor`: False
251
  - `group_by_length`: False
252
  - `length_column_name`: length
253
  - `project`: huggingface
254
  - `trackio_space_id`: trackio
255
+ - `ddp_find_unused_parameters`: None
256
  - `ddp_bucket_cap_mb`: None
257
  - `ddp_broadcast_buffers`: False
258
  - `dataloader_pin_memory`: True
259
  - `dataloader_persistent_workers`: False
260
  - `skip_memory_metrics`: True
261
  - `use_legacy_prediction_loop`: False
262
+ - `push_to_hub`: False
263
  - `resume_from_checkpoint`: None
264
+ - `hub_model_id`: None
265
  - `hub_strategy`: every_save
266
  - `hub_private_repo`: None
267
  - `hub_always_push`: False
 
288
  - `neftune_noise_alpha`: None
289
  - `optim_target_modules`: None
290
  - `batch_eval_metrics`: False
291
+ - `eval_on_start`: False
292
  - `use_liger_kernel`: False
293
  - `liger_kernel_config`: None
294
  - `eval_use_gather_object`: False
295
  - `average_tokens_across_devices`: True
296
  - `prompts`: None
297
  - `batch_sampler`: batch_sampler
298
+ - `multi_dataset_batch_sampler`: round_robin
299
  - `router_mapping`: {}
300
  - `learning_rate_mapping`: {}
301
 
302
  </details>
303
 
304
  ### Training Logs
305
+ | Epoch | Step | Training Loss |
306
+ |:------:|:----:|:-------------:|
307
+ | 0.3199 | 500 | 0.4294 |
308
+ | 0.6398 | 1000 | 0.1268 |
309
+ | 0.9597 | 1500 | 0.1 |
310
+ | 1.2796 | 2000 | 0.0792 |
311
+ | 1.5995 | 2500 | 0.0706 |
312
+ | 1.9194 | 3000 | 0.0687 |
313
+ | 2.2393 | 3500 | 0.0584 |
314
+ | 2.5592 | 4000 | 0.057 |
315
+ | 2.8791 | 4500 | 0.0581 |
 
 
 
 
 
 
 
 
 
 
 
 
316
 
317
 
318
  ### Framework Versions
config.json CHANGED
@@ -1,45 +1,24 @@
1
  {
2
  "architectures": [
3
- "ModernBertModel"
4
  ],
5
- "attention_bias": false,
6
- "attention_dropout": 0.0,
7
- "bos_token_id": 50281,
8
- "classifier_activation": "gelu",
9
- "classifier_bias": false,
10
- "classifier_dropout": 0.0,
11
- "classifier_pooling": "mean",
12
- "cls_token_id": 50281,
13
- "decoder_bias": true,
14
- "deterministic_flash_attn": false,
15
  "dtype": "float32",
16
- "embedding_dropout": 0.0,
17
- "eos_token_id": 50282,
18
- "global_attn_every_n_layers": 3,
19
- "global_rope_theta": 160000.0,
20
- "gradient_checkpointing": false,
21
- "hidden_activation": "gelu",
22
- "hidden_size": 768,
23
- "initializer_cutoff_factor": 2.0,
24
  "initializer_range": 0.02,
25
- "intermediate_size": 1152,
26
- "layer_norm_eps": 1e-05,
27
- "local_attention": 128,
28
- "local_rope_theta": 10000.0,
29
- "max_position_embeddings": 8192,
30
- "mlp_bias": false,
31
- "mlp_dropout": 0.0,
32
- "model_type": "modernbert",
33
- "norm_bias": false,
34
- "norm_eps": 1e-05,
35
  "num_attention_heads": 12,
36
- "num_hidden_layers": 22,
37
- "pad_token_id": 50283,
38
  "position_embedding_type": "absolute",
39
- "repad_logits_with_grad": false,
40
- "sep_token_id": 50282,
41
- "sparse_pred_ignore_index": -100,
42
- "sparse_prediction": false,
43
  "transformers_version": "4.57.3",
44
- "vocab_size": 50368
 
 
45
  }
 
1
  {
2
  "architectures": [
3
+ "BertModel"
4
  ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
 
 
 
 
 
 
 
 
7
  "dtype": "float32",
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 384,
 
 
 
 
 
11
  "initializer_range": 0.02,
12
+ "intermediate_size": 1536,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
 
 
 
 
 
 
16
  "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
  "position_embedding_type": "absolute",
 
 
 
 
20
  "transformers_version": "4.57.3",
21
+ "type_vocab_size": 2,
22
+ "use_cache": true,
23
+ "vocab_size": 30522
24
  }
config_sentence_transformers.json CHANGED
@@ -1,4 +1,5 @@
1
  {
 
2
  "__version__": {
3
  "sentence_transformers": "5.2.0",
4
  "transformers": "4.57.3",
@@ -9,6 +10,5 @@
9
  "document": ""
10
  },
11
  "default_prompt_name": null,
12
- "similarity_fn_name": "cosine",
13
- "model_type": "SentenceTransformer"
14
  }
 
1
  {
2
+ "model_type": "SentenceTransformer",
3
  "__version__": {
4
  "sentence_transformers": "5.2.0",
5
  "transformers": "4.57.3",
 
10
  "document": ""
11
  },
12
  "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
 
14
  }
eval/Information-Retrieval_evaluation_val_results.csv CHANGED
@@ -554,3 +554,24 @@ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Precisi
554
  0.8070301291248206,4500,0.836575,0.91035,0.936225,0.836575,0.836575,0.3034499999999999,0.91035,0.18724500000000002,0.936225,0.836575,0.8753933333333296,0.8792632638888849,0.8999479246388732,0.8812381232525891
555
  0.8518651362984218,4750,0.8364,0.910025,0.93625,0.8364,0.8364,0.3033416666666667,0.910025,0.18725000000000003,0.93625,0.8364,0.875251249999996,0.879122916666663,0.8998371123346441,0.8811114899752792
556
  0.896700143472023,5000,0.83665,0.91045,0.9361,0.83665,0.83665,0.3034833333333333,0.91045,0.18722000000000003,0.9361,0.83665,0.8753945833333286,0.8793089583333286,0.9000254411118587,0.8812821493075779
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
554
  0.8070301291248206,4500,0.836575,0.91035,0.936225,0.836575,0.836575,0.3034499999999999,0.91035,0.18724500000000002,0.936225,0.836575,0.8753933333333296,0.8792632638888849,0.8999479246388732,0.8812381232525891
555
  0.8518651362984218,4750,0.8364,0.910025,0.93625,0.8364,0.8364,0.3033416666666667,0.910025,0.18725000000000003,0.93625,0.8364,0.875251249999996,0.879122916666663,0.8998371123346441,0.8811114899752792
556
  0.896700143472023,5000,0.83665,0.91045,0.9361,0.83665,0.83665,0.3034833333333333,0.91045,0.18722000000000003,0.9361,0.83665,0.8753945833333286,0.8793089583333286,0.9000254411118587,0.8812821493075779
557
+ 0,0,0.756925,0.886,0.915975,0.756925,0.756925,0.2953333333333333,0.886,0.183195,0.915975,0.756925,0.823237499999992,0.8273541269841215,0.8566105435301875,0.8297538154910199
558
+ 0.04483500717360115,250,0.797525,0.890075,0.916275,0.797525,0.797525,0.2966916666666667,0.890075,0.18325500000000003,0.916275,0.797525,0.8455487499999935,0.8495717063492002,0.8730393801633339,0.8519482052146723
559
+ 0.0896700143472023,500,0.82065,0.899625,0.92225,0.82065,0.82065,0.299875,0.899625,0.18445000000000006,0.92225,0.82065,0.8612837499999955,0.8653779265872968,0.8864626207459981,0.8676268538103573
560
+ 0.13450502152080343,750,0.82245,0.90005,0.925075,0.82245,0.82245,0.3000166666666666,0.90005,0.18501500000000007,0.925075,0.82245,0.8630183333333307,0.8668782837301547,0.8879172543461453,0.8691401302878512
561
+ 0.1793400286944046,1000,0.824,0.90095,0.926075,0.824,0.824,0.3003166666666666,0.90095,0.18521500000000002,0.926075,0.824,0.864193333333329,0.8680340873015829,0.8890159154358417,0.8702850493416893
562
+ 0.22417503586800575,1250,0.8251,0.900975,0.92625,0.8251,0.8251,0.30032499999999995,0.900975,0.18525000000000003,0.92625,0.8251,0.8648804166666639,0.8687764880952341,0.8897092729977382,0.8709977164602416
563
+ 0.26901004304160686,1500,0.825625,0.9012,0.9268,0.825625,0.825625,0.30039999999999994,0.9012,0.18536000000000002,0.9268,0.825625,0.865259166666663,0.8691541369047578,0.8900874550807586,0.8713935357378468
564
+ 0.31384505021520803,1750,0.826125,0.9013,0.927225,0.826125,0.826125,0.3004333333333333,0.9013,0.18544500000000003,0.927225,0.826125,0.865661249999996,0.8695618253968221,0.8905357737452925,0.8717678342233259
565
+ 0.3586800573888092,2000,0.826175,0.901025,0.926975,0.826175,0.826175,0.3003416666666666,0.901025,0.18539500000000003,0.926975,0.826175,0.8655199999999963,0.8694849007936466,0.8904921468586298,0.8716916370424884
566
+ 0.4035150645624103,2250,0.8267,0.901925,0.9278,0.8267,0.8267,0.3006416666666666,0.901925,0.18556000000000003,0.9278,0.8267,0.86619708333333,0.8700837400793613,0.8910357094734063,0.8722924784004461
567
+ 0.4483500717360115,2500,0.826325,0.9016,0.9274,0.826325,0.826325,0.3005333333333333,0.9016,0.18548,0.9274,0.826325,0.8658512499999957,0.8698226884920585,0.8908745638086586,0.8720233519109022
568
+ 0.4931850789096126,2750,0.8261,0.901675,0.9275,0.8261,0.8261,0.3005583333333333,0.901675,0.1855,0.9275,0.8261,0.8657958333333294,0.869753412698409,0.8908133360196127,0.871955705362073
569
+ 0.5380200860832137,3000,0.82665,0.901875,0.927675,0.82665,0.82665,0.300625,0.901875,0.185535,0.927675,0.82665,0.8662349999999955,0.8701861805555516,0.8911847325189749,0.8723919812776054
570
+ 0.582855093256815,3250,0.826975,0.9024,0.9278,0.826975,0.826975,0.30079999999999996,0.9024,0.18556000000000003,0.9278,0.826975,0.8663795833333295,0.8703611408730112,0.8914074480899707,0.8725399895706075
571
+ 0.6276901004304161,3500,0.8276,0.903025,0.928675,0.8276,0.8276,0.3010083333333333,0.903025,0.18573500000000004,0.928675,0.8276,0.8671770833333294,0.8711155456349156,0.8921604046034279,0.8732633623176634
572
+ 0.6725251076040172,3750,0.827375,0.90265,0.928025,0.827375,0.827375,0.3008833333333333,0.90265,0.18560500000000002,0.928025,0.827375,0.8668445833333284,0.8708662202380896,0.8919061132291579,0.8730258603104377
573
+ 0.7173601147776184,4000,0.827575,0.9031,0.9284,0.827575,0.827575,0.30103333333333326,0.9031,0.18567999999999998,0.9284,0.827575,0.8670654166666621,0.8710379265872962,0.892045030666255,0.8732157548668783
574
+ 0.7621951219512195,4250,0.8276,0.90295,0.92845,0.8276,0.8276,0.3009833333333333,0.90295,0.18569000000000002,0.92845,0.8276,0.8671320833333284,0.8711172519841215,0.8921526664242764,0.8732736127760576
575
+ 0.8070301291248206,4500,0.82765,0.902975,0.928525,0.82765,0.82765,0.3009916666666666,0.902975,0.185705,0.928525,0.82765,0.8670999999999953,0.871106845238089,0.8922032735564264,0.8732435780606005
576
+ 0.8518651362984218,4750,0.827625,0.902975,0.9284,0.827625,0.827625,0.3009916666666666,0.902975,0.18568,0.9284,0.827625,0.8671220833333285,0.8711407440476129,0.8921952489654028,0.8732955747232074
577
+ 0.896700143472023,5000,0.827675,0.903,0.928425,0.827675,0.827675,0.30099999999999993,0.903,0.18568500000000004,0.928425,0.827675,0.8671804166666619,0.8711970039682481,0.8922532953454642,0.87334664003711
final_metrics.json CHANGED
@@ -1,16 +1,16 @@
1
  {
2
- "val_cosine_accuracy@1": 0.7467,
3
- "val_cosine_accuracy@3": 0.81875,
4
- "val_cosine_accuracy@5": 0.842275,
5
- "val_cosine_precision@1": 0.7467,
6
- "val_cosine_precision@3": 0.27291666666666664,
7
- "val_cosine_precision@5": 0.16845500000000002,
8
- "val_cosine_recall@1": 0.7467,
9
- "val_cosine_recall@3": 0.81875,
10
- "val_cosine_recall@5": 0.842275,
11
- "val_cosine_ndcg@10": 0.8088581445720447,
12
- "val_cosine_mrr@1": 0.7467,
13
- "val_cosine_mrr@5": 0.784354583333328,
14
- "val_cosine_mrr@10": 0.7884659325396792,
15
- "val_cosine_map@100": 0.7917670616349511
16
  }
 
1
  {
2
+ "val_cosine_accuracy@1": 0.83665,
3
+ "val_cosine_accuracy@3": 0.91045,
4
+ "val_cosine_accuracy@5": 0.9361,
5
+ "val_cosine_precision@1": 0.83665,
6
+ "val_cosine_precision@3": 0.3034833333333333,
7
+ "val_cosine_precision@5": 0.18722000000000003,
8
+ "val_cosine_recall@1": 0.83665,
9
+ "val_cosine_recall@3": 0.91045,
10
+ "val_cosine_recall@5": 0.9361,
11
+ "val_cosine_ndcg@10": 0.9000254411118587,
12
+ "val_cosine_mrr@1": 0.83665,
13
+ "val_cosine_mrr@5": 0.8753945833333286,
14
+ "val_cosine_mrr@10": 0.8793089583333286,
15
+ "val_cosine_map@100": 0.8812821493075779
16
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:618a711cb46a993dfbbd3a4d48e58bc1574588b606af9e32ce39fe451134f747
3
- size 596070136
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:325ea67b4c6fb434aa13d441e188dab3f657079050a07c5c0072112cfb0cb217
3
+ size 133462128
special_tokens_map.json CHANGED
@@ -8,7 +8,7 @@
8
  },
9
  "mask_token": {
10
  "content": "[MASK]",
11
- "lstrip": true,
12
  "normalized": false,
13
  "rstrip": false,
14
  "single_word": false
 
8
  },
9
  "mask_token": {
10
  "content": "[MASK]",
11
+ "lstrip": false,
12
  "normalized": false,
13
  "rstrip": false,
14
  "single_word": false
tokenizer.json CHANGED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json CHANGED
@@ -1,230 +1,14 @@
1
  {
2
  "added_tokens_decoder": {
3
  "0": {
4
- "content": "|||IP_ADDRESS|||",
5
- "lstrip": false,
6
- "normalized": true,
7
- "rstrip": false,
8
- "single_word": false,
9
- "special": false
10
- },
11
- "1": {
12
- "content": "<|padding|>",
13
- "lstrip": false,
14
- "normalized": false,
15
- "rstrip": false,
16
- "single_word": false,
17
- "special": true
18
- },
19
- "50254": {
20
- "content": " ",
21
- "lstrip": false,
22
- "normalized": true,
23
- "rstrip": false,
24
- "single_word": false,
25
- "special": false
26
- },
27
- "50255": {
28
- "content": " ",
29
- "lstrip": false,
30
- "normalized": true,
31
- "rstrip": false,
32
- "single_word": false,
33
- "special": false
34
- },
35
- "50256": {
36
- "content": " ",
37
- "lstrip": false,
38
- "normalized": true,
39
- "rstrip": false,
40
- "single_word": false,
41
- "special": false
42
- },
43
- "50257": {
44
- "content": " ",
45
- "lstrip": false,
46
- "normalized": true,
47
- "rstrip": false,
48
- "single_word": false,
49
- "special": false
50
- },
51
- "50258": {
52
- "content": " ",
53
- "lstrip": false,
54
- "normalized": true,
55
- "rstrip": false,
56
- "single_word": false,
57
- "special": false
58
- },
59
- "50259": {
60
- "content": " ",
61
- "lstrip": false,
62
- "normalized": true,
63
- "rstrip": false,
64
- "single_word": false,
65
- "special": false
66
- },
67
- "50260": {
68
- "content": " ",
69
- "lstrip": false,
70
- "normalized": true,
71
- "rstrip": false,
72
- "single_word": false,
73
- "special": false
74
- },
75
- "50261": {
76
- "content": " ",
77
- "lstrip": false,
78
- "normalized": true,
79
- "rstrip": false,
80
- "single_word": false,
81
- "special": false
82
- },
83
- "50262": {
84
- "content": " ",
85
- "lstrip": false,
86
- "normalized": true,
87
- "rstrip": false,
88
- "single_word": false,
89
- "special": false
90
- },
91
- "50263": {
92
- "content": " ",
93
- "lstrip": false,
94
- "normalized": true,
95
- "rstrip": false,
96
- "single_word": false,
97
- "special": false
98
- },
99
- "50264": {
100
- "content": " ",
101
- "lstrip": false,
102
- "normalized": true,
103
- "rstrip": false,
104
- "single_word": false,
105
- "special": false
106
- },
107
- "50265": {
108
- "content": " ",
109
- "lstrip": false,
110
- "normalized": true,
111
- "rstrip": false,
112
- "single_word": false,
113
- "special": false
114
- },
115
- "50266": {
116
- "content": " ",
117
- "lstrip": false,
118
- "normalized": true,
119
- "rstrip": false,
120
- "single_word": false,
121
- "special": false
122
- },
123
- "50267": {
124
- "content": " ",
125
- "lstrip": false,
126
- "normalized": true,
127
- "rstrip": false,
128
- "single_word": false,
129
- "special": false
130
- },
131
- "50268": {
132
- "content": " ",
133
- "lstrip": false,
134
- "normalized": true,
135
- "rstrip": false,
136
- "single_word": false,
137
- "special": false
138
- },
139
- "50269": {
140
- "content": " ",
141
- "lstrip": false,
142
- "normalized": true,
143
- "rstrip": false,
144
- "single_word": false,
145
- "special": false
146
- },
147
- "50270": {
148
- "content": " ",
149
- "lstrip": false,
150
- "normalized": true,
151
- "rstrip": false,
152
- "single_word": false,
153
- "special": false
154
- },
155
- "50271": {
156
- "content": " ",
157
- "lstrip": false,
158
- "normalized": true,
159
- "rstrip": false,
160
- "single_word": false,
161
- "special": false
162
- },
163
- "50272": {
164
- "content": " ",
165
- "lstrip": false,
166
- "normalized": true,
167
- "rstrip": false,
168
- "single_word": false,
169
- "special": false
170
- },
171
- "50273": {
172
- "content": " ",
173
- "lstrip": false,
174
- "normalized": true,
175
- "rstrip": false,
176
- "single_word": false,
177
- "special": false
178
- },
179
- "50274": {
180
- "content": " ",
181
- "lstrip": false,
182
- "normalized": true,
183
- "rstrip": false,
184
- "single_word": false,
185
- "special": false
186
- },
187
- "50275": {
188
- "content": " ",
189
- "lstrip": false,
190
- "normalized": true,
191
- "rstrip": false,
192
- "single_word": false,
193
- "special": false
194
- },
195
- "50276": {
196
- "content": " ",
197
- "lstrip": false,
198
- "normalized": true,
199
- "rstrip": false,
200
- "single_word": false,
201
- "special": false
202
- },
203
- "50277": {
204
- "content": "|||EMAIL_ADDRESS|||",
205
- "lstrip": false,
206
- "normalized": true,
207
- "rstrip": false,
208
- "single_word": false,
209
- "special": false
210
- },
211
- "50278": {
212
- "content": "|||PHONE_NUMBER|||",
213
- "lstrip": false,
214
- "normalized": true,
215
- "rstrip": false,
216
- "single_word": false,
217
- "special": false
218
- },
219
- "50279": {
220
- "content": "<|endoftext|>",
221
  "lstrip": false,
222
  "normalized": false,
223
  "rstrip": false,
224
  "single_word": false,
225
  "special": true
226
  },
227
- "50280": {
228
  "content": "[UNK]",
229
  "lstrip": false,
230
  "normalized": false,
@@ -232,7 +16,7 @@
232
  "single_word": false,
233
  "special": true
234
  },
235
- "50281": {
236
  "content": "[CLS]",
237
  "lstrip": false,
238
  "normalized": false,
@@ -240,7 +24,7 @@
240
  "single_word": false,
241
  "special": true
242
  },
243
- "50282": {
244
  "content": "[SEP]",
245
  "lstrip": false,
246
  "normalized": false,
@@ -248,698 +32,34 @@
248
  "single_word": false,
249
  "special": true
250
  },
251
- "50283": {
252
- "content": "[PAD]",
253
- "lstrip": false,
254
- "normalized": false,
255
- "rstrip": false,
256
- "single_word": false,
257
- "special": true
258
- },
259
- "50284": {
260
  "content": "[MASK]",
261
- "lstrip": true,
262
  "normalized": false,
263
  "rstrip": false,
264
  "single_word": false,
265
  "special": true
266
- },
267
- "50285": {
268
- "content": "[unused0]",
269
- "lstrip": false,
270
- "normalized": true,
271
- "rstrip": false,
272
- "single_word": false,
273
- "special": false
274
- },
275
- "50286": {
276
- "content": "[unused1]",
277
- "lstrip": false,
278
- "normalized": true,
279
- "rstrip": false,
280
- "single_word": false,
281
- "special": false
282
- },
283
- "50287": {
284
- "content": "[unused2]",
285
- "lstrip": false,
286
- "normalized": true,
287
- "rstrip": false,
288
- "single_word": false,
289
- "special": false
290
- },
291
- "50288": {
292
- "content": "[unused3]",
293
- "lstrip": false,
294
- "normalized": true,
295
- "rstrip": false,
296
- "single_word": false,
297
- "special": false
298
- },
299
- "50289": {
300
- "content": "[unused4]",
301
- "lstrip": false,
302
- "normalized": true,
303
- "rstrip": false,
304
- "single_word": false,
305
- "special": false
306
- },
307
- "50290": {
308
- "content": "[unused5]",
309
- "lstrip": false,
310
- "normalized": true,
311
- "rstrip": false,
312
- "single_word": false,
313
- "special": false
314
- },
315
- "50291": {
316
- "content": "[unused6]",
317
- "lstrip": false,
318
- "normalized": true,
319
- "rstrip": false,
320
- "single_word": false,
321
- "special": false
322
- },
323
- "50292": {
324
- "content": "[unused7]",
325
- "lstrip": false,
326
- "normalized": true,
327
- "rstrip": false,
328
- "single_word": false,
329
- "special": false
330
- },
331
- "50293": {
332
- "content": "[unused8]",
333
- "lstrip": false,
334
- "normalized": true,
335
- "rstrip": false,
336
- "single_word": false,
337
- "special": false
338
- },
339
- "50294": {
340
- "content": "[unused9]",
341
- "lstrip": false,
342
- "normalized": true,
343
- "rstrip": false,
344
- "single_word": false,
345
- "special": false
346
- },
347
- "50295": {
348
- "content": "[unused10]",
349
- "lstrip": false,
350
- "normalized": true,
351
- "rstrip": false,
352
- "single_word": false,
353
- "special": false
354
- },
355
- "50296": {
356
- "content": "[unused11]",
357
- "lstrip": false,
358
- "normalized": true,
359
- "rstrip": false,
360
- "single_word": false,
361
- "special": false
362
- },
363
- "50297": {
364
- "content": "[unused12]",
365
- "lstrip": false,
366
- "normalized": true,
367
- "rstrip": false,
368
- "single_word": false,
369
- "special": false
370
- },
371
- "50298": {
372
- "content": "[unused13]",
373
- "lstrip": false,
374
- "normalized": true,
375
- "rstrip": false,
376
- "single_word": false,
377
- "special": false
378
- },
379
- "50299": {
380
- "content": "[unused14]",
381
- "lstrip": false,
382
- "normalized": true,
383
- "rstrip": false,
384
- "single_word": false,
385
- "special": false
386
- },
387
- "50300": {
388
- "content": "[unused15]",
389
- "lstrip": false,
390
- "normalized": true,
391
- "rstrip": false,
392
- "single_word": false,
393
- "special": false
394
- },
395
- "50301": {
396
- "content": "[unused16]",
397
- "lstrip": false,
398
- "normalized": true,
399
- "rstrip": false,
400
- "single_word": false,
401
- "special": false
402
- },
403
- "50302": {
404
- "content": "[unused17]",
405
- "lstrip": false,
406
- "normalized": true,
407
- "rstrip": false,
408
- "single_word": false,
409
- "special": false
410
- },
411
- "50303": {
412
- "content": "[unused18]",
413
- "lstrip": false,
414
- "normalized": true,
415
- "rstrip": false,
416
- "single_word": false,
417
- "special": false
418
- },
419
- "50304": {
420
- "content": "[unused19]",
421
- "lstrip": false,
422
- "normalized": true,
423
- "rstrip": false,
424
- "single_word": false,
425
- "special": false
426
- },
427
- "50305": {
428
- "content": "[unused20]",
429
- "lstrip": false,
430
- "normalized": true,
431
- "rstrip": false,
432
- "single_word": false,
433
- "special": false
434
- },
435
- "50306": {
436
- "content": "[unused21]",
437
- "lstrip": false,
438
- "normalized": true,
439
- "rstrip": false,
440
- "single_word": false,
441
- "special": false
442
- },
443
- "50307": {
444
- "content": "[unused22]",
445
- "lstrip": false,
446
- "normalized": true,
447
- "rstrip": false,
448
- "single_word": false,
449
- "special": false
450
- },
451
- "50308": {
452
- "content": "[unused23]",
453
- "lstrip": false,
454
- "normalized": true,
455
- "rstrip": false,
456
- "single_word": false,
457
- "special": false
458
- },
459
- "50309": {
460
- "content": "[unused24]",
461
- "lstrip": false,
462
- "normalized": true,
463
- "rstrip": false,
464
- "single_word": false,
465
- "special": false
466
- },
467
- "50310": {
468
- "content": "[unused25]",
469
- "lstrip": false,
470
- "normalized": true,
471
- "rstrip": false,
472
- "single_word": false,
473
- "special": false
474
- },
475
- "50311": {
476
- "content": "[unused26]",
477
- "lstrip": false,
478
- "normalized": true,
479
- "rstrip": false,
480
- "single_word": false,
481
- "special": false
482
- },
483
- "50312": {
484
- "content": "[unused27]",
485
- "lstrip": false,
486
- "normalized": true,
487
- "rstrip": false,
488
- "single_word": false,
489
- "special": false
490
- },
491
- "50313": {
492
- "content": "[unused28]",
493
- "lstrip": false,
494
- "normalized": true,
495
- "rstrip": false,
496
- "single_word": false,
497
- "special": false
498
- },
499
- "50314": {
500
- "content": "[unused29]",
501
- "lstrip": false,
502
- "normalized": true,
503
- "rstrip": false,
504
- "single_word": false,
505
- "special": false
506
- },
507
- "50315": {
508
- "content": "[unused30]",
509
- "lstrip": false,
510
- "normalized": true,
511
- "rstrip": false,
512
- "single_word": false,
513
- "special": false
514
- },
515
- "50316": {
516
- "content": "[unused31]",
517
- "lstrip": false,
518
- "normalized": true,
519
- "rstrip": false,
520
- "single_word": false,
521
- "special": false
522
- },
523
- "50317": {
524
- "content": "[unused32]",
525
- "lstrip": false,
526
- "normalized": true,
527
- "rstrip": false,
528
- "single_word": false,
529
- "special": false
530
- },
531
- "50318": {
532
- "content": "[unused33]",
533
- "lstrip": false,
534
- "normalized": true,
535
- "rstrip": false,
536
- "single_word": false,
537
- "special": false
538
- },
539
- "50319": {
540
- "content": "[unused34]",
541
- "lstrip": false,
542
- "normalized": true,
543
- "rstrip": false,
544
- "single_word": false,
545
- "special": false
546
- },
547
- "50320": {
548
- "content": "[unused35]",
549
- "lstrip": false,
550
- "normalized": true,
551
- "rstrip": false,
552
- "single_word": false,
553
- "special": false
554
- },
555
- "50321": {
556
- "content": "[unused36]",
557
- "lstrip": false,
558
- "normalized": true,
559
- "rstrip": false,
560
- "single_word": false,
561
- "special": false
562
- },
563
- "50322": {
564
- "content": "[unused37]",
565
- "lstrip": false,
566
- "normalized": true,
567
- "rstrip": false,
568
- "single_word": false,
569
- "special": false
570
- },
571
- "50323": {
572
- "content": "[unused38]",
573
- "lstrip": false,
574
- "normalized": true,
575
- "rstrip": false,
576
- "single_word": false,
577
- "special": false
578
- },
579
- "50324": {
580
- "content": "[unused39]",
581
- "lstrip": false,
582
- "normalized": true,
583
- "rstrip": false,
584
- "single_word": false,
585
- "special": false
586
- },
587
- "50325": {
588
- "content": "[unused40]",
589
- "lstrip": false,
590
- "normalized": true,
591
- "rstrip": false,
592
- "single_word": false,
593
- "special": false
594
- },
595
- "50326": {
596
- "content": "[unused41]",
597
- "lstrip": false,
598
- "normalized": true,
599
- "rstrip": false,
600
- "single_word": false,
601
- "special": false
602
- },
603
- "50327": {
604
- "content": "[unused42]",
605
- "lstrip": false,
606
- "normalized": true,
607
- "rstrip": false,
608
- "single_word": false,
609
- "special": false
610
- },
611
- "50328": {
612
- "content": "[unused43]",
613
- "lstrip": false,
614
- "normalized": true,
615
- "rstrip": false,
616
- "single_word": false,
617
- "special": false
618
- },
619
- "50329": {
620
- "content": "[unused44]",
621
- "lstrip": false,
622
- "normalized": true,
623
- "rstrip": false,
624
- "single_word": false,
625
- "special": false
626
- },
627
- "50330": {
628
- "content": "[unused45]",
629
- "lstrip": false,
630
- "normalized": true,
631
- "rstrip": false,
632
- "single_word": false,
633
- "special": false
634
- },
635
- "50331": {
636
- "content": "[unused46]",
637
- "lstrip": false,
638
- "normalized": true,
639
- "rstrip": false,
640
- "single_word": false,
641
- "special": false
642
- },
643
- "50332": {
644
- "content": "[unused47]",
645
- "lstrip": false,
646
- "normalized": true,
647
- "rstrip": false,
648
- "single_word": false,
649
- "special": false
650
- },
651
- "50333": {
652
- "content": "[unused48]",
653
- "lstrip": false,
654
- "normalized": true,
655
- "rstrip": false,
656
- "single_word": false,
657
- "special": false
658
- },
659
- "50334": {
660
- "content": "[unused49]",
661
- "lstrip": false,
662
- "normalized": true,
663
- "rstrip": false,
664
- "single_word": false,
665
- "special": false
666
- },
667
- "50335": {
668
- "content": "[unused50]",
669
- "lstrip": false,
670
- "normalized": true,
671
- "rstrip": false,
672
- "single_word": false,
673
- "special": false
674
- },
675
- "50336": {
676
- "content": "[unused51]",
677
- "lstrip": false,
678
- "normalized": true,
679
- "rstrip": false,
680
- "single_word": false,
681
- "special": false
682
- },
683
- "50337": {
684
- "content": "[unused52]",
685
- "lstrip": false,
686
- "normalized": true,
687
- "rstrip": false,
688
- "single_word": false,
689
- "special": false
690
- },
691
- "50338": {
692
- "content": "[unused53]",
693
- "lstrip": false,
694
- "normalized": true,
695
- "rstrip": false,
696
- "single_word": false,
697
- "special": false
698
- },
699
- "50339": {
700
- "content": "[unused54]",
701
- "lstrip": false,
702
- "normalized": true,
703
- "rstrip": false,
704
- "single_word": false,
705
- "special": false
706
- },
707
- "50340": {
708
- "content": "[unused55]",
709
- "lstrip": false,
710
- "normalized": true,
711
- "rstrip": false,
712
- "single_word": false,
713
- "special": false
714
- },
715
- "50341": {
716
- "content": "[unused56]",
717
- "lstrip": false,
718
- "normalized": true,
719
- "rstrip": false,
720
- "single_word": false,
721
- "special": false
722
- },
723
- "50342": {
724
- "content": "[unused57]",
725
- "lstrip": false,
726
- "normalized": true,
727
- "rstrip": false,
728
- "single_word": false,
729
- "special": false
730
- },
731
- "50343": {
732
- "content": "[unused58]",
733
- "lstrip": false,
734
- "normalized": true,
735
- "rstrip": false,
736
- "single_word": false,
737
- "special": false
738
- },
739
- "50344": {
740
- "content": "[unused59]",
741
- "lstrip": false,
742
- "normalized": true,
743
- "rstrip": false,
744
- "single_word": false,
745
- "special": false
746
- },
747
- "50345": {
748
- "content": "[unused60]",
749
- "lstrip": false,
750
- "normalized": true,
751
- "rstrip": false,
752
- "single_word": false,
753
- "special": false
754
- },
755
- "50346": {
756
- "content": "[unused61]",
757
- "lstrip": false,
758
- "normalized": true,
759
- "rstrip": false,
760
- "single_word": false,
761
- "special": false
762
- },
763
- "50347": {
764
- "content": "[unused62]",
765
- "lstrip": false,
766
- "normalized": true,
767
- "rstrip": false,
768
- "single_word": false,
769
- "special": false
770
- },
771
- "50348": {
772
- "content": "[unused63]",
773
- "lstrip": false,
774
- "normalized": true,
775
- "rstrip": false,
776
- "single_word": false,
777
- "special": false
778
- },
779
- "50349": {
780
- "content": "[unused64]",
781
- "lstrip": false,
782
- "normalized": true,
783
- "rstrip": false,
784
- "single_word": false,
785
- "special": false
786
- },
787
- "50350": {
788
- "content": "[unused65]",
789
- "lstrip": false,
790
- "normalized": true,
791
- "rstrip": false,
792
- "single_word": false,
793
- "special": false
794
- },
795
- "50351": {
796
- "content": "[unused66]",
797
- "lstrip": false,
798
- "normalized": true,
799
- "rstrip": false,
800
- "single_word": false,
801
- "special": false
802
- },
803
- "50352": {
804
- "content": "[unused67]",
805
- "lstrip": false,
806
- "normalized": true,
807
- "rstrip": false,
808
- "single_word": false,
809
- "special": false
810
- },
811
- "50353": {
812
- "content": "[unused68]",
813
- "lstrip": false,
814
- "normalized": true,
815
- "rstrip": false,
816
- "single_word": false,
817
- "special": false
818
- },
819
- "50354": {
820
- "content": "[unused69]",
821
- "lstrip": false,
822
- "normalized": true,
823
- "rstrip": false,
824
- "single_word": false,
825
- "special": false
826
- },
827
- "50355": {
828
- "content": "[unused70]",
829
- "lstrip": false,
830
- "normalized": true,
831
- "rstrip": false,
832
- "single_word": false,
833
- "special": false
834
- },
835
- "50356": {
836
- "content": "[unused71]",
837
- "lstrip": false,
838
- "normalized": true,
839
- "rstrip": false,
840
- "single_word": false,
841
- "special": false
842
- },
843
- "50357": {
844
- "content": "[unused72]",
845
- "lstrip": false,
846
- "normalized": true,
847
- "rstrip": false,
848
- "single_word": false,
849
- "special": false
850
- },
851
- "50358": {
852
- "content": "[unused73]",
853
- "lstrip": false,
854
- "normalized": true,
855
- "rstrip": false,
856
- "single_word": false,
857
- "special": false
858
- },
859
- "50359": {
860
- "content": "[unused74]",
861
- "lstrip": false,
862
- "normalized": true,
863
- "rstrip": false,
864
- "single_word": false,
865
- "special": false
866
- },
867
- "50360": {
868
- "content": "[unused75]",
869
- "lstrip": false,
870
- "normalized": true,
871
- "rstrip": false,
872
- "single_word": false,
873
- "special": false
874
- },
875
- "50361": {
876
- "content": "[unused76]",
877
- "lstrip": false,
878
- "normalized": true,
879
- "rstrip": false,
880
- "single_word": false,
881
- "special": false
882
- },
883
- "50362": {
884
- "content": "[unused77]",
885
- "lstrip": false,
886
- "normalized": true,
887
- "rstrip": false,
888
- "single_word": false,
889
- "special": false
890
- },
891
- "50363": {
892
- "content": "[unused78]",
893
- "lstrip": false,
894
- "normalized": true,
895
- "rstrip": false,
896
- "single_word": false,
897
- "special": false
898
- },
899
- "50364": {
900
- "content": "[unused79]",
901
- "lstrip": false,
902
- "normalized": true,
903
- "rstrip": false,
904
- "single_word": false,
905
- "special": false
906
- },
907
- "50365": {
908
- "content": "[unused80]",
909
- "lstrip": false,
910
- "normalized": true,
911
- "rstrip": false,
912
- "single_word": false,
913
- "special": false
914
- },
915
- "50366": {
916
- "content": "[unused81]",
917
- "lstrip": false,
918
- "normalized": true,
919
- "rstrip": false,
920
- "single_word": false,
921
- "special": false
922
- },
923
- "50367": {
924
- "content": "[unused82]",
925
- "lstrip": false,
926
- "normalized": true,
927
- "rstrip": false,
928
- "single_word": false,
929
- "special": false
930
  }
931
  },
932
  "clean_up_tokenization_spaces": true,
933
  "cls_token": "[CLS]",
 
 
934
  "extra_special_tokens": {},
935
  "mask_token": "[MASK]",
936
- "model_input_names": [
937
- "input_ids",
938
- "attention_mask"
939
- ],
940
- "model_max_length": 1000000000000000019884624838656,
941
  "pad_token": "[PAD]",
 
 
942
  "sep_token": "[SEP]",
943
- "tokenizer_class": "PreTrainedTokenizerFast",
 
 
 
 
 
944
  "unk_token": "[UNK]"
945
  }
 
1
  {
2
  "added_tokens_decoder": {
3
  "0": {
4
+ "content": "[PAD]",
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  "lstrip": false,
6
  "normalized": false,
7
  "rstrip": false,
8
  "single_word": false,
9
  "special": true
10
  },
11
+ "100": {
12
  "content": "[UNK]",
13
  "lstrip": false,
14
  "normalized": false,
 
16
  "single_word": false,
17
  "special": true
18
  },
19
+ "101": {
20
  "content": "[CLS]",
21
  "lstrip": false,
22
  "normalized": false,
 
24
  "single_word": false,
25
  "special": true
26
  },
27
+ "102": {
28
  "content": "[SEP]",
29
  "lstrip": false,
30
  "normalized": false,
 
32
  "single_word": false,
33
  "special": true
34
  },
35
+ "103": {
 
 
 
 
 
 
 
 
36
  "content": "[MASK]",
37
+ "lstrip": false,
38
  "normalized": false,
39
  "rstrip": false,
40
  "single_word": false,
41
  "special": true
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
  }
43
  },
44
  "clean_up_tokenization_spaces": true,
45
  "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
  "extra_special_tokens": {},
49
  "mask_token": "[MASK]",
50
+ "max_length": 128,
51
+ "model_max_length": 512,
52
+ "never_split": null,
53
+ "pad_to_multiple_of": null,
 
54
  "pad_token": "[PAD]",
55
+ "pad_token_type_id": 0,
56
+ "padding_side": "right",
57
  "sep_token": "[SEP]",
58
+ "stride": 0,
59
+ "strip_accents": null,
60
+ "tokenize_chinese_chars": true,
61
+ "tokenizer_class": "BertTokenizer",
62
+ "truncation_side": "right",
63
+ "truncation_strategy": "longest_first",
64
  "unk_token": "[UNK]"
65
  }