index int64 0 22.3k | modelId stringlengths 8 111 | label list | readme stringlengths 0 385k |
|---|---|---|---|
436 | SetFit/distilbert-base-uncased__hate_speech_offensive__train-8-9 | [
"hate speech",
"neither",
"offensive language"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__hate_speech_offensive__train-8-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__hate_speech_offensive__train-8-9
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0959
- Accuracy: 0.093
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.1068 | 1.0 | 5 | 1.1545 | 0.0 |
| 1.0494 | 2.0 | 10 | 1.1971 | 0.0 |
| 1.0612 | 3.0 | 15 | 1.2164 | 0.0 |
| 0.9517 | 4.0 | 20 | 1.2545 | 0.0 |
| 0.8874 | 5.0 | 25 | 1.2699 | 0.0 |
| 0.8598 | 6.0 | 30 | 1.2835 | 0.0 |
| 0.7006 | 7.0 | 35 | 1.3139 | 0.0 |
| 0.5969 | 8.0 | 40 | 1.3116 | 0.2 |
| 0.4769 | 9.0 | 45 | 1.3124 | 0.4 |
| 0.4352 | 10.0 | 50 | 1.3541 | 0.4 |
| 0.3231 | 11.0 | 55 | 1.3919 | 0.4 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
437 | SetFit/distilbert-base-uncased__sst2__all-train | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__all-train
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__all-train
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2496
- Accuracy: 0.8962
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3643 | 1.0 | 433 | 0.2496 | 0.8962 |
| 0.196 | 2.0 | 866 | 0.2548 | 0.9110 |
| 0.0915 | 3.0 | 1299 | 0.4483 | 0.8957 |
| 0.0505 | 4.0 | 1732 | 0.4968 | 0.9044 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
438 | SetFit/distilbert-base-uncased__sst2__train-16-0 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-0
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6903
- Accuracy: 0.5091
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6934 | 1.0 | 7 | 0.7142 | 0.2857 |
| 0.6703 | 2.0 | 14 | 0.7379 | 0.2857 |
| 0.6282 | 3.0 | 21 | 0.7769 | 0.2857 |
| 0.5193 | 4.0 | 28 | 0.8799 | 0.2857 |
| 0.5104 | 5.0 | 35 | 0.8380 | 0.4286 |
| 0.2504 | 6.0 | 42 | 0.8622 | 0.4286 |
| 0.1794 | 7.0 | 49 | 0.9227 | 0.4286 |
| 0.1156 | 8.0 | 56 | 0.8479 | 0.4286 |
| 0.0709 | 9.0 | 63 | 1.0929 | 0.2857 |
| 0.0471 | 10.0 | 70 | 1.2189 | 0.2857 |
| 0.0288 | 11.0 | 77 | 1.2026 | 0.4286 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
439 | SetFit/distilbert-base-uncased__sst2__train-16-1 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6012
- Accuracy: 0.6766
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6983 | 1.0 | 7 | 0.7036 | 0.2857 |
| 0.6836 | 2.0 | 14 | 0.7181 | 0.2857 |
| 0.645 | 3.0 | 21 | 0.7381 | 0.2857 |
| 0.5902 | 4.0 | 28 | 0.7746 | 0.2857 |
| 0.5799 | 5.0 | 35 | 0.7242 | 0.5714 |
| 0.3584 | 6.0 | 42 | 0.6935 | 0.5714 |
| 0.2596 | 7.0 | 49 | 0.7041 | 0.5714 |
| 0.1815 | 8.0 | 56 | 0.5930 | 0.7143 |
| 0.0827 | 9.0 | 63 | 0.6976 | 0.7143 |
| 0.0613 | 10.0 | 70 | 0.7346 | 0.7143 |
| 0.0356 | 11.0 | 77 | 0.6992 | 0.5714 |
| 0.0158 | 12.0 | 84 | 0.7328 | 0.5714 |
| 0.013 | 13.0 | 91 | 0.7819 | 0.5714 |
| 0.0103 | 14.0 | 98 | 0.8589 | 0.5714 |
| 0.0087 | 15.0 | 105 | 0.9177 | 0.5714 |
| 0.0076 | 16.0 | 112 | 0.9519 | 0.5714 |
| 0.0078 | 17.0 | 119 | 0.9556 | 0.5714 |
| 0.006 | 18.0 | 126 | 0.9542 | 0.5714 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
440 | SetFit/distilbert-base-uncased__sst2__train-16-2 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6748
- Accuracy: 0.6315
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7043 | 1.0 | 7 | 0.7054 | 0.2857 |
| 0.6711 | 2.0 | 14 | 0.7208 | 0.2857 |
| 0.6311 | 3.0 | 21 | 0.7365 | 0.2857 |
| 0.551 | 4.0 | 28 | 0.7657 | 0.5714 |
| 0.5599 | 5.0 | 35 | 0.6915 | 0.5714 |
| 0.3167 | 6.0 | 42 | 0.7134 | 0.5714 |
| 0.2489 | 7.0 | 49 | 0.7892 | 0.5714 |
| 0.1985 | 8.0 | 56 | 0.6756 | 0.7143 |
| 0.0864 | 9.0 | 63 | 0.8059 | 0.5714 |
| 0.0903 | 10.0 | 70 | 0.8165 | 0.7143 |
| 0.0429 | 11.0 | 77 | 0.7947 | 0.7143 |
| 0.0186 | 12.0 | 84 | 0.8570 | 0.7143 |
| 0.0146 | 13.0 | 91 | 0.9346 | 0.7143 |
| 0.011 | 14.0 | 98 | 0.9804 | 0.7143 |
| 0.0098 | 15.0 | 105 | 1.0136 | 0.7143 |
| 0.0086 | 16.0 | 112 | 1.0424 | 0.7143 |
| 0.0089 | 17.0 | 119 | 1.0736 | 0.7143 |
| 0.0068 | 18.0 | 126 | 1.0808 | 0.7143 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
441 | SetFit/distilbert-base-uncased__sst2__train-16-3 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7887
- Accuracy: 0.6458
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6928 | 1.0 | 7 | 0.6973 | 0.4286 |
| 0.675 | 2.0 | 14 | 0.7001 | 0.4286 |
| 0.6513 | 3.0 | 21 | 0.6959 | 0.4286 |
| 0.5702 | 4.0 | 28 | 0.6993 | 0.4286 |
| 0.5389 | 5.0 | 35 | 0.6020 | 0.7143 |
| 0.3386 | 6.0 | 42 | 0.5326 | 0.5714 |
| 0.2596 | 7.0 | 49 | 0.4943 | 0.7143 |
| 0.1633 | 8.0 | 56 | 0.3589 | 0.8571 |
| 0.1086 | 9.0 | 63 | 0.2924 | 0.8571 |
| 0.0641 | 10.0 | 70 | 0.2687 | 0.8571 |
| 0.0409 | 11.0 | 77 | 0.2202 | 0.8571 |
| 0.0181 | 12.0 | 84 | 0.2445 | 0.8571 |
| 0.0141 | 13.0 | 91 | 0.2885 | 0.8571 |
| 0.0108 | 14.0 | 98 | 0.3069 | 0.8571 |
| 0.009 | 15.0 | 105 | 0.3006 | 0.8571 |
| 0.0084 | 16.0 | 112 | 0.2834 | 0.8571 |
| 0.0088 | 17.0 | 119 | 0.2736 | 0.8571 |
| 0.0062 | 18.0 | 126 | 0.2579 | 0.8571 |
| 0.0058 | 19.0 | 133 | 0.2609 | 0.8571 |
| 0.0057 | 20.0 | 140 | 0.2563 | 0.8571 |
| 0.0049 | 21.0 | 147 | 0.2582 | 0.8571 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
442 | SetFit/distilbert-base-uncased__sst2__train-16-4 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-4
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1501
- Accuracy: 0.6387
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7043 | 1.0 | 7 | 0.7139 | 0.2857 |
| 0.68 | 2.0 | 14 | 0.7398 | 0.2857 |
| 0.641 | 3.0 | 21 | 0.7723 | 0.2857 |
| 0.5424 | 4.0 | 28 | 0.8391 | 0.2857 |
| 0.5988 | 5.0 | 35 | 0.7761 | 0.2857 |
| 0.3698 | 6.0 | 42 | 0.7707 | 0.4286 |
| 0.3204 | 7.0 | 49 | 0.8290 | 0.4286 |
| 0.2882 | 8.0 | 56 | 0.6551 | 0.5714 |
| 0.1512 | 9.0 | 63 | 0.5652 | 0.5714 |
| 0.1302 | 10.0 | 70 | 0.5278 | 0.5714 |
| 0.1043 | 11.0 | 77 | 0.4987 | 0.7143 |
| 0.0272 | 12.0 | 84 | 0.5278 | 0.5714 |
| 0.0201 | 13.0 | 91 | 0.5307 | 0.5714 |
| 0.0129 | 14.0 | 98 | 0.5382 | 0.5714 |
| 0.0117 | 15.0 | 105 | 0.5227 | 0.5714 |
| 0.0094 | 16.0 | 112 | 0.5066 | 0.7143 |
| 0.0104 | 17.0 | 119 | 0.4869 | 0.7143 |
| 0.0069 | 18.0 | 126 | 0.4786 | 0.7143 |
| 0.0062 | 19.0 | 133 | 0.4707 | 0.7143 |
| 0.0065 | 20.0 | 140 | 0.4669 | 0.7143 |
| 0.0051 | 21.0 | 147 | 0.4686 | 0.7143 |
| 0.0049 | 22.0 | 154 | 0.4784 | 0.7143 |
| 0.0046 | 23.0 | 161 | 0.4839 | 0.7143 |
| 0.0039 | 24.0 | 168 | 0.4823 | 0.7143 |
| 0.0044 | 25.0 | 175 | 0.4791 | 0.7143 |
| 0.0037 | 26.0 | 182 | 0.4778 | 0.7143 |
| 0.0038 | 27.0 | 189 | 0.4770 | 0.7143 |
| 0.0036 | 28.0 | 196 | 0.4750 | 0.7143 |
| 0.0031 | 29.0 | 203 | 0.4766 | 0.7143 |
| 0.0031 | 30.0 | 210 | 0.4754 | 0.7143 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
443 | SetFit/distilbert-base-uncased__sst2__train-16-5 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-5
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6537
- Accuracy: 0.6332
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6925 | 1.0 | 7 | 0.6966 | 0.2857 |
| 0.6703 | 2.0 | 14 | 0.7045 | 0.2857 |
| 0.6404 | 3.0 | 21 | 0.7205 | 0.2857 |
| 0.555 | 4.0 | 28 | 0.7548 | 0.2857 |
| 0.5179 | 5.0 | 35 | 0.6745 | 0.5714 |
| 0.3038 | 6.0 | 42 | 0.7260 | 0.5714 |
| 0.2089 | 7.0 | 49 | 0.8016 | 0.5714 |
| 0.1303 | 8.0 | 56 | 0.8202 | 0.5714 |
| 0.0899 | 9.0 | 63 | 0.9966 | 0.5714 |
| 0.0552 | 10.0 | 70 | 1.1887 | 0.5714 |
| 0.0333 | 11.0 | 77 | 1.2163 | 0.5714 |
| 0.0169 | 12.0 | 84 | 1.2874 | 0.5714 |
| 0.0136 | 13.0 | 91 | 1.3598 | 0.5714 |
| 0.0103 | 14.0 | 98 | 1.4237 | 0.5714 |
| 0.0089 | 15.0 | 105 | 1.4758 | 0.5714 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
444 | SetFit/distilbert-base-uncased__sst2__train-16-6 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-6
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-6
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8356
- Accuracy: 0.6480
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6978 | 1.0 | 7 | 0.6807 | 0.4286 |
| 0.6482 | 2.0 | 14 | 0.6775 | 0.4286 |
| 0.6051 | 3.0 | 21 | 0.6623 | 0.5714 |
| 0.486 | 4.0 | 28 | 0.6710 | 0.5714 |
| 0.4612 | 5.0 | 35 | 0.5325 | 0.7143 |
| 0.2233 | 6.0 | 42 | 0.4992 | 0.7143 |
| 0.1328 | 7.0 | 49 | 0.4753 | 0.7143 |
| 0.0905 | 8.0 | 56 | 0.2416 | 1.0 |
| 0.0413 | 9.0 | 63 | 0.2079 | 1.0 |
| 0.0356 | 10.0 | 70 | 0.2234 | 0.8571 |
| 0.0217 | 11.0 | 77 | 0.2639 | 0.8571 |
| 0.0121 | 12.0 | 84 | 0.2977 | 0.8571 |
| 0.0105 | 13.0 | 91 | 0.3468 | 0.8571 |
| 0.0085 | 14.0 | 98 | 0.3912 | 0.8571 |
| 0.0077 | 15.0 | 105 | 0.4000 | 0.8571 |
| 0.0071 | 16.0 | 112 | 0.4015 | 0.8571 |
| 0.0078 | 17.0 | 119 | 0.3865 | 0.8571 |
| 0.0059 | 18.0 | 126 | 0.3603 | 0.8571 |
| 0.0051 | 19.0 | 133 | 0.3231 | 0.8571 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
445 | SetFit/distilbert-base-uncased__sst2__train-16-7 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-7
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6952
- Accuracy: 0.5025
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6949 | 1.0 | 7 | 0.7252 | 0.2857 |
| 0.6678 | 2.0 | 14 | 0.7550 | 0.2857 |
| 0.6299 | 3.0 | 21 | 0.8004 | 0.2857 |
| 0.5596 | 4.0 | 28 | 0.8508 | 0.2857 |
| 0.5667 | 5.0 | 35 | 0.8464 | 0.2857 |
| 0.367 | 6.0 | 42 | 0.8515 | 0.2857 |
| 0.2706 | 7.0 | 49 | 0.9574 | 0.2857 |
| 0.2163 | 8.0 | 56 | 0.9710 | 0.4286 |
| 0.1024 | 9.0 | 63 | 1.1607 | 0.1429 |
| 0.1046 | 10.0 | 70 | 1.3779 | 0.1429 |
| 0.0483 | 11.0 | 77 | 1.4876 | 0.1429 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
446 | SetFit/distilbert-base-uncased__sst2__train-16-8 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-8
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6895
- Accuracy: 0.5222
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6899 | 1.0 | 7 | 0.7055 | 0.2857 |
| 0.6793 | 2.0 | 14 | 0.7205 | 0.2857 |
| 0.6291 | 3.0 | 21 | 0.7460 | 0.2857 |
| 0.5659 | 4.0 | 28 | 0.8041 | 0.2857 |
| 0.5607 | 5.0 | 35 | 0.7785 | 0.4286 |
| 0.3349 | 6.0 | 42 | 0.8163 | 0.4286 |
| 0.2436 | 7.0 | 49 | 0.9101 | 0.2857 |
| 0.1734 | 8.0 | 56 | 0.8632 | 0.5714 |
| 0.1122 | 9.0 | 63 | 0.9851 | 0.5714 |
| 0.0661 | 10.0 | 70 | 1.0835 | 0.5714 |
| 0.0407 | 11.0 | 77 | 1.1656 | 0.5714 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
447 | SetFit/distilbert-base-uncased__sst2__train-16-9 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-16-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-16-9
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6915
- Accuracy: 0.5157
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6868 | 1.0 | 7 | 0.7121 | 0.1429 |
| 0.6755 | 2.0 | 14 | 0.7234 | 0.1429 |
| 0.6389 | 3.0 | 21 | 0.7384 | 0.2857 |
| 0.5575 | 4.0 | 28 | 0.7884 | 0.2857 |
| 0.4972 | 5.0 | 35 | 0.7767 | 0.4286 |
| 0.2821 | 6.0 | 42 | 0.8275 | 0.4286 |
| 0.1859 | 7.0 | 49 | 0.9283 | 0.2857 |
| 0.1388 | 8.0 | 56 | 0.9384 | 0.4286 |
| 0.078 | 9.0 | 63 | 1.1973 | 0.4286 |
| 0.0462 | 10.0 | 70 | 1.4016 | 0.4286 |
| 0.0319 | 11.0 | 77 | 1.4087 | 0.4286 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
448 | SetFit/distilbert-base-uncased__sst2__train-32-0 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-0
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8558
- Accuracy: 0.7183
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7088 | 1.0 | 13 | 0.6819 | 0.6154 |
| 0.635 | 2.0 | 26 | 0.6318 | 0.7692 |
| 0.547 | 3.0 | 39 | 0.5356 | 0.7692 |
| 0.3497 | 4.0 | 52 | 0.4456 | 0.6923 |
| 0.1979 | 5.0 | 65 | 0.3993 | 0.7692 |
| 0.098 | 6.0 | 78 | 0.3613 | 0.7692 |
| 0.0268 | 7.0 | 91 | 0.3561 | 0.9231 |
| 0.0137 | 8.0 | 104 | 0.3755 | 0.9231 |
| 0.0083 | 9.0 | 117 | 0.4194 | 0.7692 |
| 0.0065 | 10.0 | 130 | 0.4446 | 0.7692 |
| 0.005 | 11.0 | 143 | 0.4527 | 0.7692 |
| 0.0038 | 12.0 | 156 | 0.4645 | 0.7692 |
| 0.0033 | 13.0 | 169 | 0.4735 | 0.7692 |
| 0.0033 | 14.0 | 182 | 0.4874 | 0.7692 |
| 0.0029 | 15.0 | 195 | 0.5041 | 0.7692 |
| 0.0025 | 16.0 | 208 | 0.5148 | 0.7692 |
| 0.0024 | 17.0 | 221 | 0.5228 | 0.7692 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
449 | SetFit/distilbert-base-uncased__sst2__train-32-1 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6492
- Accuracy: 0.6551
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7106 | 1.0 | 13 | 0.6850 | 0.6154 |
| 0.631 | 2.0 | 26 | 0.6632 | 0.6923 |
| 0.5643 | 3.0 | 39 | 0.6247 | 0.7692 |
| 0.3992 | 4.0 | 52 | 0.5948 | 0.7692 |
| 0.1928 | 5.0 | 65 | 0.5803 | 0.7692 |
| 0.0821 | 6.0 | 78 | 0.6404 | 0.6923 |
| 0.0294 | 7.0 | 91 | 0.7387 | 0.6923 |
| 0.0141 | 8.0 | 104 | 0.8270 | 0.6923 |
| 0.0082 | 9.0 | 117 | 0.8496 | 0.6923 |
| 0.0064 | 10.0 | 130 | 0.8679 | 0.6923 |
| 0.005 | 11.0 | 143 | 0.8914 | 0.6923 |
| 0.0036 | 12.0 | 156 | 0.9278 | 0.6923 |
| 0.0031 | 13.0 | 169 | 0.9552 | 0.6923 |
| 0.0029 | 14.0 | 182 | 0.9745 | 0.6923 |
| 0.0028 | 15.0 | 195 | 0.9785 | 0.6923 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
450 | SetFit/distilbert-base-uncased__sst2__train-32-2 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4805
- Accuracy: 0.7699
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7124 | 1.0 | 13 | 0.6882 | 0.5385 |
| 0.6502 | 2.0 | 26 | 0.6715 | 0.5385 |
| 0.6001 | 3.0 | 39 | 0.6342 | 0.6154 |
| 0.455 | 4.0 | 52 | 0.5713 | 0.7692 |
| 0.2605 | 5.0 | 65 | 0.5562 | 0.7692 |
| 0.1258 | 6.0 | 78 | 0.6799 | 0.7692 |
| 0.0444 | 7.0 | 91 | 0.8096 | 0.7692 |
| 0.0175 | 8.0 | 104 | 0.9281 | 0.6923 |
| 0.0106 | 9.0 | 117 | 0.9826 | 0.6923 |
| 0.0077 | 10.0 | 130 | 1.0254 | 0.7692 |
| 0.0056 | 11.0 | 143 | 1.0667 | 0.7692 |
| 0.0042 | 12.0 | 156 | 1.1003 | 0.7692 |
| 0.0036 | 13.0 | 169 | 1.1299 | 0.7692 |
| 0.0034 | 14.0 | 182 | 1.1623 | 0.6923 |
| 0.003 | 15.0 | 195 | 1.1938 | 0.6923 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
451 | SetFit/distilbert-base-uncased__sst2__train-32-3 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5694
- Accuracy: 0.7073
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7118 | 1.0 | 13 | 0.6844 | 0.5385 |
| 0.6587 | 2.0 | 26 | 0.6707 | 0.6154 |
| 0.6067 | 3.0 | 39 | 0.6295 | 0.5385 |
| 0.4714 | 4.0 | 52 | 0.5811 | 0.6923 |
| 0.2444 | 5.0 | 65 | 0.5932 | 0.7692 |
| 0.1007 | 6.0 | 78 | 0.7386 | 0.6923 |
| 0.0332 | 7.0 | 91 | 0.6962 | 0.6154 |
| 0.0147 | 8.0 | 104 | 0.8200 | 0.7692 |
| 0.0083 | 9.0 | 117 | 0.9250 | 0.7692 |
| 0.0066 | 10.0 | 130 | 0.9345 | 0.7692 |
| 0.005 | 11.0 | 143 | 0.9313 | 0.7692 |
| 0.0036 | 12.0 | 156 | 0.9356 | 0.7692 |
| 0.0031 | 13.0 | 169 | 0.9395 | 0.7692 |
| 0.0029 | 14.0 | 182 | 0.9504 | 0.7692 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
452 | SetFit/distilbert-base-uncased__sst2__train-32-4 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-4
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5001
- Accuracy: 0.7650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7175 | 1.0 | 13 | 0.6822 | 0.5385 |
| 0.6559 | 2.0 | 26 | 0.6533 | 0.6154 |
| 0.6052 | 3.0 | 39 | 0.5762 | 0.7692 |
| 0.4587 | 4.0 | 52 | 0.4477 | 0.8462 |
| 0.2459 | 5.0 | 65 | 0.4288 | 0.7692 |
| 0.1001 | 6.0 | 78 | 0.5219 | 0.7692 |
| 0.0308 | 7.0 | 91 | 0.8540 | 0.7692 |
| 0.014 | 8.0 | 104 | 0.7789 | 0.7692 |
| 0.0083 | 9.0 | 117 | 0.7996 | 0.7692 |
| 0.0064 | 10.0 | 130 | 0.8342 | 0.7692 |
| 0.0049 | 11.0 | 143 | 0.8612 | 0.7692 |
| 0.0036 | 12.0 | 156 | 0.8834 | 0.7692 |
| 0.0032 | 13.0 | 169 | 0.9067 | 0.7692 |
| 0.003 | 14.0 | 182 | 0.9332 | 0.7692 |
| 0.0028 | 15.0 | 195 | 0.9511 | 0.7692 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
453 | SetFit/distilbert-base-uncased__sst2__train-32-5 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-5
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6248
- Accuracy: 0.6826
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7136 | 1.0 | 13 | 0.6850 | 0.5385 |
| 0.6496 | 2.0 | 26 | 0.6670 | 0.6154 |
| 0.5895 | 3.0 | 39 | 0.6464 | 0.7692 |
| 0.4271 | 4.0 | 52 | 0.6478 | 0.7692 |
| 0.2182 | 5.0 | 65 | 0.6809 | 0.6923 |
| 0.103 | 6.0 | 78 | 0.9119 | 0.6923 |
| 0.0326 | 7.0 | 91 | 1.0718 | 0.6923 |
| 0.0154 | 8.0 | 104 | 1.0721 | 0.7692 |
| 0.0087 | 9.0 | 117 | 1.1416 | 0.7692 |
| 0.0067 | 10.0 | 130 | 1.2088 | 0.7692 |
| 0.005 | 11.0 | 143 | 1.2656 | 0.7692 |
| 0.0037 | 12.0 | 156 | 1.3104 | 0.7692 |
| 0.0032 | 13.0 | 169 | 1.3428 | 0.6923 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
454 | SetFit/distilbert-base-uncased__sst2__train-32-6 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-6
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-6
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5072
- Accuracy: 0.7650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7057 | 1.0 | 13 | 0.6704 | 0.6923 |
| 0.6489 | 2.0 | 26 | 0.6228 | 0.8462 |
| 0.5475 | 3.0 | 39 | 0.5079 | 0.8462 |
| 0.4014 | 4.0 | 52 | 0.4203 | 0.8462 |
| 0.1923 | 5.0 | 65 | 0.3872 | 0.8462 |
| 0.1014 | 6.0 | 78 | 0.4909 | 0.8462 |
| 0.0349 | 7.0 | 91 | 0.5460 | 0.8462 |
| 0.0173 | 8.0 | 104 | 0.4867 | 0.8462 |
| 0.0098 | 9.0 | 117 | 0.5274 | 0.8462 |
| 0.0075 | 10.0 | 130 | 0.6086 | 0.8462 |
| 0.0057 | 11.0 | 143 | 0.6604 | 0.8462 |
| 0.0041 | 12.0 | 156 | 0.6904 | 0.8462 |
| 0.0037 | 13.0 | 169 | 0.7164 | 0.8462 |
| 0.0034 | 14.0 | 182 | 0.7368 | 0.8462 |
| 0.0031 | 15.0 | 195 | 0.7565 | 0.8462 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
455 | SetFit/distilbert-base-uncased__sst2__train-32-7 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-7
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6736
- Accuracy: 0.5931
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7094 | 1.0 | 13 | 0.6887 | 0.5385 |
| 0.651 | 2.0 | 26 | 0.6682 | 0.6923 |
| 0.6084 | 3.0 | 39 | 0.6412 | 0.6923 |
| 0.4547 | 4.0 | 52 | 0.6095 | 0.6923 |
| 0.2903 | 5.0 | 65 | 0.6621 | 0.6923 |
| 0.1407 | 6.0 | 78 | 0.7130 | 0.7692 |
| 0.0444 | 7.0 | 91 | 0.9007 | 0.6923 |
| 0.0176 | 8.0 | 104 | 0.9525 | 0.7692 |
| 0.0098 | 9.0 | 117 | 1.0289 | 0.7692 |
| 0.0071 | 10.0 | 130 | 1.0876 | 0.7692 |
| 0.0052 | 11.0 | 143 | 1.1431 | 0.6923 |
| 0.0038 | 12.0 | 156 | 1.1687 | 0.7692 |
| 0.0034 | 13.0 | 169 | 1.1792 | 0.7692 |
| 0.0031 | 14.0 | 182 | 1.2033 | 0.7692 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
456 | SetFit/distilbert-base-uncased__sst2__train-32-8 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-8
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6880
- Accuracy: 0.5014
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.712 | 1.0 | 13 | 0.6936 | 0.5385 |
| 0.665 | 2.0 | 26 | 0.6960 | 0.3846 |
| 0.6112 | 3.0 | 39 | 0.7138 | 0.3846 |
| 0.4521 | 4.0 | 52 | 0.8243 | 0.4615 |
| 0.2627 | 5.0 | 65 | 0.7723 | 0.6154 |
| 0.0928 | 6.0 | 78 | 1.2666 | 0.5385 |
| 0.0312 | 7.0 | 91 | 1.2306 | 0.6154 |
| 0.0132 | 8.0 | 104 | 1.3385 | 0.6154 |
| 0.0082 | 9.0 | 117 | 1.4584 | 0.6154 |
| 0.0063 | 10.0 | 130 | 1.5429 | 0.6154 |
| 0.0049 | 11.0 | 143 | 1.5913 | 0.6154 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
457 | SetFit/distilbert-base-uncased__sst2__train-32-9 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-32-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-32-9
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5625
- Accuracy: 0.7353
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7057 | 1.0 | 13 | 0.6805 | 0.5385 |
| 0.6642 | 2.0 | 26 | 0.6526 | 0.7692 |
| 0.5869 | 3.0 | 39 | 0.5773 | 0.8462 |
| 0.4085 | 4.0 | 52 | 0.4959 | 0.8462 |
| 0.2181 | 5.0 | 65 | 0.4902 | 0.6923 |
| 0.069 | 6.0 | 78 | 0.5065 | 0.8462 |
| 0.0522 | 7.0 | 91 | 0.6082 | 0.7692 |
| 0.0135 | 8.0 | 104 | 0.6924 | 0.7692 |
| 0.0084 | 9.0 | 117 | 0.5921 | 0.7692 |
| 0.0061 | 10.0 | 130 | 0.6477 | 0.7692 |
| 0.0047 | 11.0 | 143 | 0.6648 | 0.7692 |
| 0.0035 | 12.0 | 156 | 0.6640 | 0.7692 |
| 0.0031 | 13.0 | 169 | 0.6615 | 0.7692 |
| 0.0029 | 14.0 | 182 | 0.6605 | 0.7692 |
| 0.0026 | 15.0 | 195 | 0.6538 | 0.8462 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
458 | SetFit/distilbert-base-uncased__sst2__train-8-0 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-0
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6920
- Accuracy: 0.5189
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6916 | 1.0 | 3 | 0.7035 | 0.25 |
| 0.6852 | 2.0 | 6 | 0.7139 | 0.25 |
| 0.6533 | 3.0 | 9 | 0.7192 | 0.25 |
| 0.6211 | 4.0 | 12 | 0.7322 | 0.25 |
| 0.5522 | 5.0 | 15 | 0.7561 | 0.25 |
| 0.488 | 6.0 | 18 | 0.7883 | 0.25 |
| 0.48 | 7.0 | 21 | 0.8224 | 0.25 |
| 0.3948 | 8.0 | 24 | 0.8605 | 0.25 |
| 0.3478 | 9.0 | 27 | 0.8726 | 0.25 |
| 0.2723 | 10.0 | 30 | 0.8885 | 0.25 |
| 0.2174 | 11.0 | 33 | 0.8984 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
459 | SetFit/distilbert-base-uncased__sst2__train-8-1 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6930
- Accuracy: 0.5047
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7082 | 1.0 | 3 | 0.7048 | 0.25 |
| 0.6761 | 2.0 | 6 | 0.7249 | 0.25 |
| 0.6653 | 3.0 | 9 | 0.7423 | 0.25 |
| 0.6212 | 4.0 | 12 | 0.7727 | 0.25 |
| 0.5932 | 5.0 | 15 | 0.8098 | 0.25 |
| 0.5427 | 6.0 | 18 | 0.8496 | 0.25 |
| 0.5146 | 7.0 | 21 | 0.8992 | 0.25 |
| 0.4356 | 8.0 | 24 | 0.9494 | 0.25 |
| 0.4275 | 9.0 | 27 | 0.9694 | 0.25 |
| 0.3351 | 10.0 | 30 | 0.9968 | 0.25 |
| 0.2812 | 11.0 | 33 | 1.0056 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
460 | SetFit/distilbert-base-uncased__sst2__train-8-2 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6932
- Accuracy: 0.4931
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7081 | 1.0 | 3 | 0.7031 | 0.25 |
| 0.6853 | 2.0 | 6 | 0.7109 | 0.25 |
| 0.6696 | 3.0 | 9 | 0.7211 | 0.25 |
| 0.6174 | 4.0 | 12 | 0.7407 | 0.25 |
| 0.5717 | 5.0 | 15 | 0.7625 | 0.25 |
| 0.5096 | 6.0 | 18 | 0.7732 | 0.25 |
| 0.488 | 7.0 | 21 | 0.7798 | 0.25 |
| 0.4023 | 8.0 | 24 | 0.7981 | 0.25 |
| 0.3556 | 9.0 | 27 | 0.8110 | 0.25 |
| 0.2714 | 10.0 | 30 | 0.8269 | 0.25 |
| 0.2295 | 11.0 | 33 | 0.8276 | 0.25 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
461 | SetFit/distilbert-base-uncased__sst2__train-8-3 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6914
- Accuracy: 0.5195
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6931 | 1.0 | 3 | 0.7039 | 0.25 |
| 0.6615 | 2.0 | 6 | 0.7186 | 0.25 |
| 0.653 | 3.0 | 9 | 0.7334 | 0.25 |
| 0.601 | 4.0 | 12 | 0.7592 | 0.25 |
| 0.5555 | 5.0 | 15 | 0.7922 | 0.25 |
| 0.4832 | 6.0 | 18 | 0.8179 | 0.25 |
| 0.4565 | 7.0 | 21 | 0.8285 | 0.25 |
| 0.3996 | 8.0 | 24 | 0.8559 | 0.25 |
| 0.3681 | 9.0 | 27 | 0.8586 | 0.5 |
| 0.2901 | 10.0 | 30 | 0.8646 | 0.5 |
| 0.241 | 11.0 | 33 | 0.8524 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
462 | SetFit/distilbert-base-uncased__sst2__train-8-4 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-4
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6921
- Accuracy: 0.5107
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7163 | 1.0 | 3 | 0.7100 | 0.25 |
| 0.6785 | 2.0 | 6 | 0.7209 | 0.25 |
| 0.6455 | 3.0 | 9 | 0.7321 | 0.25 |
| 0.6076 | 4.0 | 12 | 0.7517 | 0.25 |
| 0.5593 | 5.0 | 15 | 0.7780 | 0.25 |
| 0.5202 | 6.0 | 18 | 0.7990 | 0.25 |
| 0.4967 | 7.0 | 21 | 0.8203 | 0.25 |
| 0.4158 | 8.0 | 24 | 0.8497 | 0.25 |
| 0.3997 | 9.0 | 27 | 0.8638 | 0.25 |
| 0.3064 | 10.0 | 30 | 0.8732 | 0.25 |
| 0.2618 | 11.0 | 33 | 0.8669 | 0.25 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
463 | SetFit/distilbert-base-uncased__sst2__train-8-5 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-5
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8419
- Accuracy: 0.6172
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7057 | 1.0 | 3 | 0.6848 | 0.75 |
| 0.6681 | 2.0 | 6 | 0.6875 | 0.5 |
| 0.6591 | 3.0 | 9 | 0.6868 | 0.25 |
| 0.6052 | 4.0 | 12 | 0.6943 | 0.25 |
| 0.557 | 5.0 | 15 | 0.7078 | 0.25 |
| 0.4954 | 6.0 | 18 | 0.7168 | 0.25 |
| 0.4593 | 7.0 | 21 | 0.7185 | 0.25 |
| 0.3936 | 8.0 | 24 | 0.7212 | 0.25 |
| 0.3699 | 9.0 | 27 | 0.6971 | 0.5 |
| 0.2916 | 10.0 | 30 | 0.6827 | 0.5 |
| 0.2511 | 11.0 | 33 | 0.6464 | 0.5 |
| 0.2109 | 12.0 | 36 | 0.6344 | 0.75 |
| 0.1655 | 13.0 | 39 | 0.6377 | 0.75 |
| 0.1412 | 14.0 | 42 | 0.6398 | 0.75 |
| 0.1157 | 15.0 | 45 | 0.6315 | 0.75 |
| 0.0895 | 16.0 | 48 | 0.6210 | 0.75 |
| 0.0783 | 17.0 | 51 | 0.5918 | 0.75 |
| 0.0606 | 18.0 | 54 | 0.5543 | 0.75 |
| 0.0486 | 19.0 | 57 | 0.5167 | 0.75 |
| 0.0405 | 20.0 | 60 | 0.4862 | 0.75 |
| 0.0376 | 21.0 | 63 | 0.4644 | 0.75 |
| 0.0294 | 22.0 | 66 | 0.4497 | 0.75 |
| 0.0261 | 23.0 | 69 | 0.4428 | 0.75 |
| 0.0238 | 24.0 | 72 | 0.4408 | 0.75 |
| 0.0217 | 25.0 | 75 | 0.4392 | 0.75 |
| 0.0187 | 26.0 | 78 | 0.4373 | 0.75 |
| 0.0177 | 27.0 | 81 | 0.4360 | 0.75 |
| 0.0136 | 28.0 | 84 | 0.4372 | 0.75 |
| 0.0144 | 29.0 | 87 | 0.4368 | 0.75 |
| 0.014 | 30.0 | 90 | 0.4380 | 0.75 |
| 0.0137 | 31.0 | 93 | 0.4383 | 0.75 |
| 0.0133 | 32.0 | 96 | 0.4409 | 0.75 |
| 0.013 | 33.0 | 99 | 0.4380 | 0.75 |
| 0.0096 | 34.0 | 102 | 0.4358 | 0.75 |
| 0.012 | 35.0 | 105 | 0.4339 | 0.75 |
| 0.0122 | 36.0 | 108 | 0.4305 | 0.75 |
| 0.0109 | 37.0 | 111 | 0.4267 | 0.75 |
| 0.0121 | 38.0 | 114 | 0.4231 | 0.75 |
| 0.0093 | 39.0 | 117 | 0.4209 | 0.75 |
| 0.0099 | 40.0 | 120 | 0.4199 | 0.75 |
| 0.0091 | 41.0 | 123 | 0.4184 | 0.75 |
| 0.0116 | 42.0 | 126 | 0.4173 | 0.75 |
| 0.01 | 43.0 | 129 | 0.4163 | 0.75 |
| 0.0098 | 44.0 | 132 | 0.4153 | 0.75 |
| 0.0101 | 45.0 | 135 | 0.4155 | 0.75 |
| 0.0088 | 46.0 | 138 | 0.4149 | 0.75 |
| 0.0087 | 47.0 | 141 | 0.4150 | 0.75 |
| 0.0093 | 48.0 | 144 | 0.4147 | 0.75 |
| 0.0081 | 49.0 | 147 | 0.4147 | 0.75 |
| 0.009 | 50.0 | 150 | 0.4150 | 0.75 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
464 | SetFit/distilbert-base-uncased__sst2__train-8-6 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-6
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-6
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5336
- Accuracy: 0.7523
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7161 | 1.0 | 3 | 0.6941 | 0.5 |
| 0.6786 | 2.0 | 6 | 0.7039 | 0.25 |
| 0.6586 | 3.0 | 9 | 0.7090 | 0.25 |
| 0.6121 | 4.0 | 12 | 0.7183 | 0.25 |
| 0.5696 | 5.0 | 15 | 0.7266 | 0.25 |
| 0.522 | 6.0 | 18 | 0.7305 | 0.25 |
| 0.4899 | 7.0 | 21 | 0.7339 | 0.25 |
| 0.3985 | 8.0 | 24 | 0.7429 | 0.25 |
| 0.3758 | 9.0 | 27 | 0.7224 | 0.25 |
| 0.2876 | 10.0 | 30 | 0.7068 | 0.5 |
| 0.2498 | 11.0 | 33 | 0.6751 | 0.75 |
| 0.1921 | 12.0 | 36 | 0.6487 | 0.75 |
| 0.1491 | 13.0 | 39 | 0.6261 | 0.75 |
| 0.1276 | 14.0 | 42 | 0.6102 | 0.75 |
| 0.0996 | 15.0 | 45 | 0.5964 | 0.75 |
| 0.073 | 16.0 | 48 | 0.6019 | 0.75 |
| 0.0627 | 17.0 | 51 | 0.5933 | 0.75 |
| 0.053 | 18.0 | 54 | 0.5768 | 0.75 |
| 0.0403 | 19.0 | 57 | 0.5698 | 0.75 |
| 0.0328 | 20.0 | 60 | 0.5656 | 0.75 |
| 0.03 | 21.0 | 63 | 0.5634 | 0.75 |
| 0.025 | 22.0 | 66 | 0.5620 | 0.75 |
| 0.0209 | 23.0 | 69 | 0.5623 | 0.75 |
| 0.0214 | 24.0 | 72 | 0.5606 | 0.75 |
| 0.0191 | 25.0 | 75 | 0.5565 | 0.75 |
| 0.0173 | 26.0 | 78 | 0.5485 | 0.75 |
| 0.0175 | 27.0 | 81 | 0.5397 | 0.75 |
| 0.0132 | 28.0 | 84 | 0.5322 | 0.75 |
| 0.0138 | 29.0 | 87 | 0.5241 | 0.75 |
| 0.0128 | 30.0 | 90 | 0.5235 | 0.75 |
| 0.0126 | 31.0 | 93 | 0.5253 | 0.75 |
| 0.012 | 32.0 | 96 | 0.5317 | 0.75 |
| 0.0118 | 33.0 | 99 | 0.5342 | 0.75 |
| 0.0092 | 34.0 | 102 | 0.5388 | 0.75 |
| 0.0117 | 35.0 | 105 | 0.5414 | 0.75 |
| 0.0124 | 36.0 | 108 | 0.5453 | 0.75 |
| 0.0109 | 37.0 | 111 | 0.5506 | 0.75 |
| 0.0112 | 38.0 | 114 | 0.5555 | 0.75 |
| 0.0087 | 39.0 | 117 | 0.5597 | 0.75 |
| 0.01 | 40.0 | 120 | 0.5640 | 0.75 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
465 | SetFit/distilbert-base-uncased__sst2__train-8-7 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-7
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6950
- Accuracy: 0.4618
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7156 | 1.0 | 3 | 0.6965 | 0.25 |
| 0.6645 | 2.0 | 6 | 0.7059 | 0.25 |
| 0.6368 | 3.0 | 9 | 0.7179 | 0.25 |
| 0.5944 | 4.0 | 12 | 0.7408 | 0.25 |
| 0.5369 | 5.0 | 15 | 0.7758 | 0.25 |
| 0.449 | 6.0 | 18 | 0.8009 | 0.25 |
| 0.4352 | 7.0 | 21 | 0.8209 | 0.5 |
| 0.3462 | 8.0 | 24 | 0.8470 | 0.5 |
| 0.3028 | 9.0 | 27 | 0.8579 | 0.5 |
| 0.2365 | 10.0 | 30 | 0.8704 | 0.5 |
| 0.2023 | 11.0 | 33 | 0.8770 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
466 | SetFit/distilbert-base-uncased__sst2__train-8-8 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-8
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6925
- Accuracy: 0.5200
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7061 | 1.0 | 3 | 0.6899 | 0.75 |
| 0.6627 | 2.0 | 6 | 0.7026 | 0.25 |
| 0.644 | 3.0 | 9 | 0.7158 | 0.25 |
| 0.6087 | 4.0 | 12 | 0.7325 | 0.25 |
| 0.5602 | 5.0 | 15 | 0.7555 | 0.25 |
| 0.5034 | 6.0 | 18 | 0.7725 | 0.25 |
| 0.4672 | 7.0 | 21 | 0.7983 | 0.25 |
| 0.403 | 8.0 | 24 | 0.8314 | 0.25 |
| 0.3571 | 9.0 | 27 | 0.8555 | 0.25 |
| 0.2792 | 10.0 | 30 | 0.9065 | 0.25 |
| 0.2373 | 11.0 | 33 | 0.9286 | 0.25 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
467 | SetFit/distilbert-base-uncased__sst2__train-8-9 | [
"negative",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst2__train-8-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst2__train-8-9
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6925
- Accuracy: 0.5140
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7204 | 1.0 | 3 | 0.7025 | 0.5 |
| 0.6885 | 2.0 | 6 | 0.7145 | 0.5 |
| 0.6662 | 3.0 | 9 | 0.7222 | 0.5 |
| 0.6182 | 4.0 | 12 | 0.7427 | 0.25 |
| 0.5707 | 5.0 | 15 | 0.7773 | 0.25 |
| 0.5247 | 6.0 | 18 | 0.8137 | 0.25 |
| 0.5003 | 7.0 | 21 | 0.8556 | 0.25 |
| 0.4195 | 8.0 | 24 | 0.9089 | 0.5 |
| 0.387 | 9.0 | 27 | 0.9316 | 0.25 |
| 0.2971 | 10.0 | 30 | 0.9558 | 0.25 |
| 0.2581 | 11.0 | 33 | 0.9420 | 0.25 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
468 | SetFit/distilbert-base-uncased__sst5__all-train | [
"negative",
"neutral",
"positive",
"very negative",
"very positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__sst5__all-train
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__sst5__all-train
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3757
- Accuracy: 0.5045
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.2492 | 1.0 | 534 | 1.1163 | 0.4991 |
| 0.9937 | 2.0 | 1068 | 1.1232 | 0.5122 |
| 0.7867 | 3.0 | 1602 | 1.2097 | 0.5045 |
| 0.595 | 4.0 | 2136 | 1.3757 | 0.5045 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
469 | SetFit/distilbert-base-uncased__subj__all-train | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__all-train
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__all-train
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3193
- Accuracy: 0.9485
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1992 | 1.0 | 500 | 0.1236 | 0.963 |
| 0.084 | 2.0 | 1000 | 0.1428 | 0.963 |
| 0.0333 | 3.0 | 1500 | 0.1906 | 0.965 |
| 0.0159 | 4.0 | 2000 | 0.3193 | 0.9485 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
470 | SetFit/distilbert-base-uncased__subj__train-8-0 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-0
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4440
- Accuracy: 0.789
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7163 | 1.0 | 3 | 0.6868 | 0.5 |
| 0.6683 | 2.0 | 6 | 0.6804 | 0.75 |
| 0.6375 | 3.0 | 9 | 0.6702 | 0.75 |
| 0.5997 | 4.0 | 12 | 0.6686 | 0.75 |
| 0.5345 | 5.0 | 15 | 0.6720 | 0.75 |
| 0.4673 | 6.0 | 18 | 0.6646 | 0.75 |
| 0.4214 | 7.0 | 21 | 0.6494 | 0.75 |
| 0.3439 | 8.0 | 24 | 0.6313 | 0.75 |
| 0.3157 | 9.0 | 27 | 0.6052 | 0.75 |
| 0.2329 | 10.0 | 30 | 0.5908 | 0.75 |
| 0.1989 | 11.0 | 33 | 0.5768 | 0.75 |
| 0.1581 | 12.0 | 36 | 0.5727 | 0.75 |
| 0.1257 | 13.0 | 39 | 0.5678 | 0.75 |
| 0.1005 | 14.0 | 42 | 0.5518 | 0.75 |
| 0.0836 | 15.0 | 45 | 0.5411 | 0.75 |
| 0.0611 | 16.0 | 48 | 0.5320 | 0.75 |
| 0.0503 | 17.0 | 51 | 0.5299 | 0.75 |
| 0.0407 | 18.0 | 54 | 0.5368 | 0.75 |
| 0.0332 | 19.0 | 57 | 0.5455 | 0.75 |
| 0.0293 | 20.0 | 60 | 0.5525 | 0.75 |
| 0.0254 | 21.0 | 63 | 0.5560 | 0.75 |
| 0.0231 | 22.0 | 66 | 0.5569 | 0.75 |
| 0.0201 | 23.0 | 69 | 0.5572 | 0.75 |
| 0.0179 | 24.0 | 72 | 0.5575 | 0.75 |
| 0.0184 | 25.0 | 75 | 0.5547 | 0.75 |
| 0.0148 | 26.0 | 78 | 0.5493 | 0.75 |
| 0.0149 | 27.0 | 81 | 0.5473 | 0.75 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
471 | SetFit/distilbert-base-uncased__subj__train-8-1 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-1
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5488
- Accuracy: 0.791
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.703 | 1.0 | 3 | 0.6906 | 0.5 |
| 0.666 | 2.0 | 6 | 0.6945 | 0.25 |
| 0.63 | 3.0 | 9 | 0.6885 | 0.5 |
| 0.588 | 4.0 | 12 | 0.6888 | 0.25 |
| 0.5181 | 5.0 | 15 | 0.6899 | 0.25 |
| 0.4508 | 6.0 | 18 | 0.6770 | 0.5 |
| 0.4025 | 7.0 | 21 | 0.6579 | 0.5 |
| 0.3361 | 8.0 | 24 | 0.6392 | 0.5 |
| 0.2919 | 9.0 | 27 | 0.6113 | 0.5 |
| 0.2151 | 10.0 | 30 | 0.5774 | 0.75 |
| 0.1728 | 11.0 | 33 | 0.5248 | 0.75 |
| 0.1313 | 12.0 | 36 | 0.4824 | 0.75 |
| 0.1046 | 13.0 | 39 | 0.4456 | 0.75 |
| 0.0858 | 14.0 | 42 | 0.4076 | 0.75 |
| 0.0679 | 15.0 | 45 | 0.3755 | 0.75 |
| 0.0485 | 16.0 | 48 | 0.3422 | 0.75 |
| 0.0416 | 17.0 | 51 | 0.3055 | 0.75 |
| 0.0358 | 18.0 | 54 | 0.2731 | 1.0 |
| 0.0277 | 19.0 | 57 | 0.2443 | 1.0 |
| 0.0234 | 20.0 | 60 | 0.2187 | 1.0 |
| 0.0223 | 21.0 | 63 | 0.1960 | 1.0 |
| 0.0187 | 22.0 | 66 | 0.1762 | 1.0 |
| 0.017 | 23.0 | 69 | 0.1629 | 1.0 |
| 0.0154 | 24.0 | 72 | 0.1543 | 1.0 |
| 0.0164 | 25.0 | 75 | 0.1476 | 1.0 |
| 0.0131 | 26.0 | 78 | 0.1423 | 1.0 |
| 0.0139 | 27.0 | 81 | 0.1387 | 1.0 |
| 0.0107 | 28.0 | 84 | 0.1360 | 1.0 |
| 0.0108 | 29.0 | 87 | 0.1331 | 1.0 |
| 0.0105 | 30.0 | 90 | 0.1308 | 1.0 |
| 0.0106 | 31.0 | 93 | 0.1276 | 1.0 |
| 0.0104 | 32.0 | 96 | 0.1267 | 1.0 |
| 0.0095 | 33.0 | 99 | 0.1255 | 1.0 |
| 0.0076 | 34.0 | 102 | 0.1243 | 1.0 |
| 0.0094 | 35.0 | 105 | 0.1235 | 1.0 |
| 0.0103 | 36.0 | 108 | 0.1228 | 1.0 |
| 0.0086 | 37.0 | 111 | 0.1231 | 1.0 |
| 0.0094 | 38.0 | 114 | 0.1236 | 1.0 |
| 0.0074 | 39.0 | 117 | 0.1240 | 1.0 |
| 0.0085 | 40.0 | 120 | 0.1246 | 1.0 |
| 0.0079 | 41.0 | 123 | 0.1253 | 1.0 |
| 0.0088 | 42.0 | 126 | 0.1248 | 1.0 |
| 0.0082 | 43.0 | 129 | 0.1244 | 1.0 |
| 0.0082 | 44.0 | 132 | 0.1234 | 1.0 |
| 0.0082 | 45.0 | 135 | 0.1223 | 1.0 |
| 0.0071 | 46.0 | 138 | 0.1212 | 1.0 |
| 0.0073 | 47.0 | 141 | 0.1208 | 1.0 |
| 0.0081 | 48.0 | 144 | 0.1205 | 1.0 |
| 0.0067 | 49.0 | 147 | 0.1202 | 1.0 |
| 0.0077 | 50.0 | 150 | 0.1202 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
472 | SetFit/distilbert-base-uncased__subj__train-8-2 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3081
- Accuracy: 0.8755
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7146 | 1.0 | 3 | 0.6798 | 0.75 |
| 0.6737 | 2.0 | 6 | 0.6847 | 0.75 |
| 0.6519 | 3.0 | 9 | 0.6783 | 0.75 |
| 0.6105 | 4.0 | 12 | 0.6812 | 0.25 |
| 0.5463 | 5.0 | 15 | 0.6869 | 0.25 |
| 0.4922 | 6.0 | 18 | 0.6837 | 0.5 |
| 0.4543 | 7.0 | 21 | 0.6716 | 0.5 |
| 0.3856 | 8.0 | 24 | 0.6613 | 0.75 |
| 0.3475 | 9.0 | 27 | 0.6282 | 0.75 |
| 0.2717 | 10.0 | 30 | 0.6045 | 0.75 |
| 0.2347 | 11.0 | 33 | 0.5620 | 0.75 |
| 0.1979 | 12.0 | 36 | 0.5234 | 1.0 |
| 0.1535 | 13.0 | 39 | 0.4771 | 1.0 |
| 0.1332 | 14.0 | 42 | 0.4277 | 1.0 |
| 0.1041 | 15.0 | 45 | 0.3785 | 1.0 |
| 0.082 | 16.0 | 48 | 0.3318 | 1.0 |
| 0.0672 | 17.0 | 51 | 0.2885 | 1.0 |
| 0.0538 | 18.0 | 54 | 0.2568 | 1.0 |
| 0.0412 | 19.0 | 57 | 0.2356 | 1.0 |
| 0.0361 | 20.0 | 60 | 0.2217 | 1.0 |
| 0.0303 | 21.0 | 63 | 0.2125 | 1.0 |
| 0.0268 | 22.0 | 66 | 0.2060 | 1.0 |
| 0.0229 | 23.0 | 69 | 0.2015 | 1.0 |
| 0.0215 | 24.0 | 72 | 0.1989 | 1.0 |
| 0.0211 | 25.0 | 75 | 0.1969 | 1.0 |
| 0.0172 | 26.0 | 78 | 0.1953 | 1.0 |
| 0.0165 | 27.0 | 81 | 0.1935 | 1.0 |
| 0.0132 | 28.0 | 84 | 0.1923 | 1.0 |
| 0.0146 | 29.0 | 87 | 0.1914 | 1.0 |
| 0.0125 | 30.0 | 90 | 0.1904 | 1.0 |
| 0.0119 | 31.0 | 93 | 0.1897 | 1.0 |
| 0.0122 | 32.0 | 96 | 0.1886 | 1.0 |
| 0.0118 | 33.0 | 99 | 0.1875 | 1.0 |
| 0.0097 | 34.0 | 102 | 0.1866 | 1.0 |
| 0.0111 | 35.0 | 105 | 0.1861 | 1.0 |
| 0.0111 | 36.0 | 108 | 0.1855 | 1.0 |
| 0.0102 | 37.0 | 111 | 0.1851 | 1.0 |
| 0.0109 | 38.0 | 114 | 0.1851 | 1.0 |
| 0.0085 | 39.0 | 117 | 0.1854 | 1.0 |
| 0.0089 | 40.0 | 120 | 0.1855 | 1.0 |
| 0.0092 | 41.0 | 123 | 0.1863 | 1.0 |
| 0.0105 | 42.0 | 126 | 0.1868 | 1.0 |
| 0.0089 | 43.0 | 129 | 0.1874 | 1.0 |
| 0.0091 | 44.0 | 132 | 0.1877 | 1.0 |
| 0.0096 | 45.0 | 135 | 0.1881 | 1.0 |
| 0.0081 | 46.0 | 138 | 0.1881 | 1.0 |
| 0.0086 | 47.0 | 141 | 0.1883 | 1.0 |
| 0.009 | 48.0 | 144 | 0.1884 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
473 | SetFit/distilbert-base-uncased__subj__train-8-3 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-3
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3496
- Accuracy: 0.859
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7136 | 1.0 | 3 | 0.6875 | 0.75 |
| 0.6702 | 2.0 | 6 | 0.6824 | 0.75 |
| 0.6456 | 3.0 | 9 | 0.6687 | 0.75 |
| 0.5934 | 4.0 | 12 | 0.6564 | 0.75 |
| 0.537 | 5.0 | 15 | 0.6428 | 0.75 |
| 0.4812 | 6.0 | 18 | 0.6180 | 0.75 |
| 0.4279 | 7.0 | 21 | 0.5864 | 0.75 |
| 0.3608 | 8.0 | 24 | 0.5540 | 0.75 |
| 0.3076 | 9.0 | 27 | 0.5012 | 1.0 |
| 0.2292 | 10.0 | 30 | 0.4497 | 1.0 |
| 0.1991 | 11.0 | 33 | 0.3945 | 1.0 |
| 0.1495 | 12.0 | 36 | 0.3483 | 1.0 |
| 0.1176 | 13.0 | 39 | 0.3061 | 1.0 |
| 0.0947 | 14.0 | 42 | 0.2683 | 1.0 |
| 0.0761 | 15.0 | 45 | 0.2295 | 1.0 |
| 0.0584 | 16.0 | 48 | 0.1996 | 1.0 |
| 0.0451 | 17.0 | 51 | 0.1739 | 1.0 |
| 0.0387 | 18.0 | 54 | 0.1521 | 1.0 |
| 0.0272 | 19.0 | 57 | 0.1333 | 1.0 |
| 0.0247 | 20.0 | 60 | 0.1171 | 1.0 |
| 0.0243 | 21.0 | 63 | 0.1044 | 1.0 |
| 0.0206 | 22.0 | 66 | 0.0943 | 1.0 |
| 0.0175 | 23.0 | 69 | 0.0859 | 1.0 |
| 0.0169 | 24.0 | 72 | 0.0799 | 1.0 |
| 0.0162 | 25.0 | 75 | 0.0746 | 1.0 |
| 0.0137 | 26.0 | 78 | 0.0705 | 1.0 |
| 0.0141 | 27.0 | 81 | 0.0674 | 1.0 |
| 0.0107 | 28.0 | 84 | 0.0654 | 1.0 |
| 0.0117 | 29.0 | 87 | 0.0634 | 1.0 |
| 0.0113 | 30.0 | 90 | 0.0617 | 1.0 |
| 0.0107 | 31.0 | 93 | 0.0599 | 1.0 |
| 0.0106 | 32.0 | 96 | 0.0585 | 1.0 |
| 0.0101 | 33.0 | 99 | 0.0568 | 1.0 |
| 0.0084 | 34.0 | 102 | 0.0553 | 1.0 |
| 0.0101 | 35.0 | 105 | 0.0539 | 1.0 |
| 0.0102 | 36.0 | 108 | 0.0529 | 1.0 |
| 0.009 | 37.0 | 111 | 0.0520 | 1.0 |
| 0.0092 | 38.0 | 114 | 0.0511 | 1.0 |
| 0.0073 | 39.0 | 117 | 0.0504 | 1.0 |
| 0.0081 | 40.0 | 120 | 0.0497 | 1.0 |
| 0.0079 | 41.0 | 123 | 0.0492 | 1.0 |
| 0.0092 | 42.0 | 126 | 0.0488 | 1.0 |
| 0.008 | 43.0 | 129 | 0.0483 | 1.0 |
| 0.0087 | 44.0 | 132 | 0.0479 | 1.0 |
| 0.009 | 45.0 | 135 | 0.0474 | 1.0 |
| 0.0076 | 46.0 | 138 | 0.0470 | 1.0 |
| 0.0075 | 47.0 | 141 | 0.0467 | 1.0 |
| 0.008 | 48.0 | 144 | 0.0465 | 1.0 |
| 0.0069 | 49.0 | 147 | 0.0464 | 1.0 |
| 0.0077 | 50.0 | 150 | 0.0464 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
474 | SetFit/distilbert-base-uncased__subj__train-8-4 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-4
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3305
- Accuracy: 0.8565
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6991 | 1.0 | 3 | 0.6772 | 0.75 |
| 0.6707 | 2.0 | 6 | 0.6704 | 0.75 |
| 0.6402 | 3.0 | 9 | 0.6608 | 1.0 |
| 0.5789 | 4.0 | 12 | 0.6547 | 0.75 |
| 0.5211 | 5.0 | 15 | 0.6434 | 0.75 |
| 0.454 | 6.0 | 18 | 0.6102 | 1.0 |
| 0.4187 | 7.0 | 21 | 0.5701 | 1.0 |
| 0.3401 | 8.0 | 24 | 0.5289 | 1.0 |
| 0.3107 | 9.0 | 27 | 0.4737 | 1.0 |
| 0.2381 | 10.0 | 30 | 0.4255 | 1.0 |
| 0.1982 | 11.0 | 33 | 0.3685 | 1.0 |
| 0.1631 | 12.0 | 36 | 0.3200 | 1.0 |
| 0.1234 | 13.0 | 39 | 0.2798 | 1.0 |
| 0.0993 | 14.0 | 42 | 0.2455 | 1.0 |
| 0.0781 | 15.0 | 45 | 0.2135 | 1.0 |
| 0.0586 | 16.0 | 48 | 0.1891 | 1.0 |
| 0.0513 | 17.0 | 51 | 0.1671 | 1.0 |
| 0.043 | 18.0 | 54 | 0.1427 | 1.0 |
| 0.0307 | 19.0 | 57 | 0.1225 | 1.0 |
| 0.0273 | 20.0 | 60 | 0.1060 | 1.0 |
| 0.0266 | 21.0 | 63 | 0.0920 | 1.0 |
| 0.0233 | 22.0 | 66 | 0.0823 | 1.0 |
| 0.0185 | 23.0 | 69 | 0.0751 | 1.0 |
| 0.0173 | 24.0 | 72 | 0.0698 | 1.0 |
| 0.0172 | 25.0 | 75 | 0.0651 | 1.0 |
| 0.0142 | 26.0 | 78 | 0.0613 | 1.0 |
| 0.0151 | 27.0 | 81 | 0.0583 | 1.0 |
| 0.0117 | 28.0 | 84 | 0.0563 | 1.0 |
| 0.0123 | 29.0 | 87 | 0.0546 | 1.0 |
| 0.0121 | 30.0 | 90 | 0.0531 | 1.0 |
| 0.0123 | 31.0 | 93 | 0.0511 | 1.0 |
| 0.0112 | 32.0 | 96 | 0.0496 | 1.0 |
| 0.0103 | 33.0 | 99 | 0.0481 | 1.0 |
| 0.0086 | 34.0 | 102 | 0.0468 | 1.0 |
| 0.0096 | 35.0 | 105 | 0.0457 | 1.0 |
| 0.0107 | 36.0 | 108 | 0.0447 | 1.0 |
| 0.0095 | 37.0 | 111 | 0.0439 | 1.0 |
| 0.0102 | 38.0 | 114 | 0.0429 | 1.0 |
| 0.0077 | 39.0 | 117 | 0.0422 | 1.0 |
| 0.0092 | 40.0 | 120 | 0.0415 | 1.0 |
| 0.0083 | 41.0 | 123 | 0.0409 | 1.0 |
| 0.0094 | 42.0 | 126 | 0.0404 | 1.0 |
| 0.0084 | 43.0 | 129 | 0.0400 | 1.0 |
| 0.0085 | 44.0 | 132 | 0.0396 | 1.0 |
| 0.0092 | 45.0 | 135 | 0.0392 | 1.0 |
| 0.0076 | 46.0 | 138 | 0.0389 | 1.0 |
| 0.0073 | 47.0 | 141 | 0.0388 | 1.0 |
| 0.0085 | 48.0 | 144 | 0.0387 | 1.0 |
| 0.0071 | 49.0 | 147 | 0.0386 | 1.0 |
| 0.0079 | 50.0 | 150 | 0.0386 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
475 | SetFit/distilbert-base-uncased__subj__train-8-5 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-5
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6927
- Accuracy: 0.506
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7102 | 1.0 | 3 | 0.6790 | 0.75 |
| 0.6693 | 2.0 | 6 | 0.6831 | 0.75 |
| 0.6438 | 3.0 | 9 | 0.6876 | 0.75 |
| 0.6047 | 4.0 | 12 | 0.6970 | 0.75 |
| 0.547 | 5.0 | 15 | 0.7065 | 0.75 |
| 0.4885 | 6.0 | 18 | 0.7114 | 0.75 |
| 0.4601 | 7.0 | 21 | 0.7147 | 0.5 |
| 0.4017 | 8.0 | 24 | 0.7178 | 0.5 |
| 0.3474 | 9.0 | 27 | 0.7145 | 0.5 |
| 0.2624 | 10.0 | 30 | 0.7153 | 0.5 |
| 0.2175 | 11.0 | 33 | 0.7158 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
476 | SetFit/distilbert-base-uncased__subj__train-8-6 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-6
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-6
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6075
- Accuracy: 0.7485
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7163 | 1.0 | 3 | 0.6923 | 0.5 |
| 0.6648 | 2.0 | 6 | 0.6838 | 0.5 |
| 0.6329 | 3.0 | 9 | 0.6747 | 0.75 |
| 0.5836 | 4.0 | 12 | 0.6693 | 0.5 |
| 0.5287 | 5.0 | 15 | 0.6670 | 0.25 |
| 0.4585 | 6.0 | 18 | 0.6517 | 0.5 |
| 0.415 | 7.0 | 21 | 0.6290 | 0.5 |
| 0.3353 | 8.0 | 24 | 0.6019 | 0.5 |
| 0.2841 | 9.0 | 27 | 0.5613 | 0.75 |
| 0.2203 | 10.0 | 30 | 0.5222 | 1.0 |
| 0.1743 | 11.0 | 33 | 0.4769 | 1.0 |
| 0.1444 | 12.0 | 36 | 0.4597 | 1.0 |
| 0.1079 | 13.0 | 39 | 0.4462 | 1.0 |
| 0.0891 | 14.0 | 42 | 0.4216 | 1.0 |
| 0.0704 | 15.0 | 45 | 0.3880 | 1.0 |
| 0.0505 | 16.0 | 48 | 0.3663 | 1.0 |
| 0.0428 | 17.0 | 51 | 0.3536 | 1.0 |
| 0.0356 | 18.0 | 54 | 0.3490 | 1.0 |
| 0.0283 | 19.0 | 57 | 0.3531 | 1.0 |
| 0.025 | 20.0 | 60 | 0.3595 | 1.0 |
| 0.0239 | 21.0 | 63 | 0.3594 | 1.0 |
| 0.0202 | 22.0 | 66 | 0.3521 | 1.0 |
| 0.0168 | 23.0 | 69 | 0.3475 | 1.0 |
| 0.0159 | 24.0 | 72 | 0.3458 | 1.0 |
| 0.0164 | 25.0 | 75 | 0.3409 | 1.0 |
| 0.0132 | 26.0 | 78 | 0.3360 | 1.0 |
| 0.0137 | 27.0 | 81 | 0.3302 | 1.0 |
| 0.0112 | 28.0 | 84 | 0.3235 | 1.0 |
| 0.0113 | 29.0 | 87 | 0.3178 | 1.0 |
| 0.0111 | 30.0 | 90 | 0.3159 | 1.0 |
| 0.0113 | 31.0 | 93 | 0.3108 | 1.0 |
| 0.0107 | 32.0 | 96 | 0.3101 | 1.0 |
| 0.0101 | 33.0 | 99 | 0.3100 | 1.0 |
| 0.0083 | 34.0 | 102 | 0.3110 | 1.0 |
| 0.0092 | 35.0 | 105 | 0.3117 | 1.0 |
| 0.0102 | 36.0 | 108 | 0.3104 | 1.0 |
| 0.0086 | 37.0 | 111 | 0.3086 | 1.0 |
| 0.0092 | 38.0 | 114 | 0.3047 | 1.0 |
| 0.0072 | 39.0 | 117 | 0.3024 | 1.0 |
| 0.0079 | 40.0 | 120 | 0.3014 | 1.0 |
| 0.0079 | 41.0 | 123 | 0.2983 | 1.0 |
| 0.0091 | 42.0 | 126 | 0.2948 | 1.0 |
| 0.0077 | 43.0 | 129 | 0.2915 | 1.0 |
| 0.0085 | 44.0 | 132 | 0.2890 | 1.0 |
| 0.009 | 45.0 | 135 | 0.2870 | 1.0 |
| 0.0073 | 46.0 | 138 | 0.2856 | 1.0 |
| 0.0073 | 47.0 | 141 | 0.2844 | 1.0 |
| 0.0076 | 48.0 | 144 | 0.2841 | 1.0 |
| 0.0065 | 49.0 | 147 | 0.2836 | 1.0 |
| 0.0081 | 50.0 | 150 | 0.2835 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
477 | SetFit/distilbert-base-uncased__subj__train-8-7 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-7
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2766
- Accuracy: 0.8845
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7044 | 1.0 | 3 | 0.6909 | 0.5 |
| 0.6678 | 2.0 | 6 | 0.6901 | 0.5 |
| 0.6336 | 3.0 | 9 | 0.6807 | 0.5 |
| 0.5926 | 4.0 | 12 | 0.6726 | 0.5 |
| 0.5221 | 5.0 | 15 | 0.6648 | 0.5 |
| 0.4573 | 6.0 | 18 | 0.6470 | 0.5 |
| 0.4177 | 7.0 | 21 | 0.6251 | 0.5 |
| 0.3252 | 8.0 | 24 | 0.5994 | 0.5 |
| 0.2831 | 9.0 | 27 | 0.5529 | 0.5 |
| 0.213 | 10.0 | 30 | 0.5078 | 0.75 |
| 0.1808 | 11.0 | 33 | 0.4521 | 1.0 |
| 0.1355 | 12.0 | 36 | 0.3996 | 1.0 |
| 0.1027 | 13.0 | 39 | 0.3557 | 1.0 |
| 0.0862 | 14.0 | 42 | 0.3121 | 1.0 |
| 0.0682 | 15.0 | 45 | 0.2828 | 1.0 |
| 0.0517 | 16.0 | 48 | 0.2603 | 1.0 |
| 0.0466 | 17.0 | 51 | 0.2412 | 1.0 |
| 0.038 | 18.0 | 54 | 0.2241 | 1.0 |
| 0.0276 | 19.0 | 57 | 0.2096 | 1.0 |
| 0.0246 | 20.0 | 60 | 0.1969 | 1.0 |
| 0.0249 | 21.0 | 63 | 0.1859 | 1.0 |
| 0.0201 | 22.0 | 66 | 0.1770 | 1.0 |
| 0.018 | 23.0 | 69 | 0.1703 | 1.0 |
| 0.0164 | 24.0 | 72 | 0.1670 | 1.0 |
| 0.0172 | 25.0 | 75 | 0.1639 | 1.0 |
| 0.0135 | 26.0 | 78 | 0.1604 | 1.0 |
| 0.014 | 27.0 | 81 | 0.1585 | 1.0 |
| 0.0108 | 28.0 | 84 | 0.1569 | 1.0 |
| 0.0116 | 29.0 | 87 | 0.1549 | 1.0 |
| 0.0111 | 30.0 | 90 | 0.1532 | 1.0 |
| 0.0113 | 31.0 | 93 | 0.1513 | 1.0 |
| 0.0104 | 32.0 | 96 | 0.1503 | 1.0 |
| 0.01 | 33.0 | 99 | 0.1490 | 1.0 |
| 0.0079 | 34.0 | 102 | 0.1479 | 1.0 |
| 0.0097 | 35.0 | 105 | 0.1466 | 1.0 |
| 0.0112 | 36.0 | 108 | 0.1458 | 1.0 |
| 0.0091 | 37.0 | 111 | 0.1457 | 1.0 |
| 0.0098 | 38.0 | 114 | 0.1454 | 1.0 |
| 0.0076 | 39.0 | 117 | 0.1451 | 1.0 |
| 0.0085 | 40.0 | 120 | 0.1448 | 1.0 |
| 0.0079 | 41.0 | 123 | 0.1445 | 1.0 |
| 0.0096 | 42.0 | 126 | 0.1440 | 1.0 |
| 0.0081 | 43.0 | 129 | 0.1430 | 1.0 |
| 0.0083 | 44.0 | 132 | 0.1424 | 1.0 |
| 0.0088 | 45.0 | 135 | 0.1418 | 1.0 |
| 0.0077 | 46.0 | 138 | 0.1414 | 1.0 |
| 0.0073 | 47.0 | 141 | 0.1413 | 1.0 |
| 0.0084 | 48.0 | 144 | 0.1412 | 1.0 |
| 0.0072 | 49.0 | 147 | 0.1411 | 1.0 |
| 0.0077 | 50.0 | 150 | 0.1411 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
478 | SetFit/distilbert-base-uncased__subj__train-8-8 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-8
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3160
- Accuracy: 0.8735
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7187 | 1.0 | 3 | 0.6776 | 1.0 |
| 0.684 | 2.0 | 6 | 0.6608 | 1.0 |
| 0.6532 | 3.0 | 9 | 0.6364 | 1.0 |
| 0.5996 | 4.0 | 12 | 0.6119 | 1.0 |
| 0.5242 | 5.0 | 15 | 0.5806 | 1.0 |
| 0.4612 | 6.0 | 18 | 0.5320 | 1.0 |
| 0.4192 | 7.0 | 21 | 0.4714 | 1.0 |
| 0.3274 | 8.0 | 24 | 0.4071 | 1.0 |
| 0.2871 | 9.0 | 27 | 0.3378 | 1.0 |
| 0.2082 | 10.0 | 30 | 0.2822 | 1.0 |
| 0.1692 | 11.0 | 33 | 0.2271 | 1.0 |
| 0.1242 | 12.0 | 36 | 0.1793 | 1.0 |
| 0.0977 | 13.0 | 39 | 0.1417 | 1.0 |
| 0.0776 | 14.0 | 42 | 0.1117 | 1.0 |
| 0.0631 | 15.0 | 45 | 0.0894 | 1.0 |
| 0.0453 | 16.0 | 48 | 0.0733 | 1.0 |
| 0.0399 | 17.0 | 51 | 0.0617 | 1.0 |
| 0.0333 | 18.0 | 54 | 0.0528 | 1.0 |
| 0.0266 | 19.0 | 57 | 0.0454 | 1.0 |
| 0.0234 | 20.0 | 60 | 0.0393 | 1.0 |
| 0.0223 | 21.0 | 63 | 0.0345 | 1.0 |
| 0.0195 | 22.0 | 66 | 0.0309 | 1.0 |
| 0.0161 | 23.0 | 69 | 0.0281 | 1.0 |
| 0.0167 | 24.0 | 72 | 0.0260 | 1.0 |
| 0.0163 | 25.0 | 75 | 0.0242 | 1.0 |
| 0.0134 | 26.0 | 78 | 0.0227 | 1.0 |
| 0.0128 | 27.0 | 81 | 0.0214 | 1.0 |
| 0.0101 | 28.0 | 84 | 0.0204 | 1.0 |
| 0.0109 | 29.0 | 87 | 0.0194 | 1.0 |
| 0.0112 | 30.0 | 90 | 0.0186 | 1.0 |
| 0.0108 | 31.0 | 93 | 0.0179 | 1.0 |
| 0.011 | 32.0 | 96 | 0.0174 | 1.0 |
| 0.0099 | 33.0 | 99 | 0.0169 | 1.0 |
| 0.0083 | 34.0 | 102 | 0.0164 | 1.0 |
| 0.0096 | 35.0 | 105 | 0.0160 | 1.0 |
| 0.01 | 36.0 | 108 | 0.0156 | 1.0 |
| 0.0084 | 37.0 | 111 | 0.0152 | 1.0 |
| 0.0089 | 38.0 | 114 | 0.0149 | 1.0 |
| 0.0073 | 39.0 | 117 | 0.0146 | 1.0 |
| 0.0082 | 40.0 | 120 | 0.0143 | 1.0 |
| 0.008 | 41.0 | 123 | 0.0141 | 1.0 |
| 0.0093 | 42.0 | 126 | 0.0139 | 1.0 |
| 0.0078 | 43.0 | 129 | 0.0138 | 1.0 |
| 0.0086 | 44.0 | 132 | 0.0136 | 1.0 |
| 0.009 | 45.0 | 135 | 0.0135 | 1.0 |
| 0.0072 | 46.0 | 138 | 0.0134 | 1.0 |
| 0.0075 | 47.0 | 141 | 0.0133 | 1.0 |
| 0.0082 | 48.0 | 144 | 0.0133 | 1.0 |
| 0.0068 | 49.0 | 147 | 0.0132 | 1.0 |
| 0.0074 | 50.0 | 150 | 0.0132 | 1.0 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
479 | SetFit/distilbert-base-uncased__subj__train-8-9 | [
"objective",
"subjective"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased__subj__train-8-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased__subj__train-8-9
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4865
- Accuracy: 0.778
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7024 | 1.0 | 3 | 0.6843 | 0.75 |
| 0.67 | 2.0 | 6 | 0.6807 | 0.5 |
| 0.6371 | 3.0 | 9 | 0.6677 | 0.5 |
| 0.585 | 4.0 | 12 | 0.6649 | 0.5 |
| 0.5122 | 5.0 | 15 | 0.6707 | 0.5 |
| 0.4379 | 6.0 | 18 | 0.6660 | 0.5 |
| 0.4035 | 7.0 | 21 | 0.6666 | 0.5 |
| 0.323 | 8.0 | 24 | 0.6672 | 0.5 |
| 0.2841 | 9.0 | 27 | 0.6534 | 0.5 |
| 0.21 | 10.0 | 30 | 0.6456 | 0.5 |
| 0.1735 | 11.0 | 33 | 0.6325 | 0.5 |
| 0.133 | 12.0 | 36 | 0.6214 | 0.5 |
| 0.0986 | 13.0 | 39 | 0.6351 | 0.5 |
| 0.081 | 14.0 | 42 | 0.6495 | 0.5 |
| 0.0638 | 15.0 | 45 | 0.6671 | 0.5 |
| 0.0449 | 16.0 | 48 | 0.7156 | 0.5 |
| 0.0399 | 17.0 | 51 | 0.7608 | 0.5 |
| 0.0314 | 18.0 | 54 | 0.7796 | 0.5 |
| 0.0243 | 19.0 | 57 | 0.7789 | 0.5 |
| 0.0227 | 20.0 | 60 | 0.7684 | 0.5 |
| 0.0221 | 21.0 | 63 | 0.7628 | 0.5 |
| 0.0192 | 22.0 | 66 | 0.7728 | 0.5 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2
- Tokenizers 0.10.3
|
480 | SharanSMenon/22-languages-bert-base-cased | [
"Arabic",
"Chinese",
"Latin",
"Persian",
"Portugese",
"Pushto",
"Romanian",
"Russian",
"Spanish",
"Swedish",
"Tamil",
"Thai",
"Dutch",
"Turkish",
"Urdu",
"English",
"Estonian",
"French",
"Hindi",
"Indonesian",
"Japanese",
"Korean"
] | ---
metrics:
- accuracy
widget:
- text: "In war resolution, in defeat defiance, in victory magnanimity"
- text: "en la guerra resolución en la derrota desafío en la victoria magnanimidad"
---
[](https://colab.research.google.com/drive/1dqeUwS_DZ-urrmYzB29nTCBUltwJxhbh?usp=sharing)
# 22 Language Identifier - BERT
This model is trained to identify the following 22 different languages.
- Arabic
- Chinese
- Dutch
- English
- Estonian
- French
- Hindi
- Indonesian
- Japanese
- Korean
- Latin
- Persian
- Portugese
- Pushto
- Romanian
- Russian
- Spanish
- Swedish
- Tamil
- Thai
- Turkish
- Urdu
## Loading the model
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("SharanSMenon/22-languages-bert-base-cased")
model = AutoModelForSequenceClassification.from_pretrained("SharanSMenon/22-languages-bert-base-cased")
```
## Inference
```python
def predict(sentence):
tokenized = tokenizer(sentence, return_tensors="pt")
outputs = model(**tokenized)
return model.config.id2label[outputs.logits.argmax(dim=1).item()]
```
### Examples
```python
sentence1 = "in war resolution, in defeat defiance, in victory magnanimity"
predict(sentence1) # English
sentence2 = "en la guerra resolución en la derrota desafío en la victoria magnanimidad"
predict(sentence2) # Spanish
sentence3 = "هذا هو أعظم إله على الإطلاق"
predict(sentence3) # Arabic
``` |
482 | Shuvam/autonlp-college_classification-164469 | [
"0",
"1"
] | ---
tags: autonlp
language: en
widget:
- text: "I love AutoNLP 🤗"
datasets:
- Shuvam/autonlp-data-college_classification
---
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 164469
## Validation Metrics
- Loss: 0.05527503043413162
- Accuracy: 0.9853049228508449
- Precision: 0.991044776119403
- Recall: 0.9793510324483776
- AUC: 0.9966895139869654
- F1: 0.9851632047477745
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/Shuvam/autonlp-college_classification-164469
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("Shuvam/autonlp-college_classification-164469", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("Shuvam/autonlp-college_classification-164469", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
483 | s-nlp/roberta-base-formality-ranker | [
"formal",
"informal"
] | ---
language:
- en
tags:
- formality
datasets:
- GYAFC
- Pavlick-Tetreault-2016
---
The model has been trained to predict for English sentences, whether they are formal or informal.
Base model: `roberta-base`
Datasets: [GYAFC](https://github.com/raosudha89/GYAFC-corpus) from [Rao and Tetreault, 2018](https://aclanthology.org/N18-1012) and [online formality corpus](http://www.seas.upenn.edu/~nlp/resources/formality-corpus.tgz) from [Pavlick and Tetreault, 2016](https://aclanthology.org/Q16-1005).
Data augmentation: changing texts to upper or lower case; removing all punctuation, adding dot at the end of a sentence. It was applied because otherwise the model is over-reliant on punctuation and capitalization and does not pay enough attention to other features.
Loss: binary classification (on GYAFC), in-batch ranking (on PT data).
Performance metrics on the test data:
| dataset | ROC AUC | precision | recall | fscore | accuracy | Spearman |
|----------------------------------------------|---------|-----------|--------|--------|----------|------------|
| GYAFC | 0.9779 | 0.90 | 0.91 | 0.90 | 0.9087 | 0.8233 |
| GYAFC normalized (lowercase + remove punct.) | 0.9234 | 0.85 | 0.81 | 0.82 | 0.8218 | 0.7294 |
| P&T subset | Spearman R |
| - | - |
news | 0.4003
answers | 0.7500
blog | 0.7334
email | 0.7606
|
485 | s-nlp/roberta_toxicity_classifier | [
"neutral",
"toxic"
] | ---
language:
- en
tags:
- toxic comments classification
licenses:
- cc-by-nc-sa
---
## Toxicity Classification Model
This model is trained for toxicity classification task. The dataset used for training is the merge of the English parts of the three datasets by **Jigsaw** ([Jigsaw 2018](https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge), [Jigsaw 2019](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification), [Jigsaw 2020](https://www.kaggle.com/c/jigsaw-multilingual-toxic-comment-classification)), containing around 2 million examples. We split it into two parts and fine-tune a RoBERTa model ([RoBERTa: A Robustly Optimized BERT Pretraining Approach](https://arxiv.org/abs/1907.11692)) on it. The classifiers perform closely on the test set of the first Jigsaw competition, reaching the **AUC-ROC** of 0.98 and **F1-score** of 0.76.
## How to use
```python
from transformers import RobertaTokenizer, RobertaForSequenceClassification
# load tokenizer and model weights
tokenizer = RobertaTokenizer.from_pretrained('SkolkovoInstitute/roberta_toxicity_classifier')
model = RobertaForSequenceClassification.from_pretrained('SkolkovoInstitute/roberta_toxicity_classifier')
# prepare the input
batch = tokenizer.encode('you are amazing', return_tensors='pt')
# inference
model(batch)
```
## Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png |
487 | s-nlp/rubert-base-corruption-detector | [
"unnatural",
"natural"
] | ---
language:
- ru
tags:
- fluency
---
This is a model for evaluation of naturalness of short Russian texts. It has been trained to distinguish human-written texts from their corrupted versions.
Corruption sources: random replacement, deletion, addition, shuffling, and re-inflection of words and characters, random changes of capitalization, round-trip translation, filling random gaps with T5 and RoBERTA models. For each original text, we sampled three corrupted texts, so the model is uniformly biased towards the `unnatural` label.
Data sources: web-corpora from [the Leipzig collection](https://wortschatz.uni-leipzig.de/en/download) (`rus_news_2020_100K`, `rus_newscrawl-public_2018_100K`, `rus-ru_web-public_2019_100K`, `rus_wikipedia_2021_100K`), comments from [OK](https://www.kaggle.com/alexandersemiletov/toxic-russian-comments) and [Pikabu](https://www.kaggle.com/blackmoon/russian-language-toxic-comments).
On our private test dataset, the model has achieved 40% rank correlation with human judgements of naturalness, which is higher than GPT perplexity, another popular fluency metric. |
488 | s-nlp/russian_toxicity_classifier | [
"neutral",
"toxic"
] | ---
language:
- ru
tags:
- toxic comments classification
licenses:
- cc-by-nc-sa
---
Bert-based classifier (finetuned from [Conversational Rubert](https://huggingface.co/DeepPavlov/rubert-base-cased-conversational)) trained on merge of Russian Language Toxic Comments [dataset](https://www.kaggle.com/blackmoon/russian-language-toxic-comments/metadata) collected from 2ch.hk and Toxic Russian Comments [dataset](https://www.kaggle.com/alexandersemiletov/toxic-russian-comments) collected from ok.ru.
The datasets were merged, shuffled, and split into train, dev, test splits in 80-10-10 proportion.
The metrics obtained from test dataset is as follows
| | precision | recall | f1-score | support |
|:------------:|:---------:|:------:|:--------:|:-------:|
| 0 | 0.98 | 0.99 | 0.98 | 21384 |
| 1 | 0.94 | 0.92 | 0.93 | 4886 |
| accuracy | | | 0.97 | 26270|
| macro avg | 0.96 | 0.96 | 0.96 | 26270 |
| weighted avg | 0.97 | 0.97 | 0.97 | 26270 |
## How to use
```python
from transformers import BertTokenizer, BertForSequenceClassification
# load tokenizer and model weights
tokenizer = BertTokenizer.from_pretrained('SkolkovoInstitute/russian_toxicity_classifier')
model = BertForSequenceClassification.from_pretrained('SkolkovoInstitute/russian_toxicity_classifier')
# prepare the input
batch = tokenizer.encode('ты супер', return_tensors='pt')
# inference
model(batch)
```
## Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png |
489 | s-nlp/xlmr_formality_classifier | [
"formal",
"informal"
] | ---
language:
- en
- fr
- it
- pt
tags:
- formal or informal classification
licenses:
- cc-by-nc-sa
---
XLMRoberta-based classifier trained on XFORMAL.
all
| | precision | recall | f1-score | support |
|--------------|-----------|----------|----------|---------|
| 0 | 0.744912 | 0.927790 | 0.826354 | 108019 |
| 1 | 0.889088 | 0.645630 | 0.748048 | 96845 |
| accuracy | | | 0.794405 | 204864 |
| macro avg | 0.817000 | 0.786710 | 0.787201 | 204864 |
| weighted avg | 0.813068 | 0.794405 | 0.789337 | 204864 |
en
| | precision | recall | f1-score | support |
|--------------|-----------|----------|----------|---------|
| 0 | 0.800053 | 0.962981 | 0.873988 | 22151 |
| 1 | 0.945106 | 0.725899 | 0.821124 | 19449 |
| accuracy | | | 0.852139 | 41600 |
| macro avg | 0.872579 | 0.844440 | 0.847556 | 41600 |
| weighted avg | 0.867869 | 0.852139 | 0.849273 | 41600 |
fr
| | precision | recall | f1-score | support |
|--------------|-----------|----------|----------|---------|
| 0 | 0.746709 | 0.925738 | 0.826641 | 21505 |
| 1 | 0.887305 | 0.650592 | 0.750731 | 19327 |
| accuracy | | | 0.795504 | 40832 |
| macro avg | 0.817007 | 0.788165 | 0.788686 | 40832 |
| weighted avg | 0.813257 | 0.795504 | 0.790711 | 40832 |
it
| | precision | recall | f1-score | support |
|--------------|-----------|----------|----------|---------|
| 0 | 0.721282 | 0.914669 | 0.806545 | 21528 |
| 1 | 0.864887 | 0.607135 | 0.713445 | 19368 |
| accuracy | | | 0.769024 | 40896 |
| macro avg | 0.793084 | 0.760902 | 0.759995 | 40896 |
| weighted avg | 0.789292 | 0.769024 | 0.762454 | 40896 |
pt
| | precision | recall | f1-score | support |
|--------------|-----------|----------|----------|---------|
| 0 | 0.717546 | 0.908167 | 0.801681 | 21637 |
| 1 | 0.853628 | 0.599700 | 0.704481 | 19323 |
| accuracy | | | 0.762646 | 40960 |
| macro avg | 0.785587 | 0.753933 | 0.753081 | 40960 |
| weighted avg | 0.781743 | 0.762646 | 0.755826 | 40960 |
## How to use
```python
from transformers import XLMRobertaTokenizerFast, XLMRobertaForSequenceClassification
# load tokenizer and model weights
tokenizer = XLMRobertaTokenizerFast.from_pretrained('SkolkovoInstitute/xlmr_formality_classifier')
model = XLMRobertaForSequenceClassification.from_pretrained('SkolkovoInstitute/xlmr_formality_classifier')
# prepare the input
batch = tokenizer.encode('ты супер', return_tensors='pt')
# inference
model(batch)
```
## Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png |
491 | apanc/russian-sensitive-topics | [
"LABEL_0",
"LABEL_1",
"LABEL_10",
"LABEL_100",
"LABEL_101",
"LABEL_102",
"LABEL_103",
"LABEL_104",
"LABEL_105",
"LABEL_106",
"LABEL_107",
"LABEL_108",
"LABEL_109",
"LABEL_11",
"LABEL_110",
"LABEL_111",
"LABEL_112",
"LABEL_113",
"LABEL_114",
"LABEL_115",
"LABEL_116",
"LABEL_117",
"LABEL_118",
"LABEL_119",
"LABEL_12",
"LABEL_120",
"LABEL_121",
"LABEL_122",
"LABEL_123",
"LABEL_124",
"LABEL_125",
"LABEL_126",
"LABEL_127",
"LABEL_128",
"LABEL_129",
"LABEL_13",
"LABEL_130",
"LABEL_131",
"LABEL_132",
"LABEL_133",
"LABEL_134",
"LABEL_135",
"LABEL_136",
"LABEL_137",
"LABEL_138",
"LABEL_139",
"LABEL_14",
"LABEL_140",
"LABEL_141",
"LABEL_142",
"LABEL_143",
"LABEL_144",
"LABEL_145",
"LABEL_146",
"LABEL_147",
"LABEL_148",
"LABEL_149",
"LABEL_15",
"LABEL_150",
"LABEL_151",
"LABEL_152",
"LABEL_153",
"LABEL_154",
"LABEL_155",
"LABEL_156",
"LABEL_157",
"LABEL_158",
"LABEL_159",
"LABEL_16",
"LABEL_160",
"LABEL_161",
"LABEL_162",
"LABEL_163",
"LABEL_164",
"LABEL_165",
"LABEL_166",
"LABEL_167",
"LABEL_168",
"LABEL_169",
"LABEL_17",
"LABEL_170",
"LABEL_171",
"LABEL_172",
"LABEL_173",
"LABEL_174",
"LABEL_175",
"LABEL_176",
"LABEL_177",
"LABEL_178",
"LABEL_179",
"LABEL_18",
"LABEL_180",
"LABEL_181",
"LABEL_182",
"LABEL_183",
"LABEL_184",
"LABEL_185",
"LABEL_186",
"LABEL_187",
"LABEL_188",
"LABEL_189",
"LABEL_19",
"LABEL_190",
"LABEL_191",
"LABEL_192",
"LABEL_193",
"LABEL_194",
"LABEL_195",
"LABEL_196",
"LABEL_197",
"LABEL_198",
"LABEL_199",
"LABEL_2",
"LABEL_20",
"LABEL_200",
"LABEL_201",
"LABEL_202",
"LABEL_203",
"LABEL_204",
"LABEL_205",
"LABEL_206",
"LABEL_207",
"LABEL_208",
"LABEL_209",
"LABEL_21",
"LABEL_210",
"LABEL_211",
"LABEL_212",
"LABEL_213",
"LABEL_214",
"LABEL_215",
"LABEL_216",
"LABEL_217",
"LABEL_218",
"LABEL_219",
"LABEL_22",
"LABEL_220",
"LABEL_221",
"LABEL_222",
"LABEL_223",
"LABEL_224",
"LABEL_225",
"LABEL_226",
"LABEL_227",
"LABEL_228",
"LABEL_229",
"LABEL_23",
"LABEL_230",
"LABEL_231",
"LABEL_232",
"LABEL_233",
"LABEL_234",
"LABEL_235",
"LABEL_236",
"LABEL_237",
"LABEL_238",
"LABEL_239",
"LABEL_24",
"LABEL_240",
"LABEL_241",
"LABEL_242",
"LABEL_243",
"LABEL_244",
"LABEL_245",
"LABEL_246",
"LABEL_247",
"LABEL_248",
"LABEL_249",
"LABEL_25",
"LABEL_250",
"LABEL_251",
"LABEL_252",
"LABEL_253",
"LABEL_254",
"LABEL_255",
"LABEL_256",
"LABEL_257",
"LABEL_258",
"LABEL_259",
"LABEL_26",
"LABEL_260",
"LABEL_261",
"LABEL_262",
"LABEL_263",
"LABEL_264",
"LABEL_265",
"LABEL_266",
"LABEL_267",
"LABEL_268",
"LABEL_269",
"LABEL_27",
"LABEL_270",
"LABEL_271",
"LABEL_272",
"LABEL_273",
"LABEL_274",
"LABEL_275",
"LABEL_276",
"LABEL_277",
"LABEL_278",
"LABEL_279",
"LABEL_28",
"LABEL_280",
"LABEL_281",
"LABEL_282",
"LABEL_283",
"LABEL_284",
"LABEL_285",
"LABEL_286",
"LABEL_287",
"LABEL_288",
"LABEL_289",
"LABEL_29",
"LABEL_290",
"LABEL_291",
"LABEL_292",
"LABEL_293",
"LABEL_294",
"LABEL_295",
"LABEL_296",
"LABEL_297",
"LABEL_298",
"LABEL_299",
"LABEL_3",
"LABEL_30",
"LABEL_300",
"LABEL_301",
"LABEL_302",
"LABEL_303",
"LABEL_304",
"LABEL_305",
"LABEL_306",
"LABEL_307",
"LABEL_308",
"LABEL_309",
"LABEL_31",
"LABEL_310",
"LABEL_311",
"LABEL_312",
"LABEL_313",
"LABEL_314",
"LABEL_315",
"LABEL_316",
"LABEL_317",
"LABEL_318",
"LABEL_319",
"LABEL_32",
"LABEL_320",
"LABEL_321",
"LABEL_322",
"LABEL_323",
"LABEL_324",
"LABEL_325",
"LABEL_326",
"LABEL_327",
"LABEL_328",
"LABEL_329",
"LABEL_33",
"LABEL_330",
"LABEL_331",
"LABEL_332",
"LABEL_333",
"LABEL_334",
"LABEL_335",
"LABEL_336",
"LABEL_337",
"LABEL_338",
"LABEL_339",
"LABEL_34",
"LABEL_340",
"LABEL_341",
"LABEL_342",
"LABEL_343",
"LABEL_344",
"LABEL_345",
"LABEL_346",
"LABEL_347",
"LABEL_348",
"LABEL_349",
"LABEL_35",
"LABEL_350",
"LABEL_351",
"LABEL_352",
"LABEL_353",
"LABEL_354",
"LABEL_355",
"LABEL_356",
"LABEL_357",
"LABEL_358",
"LABEL_359",
"LABEL_36",
"LABEL_360",
"LABEL_361",
"LABEL_362",
"LABEL_363",
"LABEL_364",
"LABEL_365",
"LABEL_366",
"LABEL_367",
"LABEL_368",
"LABEL_369",
"LABEL_37",
"LABEL_370",
"LABEL_371",
"LABEL_372",
"LABEL_373",
"LABEL_374",
"LABEL_375",
"LABEL_376",
"LABEL_377",
"LABEL_378",
"LABEL_379",
"LABEL_38",
"LABEL_380",
"LABEL_381",
"LABEL_382",
"LABEL_383",
"LABEL_384",
"LABEL_385",
"LABEL_386",
"LABEL_387",
"LABEL_388",
"LABEL_389",
"LABEL_39",
"LABEL_390",
"LABEL_391",
"LABEL_392",
"LABEL_4",
"LABEL_40",
"LABEL_41",
"LABEL_42",
"LABEL_43",
"LABEL_44",
"LABEL_45",
"LABEL_46",
"LABEL_47",
"LABEL_48",
"LABEL_49",
"LABEL_5",
"LABEL_50",
"LABEL_51",
"LABEL_52",
"LABEL_53",
"LABEL_54",
"LABEL_55",
"LABEL_56",
"LABEL_57",
"LABEL_58",
"LABEL_59",
"LABEL_6",
"LABEL_60",
"LABEL_61",
"LABEL_62",
"LABEL_63",
"LABEL_64",
"LABEL_65",
"LABEL_66",
"LABEL_67",
"LABEL_68",
"LABEL_69",
"LABEL_7",
"LABEL_70",
"LABEL_71",
"LABEL_72",
"LABEL_73",
"LABEL_74",
"LABEL_75",
"LABEL_76",
"LABEL_77",
"LABEL_78",
"LABEL_79",
"LABEL_8",
"LABEL_80",
"LABEL_81",
"LABEL_82",
"LABEL_83",
"LABEL_84",
"LABEL_85",
"LABEL_86",
"LABEL_87",
"LABEL_88",
"LABEL_89",
"LABEL_9",
"LABEL_90",
"LABEL_91",
"LABEL_92",
"LABEL_93",
"LABEL_94",
"LABEL_95",
"LABEL_96",
"LABEL_97",
"LABEL_98",
"LABEL_99"
] | ---
language:
- ru
tags:
- toxic comments classification
licenses:
- cc-by-nc-sa
---
## General concept of the model
This model is trained on the dataset of sensitive topics of the Russian language. The concept of sensitive topics is described [in this article ](https://www.aclweb.org/anthology/2021.bsnlp-1.4/) presented at the workshop for Balto-Slavic NLP at the EACL-2021 conference. Please note that this article describes the first version of the dataset, while the model is trained on the extended version of the dataset open-sourced on our [GitHub](https://github.com/skoltech-nlp/inappropriate-sensitive-topics/blob/main/Version2/sensitive_topics/sensitive_topics.csv) or on [kaggle](https://www.kaggle.com/nigula/russian-sensitive-topics). The properties of the dataset is the same as the one described in the article, the only difference is the size.
## Instructions
The model predicts combinations of 18 sensitive topics described in the [article](https://arxiv.org/abs/2103.05345). You can find step-by-step instructions for using the model [here](https://github.com/skoltech-nlp/inappropriate-sensitive-topics/blob/main/Version2/sensitive_topics/Inference.ipynb)
## Metrics
The dataset partially manually labeled samples and partially semi-automatically labeled samples. Learn more in our article. We tested the performance of the classifier only on the part of manually labeled data that is why some topics are not well represented in the test set.
| | precision | recall | f1-score | support |
|-------------------|-----------|--------|----------|---------|
| offline_crime | 0.65 | 0.55 | 0.6 | 132 |
| online_crime | 0.5 | 0.46 | 0.48 | 37 |
| drugs | 0.87 | 0.9 | 0.88 | 87 |
| gambling | 0.5 | 0.67 | 0.57 | 6 |
| pornography | 0.73 | 0.59 | 0.65 | 204 |
| prostitution | 0.75 | 0.69 | 0.72 | 91 |
| slavery | 0.72 | 0.72 | 0.73 | 40 |
| suicide | 0.33 | 0.29 | 0.31 | 7 |
| terrorism | 0.68 | 0.57 | 0.62 | 47 |
| weapons | 0.89 | 0.83 | 0.86 | 138 |
| body_shaming | 0.9 | 0.67 | 0.77 | 109 |
| health_shaming | 0.84 | 0.55 | 0.66 | 108 |
| politics | 0.68 | 0.54 | 0.6 | 241 |
| racism | 0.81 | 0.59 | 0.68 | 204 |
| religion | 0.94 | 0.72 | 0.81 | 102 |
| sexual_minorities | 0.69 | 0.46 | 0.55 | 102 |
| sexism | 0.66 | 0.64 | 0.65 | 132 |
| social_injustice | 0.56 | 0.37 | 0.45 | 181 |
| none | 0.62 | 0.67 | 0.64 | 250 |
| micro avg | 0.72 | 0.61 | 0.66 | 2218 |
| macro avg | 0.7 | 0.6 | 0.64 | 2218 |
| weighted avg | 0.73 | 0.61 | 0.66 | 2218 |
| samples avg | 0.75 | 0.66 | 0.68 | 2218 |
## Licensing Information
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png
## Citation
If you find this repository helpful, feel free to cite our publication:
```
@inproceedings{babakov-etal-2021-detecting,
title = "Detecting Inappropriate Messages on Sensitive Topics that Could Harm a Company{'}s Reputation",
author = "Babakov, Nikolay and
Logacheva, Varvara and
Kozlova, Olga and
Semenov, Nikita and
Panchenko, Alexander",
booktitle = "Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing",
month = apr,
year = "2021",
address = "Kiyv, Ukraine",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bsnlp-1.4",
pages = "26--36",
abstract = "Not all topics are equally {``}flammable{''} in terms of toxicity: a calm discussion of turtles or fishing less often fuels inappropriate toxic dialogues than a discussion of politics or sexual minorities. We define a set of sensitive topics that can yield inappropriate and toxic messages and describe the methodology of collecting and labelling a dataset for appropriateness. While toxicity in user-generated data is well-studied, we aim at defining a more fine-grained notion of inappropriateness. The core of inappropriateness is that it can harm the reputation of a speaker. This is different from toxicity in two respects: (i) inappropriateness is topic-related, and (ii) inappropriate message is not toxic but still unacceptable. We collect and release two datasets for Russian: a topic-labelled dataset and an appropriateness-labelled dataset. We also release pre-trained classification models trained on this data.",
}
``` |
492 | Smone55/autonlp-au_topics-452311620 | [
"-1",
"0",
"1",
"10",
"100",
"101",
"102",
"103",
"104",
"105",
"106",
"107",
"108",
"109",
"11",
"110",
"111",
"112",
"113",
"114",
"115",
"116",
"117",
"118",
"119",
"12",
"120",
"121",
"122",
"123",
"124",
"125",
"13",
"14",
"15",
"16",
"17",
"18",
"19",
"2",
"20",
"21",
"22",
"23",
"24",
"25",
"26",
"27",
"28",
"29",
"3",
"30",
"31",
"32",
"33",
"34",
"35",
"36",
"37",
"38",
"39",
"4",
"40",
"41",
"42",
"43",
"44",
"45",
"46",
"47",
"48",
"49",
"5",
"50",
"51",
"52",
"53",
"54",
"55",
"56",
"57",
"58",
"59",
"6",
"60",
"61",
"62",
"63",
"64",
"65",
"66",
"67",
"68",
"69",
"7",
"70",
"71",
"72",
"73",
"74",
"75",
"76",
"77",
"78",
"79",
"8",
"80",
"81",
"82",
"83",
"84",
"85",
"86",
"87",
"88",
"89",
"9",
"90",
"91",
"92",
"93",
"94",
"95",
"96",
"97",
"98",
"99"
] | ---
tags: autonlp
language: en
widget:
- text: "I love AutoNLP 🤗"
datasets:
- Smone55/autonlp-data-au_topics
co2_eq_emissions: 208.0823957145878
---
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 452311620
- CO2 Emissions (in grams): 208.0823957145878
## Validation Metrics
- Loss: 0.5259971022605896
- Accuracy: 0.8767479025169796
- Macro F1: 0.8618813750734912
- Micro F1: 0.8767479025169796
- Weighted F1: 0.8742964006840133
- Macro Precision: 0.8627700506991158
- Micro Precision: 0.8767479025169796
- Weighted Precision: 0.8755603985289852
- Macro Recall: 0.8662183006750934
- Micro Recall: 0.8767479025169796
- Weighted Recall: 0.8767479025169796
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/Smone55/autonlp-au_topics-452311620
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("Smone55/autonlp-au_topics-452311620", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("Smone55/autonlp-au_topics-452311620", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
494 | SparkBeyond/roberta-large-sts-b | [
"LABEL_0"
] |
# Roberta Large STS-B
This model is a fine tuned RoBERTA model over STS-B.
It was trained with these params:
!python /content/transformers/examples/text-classification/run_glue.py \
--model_type roberta \
--model_name_or_path roberta-large \
--task_name STS-B \
--do_train \
--do_eval \
--do_lower_case \
--data_dir /content/glue_data/STS-B/ \
--max_seq_length 128 \
--per_gpu_eval_batch_size=8 \
--per_gpu_train_batch_size=8 \
--learning_rate 2e-5 \
--num_train_epochs 3.0 \
--output_dir /content/roberta-sts-b
## How to run
```python
import toolz
import torch
batch_size = 6
def roberta_similarity_batches(to_predict):
batches = toolz.partition(batch_size, to_predict)
similarity_scores = []
for batch in batches:
sentences = [(sentence_similarity["sent1"], sentence_similarity["sent2"]) for sentence_similarity in batch]
batch_scores = similarity_roberta(model, tokenizer,sentences)
similarity_scores = similarity_scores + batch_scores[0].cpu().squeeze(axis=1).tolist()
return similarity_scores
def similarity_roberta(model, tokenizer, sent_pairs):
batch_token = tokenizer(sent_pairs, padding='max_length', truncation=True, max_length=500)
res = model(torch.tensor(batch_token['input_ids']).cuda(), attention_mask=torch.tensor(batch_token["attention_mask"]).cuda())
return res
similarity_roberta(model, tokenizer, [('NEW YORK--(BUSINESS WIRE)--Rosen Law Firm, a global investor rights law firm, announces it is investigating potential securities claims on behalf of shareholders of Vale S.A. ( VALE ) resulting from allegations that Vale may have issued materially misleading business information to the investing public',
'EQUITY ALERT: Rosen Law Firm Announces Investigation of Securities Claims Against Vale S.A. – VALE')])
```
|
495 | StevenLimcorn/indo-roberta-indonli | [
"c",
"e",
"n"
] | ---
language: id
tags:
- roberta
license: mit
datasets:
- indonli
widget:
- text: "Amir Sjarifoeddin Harahap lahir di Kota Medan, Sumatera Utara, 27 April 1907. Ia meninggal di Surakarta, Jawa Tengah, pada 19 Desember 1948 dalam usia 41 tahun. </s></s> Amir Sjarifoeddin Harahap masih hidup."
---
## Indo-roberta-indonli
Indo-roberta-indonli is natural language inference classifier based on [Indo-roberta](https://huggingface.co/flax-community/indonesian-roberta-base) model. It was trained on the trained on [IndoNLI](https://github.com/ir-nlp-csui/indonli/tree/main/data/indonli) dataset. The model used was [Indo-roberta](https://huggingface.co/flax-community/indonesian-roberta-base) and was transfer-learned to a natural inference classifier model. The model are tested using the validation, test_layer and test_expert dataset given in the github repository. The results are shown below.
### Result
| Dataset | Accuracy | F1 | Precision | Recall |
|-------------|----------|---------|-----------|---------|
| Test Lay | 0.74329 | 0.74075 | 0.74283 | 0.74133 |
| Test Expert | 0.6115 | 0.60543 | 0.63924 | 0.61742 |
## Model
The model was trained on with 5 epochs, batch size 16, learning rate 2e-5 and weight decay 0.01. Achieved different metrics as shown below.
| Epoch | Training Loss | Validation Loss | Accuracy | F1 | Precision | Recall |
|-------|---------------|-----------------|----------|----------|-----------|----------|
| 1 | 0.942500 | 0.658559 | 0.737369 | 0.735552 | 0.735488 | 0.736679 |
| 2 | 0.649200 | 0.645290 | 0.761493 | 0.759593 | 0.762784 | 0.759642 |
| 3 | 0.437100 | 0.667163 | 0.766045 | 0.763979 | 0.765740 | 0.763792 |
| 4 | 0.282000 | 0.786683 | 0.764679 | 0.761802 | 0.762011 | 0.761684 |
| 5 | 0.193500 | 0.925717 | 0.765134 | 0.763127 | 0.763560 | 0.763489 |
## How to Use
### As NLI Classifier
```python
from transformers import pipeline
pretrained_name = "StevenLimcorn/indonesian-roberta-indonli"
nlp = pipeline(
"zero-shot-classification",
model=pretrained_name,
tokenizer=pretrained_name
)
nlp("Amir Sjarifoeddin Harahap lahir di Kota Medan, Sumatera Utara, 27 April 1907. Ia meninggal di Surakarta, Jawa Tengah, pada 19 Desember 1948 dalam usia 41 tahun. </s></s> Amir Sjarifoeddin Harahap masih hidup.")
```
## Disclaimer
Do consider the biases which come from both the pre-trained RoBERTa model and the `INDONLI` dataset that may be carried over into the results of this model.
## Author
Indonesian RoBERTa Base IndoNLI was trained and evaluated by [Steven Limcorn](https://github.com/stevenlimcorn). All computation and development are done on Google Colaboratory using their free GPU access.
## Reference
The dataset we used is by IndoNLI.
```
@inproceedings{indonli,
title = "IndoNLI: A Natural Language Inference Dataset for Indonesian",
author = "Mahendra, Rahmad and Aji, Alham Fikri and Louvan, Samuel and Rahman, Fahrurrozi and Vania, Clara",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
publisher = "Association for Computational Linguistics",
}
``` |
496 | StevenLimcorn/indonesian-roberta-base-emotion-classifier | [
"anger",
"fear",
"happy",
"love",
"sadness"
] | ---
language: id
tags:
- roberta
license: mit
datasets:
- indonlu
widget:
- text: "Hal-hal baik akan datang."
---
# Indo RoBERTa Emotion Classifier
Indo RoBERTa Emotion Classifier is emotion classifier based on [Indo-roberta](https://huggingface.co/flax-community/indonesian-roberta-base) model. It was trained on the trained on [IndoNLU EmoT](https://huggingface.co/datasets/indonlu) dataset. The model used was [Indo-roberta](https://huggingface.co/flax-community/indonesian-roberta-base) and was transfer-learned to an emotion classifier model. Based from the [IndoNLU bencmark](https://www.indobenchmark.com/), the model achieve an f1-macro of 72.05%, accuracy of 71.81%, precision of 72.47% and recall of 71.94%.
## Model
The model was trained on 7 epochs with learning rate 2e-5. Achieved different metrics as shown below.
| Epoch | Training Loss | Validation Loss | Accuracy | F1 | Precision | Recall |
|-------|---------------|-----------------|----------|----------|-----------|----------|
| 1 | 1.300700 | 1.005149 | 0.622727 | 0.601846 | 0.640845 | 0.611144 |
| 2 | 0.806300 | 0.841953 | 0.686364 | 0.694096 | 0.701984 | 0.696657 |
| 3 | 0.591900 | 0.796794 | 0.686364 | 0.696573 | 0.707520 | 0.691671 |
| 4 | 0.441200 | 0.782094 | 0.722727 | 0.724359 | 0.725985 | 0.730229 |
| 5 | 0.334700 | 0.809931 | 0.711364 | 0.720550 | 0.718318 | 0.724608 |
| 6 | 0.268400 | 0.812771 | 0.718182 | 0.724192 | 0.721222 | 0.729195 |
| 7 | 0.226000 | 0.828461 | 0.725000 | 0.733625 | 0.731709 | 0.735800 |
## How to Use
### As Text Classifier
```python
from transformers import pipeline
pretrained_name = "StevenLimcorn/indonesian-roberta-base-emotion-classifier"
nlp = pipeline(
"sentiment-analysis",
model=pretrained_name,
tokenizer=pretrained_name
)
nlp("Hal-hal baik akan datang.")
```
## Disclaimer
Do consider the biases which come from both the pre-trained RoBERTa model and the `EmoT` dataset that may be carried over into the results of this model.
## Author
Indonesian RoBERTa Base Emotion Classifier was trained and evaluated by [Steven Limcorn](https://github.com/stevenlimcorn). All computation and development are done on Google Colaboratory using their free GPU access.
If used, please cite
```bibtex
@misc {steven_limcorn_2023,
author = { {Steven Limcorn} },
title = { indonesian-roberta-base-emotion-classifier (Revision e8a9cb9) },
year = 2023,
url = { https://huggingface.co/StevenLimcorn/indonesian-roberta-base-emotion-classifier },
doi = { 10.57967/hf/0681 },
publisher = { Hugging Face }
}
``` |
497 | Tahsin/distilbert-base-uncased-finetuned-emotion | [
"anger",
"fear",
"joy",
"love",
"sadness",
"surprise"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9285
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1561
- Accuracy: 0.9285
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 250 | 0.1635 | 0.9295 |
| 0.111 | 2.0 | 500 | 0.1515 | 0.936 |
| 0.111 | 3.0 | 750 | 0.1561 | 0.9285 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
498 | MonoHime/rubert-base-cased-sentiment-new | [
"NEGATIVE",
"NEUTRAL",
"POSITIVE"
] | ---
language:
- ru
tags:
- sentiment
- text-classification
datasets:
- Tatyana/ru_sentiment_dataset
---
# Model Card for RuBERT for Sentiment Analysis
# Model Details
## Model Description
Russian texts sentiment classification.
- **Developed by:** Tatyana Voloshina
- **Shared by [Optional]:** Tatyana Voloshina
- **Model type:** Text Classification
- **Language(s) (NLP):** More information needed
- **License:** More information needed
- **Parent Model:** BERT
- **Resources for more information:**
- [GitHub Repo](https://github.com/T-Sh/Sentiment-Analysis)
# Uses
## Direct Use
This model can be used for the task of text classification.
## Downstream Use [Optional]
More information needed.
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
Model trained on [Tatyana/ru_sentiment_dataset](https://huggingface.co/datasets/Tatyana/ru_sentiment_dataset)
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
## Labels meaning
0: NEUTRAL
1: POSITIVE
2: NEGATIVE
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed.
# Citation
More information needed.
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Tatyana Voloshina in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
Needed pytorch trained model presented in [Drive](https://drive.google.com/drive/folders/1EnJBq0dGfpjPxbVjybqaS7PsMaPHLUIl?usp=sharing).
Load and place model.pth.tar in folder next to another files of a model.
```python
!pip install tensorflow-gpu
!pip install deeppavlov
!python -m deeppavlov install squad_bert
!pip install fasttext
!pip install transformers
!python -m deeppavlov install bert_sentence_embedder
from deeppavlov import build_model
model = build_model(path_to_model/rubert_sentiment.json)
model(["Сегодня хорошая погода", "Я счастлив проводить с тобою время", "Мне нравится эта музыкальная композиция"])
```
</details>
|
503 | Theivaprakasham/bert-base-cased-twitter_sentiment | [
"LABEL_0",
"LABEL_1",
"LABEL_2"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-base-cased-twitter_sentiment
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-cased-twitter_sentiment
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6907
- Accuracy: 0.7132
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.8901 | 1.0 | 1387 | 0.8592 | 0.6249 |
| 0.8085 | 2.0 | 2774 | 0.7600 | 0.6822 |
| 0.7336 | 3.0 | 4161 | 0.7170 | 0.6915 |
| 0.6938 | 4.0 | 5548 | 0.7018 | 0.7016 |
| 0.6738 | 5.0 | 6935 | 0.6926 | 0.7067 |
| 0.6496 | 6.0 | 8322 | 0.6910 | 0.7088 |
| 0.6599 | 7.0 | 9709 | 0.6902 | 0.7088 |
| 0.631 | 8.0 | 11096 | 0.6910 | 0.7095 |
| 0.6327 | 9.0 | 12483 | 0.6925 | 0.7146 |
| 0.6305 | 10.0 | 13870 | 0.6907 | 0.7132 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
504 | Theivaprakasham/sentence-transformers-msmarco-distilbert-base-tas-b-twitter_sentiment | [
"LABEL_0",
"LABEL_1",
"LABEL_2"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: sentence-transformers-msmarco-distilbert-base-tas-b-twitter_sentiment
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sentence-transformers-msmarco-distilbert-base-tas-b-twitter_sentiment
This model is a fine-tuned version of [sentence-transformers/msmarco-distilbert-base-tas-b](https://huggingface.co/sentence-transformers/msmarco-distilbert-base-tas-b) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6954
- Accuracy: 0.7146
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.8892 | 1.0 | 1387 | 0.8472 | 0.6180 |
| 0.7965 | 2.0 | 2774 | 0.7797 | 0.6609 |
| 0.7459 | 3.0 | 4161 | 0.7326 | 0.6872 |
| 0.7096 | 4.0 | 5548 | 0.7133 | 0.6995 |
| 0.6853 | 5.0 | 6935 | 0.6998 | 0.7002 |
| 0.6561 | 6.0 | 8322 | 0.6949 | 0.7059 |
| 0.663 | 7.0 | 9709 | 0.6956 | 0.7077 |
| 0.6352 | 8.0 | 11096 | 0.6890 | 0.7164 |
| 0.6205 | 9.0 | 12483 | 0.6888 | 0.7117 |
| 0.6203 | 10.0 | 13870 | 0.6871 | 0.7121 |
| 0.6005 | 11.0 | 15257 | 0.6879 | 0.7171 |
| 0.5985 | 12.0 | 16644 | 0.6870 | 0.7139 |
| 0.5839 | 13.0 | 18031 | 0.6882 | 0.7164 |
| 0.5861 | 14.0 | 19418 | 0.6910 | 0.7124 |
| 0.5732 | 15.0 | 20805 | 0.6916 | 0.7153 |
| 0.5797 | 16.0 | 22192 | 0.6947 | 0.7110 |
| 0.5565 | 17.0 | 23579 | 0.6930 | 0.7175 |
| 0.5636 | 18.0 | 24966 | 0.6959 | 0.7106 |
| 0.5642 | 19.0 | 26353 | 0.6952 | 0.7132 |
| 0.5717 | 20.0 | 27740 | 0.6954 | 0.7146 |
### Framework versions
- Transformers 4.13.0.dev0
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
505 | TomO/xlm-roberta-base-finetuned-marc-en | [
"good",
"great",
"ok",
"poor",
"terrible"
] | ---
license: mit
tags:
- generated_from_trainer
datasets:
- amazon_reviews_multi
model-index:
- name: xlm-roberta-base-finetuned-marc-en
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-marc-en
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the amazon_reviews_multi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9237
- Mae: 0.5122
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mae |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.1089 | 1.0 | 235 | 0.9380 | 0.4878 |
| 0.9546 | 2.0 | 470 | 0.9237 | 0.5122 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
506 | TomW/TOMFINSEN | [
"negative",
"neutral",
"positive"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- financial_phrasebank
metrics:
- recall
- accuracy
- precision
model-index:
- name: TOMFINSEN
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: financial_phrasebank
type: financial_phrasebank
args: sentences_50agree
metrics:
- name: Recall
type: recall
value: 0.8985861629736692
- name: Accuracy
type: accuracy
value: 0.8742268041237113
- name: Precision
type: precision
value: 0.8509995913451198
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# TOMFINSEN
This model is a fine-tuned version of [deepmind/language-perceiver](https://huggingface.co/deepmind/language-perceiver) on the financial_phrasebank dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3642
- Recall: 0.8986
- Accuracy: 0.8742
- Precision: 0.8510
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: tpu
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Recall | Accuracy | Precision |
|:-------------:|:-----:|:----:|:---------------:|:------:|:--------:|:---------:|
| 0.5403 | 1.0 | 273 | 0.4207 | 0.8358 | 0.8619 | 0.8534 |
| 0.3939 | 2.0 | 546 | 0.3750 | 0.8943 | 0.8577 | 0.8225 |
| 0.1993 | 3.0 | 819 | 0.3113 | 0.8882 | 0.8660 | 0.8367 |
| 0.301 | 4.0 | 1092 | 0.3642 | 0.8986 | 0.8742 | 0.8510 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.0+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
507 | Tommy930/distilbert-base-uncased-finetuned-emotion | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4",
"LABEL_5"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.919
- name: F1
type: f1
value: 0.9193144250513821
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2220
- Accuracy: 0.919
- F1: 0.9193
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.7858 | 1.0 | 250 | 0.3034 | 0.9085 | 0.9073 |
| 0.243 | 2.0 | 500 | 0.2220 | 0.919 | 0.9193 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu113
- Datasets 1.18.3
- Tokenizers 0.10.3
|
508 | TransQuest/monotransquest-da-any_en | [
"LABEL_0"
] | ---
language: multilingual-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-any_en", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
509 | TransQuest/monotransquest-da-en_any | [
"LABEL_0"
] | ---
language: en-multilingual
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-en_any", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
510 | TransQuest/monotransquest-da-en_de-wiki | [
"LABEL_0"
] | ---
language: en-de
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-en_de-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
511 | TransQuest/monotransquest-da-en_zh-wiki | [
"LABEL_0"
] | ---
language: en-zh
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-en_zh-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
512 | TransQuest/monotransquest-da-et_en-wiki | [
"LABEL_0"
] | ---
language: et-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-et_en-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
513 | TransQuest/monotransquest-da-multilingual | [
"LABEL_0"
] | ---
language: multilingual-multilingual
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-multilingual", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
514 | TransQuest/monotransquest-da-ne_en-wiki | [
"LABEL_0"
] | ---
language: ne-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-ne_en-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
515 | TransQuest/monotransquest-da-ro_en-wiki | [
"LABEL_0"
] | ---
language: ro-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-ro_en-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
516 | TransQuest/monotransquest-da-ru_en-reddit_wikiquotes | [
"LABEL_0"
] | ---
language: ru-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-ru_en-reddit_wikiquotes", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
517 | TransQuest/monotransquest-da-si_en-wiki | [
"LABEL_0"
] | ---
language: si-en
tags:
- Quality Estimation
- monotransquest
- DA
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-da-si_en-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
518 | TransQuest/monotransquest-hter-de_en-pharmaceutical | [
"LABEL_0"
] | ---
language: de-en
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-de_en-pharmaceutical", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
519 | TransQuest/monotransquest-hter-en_any | [
"LABEL_0"
] | ---
language: en-multilingual
tags:
- Quality Estimation
- monotransquest
- HTER
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_any", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
``` |
520 | TransQuest/monotransquest-hter-en_cs-pharmaceutical | [
"LABEL_0"
] | ---
language: en-cs
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_cs-pharmaceutical", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
521 | TransQuest/monotransquest-hter-en_de-it-nmt | [
"LABEL_0"
] | ---
language: en-de
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_de-it-nmt", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
522 | TransQuest/monotransquest-hter-en_de-it-smt | [
"LABEL_0"
] | ---
language: en-de
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_de-it-smt", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
523 | TransQuest/monotransquest-hter-en_de-wiki | [
"LABEL_0"
] | ---
language: en-de
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_de-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
524 | TransQuest/monotransquest-hter-en_lv-it-nmt | [
"LABEL_0"
] | ---
language: en-lv
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_lv-it-nmt", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
525 | TransQuest/monotransquest-hter-en_lv-it-smt | [
"LABEL_0"
] | ---
language: en-lv
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_lv-it-smt", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
526 | TransQuest/monotransquest-hter-en_zh-wiki | [
"LABEL_0"
] | ---
language: en-zh
tags:
- Quality Estimation
- monotransquest
- hter
license: apache-2.0
---
# TransQuest: Translation Quality Estimation with Cross-lingual Transformers
The goal of quality estimation (QE) is to evaluate the quality of a translation without having access to a reference translation. High-accuracy QE that can be easily deployed for a number of language pairs is the missing piece in many commercial translation workflows as they have numerous potential uses. They can be employed to select the best translation when several translation engines are available or can inform the end user about the reliability of automatically translated content. In addition, QE systems can be used to decide whether a translation can be published as it is in a given context, or whether it requires human post-editing before publishing or translation from scratch by a human. The quality estimation can be done at different levels: document level, sentence level and word level.
With TransQuest, we have opensourced our research in translation quality estimation which also won the sentence-level direct assessment quality estimation shared task in [WMT 2020](http://www.statmt.org/wmt20/quality-estimation-task.html). TransQuest outperforms current open-source quality estimation frameworks such as [OpenKiwi](https://github.com/Unbabel/OpenKiwi) and [DeepQuest](https://github.com/sheffieldnlp/deepQuest).
## Features
- Sentence-level translation quality estimation on both aspects: predicting post editing efforts and direct assessment.
- Word-level translation quality estimation capable of predicting quality of source words, target words and target gaps.
- Outperform current state-of-the-art quality estimation methods like DeepQuest and OpenKiwi in all the languages experimented.
- Pre-trained quality estimation models for fifteen language pairs are available in [HuggingFace.](https://huggingface.co/TransQuest)
## Installation
### From pip
```bash
pip install transquest
```
### From Source
```bash
git clone https://github.com/TharinduDR/TransQuest.git
cd TransQuest
pip install -r requirements.txt
```
## Using Pre-trained Models
```python
import torch
from transquest.algo.sentence_level.monotransquest.run_model import MonoTransQuestModel
model = MonoTransQuestModel("xlmroberta", "TransQuest/monotransquest-hter-en_zh-wiki", num_labels=1, use_cuda=torch.cuda.is_available())
predictions, raw_outputs = model.predict([["Reducerea acestor conflicte este importantă pentru conservare.", "Reducing these conflicts is not important for preservation."]])
print(predictions)
```
## Documentation
For more details follow the documentation.
## Table of Contents
1. **[Installation](https://tharindudr.github.io/TransQuest/install/)** - Install TransQuest locally using pip.
2. **Architectures** - Checkout the architectures implemented in TransQuest
1. [Sentence-level Architectures](https://tharindudr.github.io/TransQuest/architectures/sentence_level_architectures/) - We have released two architectures; MonoTransQuest and SiameseTransQuest to perform sentence level quality estimation.
2. [Word-level Architecture](https://tharindudr.github.io/TransQuest/architectures/word_level_architecture/) - We have released MicroTransQuest to perform word level quality estimation.
3. **Examples** - We have provided several examples on how to use TransQuest in recent WMT quality estimation shared tasks.
1. [Sentence-level Examples](https://tharindudr.github.io/TransQuest/examples/sentence_level_examples/)
2. [Word-level Examples](https://tharindudr.github.io/TransQuest/examples/word_level_examples/)
4. **Pre-trained Models** - We have provided pretrained quality estimation models for fifteen language pairs covering both sentence-level and word-level
1. [Sentence-level Models](https://tharindudr.github.io/TransQuest/models/sentence_level_pretrained/)
2. [Word-level Models](https://tharindudr.github.io/TransQuest/models/word_level_pretrained/)
5. **[Contact](https://tharindudr.github.io/TransQuest/contact/)** - Contact us for any issues with TransQuest
## Citations
If you are using the word-level architecture, please consider citing this paper which is accepted to [ACL 2021](https://2021.aclweb.org/).
```bash
@InProceedings{ranasinghe2021,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {An Exploratory Analysis of Multilingual Word Level Quality Estimation with Cross-Lingual Transformers},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}
```
If you are using the sentence-level architectures, please consider citing these papers which were presented in [COLING 2020](https://coling2020.org/) and in [WMT 2020](http://www.statmt.org/wmt20/) at EMNLP 2020.
```bash
@InProceedings{transquest:2020a,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest: Translation Quality Estimation with Cross-lingual Transformers},
booktitle = {Proceedings of the 28th International Conference on Computational Linguistics},
year = {2020}
}
```
```bash
@InProceedings{transquest:2020b,
author = {Ranasinghe, Tharindu and Orasan, Constantin and Mitkov, Ruslan},
title = {TransQuest at WMT2020: Sentence-Level Direct Assessment},
booktitle = {Proceedings of the Fifth Conference on Machine Translation},
year = {2020}
}
```
|
528 | Vasanth/tamil-sentiment-distilbert | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- tamilmixsentiment
metrics:
- accuracy
model_index:
- name: tamil-sentiment-distilbert
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: tamilmixsentiment
type: tamilmixsentiment
args: default
metric:
name: Accuracy
type: accuracy
value: 0.665
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# tamil-sentiment-distilbert
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the tamilmixsentiment dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0230
- Accuracy: 0.665
## Dataset Information
- text: Tamil-English code-mixed comment.
- label: list of the possible sentiments
- LABEL_0: "Positive",
- LABEL_1: "Negative",
- LABEL_2: "Mixed_feelings",
- LABEL_3: "unknown_state",
- LABEL_4: "not-Tamil"
## Intended uses & limitations
This model was just created for doing classification task on tamilmixsentiment dataset
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.0442 | 1.0 | 250 | 0.9883 | 0.674 |
| 0.9227 | 2.0 | 500 | 0.9782 | 0.673 |
| 0.7591 | 3.0 | 750 | 1.0230 | 0.665 |
### Framework versions
- Transformers 4.9.2
- Pytorch 1.9.0+cu102
- Datasets 1.11.0
- Tokenizers 0.10.3
|
529 | Vassilis/distilbert-base-uncased-finetuned-emotion | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4",
"LABEL_5"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1628
- Accuracy: 0.9345
- F1: 0.9348
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.1674 | 1.0 | 250 | 0.1718 | 0.9265 | 0.9266 |
| 0.1091 | 2.0 | 500 | 0.1628 | 0.9345 | 0.9348 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0
- Tokenizers 0.10.3
|
532 | Wellcome/WellcomeBertMesh | [
"LABEL_0",
"LABEL_1",
"LABEL_10",
"LABEL_100",
"LABEL_1000",
"LABEL_10000",
"LABEL_10001",
"LABEL_10002",
"LABEL_10003",
"LABEL_10004",
"LABEL_10005",
"LABEL_10006",
"LABEL_10007",
"LABEL_10008",
"LABEL_10009",
"LABEL_1001",
"LABEL_10010",
"LABEL_10011",
"LABEL_10012",
"LABEL_10013",
"LABEL_10014",
"LABEL_10015",
"LABEL_10016",
"LABEL_10017",
"LABEL_10018",
"LABEL_10019",
"LABEL_1002",
"LABEL_10020",
"LABEL_10021",
"LABEL_10022",
"LABEL_10023",
"LABEL_10024",
"LABEL_10025",
"LABEL_10026",
"LABEL_10027",
"LABEL_10028",
"LABEL_10029",
"LABEL_1003",
"LABEL_10030",
"LABEL_10031",
"LABEL_10032",
"LABEL_10033",
"LABEL_10034",
"LABEL_10035",
"LABEL_10036",
"LABEL_10037",
"LABEL_10038",
"LABEL_10039",
"LABEL_1004",
"LABEL_10040",
"LABEL_10041",
"LABEL_10042",
"LABEL_10043",
"LABEL_10044",
"LABEL_10045",
"LABEL_10046",
"LABEL_10047",
"LABEL_10048",
"LABEL_10049",
"LABEL_1005",
"LABEL_10050",
"LABEL_10051",
"LABEL_10052",
"LABEL_10053",
"LABEL_10054",
"LABEL_10055",
"LABEL_10056",
"LABEL_10057",
"LABEL_10058",
"LABEL_10059",
"LABEL_1006",
"LABEL_10060",
"LABEL_10061",
"LABEL_10062",
"LABEL_10063",
"LABEL_10064",
"LABEL_10065",
"LABEL_10066",
"LABEL_10067",
"LABEL_10068",
"LABEL_10069",
"LABEL_1007",
"LABEL_10070",
"LABEL_10071",
"LABEL_10072",
"LABEL_10073",
"LABEL_10074",
"LABEL_10075",
"LABEL_10076",
"LABEL_10077",
"LABEL_10078",
"LABEL_10079",
"LABEL_1008",
"LABEL_10080",
"LABEL_10081",
"LABEL_10082",
"LABEL_10083",
"LABEL_10084",
"LABEL_10085",
"LABEL_10086",
"LABEL_10087",
"LABEL_10088",
"LABEL_10089",
"LABEL_1009",
"LABEL_10090",
"LABEL_10091",
"LABEL_10092",
"LABEL_10093",
"LABEL_10094",
"LABEL_10095",
"LABEL_10096",
"LABEL_10097",
"LABEL_10098",
"LABEL_10099",
"LABEL_101",
"LABEL_1010",
"LABEL_10100",
"LABEL_10101",
"LABEL_10102",
"LABEL_10103",
"LABEL_10104",
"LABEL_10105",
"LABEL_10106",
"LABEL_10107",
"LABEL_10108",
"LABEL_10109",
"LABEL_1011",
"LABEL_10110",
"LABEL_10111",
"LABEL_10112",
"LABEL_10113",
"LABEL_10114",
"LABEL_10115",
"LABEL_10116",
"LABEL_10117",
"LABEL_10118",
"LABEL_10119",
"LABEL_1012",
"LABEL_10120",
"LABEL_10121",
"LABEL_10122",
"LABEL_10123",
"LABEL_10124",
"LABEL_10125",
"LABEL_10126",
"LABEL_10127",
"LABEL_10128",
"LABEL_10129",
"LABEL_1013",
"LABEL_10130",
"LABEL_10131",
"LABEL_10132",
"LABEL_10133",
"LABEL_10134",
"LABEL_10135",
"LABEL_10136",
"LABEL_10137",
"LABEL_10138",
"LABEL_10139",
"LABEL_1014",
"LABEL_10140",
"LABEL_10141",
"LABEL_10142",
"LABEL_10143",
"LABEL_10144",
"LABEL_10145",
"LABEL_10146",
"LABEL_10147",
"LABEL_10148",
"LABEL_10149",
"LABEL_1015",
"LABEL_10150",
"LABEL_10151",
"LABEL_10152",
"LABEL_10153",
"LABEL_10154",
"LABEL_10155",
"LABEL_10156",
"LABEL_10157",
"LABEL_10158",
"LABEL_10159",
"LABEL_1016",
"LABEL_10160",
"LABEL_10161",
"LABEL_10162",
"LABEL_10163",
"LABEL_10164",
"LABEL_10165",
"LABEL_10166",
"LABEL_10167",
"LABEL_10168",
"LABEL_10169",
"LABEL_1017",
"LABEL_10170",
"LABEL_10171",
"LABEL_10172",
"LABEL_10173",
"LABEL_10174",
"LABEL_10175",
"LABEL_10176",
"LABEL_10177",
"LABEL_10178",
"LABEL_10179",
"LABEL_1018",
"LABEL_10180",
"LABEL_10181",
"LABEL_10182",
"LABEL_10183",
"LABEL_10184",
"LABEL_10185",
"LABEL_10186",
"LABEL_10187",
"LABEL_10188",
"LABEL_10189",
"LABEL_1019",
"LABEL_10190",
"LABEL_10191",
"LABEL_10192",
"LABEL_10193",
"LABEL_10194",
"LABEL_10195",
"LABEL_10196",
"LABEL_10197",
"LABEL_10198",
"LABEL_10199",
"LABEL_102",
"LABEL_1020",
"LABEL_10200",
"LABEL_10201",
"LABEL_10202",
"LABEL_10203",
"LABEL_10204",
"LABEL_10205",
"LABEL_10206",
"LABEL_10207",
"LABEL_10208",
"LABEL_10209",
"LABEL_1021",
"LABEL_10210",
"LABEL_10211",
"LABEL_10212",
"LABEL_10213",
"LABEL_10214",
"LABEL_10215",
"LABEL_10216",
"LABEL_10217",
"LABEL_10218",
"LABEL_10219",
"LABEL_1022",
"LABEL_10220",
"LABEL_10221",
"LABEL_10222",
"LABEL_10223",
"LABEL_10224",
"LABEL_10225",
"LABEL_10226",
"LABEL_10227",
"LABEL_10228",
"LABEL_10229",
"LABEL_1023",
"LABEL_10230",
"LABEL_10231",
"LABEL_10232",
"LABEL_10233",
"LABEL_10234",
"LABEL_10235",
"LABEL_10236",
"LABEL_10237",
"LABEL_10238",
"LABEL_10239",
"LABEL_1024",
"LABEL_10240",
"LABEL_10241",
"LABEL_10242",
"LABEL_10243",
"LABEL_10244",
"LABEL_10245",
"LABEL_10246",
"LABEL_10247",
"LABEL_10248",
"LABEL_10249",
"LABEL_1025",
"LABEL_10250",
"LABEL_10251",
"LABEL_10252",
"LABEL_10253",
"LABEL_10254",
"LABEL_10255",
"LABEL_10256",
"LABEL_10257",
"LABEL_10258",
"LABEL_10259",
"LABEL_1026",
"LABEL_10260",
"LABEL_10261",
"LABEL_10262",
"LABEL_10263",
"LABEL_10264",
"LABEL_10265",
"LABEL_10266",
"LABEL_10267",
"LABEL_10268",
"LABEL_10269",
"LABEL_1027",
"LABEL_10270",
"LABEL_10271",
"LABEL_10272",
"LABEL_10273",
"LABEL_10274",
"LABEL_10275",
"LABEL_10276",
"LABEL_10277",
"LABEL_10278",
"LABEL_10279",
"LABEL_1028",
"LABEL_10280",
"LABEL_10281",
"LABEL_10282",
"LABEL_10283",
"LABEL_10284",
"LABEL_10285",
"LABEL_10286",
"LABEL_10287",
"LABEL_10288",
"LABEL_10289",
"LABEL_1029",
"LABEL_10290",
"LABEL_10291",
"LABEL_10292",
"LABEL_10293",
"LABEL_10294",
"LABEL_10295",
"LABEL_10296",
"LABEL_10297",
"LABEL_10298",
"LABEL_10299",
"LABEL_103",
"LABEL_1030",
"LABEL_10300",
"LABEL_10301",
"LABEL_10302",
"LABEL_10303",
"LABEL_10304",
"LABEL_10305",
"LABEL_10306",
"LABEL_10307",
"LABEL_10308",
"LABEL_10309",
"LABEL_1031",
"LABEL_10310",
"LABEL_10311",
"LABEL_10312",
"LABEL_10313",
"LABEL_10314",
"LABEL_10315",
"LABEL_10316",
"LABEL_10317",
"LABEL_10318",
"LABEL_10319",
"LABEL_1032",
"LABEL_10320",
"LABEL_10321",
"LABEL_10322",
"LABEL_10323",
"LABEL_10324",
"LABEL_10325",
"LABEL_10326",
"LABEL_10327",
"LABEL_10328",
"LABEL_10329",
"LABEL_1033",
"LABEL_10330",
"LABEL_10331",
"LABEL_10332",
"LABEL_10333",
"LABEL_10334",
"LABEL_10335",
"LABEL_10336",
"LABEL_10337",
"LABEL_10338",
"LABEL_10339",
"LABEL_1034",
"LABEL_10340",
"LABEL_10341",
"LABEL_10342",
"LABEL_10343",
"LABEL_10344",
"LABEL_10345",
"LABEL_10346",
"LABEL_10347",
"LABEL_10348",
"LABEL_10349",
"LABEL_1035",
"LABEL_10350",
"LABEL_10351",
"LABEL_10352",
"LABEL_10353",
"LABEL_10354",
"LABEL_10355",
"LABEL_10356",
"LABEL_10357",
"LABEL_10358",
"LABEL_10359",
"LABEL_1036",
"LABEL_10360",
"LABEL_10361",
"LABEL_10362",
"LABEL_10363",
"LABEL_10364",
"LABEL_10365",
"LABEL_10366",
"LABEL_10367",
"LABEL_10368",
"LABEL_10369",
"LABEL_1037",
"LABEL_10370",
"LABEL_10371",
"LABEL_10372",
"LABEL_10373",
"LABEL_10374",
"LABEL_10375",
"LABEL_10376",
"LABEL_10377",
"LABEL_10378",
"LABEL_10379",
"LABEL_1038",
"LABEL_10380",
"LABEL_10381",
"LABEL_10382",
"LABEL_10383",
"LABEL_10384",
"LABEL_10385",
"LABEL_10386",
"LABEL_10387",
"LABEL_10388",
"LABEL_10389",
"LABEL_1039",
"LABEL_10390",
"LABEL_10391",
"LABEL_10392",
"LABEL_10393",
"LABEL_10394",
"LABEL_10395",
"LABEL_10396",
"LABEL_10397",
"LABEL_10398",
"LABEL_10399",
"LABEL_104",
"LABEL_1040",
"LABEL_10400",
"LABEL_10401",
"LABEL_10402",
"LABEL_10403",
"LABEL_10404",
"LABEL_10405",
"LABEL_10406",
"LABEL_10407",
"LABEL_10408",
"LABEL_10409",
"LABEL_1041",
"LABEL_10410",
"LABEL_10411",
"LABEL_10412",
"LABEL_10413",
"LABEL_10414",
"LABEL_10415",
"LABEL_10416",
"LABEL_10417",
"LABEL_10418",
"LABEL_10419",
"LABEL_1042",
"LABEL_10420",
"LABEL_10421",
"LABEL_10422",
"LABEL_10423",
"LABEL_10424",
"LABEL_10425",
"LABEL_10426",
"LABEL_10427",
"LABEL_10428",
"LABEL_10429",
"LABEL_1043",
"LABEL_10430",
"LABEL_10431",
"LABEL_10432",
"LABEL_10433",
"LABEL_10434",
"LABEL_10435",
"LABEL_10436",
"LABEL_10437",
"LABEL_10438",
"LABEL_10439",
"LABEL_1044",
"LABEL_10440",
"LABEL_10441",
"LABEL_10442",
"LABEL_10443",
"LABEL_10444",
"LABEL_10445",
"LABEL_10446",
"LABEL_10447",
"LABEL_10448",
"LABEL_10449",
"LABEL_1045",
"LABEL_10450",
"LABEL_10451",
"LABEL_10452",
"LABEL_10453",
"LABEL_10454",
"LABEL_10455",
"LABEL_10456",
"LABEL_10457",
"LABEL_10458",
"LABEL_10459",
"LABEL_1046",
"LABEL_10460",
"LABEL_10461",
"LABEL_10462",
"LABEL_10463",
"LABEL_10464",
"LABEL_10465",
"LABEL_10466",
"LABEL_10467",
"LABEL_10468",
"LABEL_10469",
"LABEL_1047",
"LABEL_10470",
"LABEL_10471",
"LABEL_10472",
"LABEL_10473",
"LABEL_10474",
"LABEL_10475",
"LABEL_10476",
"LABEL_10477",
"LABEL_10478",
"LABEL_10479",
"LABEL_1048",
"LABEL_10480",
"LABEL_10481",
"LABEL_10482",
"LABEL_10483",
"LABEL_10484",
"LABEL_10485",
"LABEL_10486",
"LABEL_10487",
"LABEL_10488",
"LABEL_10489",
"LABEL_1049",
"LABEL_10490",
"LABEL_10491",
"LABEL_10492",
"LABEL_10493",
"LABEL_10494",
"LABEL_10495",
"LABEL_10496",
"LABEL_10497",
"LABEL_10498",
"LABEL_10499",
"LABEL_105",
"LABEL_1050",
"LABEL_10500",
"LABEL_10501",
"LABEL_10502",
"LABEL_10503",
"LABEL_10504",
"LABEL_10505",
"LABEL_10506",
"LABEL_10507",
"LABEL_10508",
"LABEL_10509",
"LABEL_1051",
"LABEL_10510",
"LABEL_10511",
"LABEL_10512",
"LABEL_10513",
"LABEL_10514",
"LABEL_10515",
"LABEL_10516",
"LABEL_10517",
"LABEL_10518",
"LABEL_10519",
"LABEL_1052",
"LABEL_10520",
"LABEL_10521",
"LABEL_10522",
"LABEL_10523",
"LABEL_10524",
"LABEL_10525",
"LABEL_10526",
"LABEL_10527",
"LABEL_10528",
"LABEL_10529",
"LABEL_1053",
"LABEL_10530",
"LABEL_10531",
"LABEL_10532",
"LABEL_10533",
"LABEL_10534",
"LABEL_10535",
"LABEL_10536",
"LABEL_10537",
"LABEL_10538",
"LABEL_10539",
"LABEL_1054",
"LABEL_10540",
"LABEL_10541",
"LABEL_10542",
"LABEL_10543",
"LABEL_10544",
"LABEL_10545",
"LABEL_10546",
"LABEL_10547",
"LABEL_10548",
"LABEL_10549",
"LABEL_1055",
"LABEL_10550",
"LABEL_10551",
"LABEL_10552",
"LABEL_10553",
"LABEL_10554",
"LABEL_10555",
"LABEL_10556",
"LABEL_10557",
"LABEL_10558",
"LABEL_10559",
"LABEL_1056",
"LABEL_10560",
"LABEL_10561",
"LABEL_10562",
"LABEL_10563",
"LABEL_10564",
"LABEL_10565",
"LABEL_10566",
"LABEL_10567",
"LABEL_10568",
"LABEL_10569",
"LABEL_1057",
"LABEL_10570",
"LABEL_10571",
"LABEL_10572",
"LABEL_10573",
"LABEL_10574",
"LABEL_10575",
"LABEL_10576",
"LABEL_10577",
"LABEL_10578",
"LABEL_10579",
"LABEL_1058",
"LABEL_10580",
"LABEL_10581",
"LABEL_10582",
"LABEL_10583",
"LABEL_10584",
"LABEL_10585",
"LABEL_10586",
"LABEL_10587",
"LABEL_10588",
"LABEL_10589",
"LABEL_1059",
"LABEL_10590",
"LABEL_10591",
"LABEL_10592",
"LABEL_10593",
"LABEL_10594",
"LABEL_10595",
"LABEL_10596",
"LABEL_10597",
"LABEL_10598",
"LABEL_10599",
"LABEL_106",
"LABEL_1060",
"LABEL_10600",
"LABEL_10601",
"LABEL_10602",
"LABEL_10603",
"LABEL_10604",
"LABEL_10605",
"LABEL_10606",
"LABEL_10607",
"LABEL_10608",
"LABEL_10609",
"LABEL_1061",
"LABEL_10610",
"LABEL_10611",
"LABEL_10612",
"LABEL_10613",
"LABEL_10614",
"LABEL_10615",
"LABEL_10616",
"LABEL_10617",
"LABEL_10618",
"LABEL_10619",
"LABEL_1062",
"LABEL_10620",
"LABEL_10621",
"LABEL_10622",
"LABEL_10623",
"LABEL_10624",
"LABEL_10625",
"LABEL_10626",
"LABEL_10627",
"LABEL_10628",
"LABEL_10629",
"LABEL_1063",
"LABEL_10630",
"LABEL_10631",
"LABEL_10632",
"LABEL_10633",
"LABEL_10634",
"LABEL_10635",
"LABEL_10636",
"LABEL_10637",
"LABEL_10638",
"LABEL_10639",
"LABEL_1064",
"LABEL_10640",
"LABEL_10641",
"LABEL_10642",
"LABEL_10643",
"LABEL_10644",
"LABEL_10645",
"LABEL_10646",
"LABEL_10647",
"LABEL_10648",
"LABEL_10649",
"LABEL_1065",
"LABEL_10650",
"LABEL_10651",
"LABEL_10652",
"LABEL_10653",
"LABEL_10654",
"LABEL_10655",
"LABEL_10656",
"LABEL_10657",
"LABEL_10658",
"LABEL_10659",
"LABEL_1066",
"LABEL_10660",
"LABEL_10661",
"LABEL_10662",
"LABEL_10663",
"LABEL_10664",
"LABEL_10665",
"LABEL_10666",
"LABEL_10667",
"LABEL_10668",
"LABEL_10669",
"LABEL_1067",
"LABEL_10670",
"LABEL_10671",
"LABEL_10672",
"LABEL_10673",
"LABEL_10674",
"LABEL_10675",
"LABEL_10676",
"LABEL_10677",
"LABEL_10678",
"LABEL_10679",
"LABEL_1068",
"LABEL_10680",
"LABEL_10681",
"LABEL_10682",
"LABEL_10683",
"LABEL_10684",
"LABEL_10685",
"LABEL_10686",
"LABEL_10687",
"LABEL_10688",
"LABEL_10689",
"LABEL_1069",
"LABEL_10690",
"LABEL_10691",
"LABEL_10692",
"LABEL_10693",
"LABEL_10694",
"LABEL_10695",
"LABEL_10696",
"LABEL_10697",
"LABEL_10698",
"LABEL_10699",
"LABEL_107",
"LABEL_1070",
"LABEL_10700",
"LABEL_10701",
"LABEL_10702",
"LABEL_10703",
"LABEL_10704",
"LABEL_10705",
"LABEL_10706",
"LABEL_10707",
"LABEL_10708",
"LABEL_10709",
"LABEL_1071",
"LABEL_10710",
"LABEL_10711",
"LABEL_10712",
"LABEL_10713",
"LABEL_10714",
"LABEL_10715",
"LABEL_10716",
"LABEL_10717",
"LABEL_10718",
"LABEL_10719",
"LABEL_1072",
"LABEL_10720",
"LABEL_10721",
"LABEL_10722",
"LABEL_10723",
"LABEL_10724",
"LABEL_10725",
"LABEL_10726",
"LABEL_10727",
"LABEL_10728",
"LABEL_10729",
"LABEL_1073",
"LABEL_10730",
"LABEL_10731",
"LABEL_10732",
"LABEL_10733",
"LABEL_10734",
"LABEL_10735",
"LABEL_10736",
"LABEL_10737",
"LABEL_10738",
"LABEL_10739",
"LABEL_1074",
"LABEL_10740",
"LABEL_10741",
"LABEL_10742",
"LABEL_10743",
"LABEL_10744",
"LABEL_10745",
"LABEL_10746",
"LABEL_10747",
"LABEL_10748",
"LABEL_10749",
"LABEL_1075",
"LABEL_10750",
"LABEL_10751",
"LABEL_10752",
"LABEL_10753",
"LABEL_10754",
"LABEL_10755",
"LABEL_10756",
"LABEL_10757",
"LABEL_10758",
"LABEL_10759",
"LABEL_1076",
"LABEL_10760",
"LABEL_10761",
"LABEL_10762",
"LABEL_10763",
"LABEL_10764",
"LABEL_10765",
"LABEL_10766",
"LABEL_10767",
"LABEL_10768",
"LABEL_10769",
"LABEL_1077",
"LABEL_10770",
"LABEL_10771",
"LABEL_10772",
"LABEL_10773",
"LABEL_10774",
"LABEL_10775",
"LABEL_10776",
"LABEL_10777",
"LABEL_10778",
"LABEL_10779",
"LABEL_1078",
"LABEL_10780",
"LABEL_10781",
"LABEL_10782",
"LABEL_10783",
"LABEL_10784",
"LABEL_10785",
"LABEL_10786",
"LABEL_10787",
"LABEL_10788",
"LABEL_10789",
"LABEL_1079",
"LABEL_10790",
"LABEL_10791",
"LABEL_10792",
"LABEL_10793",
"LABEL_10794",
"LABEL_10795",
"LABEL_10796",
"LABEL_10797",
"LABEL_10798",
"LABEL_10799",
"LABEL_108",
"LABEL_1080",
"LABEL_10800",
"LABEL_10801",
"LABEL_10802",
"LABEL_10803",
"LABEL_10804",
"LABEL_10805",
"LABEL_10806",
"LABEL_10807",
"LABEL_10808",
"LABEL_10809",
"LABEL_1081",
"LABEL_10810",
"LABEL_10811",
"LABEL_10812",
"LABEL_10813",
"LABEL_10814",
"LABEL_10815",
"LABEL_10816",
"LABEL_10817",
"LABEL_10818",
"LABEL_10819",
"LABEL_1082",
"LABEL_10820",
"LABEL_10821",
"LABEL_10822",
"LABEL_10823",
"LABEL_10824",
"LABEL_10825",
"LABEL_10826",
"LABEL_10827",
"LABEL_10828",
"LABEL_10829",
"LABEL_1083",
"LABEL_10830",
"LABEL_10831",
"LABEL_10832",
"LABEL_10833",
"LABEL_10834",
"LABEL_10835",
"LABEL_10836",
"LABEL_10837",
"LABEL_10838",
"LABEL_10839",
"LABEL_1084",
"LABEL_10840",
"LABEL_10841",
"LABEL_10842",
"LABEL_10843",
"LABEL_10844",
"LABEL_10845",
"LABEL_10846",
"LABEL_10847",
"LABEL_10848",
"LABEL_10849",
"LABEL_1085",
"LABEL_10850",
"LABEL_10851",
"LABEL_10852",
"LABEL_10853",
"LABEL_10854",
"LABEL_10855",
"LABEL_10856",
"LABEL_10857",
"LABEL_10858",
"LABEL_10859",
"LABEL_1086",
"LABEL_10860",
"LABEL_10861",
"LABEL_10862",
"LABEL_10863",
"LABEL_10864",
"LABEL_10865",
"LABEL_10866",
"LABEL_10867",
"LABEL_10868",
"LABEL_10869",
"LABEL_1087",
"LABEL_10870",
"LABEL_10871",
"LABEL_10872",
"LABEL_10873",
"LABEL_10874",
"LABEL_10875",
"LABEL_10876",
"LABEL_10877",
"LABEL_10878",
"LABEL_10879",
"LABEL_1088",
"LABEL_10880",
"LABEL_10881",
"LABEL_10882",
"LABEL_10883",
"LABEL_10884",
"LABEL_10885",
"LABEL_10886",
"LABEL_10887",
"LABEL_10888",
"LABEL_10889",
"LABEL_1089",
"LABEL_10890",
"LABEL_10891",
"LABEL_10892",
"LABEL_10893",
"LABEL_10894",
"LABEL_10895",
"LABEL_10896",
"LABEL_10897",
"LABEL_10898",
"LABEL_10899",
"LABEL_109",
"LABEL_1090",
"LABEL_10900",
"LABEL_10901",
"LABEL_10902",
"LABEL_10903",
"LABEL_10904",
"LABEL_10905",
"LABEL_10906",
"LABEL_10907",
"LABEL_10908",
"LABEL_10909",
"LABEL_1091",
"LABEL_10910",
"LABEL_10911",
"LABEL_10912",
"LABEL_10913",
"LABEL_10914",
"LABEL_10915",
"LABEL_10916",
"LABEL_10917",
"LABEL_10918",
"LABEL_10919",
"LABEL_1092",
"LABEL_10920",
"LABEL_10921",
"LABEL_10922",
"LABEL_10923",
"LABEL_10924",
"LABEL_10925",
"LABEL_10926",
"LABEL_10927",
"LABEL_10928",
"LABEL_10929",
"LABEL_1093",
"LABEL_10930",
"LABEL_10931",
"LABEL_10932",
"LABEL_10933",
"LABEL_10934",
"LABEL_10935",
"LABEL_10936",
"LABEL_10937",
"LABEL_10938",
"LABEL_10939",
"LABEL_1094",
"LABEL_10940",
"LABEL_10941",
"LABEL_10942",
"LABEL_10943",
"LABEL_10944",
"LABEL_10945",
"LABEL_10946",
"LABEL_10947",
"LABEL_10948",
"LABEL_10949",
"LABEL_1095",
"LABEL_10950",
"LABEL_10951",
"LABEL_10952",
"LABEL_10953",
"LABEL_10954",
"LABEL_10955",
"LABEL_10956",
"LABEL_10957",
"LABEL_10958",
"LABEL_10959",
"LABEL_1096",
"LABEL_10960",
"LABEL_10961",
"LABEL_10962",
"LABEL_10963",
"LABEL_10964",
"LABEL_10965",
"LABEL_10966",
"LABEL_10967",
"LABEL_10968",
"LABEL_10969",
"LABEL_1097",
"LABEL_10970",
"LABEL_10971",
"LABEL_10972",
"LABEL_10973",
"LABEL_10974",
"LABEL_10975",
"LABEL_10976",
"LABEL_10977",
"LABEL_10978",
"LABEL_10979",
"LABEL_1098",
"LABEL_10980",
"LABEL_10981",
"LABEL_10982",
"LABEL_10983",
"LABEL_10984",
"LABEL_10985",
"LABEL_10986",
"LABEL_10987",
"LABEL_10988",
"LABEL_10989",
"LABEL_1099",
"LABEL_10990",
"LABEL_10991",
"LABEL_10992",
"LABEL_10993",
"LABEL_10994",
"LABEL_10995",
"LABEL_10996",
"LABEL_10997",
"LABEL_10998",
"LABEL_10999",
"LABEL_11",
"LABEL_110",
"LABEL_1100",
"LABEL_11000",
"LABEL_11001",
"LABEL_11002",
"LABEL_11003",
"LABEL_11004",
"LABEL_11005",
"LABEL_11006",
"LABEL_11007",
"LABEL_11008",
"LABEL_11009",
"LABEL_1101",
"LABEL_11010",
"LABEL_11011",
"LABEL_11012",
"LABEL_11013",
"LABEL_11014",
"LABEL_11015",
"LABEL_11016",
"LABEL_11017",
"LABEL_11018",
"LABEL_11019",
"LABEL_1102",
"LABEL_11020",
"LABEL_11021",
"LABEL_11022",
"LABEL_11023",
"LABEL_11024",
"LABEL_11025",
"LABEL_11026",
"LABEL_11027",
"LABEL_11028",
"LABEL_11029",
"LABEL_1103",
"LABEL_11030",
"LABEL_11031",
"LABEL_11032",
"LABEL_11033",
"LABEL_11034",
"LABEL_11035",
"LABEL_11036",
"LABEL_11037",
"LABEL_11038",
"LABEL_11039",
"LABEL_1104",
"LABEL_11040",
"LABEL_11041",
"LABEL_11042",
"LABEL_11043",
"LABEL_11044",
"LABEL_11045",
"LABEL_11046",
"LABEL_11047",
"LABEL_11048",
"LABEL_11049",
"LABEL_1105",
"LABEL_11050",
"LABEL_11051",
"LABEL_11052",
"LABEL_11053",
"LABEL_11054",
"LABEL_11055",
"LABEL_11056",
"LABEL_11057",
"LABEL_11058",
"LABEL_11059",
"LABEL_1106",
"LABEL_11060",
"LABEL_11061",
"LABEL_11062",
"LABEL_11063",
"LABEL_11064",
"LABEL_11065",
"LABEL_11066",
"LABEL_11067",
"LABEL_11068",
"LABEL_11069",
"LABEL_1107",
"LABEL_11070",
"LABEL_11071",
"LABEL_11072",
"LABEL_11073",
"LABEL_11074",
"LABEL_11075",
"LABEL_11076",
"LABEL_11077",
"LABEL_11078",
"LABEL_11079",
"LABEL_1108",
"LABEL_11080",
"LABEL_11081",
"LABEL_11082",
"LABEL_11083",
"LABEL_11084",
"LABEL_11085",
"LABEL_11086",
"LABEL_11087",
"LABEL_11088",
"LABEL_11089",
"LABEL_1109",
"LABEL_11090",
"LABEL_11091",
"LABEL_11092",
"LABEL_11093",
"LABEL_11094",
"LABEL_11095",
"LABEL_11096",
"LABEL_11097",
"LABEL_11098",
"LABEL_11099",
"LABEL_111",
"LABEL_1110",
"LABEL_11100",
"LABEL_11101",
"LABEL_11102",
"LABEL_11103",
"LABEL_11104",
"LABEL_11105",
"LABEL_11106",
"LABEL_11107",
"LABEL_11108",
"LABEL_11109",
"LABEL_1111",
"LABEL_11110",
"LABEL_11111",
"LABEL_11112",
"LABEL_11113",
"LABEL_11114",
"LABEL_11115",
"LABEL_11116",
"LABEL_11117",
"LABEL_11118",
"LABEL_11119",
"LABEL_1112",
"LABEL_11120",
"LABEL_11121",
"LABEL_11122",
"LABEL_11123",
"LABEL_11124",
"LABEL_11125",
"LABEL_11126",
"LABEL_11127",
"LABEL_11128",
"LABEL_11129",
"LABEL_1113",
"LABEL_11130",
"LABEL_11131",
"LABEL_11132",
"LABEL_11133",
"LABEL_11134",
"LABEL_11135",
"LABEL_11136",
"LABEL_11137",
"LABEL_11138",
"LABEL_11139",
"LABEL_1114",
"LABEL_11140",
"LABEL_11141",
"LABEL_11142",
"LABEL_11143",
"LABEL_11144",
"LABEL_11145",
"LABEL_11146",
"LABEL_11147",
"LABEL_11148",
"LABEL_11149",
"LABEL_1115",
"LABEL_11150",
"LABEL_11151",
"LABEL_11152",
"LABEL_11153",
"LABEL_11154",
"LABEL_11155",
"LABEL_11156",
"LABEL_11157",
"LABEL_11158",
"LABEL_11159",
"LABEL_1116",
"LABEL_11160",
"LABEL_11161",
"LABEL_11162",
"LABEL_11163",
"LABEL_11164",
"LABEL_11165",
"LABEL_11166",
"LABEL_11167",
"LABEL_11168",
"LABEL_11169",
"LABEL_1117",
"LABEL_11170",
"LABEL_11171",
"LABEL_11172",
"LABEL_11173",
"LABEL_11174",
"LABEL_11175",
"LABEL_11176",
"LABEL_11177",
"LABEL_11178",
"LABEL_11179",
"LABEL_1118",
"LABEL_11180",
"LABEL_11181",
"LABEL_11182",
"LABEL_11183",
"LABEL_11184",
"LABEL_11185",
"LABEL_11186",
"LABEL_11187",
"LABEL_11188",
"LABEL_11189",
"LABEL_1119",
"LABEL_11190",
"LABEL_11191",
"LABEL_11192",
"LABEL_11193",
"LABEL_11194",
"LABEL_11195",
"LABEL_11196",
"LABEL_11197",
"LABEL_11198",
"LABEL_11199",
"LABEL_112",
"LABEL_1120",
"LABEL_11200",
"LABEL_11201",
"LABEL_11202",
"LABEL_11203",
"LABEL_11204",
"LABEL_11205",
"LABEL_11206",
"LABEL_11207",
"LABEL_11208",
"LABEL_11209",
"LABEL_1121",
"LABEL_11210",
"LABEL_11211",
"LABEL_11212",
"LABEL_11213",
"LABEL_11214",
"LABEL_11215",
"LABEL_11216",
"LABEL_11217",
"LABEL_11218",
"LABEL_11219",
"LABEL_1122",
"LABEL_11220",
"LABEL_11221",
"LABEL_11222",
"LABEL_11223",
"LABEL_11224",
"LABEL_11225",
"LABEL_11226",
"LABEL_11227",
"LABEL_11228",
"LABEL_11229",
"LABEL_1123",
"LABEL_11230",
"LABEL_11231",
"LABEL_11232",
"LABEL_11233",
"LABEL_11234",
"LABEL_11235",
"LABEL_11236",
"LABEL_11237",
"LABEL_11238",
"LABEL_11239",
"LABEL_1124",
"LABEL_11240",
"LABEL_11241",
"LABEL_11242",
"LABEL_11243",
"LABEL_11244",
"LABEL_11245",
"LABEL_11246",
"LABEL_11247",
"LABEL_11248",
"LABEL_11249",
"LABEL_1125",
"LABEL_11250",
"LABEL_11251",
"LABEL_11252",
"LABEL_11253",
"LABEL_11254",
"LABEL_11255",
"LABEL_11256",
"LABEL_11257",
"LABEL_11258",
"LABEL_11259",
"LABEL_1126",
"LABEL_11260",
"LABEL_11261",
"LABEL_11262",
"LABEL_11263",
"LABEL_11264",
"LABEL_11265",
"LABEL_11266",
"LABEL_11267",
"LABEL_11268",
"LABEL_11269",
"LABEL_1127",
"LABEL_11270",
"LABEL_11271",
"LABEL_11272",
"LABEL_11273",
"LABEL_11274",
"LABEL_11275",
"LABEL_11276",
"LABEL_11277",
"LABEL_11278",
"LABEL_11279",
"LABEL_1128",
"LABEL_11280",
"LABEL_11281",
"LABEL_11282",
"LABEL_11283",
"LABEL_11284",
"LABEL_11285",
"LABEL_11286",
"LABEL_11287",
"LABEL_11288",
"LABEL_11289",
"LABEL_1129",
"LABEL_11290",
"LABEL_11291",
"LABEL_11292",
"LABEL_11293",
"LABEL_11294",
"LABEL_11295",
"LABEL_11296",
"LABEL_11297",
"LABEL_11298",
"LABEL_11299",
"LABEL_113",
"LABEL_1130",
"LABEL_11300",
"LABEL_11301",
"LABEL_11302",
"LABEL_11303",
"LABEL_11304",
"LABEL_11305",
"LABEL_11306",
"LABEL_11307",
"LABEL_11308",
"LABEL_11309",
"LABEL_1131",
"LABEL_11310",
"LABEL_11311",
"LABEL_11312",
"LABEL_11313",
"LABEL_11314",
"LABEL_11315",
"LABEL_11316",
"LABEL_11317",
"LABEL_11318",
"LABEL_11319",
"LABEL_1132",
"LABEL_11320",
"LABEL_11321",
"LABEL_11322",
"LABEL_11323",
"LABEL_11324",
"LABEL_11325",
"LABEL_11326",
"LABEL_11327",
"LABEL_11328",
"LABEL_11329",
"LABEL_1133",
"LABEL_11330",
"LABEL_11331",
"LABEL_11332",
"LABEL_11333",
"LABEL_11334",
"LABEL_11335",
"LABEL_11336",
"LABEL_11337",
"LABEL_11338",
"LABEL_11339",
"LABEL_1134",
"LABEL_11340",
"LABEL_11341",
"LABEL_11342",
"LABEL_11343",
"LABEL_11344",
"LABEL_11345",
"LABEL_11346",
"LABEL_11347",
"LABEL_11348",
"LABEL_11349",
"LABEL_1135",
"LABEL_11350",
"LABEL_11351",
"LABEL_11352",
"LABEL_11353",
"LABEL_11354",
"LABEL_11355",
"LABEL_11356",
"LABEL_11357",
"LABEL_11358",
"LABEL_11359",
"LABEL_1136",
"LABEL_11360",
"LABEL_11361",
"LABEL_11362",
"LABEL_11363",
"LABEL_11364",
"LABEL_11365",
"LABEL_11366",
"LABEL_11367",
"LABEL_11368",
"LABEL_11369",
"LABEL_1137",
"LABEL_11370",
"LABEL_11371",
"LABEL_11372",
"LABEL_11373",
"LABEL_11374",
"LABEL_11375",
"LABEL_11376",
"LABEL_11377",
"LABEL_11378",
"LABEL_11379",
"LABEL_1138",
"LABEL_11380",
"LABEL_11381",
"LABEL_11382",
"LABEL_11383",
"LABEL_11384",
"LABEL_11385",
"LABEL_11386",
"LABEL_11387",
"LABEL_11388",
"LABEL_11389",
"LABEL_1139",
"LABEL_11390",
"LABEL_11391",
"LABEL_11392",
"LABEL_11393",
"LABEL_11394",
"LABEL_11395",
"LABEL_11396",
"LABEL_11397",
"LABEL_11398",
"LABEL_11399",
"LABEL_114",
"LABEL_1140",
"LABEL_11400",
"LABEL_11401",
"LABEL_11402",
"LABEL_11403",
"LABEL_11404",
"LABEL_11405",
"LABEL_11406",
"LABEL_11407",
"LABEL_11408",
"LABEL_11409",
"LABEL_1141",
"LABEL_11410",
"LABEL_11411",
"LABEL_11412",
"LABEL_11413",
"LABEL_11414",
"LABEL_11415",
"LABEL_11416",
"LABEL_11417",
"LABEL_11418",
"LABEL_11419",
"LABEL_1142",
"LABEL_11420",
"LABEL_11421",
"LABEL_11422",
"LABEL_11423",
"LABEL_11424",
"LABEL_11425",
"LABEL_11426",
"LABEL_11427",
"LABEL_11428",
"LABEL_11429",
"LABEL_1143",
"LABEL_11430",
"LABEL_11431",
"LABEL_11432",
"LABEL_11433",
"LABEL_11434",
"LABEL_11435",
"LABEL_11436",
"LABEL_11437",
"LABEL_11438",
"LABEL_11439",
"LABEL_1144",
"LABEL_11440",
"LABEL_11441",
"LABEL_11442",
"LABEL_11443",
"LABEL_11444",
"LABEL_11445",
"LABEL_11446",
"LABEL_11447",
"LABEL_11448",
"LABEL_11449",
"LABEL_1145",
"LABEL_11450",
"LABEL_11451",
"LABEL_11452",
"LABEL_11453",
"LABEL_11454",
"LABEL_11455",
"LABEL_11456",
"LABEL_11457",
"LABEL_11458",
"LABEL_11459",
"LABEL_1146",
"LABEL_11460",
"LABEL_11461",
"LABEL_11462",
"LABEL_11463",
"LABEL_11464",
"LABEL_11465",
"LABEL_11466",
"LABEL_11467",
"LABEL_11468",
"LABEL_11469",
"LABEL_1147",
"LABEL_11470",
"LABEL_11471",
"LABEL_11472",
"LABEL_11473",
"LABEL_11474",
"LABEL_11475",
"LABEL_11476",
"LABEL_11477",
"LABEL_11478",
"LABEL_11479",
"LABEL_1148",
"LABEL_11480",
"LABEL_11481",
"LABEL_11482",
"LABEL_11483",
"LABEL_11484",
"LABEL_11485",
"LABEL_11486",
"LABEL_11487",
"LABEL_11488",
"LABEL_11489",
"LABEL_1149",
"LABEL_11490",
"LABEL_11491",
"LABEL_11492",
"LABEL_11493",
"LABEL_11494",
"LABEL_11495",
"LABEL_11496",
"LABEL_11497",
"LABEL_11498",
"LABEL_11499",
"LABEL_115",
"LABEL_1150",
"LABEL_11500",
"LABEL_11501",
"LABEL_11502",
"LABEL_11503",
"LABEL_11504",
"LABEL_11505",
"LABEL_11506",
"LABEL_11507",
"LABEL_11508",
"LABEL_11509",
"LABEL_1151",
"LABEL_11510",
"LABEL_11511",
"LABEL_11512",
"LABEL_11513",
"LABEL_11514",
"LABEL_11515",
"LABEL_11516",
"LABEL_11517",
"LABEL_11518",
"LABEL_11519",
"LABEL_1152",
"LABEL_11520",
"LABEL_11521",
"LABEL_11522",
"LABEL_11523",
"LABEL_11524",
"LABEL_11525",
"LABEL_11526",
"LABEL_11527",
"LABEL_11528",
"LABEL_11529",
"LABEL_1153",
"LABEL_11530",
"LABEL_11531",
"LABEL_11532",
"LABEL_11533",
"LABEL_11534",
"LABEL_11535",
"LABEL_11536",
"LABEL_11537",
"LABEL_11538",
"LABEL_11539",
"LABEL_1154",
"LABEL_11540",
"LABEL_11541",
"LABEL_11542",
"LABEL_11543",
"LABEL_11544",
"LABEL_11545",
"LABEL_11546",
"LABEL_11547",
"LABEL_11548",
"LABEL_11549",
"LABEL_1155",
"LABEL_11550",
"LABEL_11551",
"LABEL_11552",
"LABEL_11553",
"LABEL_11554",
"LABEL_11555",
"LABEL_11556",
"LABEL_11557",
"LABEL_11558",
"LABEL_11559",
"LABEL_1156",
"LABEL_11560",
"LABEL_11561",
"LABEL_11562",
"LABEL_11563",
"LABEL_11564",
"LABEL_11565",
"LABEL_11566",
"LABEL_11567",
"LABEL_11568",
"LABEL_11569",
"LABEL_1157",
"LABEL_11570",
"LABEL_11571",
"LABEL_11572",
"LABEL_11573",
"LABEL_11574",
"LABEL_11575",
"LABEL_11576",
"LABEL_11577",
"LABEL_11578",
"LABEL_11579",
"LABEL_1158",
"LABEL_11580",
"LABEL_11581",
"LABEL_11582",
"LABEL_11583",
"LABEL_11584",
"LABEL_11585",
"LABEL_11586",
"LABEL_11587",
"LABEL_11588",
"LABEL_11589",
"LABEL_1159",
"LABEL_11590",
"LABEL_11591",
"LABEL_11592",
"LABEL_11593",
"LABEL_11594",
"LABEL_11595",
"LABEL_11596",
"LABEL_11597",
"LABEL_11598",
"LABEL_11599",
"LABEL_116",
"LABEL_1160",
"LABEL_11600",
"LABEL_11601",
"LABEL_11602",
"LABEL_11603",
"LABEL_11604",
"LABEL_11605",
"LABEL_11606",
"LABEL_11607",
"LABEL_11608",
"LABEL_11609",
"LABEL_1161",
"LABEL_11610",
"LABEL_11611",
"LABEL_11612",
"LABEL_11613",
"LABEL_11614",
"LABEL_11615",
"LABEL_11616",
"LABEL_11617",
"LABEL_11618",
"LABEL_11619",
"LABEL_1162",
"LABEL_11620",
"LABEL_11621",
"LABEL_11622",
"LABEL_11623",
"LABEL_11624",
"LABEL_11625",
"LABEL_11626",
"LABEL_11627",
"LABEL_11628",
"LABEL_11629",
"LABEL_1163",
"LABEL_11630",
"LABEL_11631",
"LABEL_11632",
"LABEL_11633",
"LABEL_11634",
"LABEL_11635",
"LABEL_11636",
"LABEL_11637",
"LABEL_11638",
"LABEL_11639",
"LABEL_1164",
"LABEL_11640",
"LABEL_11641",
"LABEL_11642",
"LABEL_11643",
"LABEL_11644",
"LABEL_11645",
"LABEL_11646",
"LABEL_11647",
"LABEL_11648",
"LABEL_11649",
"LABEL_1165",
"LABEL_11650",
"LABEL_11651",
"LABEL_11652",
"LABEL_11653",
"LABEL_11654",
"LABEL_11655",
"LABEL_11656",
"LABEL_11657",
"LABEL_11658",
"LABEL_11659",
"LABEL_1166",
"LABEL_11660",
"LABEL_11661",
"LABEL_11662",
"LABEL_11663",
"LABEL_11664",
"LABEL_11665",
"LABEL_11666",
"LABEL_11667",
"LABEL_11668",
"LABEL_11669",
"LABEL_1167",
"LABEL_11670",
"LABEL_11671",
"LABEL_11672",
"LABEL_11673",
"LABEL_11674",
"LABEL_11675",
"LABEL_11676",
"LABEL_11677",
"LABEL_11678",
"LABEL_11679",
"LABEL_1168",
"LABEL_11680",
"LABEL_11681",
"LABEL_11682",
"LABEL_11683",
"LABEL_11684",
"LABEL_11685",
"LABEL_11686",
"LABEL_11687",
"LABEL_11688",
"LABEL_11689",
"LABEL_1169",
"LABEL_11690",
"LABEL_11691",
"LABEL_11692",
"LABEL_11693",
"LABEL_11694",
"LABEL_11695",
"LABEL_11696",
"LABEL_11697",
"LABEL_11698",
"LABEL_11699",
"LABEL_117",
"LABEL_1170",
"LABEL_11700",
"LABEL_11701",
"LABEL_11702",
"LABEL_11703",
"LABEL_11704",
"LABEL_11705",
"LABEL_11706",
"LABEL_11707",
"LABEL_11708",
"LABEL_11709",
"LABEL_1171",
"LABEL_11710",
"LABEL_11711",
"LABEL_11712",
"LABEL_11713",
"LABEL_11714",
"LABEL_11715",
"LABEL_11716",
"LABEL_11717",
"LABEL_11718",
"LABEL_11719",
"LABEL_1172",
"LABEL_11720",
"LABEL_11721",
"LABEL_11722",
"LABEL_11723",
"LABEL_11724",
"LABEL_11725",
"LABEL_11726",
"LABEL_11727",
"LABEL_11728",
"LABEL_11729",
"LABEL_1173",
"LABEL_11730",
"LABEL_11731",
"LABEL_11732",
"LABEL_11733",
"LABEL_11734",
"LABEL_11735",
"LABEL_11736",
"LABEL_11737",
"LABEL_11738",
"LABEL_11739",
"LABEL_1174",
"LABEL_11740",
"LABEL_11741",
"LABEL_11742",
"LABEL_11743",
"LABEL_11744",
"LABEL_11745",
"LABEL_11746",
"LABEL_11747",
"LABEL_11748",
"LABEL_11749",
"LABEL_1175",
"LABEL_11750",
"LABEL_11751",
"LABEL_11752",
"LABEL_11753",
"LABEL_11754",
"LABEL_11755",
"LABEL_11756",
"LABEL_11757",
"LABEL_11758",
"LABEL_11759",
"LABEL_1176",
"LABEL_11760",
"LABEL_11761",
"LABEL_11762",
"LABEL_11763",
"LABEL_11764",
"LABEL_11765",
"LABEL_11766",
"LABEL_11767",
"LABEL_11768",
"LABEL_11769",
"LABEL_1177",
"LABEL_11770",
"LABEL_11771",
"LABEL_11772",
"LABEL_11773",
"LABEL_11774",
"LABEL_11775",
"LABEL_11776",
"LABEL_11777",
"LABEL_11778",
"LABEL_11779",
"LABEL_1178",
"LABEL_11780",
"LABEL_11781",
"LABEL_11782",
"LABEL_11783",
"LABEL_11784",
"LABEL_11785",
"LABEL_11786",
"LABEL_11787",
"LABEL_11788",
"LABEL_11789",
"LABEL_1179",
"LABEL_11790",
"LABEL_11791",
"LABEL_11792",
"LABEL_11793",
"LABEL_11794",
"LABEL_11795",
"LABEL_11796",
"LABEL_11797",
"LABEL_11798",
"LABEL_11799",
"LABEL_118",
"LABEL_1180",
"LABEL_11800",
"LABEL_11801",
"LABEL_11802",
"LABEL_11803",
"LABEL_11804",
"LABEL_11805",
"LABEL_11806",
"LABEL_11807",
"LABEL_11808",
"LABEL_11809",
"LABEL_1181",
"LABEL_11810",
"LABEL_11811",
"LABEL_11812",
"LABEL_11813",
"LABEL_11814",
"LABEL_11815",
"LABEL_11816",
"LABEL_11817",
"LABEL_11818",
"LABEL_11819",
"LABEL_1182",
"LABEL_11820",
"LABEL_11821",
"LABEL_11822",
"LABEL_11823",
"LABEL_11824",
"LABEL_11825",
"LABEL_11826",
"LABEL_11827",
"LABEL_11828",
"LABEL_11829",
"LABEL_1183",
"LABEL_11830",
"LABEL_11831",
"LABEL_11832",
"LABEL_11833",
"LABEL_11834",
"LABEL_11835",
"LABEL_11836",
"LABEL_11837",
"LABEL_11838",
"LABEL_11839",
"LABEL_1184",
"LABEL_11840",
"LABEL_11841",
"LABEL_11842",
"LABEL_11843",
"LABEL_11844",
"LABEL_11845",
"LABEL_11846",
"LABEL_11847",
"LABEL_11848",
"LABEL_11849",
"LABEL_1185",
"LABEL_11850",
"LABEL_11851",
"LABEL_11852",
"LABEL_11853",
"LABEL_11854",
"LABEL_11855",
"LABEL_11856",
"LABEL_11857",
"LABEL_11858",
"LABEL_11859",
"LABEL_1186",
"LABEL_11860",
"LABEL_11861",
"LABEL_11862",
"LABEL_11863",
"LABEL_11864",
"LABEL_11865",
"LABEL_11866",
"LABEL_11867",
"LABEL_11868",
"LABEL_11869",
"LABEL_1187",
"LABEL_11870",
"LABEL_11871",
"LABEL_11872",
"LABEL_11873",
"LABEL_11874",
"LABEL_11875",
"LABEL_11876",
"LABEL_11877",
"LABEL_11878",
"LABEL_11879",
"LABEL_1188",
"LABEL_11880",
"LABEL_11881",
"LABEL_11882",
"LABEL_11883",
"LABEL_11884",
"LABEL_11885",
"LABEL_11886",
"LABEL_11887",
"LABEL_11888",
"LABEL_11889",
"LABEL_1189",
"LABEL_11890",
"LABEL_11891",
"LABEL_11892",
"LABEL_11893",
"LABEL_11894",
"LABEL_11895",
"LABEL_11896",
"LABEL_11897",
"LABEL_11898",
"LABEL_11899",
"LABEL_119",
"LABEL_1190",
"LABEL_11900",
"LABEL_11901",
"LABEL_11902",
"LABEL_11903",
"LABEL_11904",
"LABEL_11905",
"LABEL_11906",
"LABEL_11907",
"LABEL_11908",
"LABEL_11909",
"LABEL_1191",
"LABEL_11910",
"LABEL_11911",
"LABEL_11912",
"LABEL_11913",
"LABEL_11914",
"LABEL_11915",
"LABEL_11916",
"LABEL_11917",
"LABEL_11918",
"LABEL_11919",
"LABEL_1192",
"LABEL_11920",
"LABEL_11921",
"LABEL_11922",
"LABEL_11923",
"LABEL_11924",
"LABEL_11925",
"LABEL_11926",
"LABEL_11927",
"LABEL_11928",
"LABEL_11929",
"LABEL_1193",
"LABEL_11930",
"LABEL_11931",
"LABEL_11932",
"LABEL_11933",
"LABEL_11934",
"LABEL_11935",
"LABEL_11936",
"LABEL_11937",
"LABEL_11938",
"LABEL_11939",
"LABEL_1194",
"LABEL_11940",
"LABEL_11941",
"LABEL_11942",
"LABEL_11943",
"LABEL_11944",
"LABEL_11945",
"LABEL_11946",
"LABEL_11947",
"LABEL_11948",
"LABEL_11949",
"LABEL_1195",
"LABEL_11950",
"LABEL_11951",
"LABEL_11952",
"LABEL_11953",
"LABEL_11954",
"LABEL_11955",
"LABEL_11956",
"LABEL_11957",
"LABEL_11958",
"LABEL_11959",
"LABEL_1196",
"LABEL_11960",
"LABEL_11961",
"LABEL_11962",
"LABEL_11963",
"LABEL_11964",
"LABEL_11965",
"LABEL_11966",
"LABEL_11967",
"LABEL_11968",
"LABEL_11969",
"LABEL_1197",
"LABEL_11970",
"LABEL_11971",
"LABEL_11972",
"LABEL_11973",
"LABEL_11974",
"LABEL_11975",
"LABEL_11976",
"LABEL_11977",
"LABEL_11978",
"LABEL_11979",
"LABEL_1198",
"LABEL_11980",
"LABEL_11981",
"LABEL_11982",
"LABEL_11983",
"LABEL_11984",
"LABEL_11985",
"LABEL_11986",
"LABEL_11987",
"LABEL_11988",
"LABEL_11989",
"LABEL_1199",
"LABEL_11990",
"LABEL_11991",
"LABEL_11992",
"LABEL_11993",
"LABEL_11994",
"LABEL_11995",
"LABEL_11996",
"LABEL_11997",
"LABEL_11998",
"LABEL_11999",
"LABEL_12",
"LABEL_120",
"LABEL_1200",
"LABEL_12000",
"LABEL_12001",
"LABEL_12002",
"LABEL_12003",
"LABEL_12004",
"LABEL_12005",
"LABEL_12006",
"LABEL_12007",
"LABEL_12008",
"LABEL_12009",
"LABEL_1201",
"LABEL_12010",
"LABEL_12011",
"LABEL_12012",
"LABEL_12013",
"LABEL_12014",
"LABEL_12015",
"LABEL_12016",
"LABEL_12017",
"LABEL_12018",
"LABEL_12019",
"LABEL_1202",
"LABEL_12020",
"LABEL_12021",
"LABEL_12022",
"LABEL_12023",
"LABEL_12024",
"LABEL_12025",
"LABEL_12026",
"LABEL_12027",
"LABEL_12028",
"LABEL_12029",
"LABEL_1203",
"LABEL_12030",
"LABEL_12031",
"LABEL_12032",
"LABEL_12033",
"LABEL_12034",
"LABEL_12035",
"LABEL_12036",
"LABEL_12037",
"LABEL_12038",
"LABEL_12039",
"LABEL_1204",
"LABEL_12040",
"LABEL_12041",
"LABEL_12042",
"LABEL_12043",
"LABEL_12044",
"LABEL_12045",
"LABEL_12046",
"LABEL_12047",
"LABEL_12048",
"LABEL_12049",
"LABEL_1205",
"LABEL_12050",
"LABEL_12051",
"LABEL_12052",
"LABEL_12053",
"LABEL_12054",
"LABEL_12055",
"LABEL_12056",
"LABEL_12057",
"LABEL_12058",
"LABEL_12059",
"LABEL_1206",
"LABEL_12060",
"LABEL_12061",
"LABEL_12062",
"LABEL_12063",
"LABEL_12064",
"LABEL_12065",
"LABEL_12066",
"LABEL_12067",
"LABEL_12068",
"LABEL_12069",
"LABEL_1207",
"LABEL_12070",
"LABEL_12071",
"LABEL_12072",
"LABEL_12073",
"LABEL_12074",
"LABEL_12075",
"LABEL_12076",
"LABEL_12077",
"LABEL_12078",
"LABEL_12079",
"LABEL_1208",
"LABEL_12080",
"LABEL_12081",
"LABEL_12082",
"LABEL_12083",
"LABEL_12084",
"LABEL_12085",
"LABEL_12086",
"LABEL_12087",
"LABEL_12088",
"LABEL_12089",
"LABEL_1209",
"LABEL_12090",
"LABEL_12091",
"LABEL_12092",
"LABEL_12093",
"LABEL_12094",
"LABEL_12095",
"LABEL_12096",
"LABEL_12097",
"LABEL_12098",
"LABEL_12099",
"LABEL_121",
"LABEL_1210",
"LABEL_12100",
"LABEL_12101",
"LABEL_12102",
"LABEL_12103",
"LABEL_12104",
"LABEL_12105",
"LABEL_12106",
"LABEL_12107",
"LABEL_12108",
"LABEL_12109",
"LABEL_1211",
"LABEL_12110",
"LABEL_12111",
"LABEL_12112",
"LABEL_12113",
"LABEL_12114",
"LABEL_12115",
"LABEL_12116",
"LABEL_12117",
"LABEL_12118",
"LABEL_12119",
"LABEL_1212",
"LABEL_12120",
"LABEL_12121",
"LABEL_12122",
"LABEL_12123",
"LABEL_12124",
"LABEL_12125",
"LABEL_12126",
"LABEL_12127",
"LABEL_12128",
"LABEL_12129",
"LABEL_1213",
"LABEL_12130",
"LABEL_12131",
"LABEL_12132",
"LABEL_12133",
"LABEL_12134",
"LABEL_12135",
"LABEL_12136",
"LABEL_12137",
"LABEL_12138",
"LABEL_12139",
"LABEL_1214",
"LABEL_12140",
"LABEL_12141",
"LABEL_12142",
"LABEL_12143",
"LABEL_12144",
"LABEL_12145",
"LABEL_12146",
"LABEL_12147",
"LABEL_12148",
"LABEL_12149",
"LABEL_1215",
"LABEL_12150",
"LABEL_12151",
"LABEL_12152",
"LABEL_12153",
"LABEL_12154",
"LABEL_12155",
"LABEL_12156",
"LABEL_12157",
"LABEL_12158",
"LABEL_12159",
"LABEL_1216",
"LABEL_12160",
"LABEL_12161",
"LABEL_12162",
"LABEL_12163",
"LABEL_12164",
"LABEL_12165",
"LABEL_12166",
"LABEL_12167",
"LABEL_12168",
"LABEL_12169",
"LABEL_1217",
"LABEL_12170",
"LABEL_12171",
"LABEL_12172",
"LABEL_12173",
"LABEL_12174",
"LABEL_12175",
"LABEL_12176",
"LABEL_12177",
"LABEL_12178",
"LABEL_12179",
"LABEL_1218",
"LABEL_12180",
"LABEL_12181",
"LABEL_12182",
"LABEL_12183",
"LABEL_12184",
"LABEL_12185",
"LABEL_12186",
"LABEL_12187",
"LABEL_12188",
"LABEL_12189",
"LABEL_1219",
"LABEL_12190",
"LABEL_12191",
"LABEL_12192",
"LABEL_12193",
"LABEL_12194",
"LABEL_12195",
"LABEL_12196",
"LABEL_12197",
"LABEL_12198",
"LABEL_12199",
"LABEL_122",
"LABEL_1220",
"LABEL_12200",
"LABEL_12201",
"LABEL_12202",
"LABEL_12203",
"LABEL_12204",
"LABEL_12205",
"LABEL_12206",
"LABEL_12207",
"LABEL_12208",
"LABEL_12209",
"LABEL_1221",
"LABEL_12210",
"LABEL_12211",
"LABEL_12212",
"LABEL_12213",
"LABEL_12214",
"LABEL_12215",
"LABEL_12216",
"LABEL_12217",
"LABEL_12218",
"LABEL_12219",
"LABEL_1222",
"LABEL_12220",
"LABEL_12221",
"LABEL_12222",
"LABEL_12223",
"LABEL_12224",
"LABEL_12225",
"LABEL_12226",
"LABEL_12227",
"LABEL_12228",
"LABEL_12229",
"LABEL_1223",
"LABEL_12230",
"LABEL_12231",
"LABEL_12232",
"LABEL_12233",
"LABEL_12234",
"LABEL_12235",
"LABEL_12236",
"LABEL_12237",
"LABEL_12238",
"LABEL_12239",
"LABEL_1224",
"LABEL_12240",
"LABEL_12241",
"LABEL_12242",
"LABEL_12243",
"LABEL_12244",
"LABEL_12245",
"LABEL_12246",
"LABEL_12247",
"LABEL_12248",
"LABEL_12249",
"LABEL_1225",
"LABEL_12250",
"LABEL_12251",
"LABEL_12252",
"LABEL_12253",
"LABEL_12254",
"LABEL_12255",
"LABEL_12256",
"LABEL_12257",
"LABEL_12258",
"LABEL_12259",
"LABEL_1226",
"LABEL_12260",
"LABEL_12261",
"LABEL_12262",
"LABEL_12263",
"LABEL_12264",
"LABEL_12265",
"LABEL_12266",
"LABEL_12267",
"LABEL_12268",
"LABEL_12269",
"LABEL_1227",
"LABEL_12270",
"LABEL_12271",
"LABEL_12272",
"LABEL_12273",
"LABEL_12274",
"LABEL_12275",
"LABEL_12276",
"LABEL_12277",
"LABEL_12278",
"LABEL_12279",
"LABEL_1228",
"LABEL_12280",
"LABEL_12281",
"LABEL_12282",
"LABEL_12283",
"LABEL_12284",
"LABEL_12285",
"LABEL_12286",
"LABEL_12287",
"LABEL_12288",
"LABEL_12289",
"LABEL_1229",
"LABEL_12290",
"LABEL_12291",
"LABEL_12292",
"LABEL_12293",
"LABEL_12294",
"LABEL_12295",
"LABEL_12296",
"LABEL_12297",
"LABEL_12298",
"LABEL_12299",
"LABEL_123",
"LABEL_1230",
"LABEL_12300",
"LABEL_12301",
"LABEL_12302",
"LABEL_12303",
"LABEL_12304",
"LABEL_12305",
"LABEL_12306",
"LABEL_12307",
"LABEL_12308",
"LABEL_12309",
"LABEL_1231",
"LABEL_12310",
"LABEL_12311",
"LABEL_12312",
"LABEL_12313",
"LABEL_12314",
"LABEL_12315",
"LABEL_12316",
"LABEL_12317",
"LABEL_12318",
"LABEL_12319",
"LABEL_1232",
"LABEL_12320",
"LABEL_12321",
"LABEL_12322",
"LABEL_12323",
"LABEL_12324",
"LABEL_12325",
"LABEL_12326",
"LABEL_12327",
"LABEL_12328",
"LABEL_12329",
"LABEL_1233",
"LABEL_12330",
"LABEL_12331",
"LABEL_12332",
"LABEL_12333",
"LABEL_12334",
"LABEL_12335",
"LABEL_12336",
"LABEL_12337",
"LABEL_12338",
"LABEL_12339",
"LABEL_1234",
"LABEL_12340",
"LABEL_12341",
"LABEL_12342",
"LABEL_12343",
"LABEL_12344",
"LABEL_12345",
"LABEL_12346",
"LABEL_12347",
"LABEL_12348",
"LABEL_12349",
"LABEL_1235",
"LABEL_12350",
"LABEL_12351",
"LABEL_12352",
"LABEL_12353",
"LABEL_12354",
"LABEL_12355",
"LABEL_12356",
"LABEL_12357",
"LABEL_12358",
"LABEL_12359",
"LABEL_1236",
"LABEL_12360",
"LABEL_12361",
"LABEL_12362",
"LABEL_12363",
"LABEL_12364",
"LABEL_12365",
"LABEL_12366",
"LABEL_12367",
"LABEL_12368",
"LABEL_12369",
"LABEL_1237",
"LABEL_12370",
"LABEL_12371",
"LABEL_12372",
"LABEL_12373",
"LABEL_12374",
"LABEL_12375",
"LABEL_12376",
"LABEL_12377",
"LABEL_12378",
"LABEL_12379",
"LABEL_1238",
"LABEL_12380",
"LABEL_12381",
"LABEL_12382",
"LABEL_12383",
"LABEL_12384",
"LABEL_12385",
"LABEL_12386",
"LABEL_12387",
"LABEL_12388",
"LABEL_12389",
"LABEL_1239",
"LABEL_12390",
"LABEL_12391",
"LABEL_12392",
"LABEL_12393",
"LABEL_12394",
"LABEL_12395",
"LABEL_12396",
"LABEL_12397",
"LABEL_12398",
"LABEL_12399",
"LABEL_124",
"LABEL_1240",
"LABEL_12400",
"LABEL_12401",
"LABEL_12402",
"LABEL_12403",
"LABEL_12404",
"LABEL_12405",
"LABEL_12406",
"LABEL_12407",
"LABEL_12408",
"LABEL_12409",
"LABEL_1241",
"LABEL_12410",
"LABEL_12411",
"LABEL_12412",
"LABEL_12413",
"LABEL_12414",
"LABEL_12415",
"LABEL_12416",
"LABEL_12417",
"LABEL_12418",
"LABEL_12419",
"LABEL_1242",
"LABEL_12420",
"LABEL_12421",
"LABEL_12422",
"LABEL_12423",
"LABEL_12424",
"LABEL_12425",
"LABEL_12426",
"LABEL_12427",
"LABEL_12428",
"LABEL_12429",
"LABEL_1243",
"LABEL_12430",
"LABEL_12431",
"LABEL_12432",
"LABEL_12433",
"LABEL_12434",
"LABEL_12435",
"LABEL_12436",
"LABEL_12437",
"LABEL_12438",
"LABEL_12439",
"LABEL_1244",
"LABEL_12440",
"LABEL_12441",
"LABEL_12442",
"LABEL_12443",
"LABEL_12444",
"LABEL_12445",
"LABEL_12446",
"LABEL_12447",
"LABEL_12448",
"LABEL_12449",
"LABEL_1245",
"LABEL_12450",
"LABEL_12451",
"LABEL_12452",
"LABEL_12453",
"LABEL_12454",
"LABEL_12455",
"LABEL_12456",
"LABEL_12457",
"LABEL_12458",
"LABEL_12459",
"LABEL_1246",
"LABEL_12460",
"LABEL_12461",
"LABEL_12462",
"LABEL_12463",
"LABEL_12464",
"LABEL_12465",
"LABEL_12466",
"LABEL_12467",
"LABEL_12468",
"LABEL_12469",
"LABEL_1247",
"LABEL_12470",
"LABEL_12471",
"LABEL_12472",
"LABEL_12473",
"LABEL_12474",
"LABEL_12475",
"LABEL_12476",
"LABEL_12477",
"LABEL_12478",
"LABEL_12479",
"LABEL_1248",
"LABEL_12480",
"LABEL_12481",
"LABEL_12482",
"LABEL_12483",
"LABEL_12484",
"LABEL_12485",
"LABEL_12486",
"LABEL_12487",
"LABEL_12488",
"LABEL_12489",
"LABEL_1249",
"LABEL_12490",
"LABEL_12491",
"LABEL_12492",
"LABEL_12493",
"LABEL_12494",
"LABEL_12495",
"LABEL_12496",
"LABEL_12497",
"LABEL_12498",
"LABEL_12499",
"LABEL_125",
"LABEL_1250",
"LABEL_12500",
"LABEL_12501",
"LABEL_12502",
"LABEL_12503",
"LABEL_12504",
"LABEL_12505",
"LABEL_12506",
"LABEL_12507",
"LABEL_12508",
"LABEL_12509",
"LABEL_1251",
"LABEL_12510",
"LABEL_12511",
"LABEL_12512",
"LABEL_12513",
"LABEL_12514",
"LABEL_12515",
"LABEL_12516",
"LABEL_12517",
"LABEL_12518",
"LABEL_12519",
"LABEL_1252",
"LABEL_12520",
"LABEL_12521",
"LABEL_12522",
"LABEL_12523",
"LABEL_12524",
"LABEL_12525",
"LABEL_12526",
"LABEL_12527",
"LABEL_12528",
"LABEL_12529",
"LABEL_1253",
"LABEL_12530",
"LABEL_12531",
"LABEL_12532",
"LABEL_12533",
"LABEL_12534",
"LABEL_12535",
"LABEL_12536",
"LABEL_12537",
"LABEL_12538",
"LABEL_12539",
"LABEL_1254",
"LABEL_12540",
"LABEL_12541",
"LABEL_12542",
"LABEL_12543",
"LABEL_12544",
"LABEL_12545",
"LABEL_12546",
"LABEL_12547",
"LABEL_12548",
"LABEL_12549",
"LABEL_1255",
"LABEL_12550",
"LABEL_12551",
"LABEL_12552",
"LABEL_12553",
"LABEL_12554",
"LABEL_12555",
"LABEL_12556",
"LABEL_12557",
"LABEL_12558",
"LABEL_12559",
"LABEL_1256",
"LABEL_12560",
"LABEL_12561",
"LABEL_12562",
"LABEL_12563",
"LABEL_12564",
"LABEL_12565",
"LABEL_12566",
"LABEL_12567",
"LABEL_12568",
"LABEL_12569",
"LABEL_1257",
"LABEL_12570",
"LABEL_12571",
"LABEL_12572",
"LABEL_12573",
"LABEL_12574",
"LABEL_12575",
"LABEL_12576",
"LABEL_12577",
"LABEL_12578",
"LABEL_12579",
"LABEL_1258",
"LABEL_12580",
"LABEL_12581",
"LABEL_12582",
"LABEL_12583",
"LABEL_12584",
"LABEL_12585",
"LABEL_12586",
"LABEL_12587",
"LABEL_12588",
"LABEL_12589",
"LABEL_1259",
"LABEL_12590",
"LABEL_12591",
"LABEL_12592",
"LABEL_12593",
"LABEL_12594",
"LABEL_12595",
"LABEL_12596",
"LABEL_12597",
"LABEL_12598",
"LABEL_12599",
"LABEL_126",
"LABEL_1260",
"LABEL_12600",
"LABEL_12601",
"LABEL_12602",
"LABEL_12603",
"LABEL_12604",
"LABEL_12605",
"LABEL_12606",
"LABEL_12607",
"LABEL_12608",
"LABEL_12609",
"LABEL_1261",
"LABEL_12610",
"LABEL_12611",
"LABEL_12612",
"LABEL_12613",
"LABEL_12614",
"LABEL_12615",
"LABEL_12616",
"LABEL_12617",
"LABEL_12618",
"LABEL_12619",
"LABEL_1262",
"LABEL_12620",
"LABEL_12621",
"LABEL_12622",
"LABEL_12623",
"LABEL_12624",
"LABEL_12625",
"LABEL_12626",
"LABEL_12627",
"LABEL_12628",
"LABEL_12629",
"LABEL_1263",
"LABEL_12630",
"LABEL_12631",
"LABEL_12632",
"LABEL_12633",
"LABEL_12634",
"LABEL_12635",
"LABEL_12636",
"LABEL_12637",
"LABEL_12638",
"LABEL_12639",
"LABEL_1264",
"LABEL_12640",
"LABEL_12641",
"LABEL_12642",
"LABEL_12643",
"LABEL_12644",
"LABEL_12645",
"LABEL_12646",
"LABEL_12647",
"LABEL_12648",
"LABEL_12649",
"LABEL_1265",
"LABEL_12650",
"LABEL_12651",
"LABEL_12652",
"LABEL_12653",
"LABEL_12654",
"LABEL_12655",
"LABEL_12656",
"LABEL_12657",
"LABEL_12658",
"LABEL_12659",
"LABEL_1266",
"LABEL_12660",
"LABEL_12661",
"LABEL_12662",
"LABEL_12663",
"LABEL_12664",
"LABEL_12665",
"LABEL_12666",
"LABEL_12667",
"LABEL_12668",
"LABEL_12669",
"LABEL_1267",
"LABEL_12670",
"LABEL_12671",
"LABEL_12672",
"LABEL_12673",
"LABEL_12674",
"LABEL_12675",
"LABEL_12676",
"LABEL_12677",
"LABEL_12678",
"LABEL_12679",
"LABEL_1268",
"LABEL_12680",
"LABEL_12681",
"LABEL_12682",
"LABEL_12683",
"LABEL_12684",
"LABEL_12685",
"LABEL_12686",
"LABEL_12687",
"LABEL_12688",
"LABEL_12689",
"LABEL_1269",
"LABEL_12690",
"LABEL_12691",
"LABEL_12692",
"LABEL_12693",
"LABEL_12694",
"LABEL_12695",
"LABEL_12696",
"LABEL_12697",
"LABEL_12698",
"LABEL_12699",
"LABEL_127",
"LABEL_1270",
"LABEL_12700",
"LABEL_12701",
"LABEL_12702",
"LABEL_12703",
"LABEL_12704",
"LABEL_12705",
"LABEL_12706",
"LABEL_12707",
"LABEL_12708",
"LABEL_12709",
"LABEL_1271",
"LABEL_12710",
"LABEL_12711",
"LABEL_12712",
"LABEL_12713",
"LABEL_12714",
"LABEL_12715",
"LABEL_12716",
"LABEL_12717",
"LABEL_12718",
"LABEL_12719",
"LABEL_1272",
"LABEL_12720",
"LABEL_12721",
"LABEL_12722",
"LABEL_12723",
"LABEL_12724",
"LABEL_12725",
"LABEL_12726",
"LABEL_12727",
"LABEL_12728",
"LABEL_12729",
"LABEL_1273",
"LABEL_12730",
"LABEL_12731",
"LABEL_12732",
"LABEL_12733",
"LABEL_12734",
"LABEL_12735",
"LABEL_12736",
"LABEL_12737",
"LABEL_12738",
"LABEL_12739",
"LABEL_1274",
"LABEL_12740",
"LABEL_12741",
"LABEL_12742",
"LABEL_12743",
"LABEL_12744",
"LABEL_12745",
"LABEL_12746",
"LABEL_12747",
"LABEL_12748",
"LABEL_12749",
"LABEL_1275",
"LABEL_12750",
"LABEL_12751",
"LABEL_12752",
"LABEL_12753",
"LABEL_12754",
"LABEL_12755",
"LABEL_12756",
"LABEL_12757",
"LABEL_12758",
"LABEL_12759",
"LABEL_1276",
"LABEL_12760",
"LABEL_12761",
"LABEL_12762",
"LABEL_12763",
"LABEL_12764",
"LABEL_12765",
"LABEL_12766",
"LABEL_12767",
"LABEL_12768",
"LABEL_12769",
"LABEL_1277",
"LABEL_12770",
"LABEL_12771",
"LABEL_12772",
"LABEL_12773",
"LABEL_12774",
"LABEL_12775",
"LABEL_12776",
"LABEL_12777",
"LABEL_12778",
"LABEL_12779",
"LABEL_1278",
"LABEL_12780",
"LABEL_12781",
"LABEL_12782",
"LABEL_12783",
"LABEL_12784",
"LABEL_12785",
"LABEL_12786",
"LABEL_12787",
"LABEL_12788",
"LABEL_12789",
"LABEL_1279",
"LABEL_12790",
"LABEL_12791",
"LABEL_12792",
"LABEL_12793",
"LABEL_12794",
"LABEL_12795",
"LABEL_12796",
"LABEL_12797",
"LABEL_12798",
"LABEL_12799",
"LABEL_128",
"LABEL_1280",
"LABEL_12800",
"LABEL_12801",
"LABEL_12802",
"LABEL_12803",
"LABEL_12804",
"LABEL_12805",
"LABEL_12806",
"LABEL_12807",
"LABEL_12808",
"LABEL_12809",
"LABEL_1281",
"LABEL_12810",
"LABEL_12811",
"LABEL_12812",
"LABEL_12813",
"LABEL_12814",
"LABEL_12815",
"LABEL_12816",
"LABEL_12817",
"LABEL_12818",
"LABEL_12819",
"LABEL_1282",
"LABEL_12820",
"LABEL_12821",
"LABEL_12822",
"LABEL_12823",
"LABEL_12824",
"LABEL_12825",
"LABEL_12826",
"LABEL_12827",
"LABEL_12828",
"LABEL_12829",
"LABEL_1283",
"LABEL_12830",
"LABEL_12831",
"LABEL_12832",
"LABEL_12833",
"LABEL_12834",
"LABEL_12835",
"LABEL_12836",
"LABEL_12837",
"LABEL_12838",
"LABEL_12839",
"LABEL_1284",
"LABEL_12840",
"LABEL_12841",
"LABEL_12842",
"LABEL_12843",
"LABEL_12844",
"LABEL_12845",
"LABEL_12846",
"LABEL_12847",
"LABEL_12848",
"LABEL_12849",
"LABEL_1285",
"LABEL_12850",
"LABEL_12851",
"LABEL_12852",
"LABEL_12853",
"LABEL_12854",
"LABEL_12855",
"LABEL_12856",
"LABEL_12857",
"LABEL_12858",
"LABEL_12859",
"LABEL_1286",
"LABEL_12860",
"LABEL_12861",
"LABEL_12862",
"LABEL_12863",
"LABEL_12864",
"LABEL_12865",
"LABEL_12866",
"LABEL_12867",
"LABEL_12868",
"LABEL_12869",
"LABEL_1287",
"LABEL_12870",
"LABEL_12871",
"LABEL_12872",
"LABEL_12873",
"LABEL_12874",
"LABEL_12875",
"LABEL_12876",
"LABEL_12877",
"LABEL_12878",
"LABEL_12879",
"LABEL_1288",
"LABEL_12880",
"LABEL_12881",
"LABEL_12882",
"LABEL_12883",
"LABEL_12884",
"LABEL_12885",
"LABEL_12886",
"LABEL_12887",
"LABEL_12888",
"LABEL_12889",
"LABEL_1289",
"LABEL_12890",
"LABEL_12891",
"LABEL_12892",
"LABEL_12893",
"LABEL_12894",
"LABEL_12895",
"LABEL_12896",
"LABEL_12897",
"LABEL_12898",
"LABEL_12899",
"LABEL_129",
"LABEL_1290",
"LABEL_12900",
"LABEL_12901",
"LABEL_12902",
"LABEL_12903",
"LABEL_12904",
"LABEL_12905",
"LABEL_12906",
"LABEL_12907",
"LABEL_12908",
"LABEL_12909",
"LABEL_1291",
"LABEL_12910",
"LABEL_12911",
"LABEL_12912",
"LABEL_12913",
"LABEL_12914",
"LABEL_12915",
"LABEL_12916",
"LABEL_12917",
"LABEL_12918",
"LABEL_12919",
"LABEL_1292",
"LABEL_12920",
"LABEL_12921",
"LABEL_12922",
"LABEL_12923",
"LABEL_12924",
"LABEL_12925",
"LABEL_12926",
"LABEL_12927",
"LABEL_12928",
"LABEL_12929",
"LABEL_1293",
"LABEL_12930",
"LABEL_12931",
"LABEL_12932",
"LABEL_12933",
"LABEL_12934",
"LABEL_12935",
"LABEL_12936",
"LABEL_12937",
"LABEL_12938",
"LABEL_12939",
"LABEL_1294",
"LABEL_12940",
"LABEL_12941",
"LABEL_12942",
"LABEL_12943",
"LABEL_12944",
"LABEL_12945",
"LABEL_12946",
"LABEL_12947",
"LABEL_12948",
"LABEL_12949",
"LABEL_1295",
"LABEL_12950",
"LABEL_12951",
"LABEL_12952",
"LABEL_12953",
"LABEL_12954",
"LABEL_12955",
"LABEL_12956",
"LABEL_12957",
"LABEL_12958",
"LABEL_12959",
"LABEL_1296",
"LABEL_12960",
"LABEL_12961",
"LABEL_12962",
"LABEL_12963",
"LABEL_12964",
"LABEL_12965",
"LABEL_12966",
"LABEL_12967",
"LABEL_12968",
"LABEL_12969",
"LABEL_1297",
"LABEL_12970",
"LABEL_12971",
"LABEL_12972",
"LABEL_12973",
"LABEL_12974",
"LABEL_12975",
"LABEL_12976",
"LABEL_12977",
"LABEL_12978",
"LABEL_12979",
"LABEL_1298",
"LABEL_12980",
"LABEL_12981",
"LABEL_12982",
"LABEL_12983",
"LABEL_12984",
"LABEL_12985",
"LABEL_12986",
"LABEL_12987",
"LABEL_12988",
"LABEL_12989",
"LABEL_1299",
"LABEL_12990",
"LABEL_12991",
"LABEL_12992",
"LABEL_12993",
"LABEL_12994",
"LABEL_12995",
"LABEL_12996",
"LABEL_12997",
"LABEL_12998",
"LABEL_12999",
"LABEL_13",
"LABEL_130",
"LABEL_1300",
"LABEL_13000",
"LABEL_13001",
"LABEL_13002",
"LABEL_13003",
"LABEL_13004",
"LABEL_13005",
"LABEL_13006",
"LABEL_13007",
"LABEL_13008",
"LABEL_13009",
"LABEL_1301",
"LABEL_13010",
"LABEL_13011",
"LABEL_13012",
"LABEL_13013",
"LABEL_13014",
"LABEL_13015",
"LABEL_13016",
"LABEL_13017",
"LABEL_13018",
"LABEL_13019",
"LABEL_1302",
"LABEL_13020",
"LABEL_13021",
"LABEL_13022",
"LABEL_13023",
"LABEL_13024",
"LABEL_13025",
"LABEL_13026",
"LABEL_13027",
"LABEL_13028",
"LABEL_13029",
"LABEL_1303",
"LABEL_13030",
"LABEL_13031",
"LABEL_13032",
"LABEL_13033",
"LABEL_13034",
"LABEL_13035",
"LABEL_13036",
"LABEL_13037",
"LABEL_13038",
"LABEL_13039",
"LABEL_1304",
"LABEL_13040",
"LABEL_13041",
"LABEL_13042",
"LABEL_13043",
"LABEL_13044",
"LABEL_13045",
"LABEL_13046",
"LABEL_13047",
"LABEL_13048",
"LABEL_13049",
"LABEL_1305",
"LABEL_13050",
"LABEL_13051",
"LABEL_13052",
"LABEL_13053",
"LABEL_13054",
"LABEL_13055",
"LABEL_13056",
"LABEL_13057",
"LABEL_13058",
"LABEL_13059",
"LABEL_1306",
"LABEL_13060",
"LABEL_13061",
"LABEL_13062",
"LABEL_13063",
"LABEL_13064",
"LABEL_13065",
"LABEL_13066",
"LABEL_13067",
"LABEL_13068",
"LABEL_13069",
"LABEL_1307",
"LABEL_13070",
"LABEL_13071",
"LABEL_13072",
"LABEL_13073",
"LABEL_13074",
"LABEL_13075",
"LABEL_13076",
"LABEL_13077",
"LABEL_13078",
"LABEL_13079",
"LABEL_1308",
"LABEL_13080",
"LABEL_13081",
"LABEL_13082",
"LABEL_13083",
"LABEL_13084",
"LABEL_13085",
"LABEL_13086",
"LABEL_13087",
"LABEL_13088",
"LABEL_13089",
"LABEL_1309",
"LABEL_13090",
"LABEL_13091",
"LABEL_13092",
"LABEL_13093",
"LABEL_13094",
"LABEL_13095",
"LABEL_13096",
"LABEL_13097",
"LABEL_13098",
"LABEL_13099",
"LABEL_131",
"LABEL_1310",
"LABEL_13100",
"LABEL_13101",
"LABEL_13102",
"LABEL_13103",
"LABEL_13104",
"LABEL_13105",
"LABEL_13106",
"LABEL_13107",
"LABEL_13108",
"LABEL_13109",
"LABEL_1311",
"LABEL_13110",
"LABEL_13111",
"LABEL_13112",
"LABEL_13113",
"LABEL_13114",
"LABEL_13115",
"LABEL_13116",
"LABEL_13117",
"LABEL_13118",
"LABEL_13119",
"LABEL_1312",
"LABEL_13120",
"LABEL_13121",
"LABEL_13122",
"LABEL_13123",
"LABEL_13124",
"LABEL_13125",
"LABEL_13126",
"LABEL_13127",
"LABEL_13128",
"LABEL_13129",
"LABEL_1313",
"LABEL_13130",
"LABEL_13131",
"LABEL_13132",
"LABEL_13133",
"LABEL_13134",
"LABEL_13135",
"LABEL_13136",
"LABEL_13137",
"LABEL_13138",
"LABEL_13139",
"LABEL_1314",
"LABEL_13140",
"LABEL_13141",
"LABEL_13142",
"LABEL_13143",
"LABEL_13144",
"LABEL_13145",
"LABEL_13146",
"LABEL_13147",
"LABEL_13148",
"LABEL_13149",
"LABEL_1315",
"LABEL_13150",
"LABEL_13151",
"LABEL_13152",
"LABEL_13153",
"LABEL_13154",
"LABEL_13155",
"LABEL_13156",
"LABEL_13157",
"LABEL_13158",
"LABEL_13159",
"LABEL_1316",
"LABEL_13160",
"LABEL_13161",
"LABEL_13162",
"LABEL_13163",
"LABEL_13164",
"LABEL_13165",
"LABEL_13166",
"LABEL_13167",
"LABEL_13168",
"LABEL_13169",
"LABEL_1317",
"LABEL_13170",
"LABEL_13171",
"LABEL_13172",
"LABEL_13173",
"LABEL_13174",
"LABEL_13175",
"LABEL_13176",
"LABEL_13177",
"LABEL_13178",
"LABEL_13179",
"LABEL_1318",
"LABEL_13180",
"LABEL_13181",
"LABEL_13182",
"LABEL_13183",
"LABEL_13184",
"LABEL_13185",
"LABEL_13186",
"LABEL_13187",
"LABEL_13188",
"LABEL_13189",
"LABEL_1319",
"LABEL_13190",
"LABEL_13191",
"LABEL_13192",
"LABEL_13193",
"LABEL_13194",
"LABEL_13195",
"LABEL_13196",
"LABEL_13197",
"LABEL_13198",
"LABEL_13199",
"LABEL_132",
"LABEL_1320",
"LABEL_13200",
"LABEL_13201",
"LABEL_13202",
"LABEL_13203",
"LABEL_13204",
"LABEL_13205",
"LABEL_13206",
"LABEL_13207",
"LABEL_13208",
"LABEL_13209",
"LABEL_1321",
"LABEL_13210",
"LABEL_13211",
"LABEL_13212",
"LABEL_13213",
"LABEL_13214",
"LABEL_13215",
"LABEL_13216",
"LABEL_13217",
"LABEL_13218",
"LABEL_13219",
"LABEL_1322",
"LABEL_13220",
"LABEL_13221",
"LABEL_13222",
"LABEL_13223",
"LABEL_13224",
"LABEL_13225",
"LABEL_13226",
"LABEL_13227",
"LABEL_13228",
"LABEL_13229",
"LABEL_1323",
"LABEL_13230",
"LABEL_13231",
"LABEL_13232",
"LABEL_13233",
"LABEL_13234",
"LABEL_13235",
"LABEL_13236",
"LABEL_13237",
"LABEL_13238",
"LABEL_13239",
"LABEL_1324",
"LABEL_13240",
"LABEL_13241",
"LABEL_13242",
"LABEL_13243",
"LABEL_13244",
"LABEL_13245",
"LABEL_13246",
"LABEL_13247",
"LABEL_13248",
"LABEL_13249",
"LABEL_1325",
"LABEL_13250",
"LABEL_13251",
"LABEL_13252",
"LABEL_13253",
"LABEL_13254",
"LABEL_13255",
"LABEL_13256",
"LABEL_13257",
"LABEL_13258",
"LABEL_13259",
"LABEL_1326",
"LABEL_13260",
"LABEL_13261",
"LABEL_13262",
"LABEL_13263",
"LABEL_13264",
"LABEL_13265",
"LABEL_13266",
"LABEL_13267",
"LABEL_13268",
"LABEL_13269",
"LABEL_1327",
"LABEL_13270",
"LABEL_13271",
"LABEL_13272",
"LABEL_13273",
"LABEL_13274",
"LABEL_13275",
"LABEL_13276",
"LABEL_13277",
"LABEL_13278",
"LABEL_13279",
"LABEL_1328",
"LABEL_13280",
"LABEL_13281",
"LABEL_13282",
"LABEL_13283",
"LABEL_13284",
"LABEL_13285",
"LABEL_13286",
"LABEL_13287",
"LABEL_13288",
"LABEL_13289",
"LABEL_1329",
"LABEL_13290",
"LABEL_13291",
"LABEL_13292",
"LABEL_13293",
"LABEL_13294",
"LABEL_13295",
"LABEL_13296",
"LABEL_13297",
"LABEL_13298",
"LABEL_13299",
"LABEL_133",
"LABEL_1330",
"LABEL_13300",
"LABEL_13301",
"LABEL_13302",
"LABEL_13303",
"LABEL_13304",
"LABEL_13305",
"LABEL_13306",
"LABEL_13307",
"LABEL_13308",
"LABEL_13309",
"LABEL_1331",
"LABEL_13310",
"LABEL_13311",
"LABEL_13312",
"LABEL_13313",
"LABEL_13314",
"LABEL_13315",
"LABEL_13316",
"LABEL_13317",
"LABEL_13318",
"LABEL_13319",
"LABEL_1332",
"LABEL_13320",
"LABEL_13321",
"LABEL_13322",
"LABEL_13323",
"LABEL_13324",
"LABEL_13325",
"LABEL_13326",
"LABEL_13327",
"LABEL_13328",
"LABEL_13329",
"LABEL_1333",
"LABEL_13330",
"LABEL_13331",
"LABEL_13332",
"LABEL_13333",
"LABEL_13334",
"LABEL_13335",
"LABEL_13336",
"LABEL_13337",
"LABEL_13338",
"LABEL_13339",
"LABEL_1334",
"LABEL_13340",
"LABEL_13341",
"LABEL_13342",
"LABEL_13343",
"LABEL_13344",
"LABEL_13345",
"LABEL_13346",
"LABEL_13347",
"LABEL_13348",
"LABEL_13349",
"LABEL_1335",
"LABEL_13350",
"LABEL_13351",
"LABEL_13352",
"LABEL_13353",
"LABEL_13354",
"LABEL_13355",
"LABEL_13356",
"LABEL_13357",
"LABEL_13358",
"LABEL_13359",
"LABEL_1336",
"LABEL_13360",
"LABEL_13361",
"LABEL_13362",
"LABEL_13363",
"LABEL_13364",
"LABEL_13365",
"LABEL_13366",
"LABEL_13367",
"LABEL_13368",
"LABEL_13369",
"LABEL_1337",
"LABEL_13370",
"LABEL_13371",
"LABEL_13372",
"LABEL_13373",
"LABEL_13374",
"LABEL_13375",
"LABEL_13376",
"LABEL_13377",
"LABEL_13378",
"LABEL_13379",
"LABEL_1338",
"LABEL_13380",
"LABEL_13381",
"LABEL_13382",
"LABEL_13383",
"LABEL_13384",
"LABEL_13385",
"LABEL_13386",
"LABEL_13387",
"LABEL_13388",
"LABEL_13389",
"LABEL_1339",
"LABEL_13390",
"LABEL_13391",
"LABEL_13392",
"LABEL_13393",
"LABEL_13394",
"LABEL_13395",
"LABEL_13396",
"LABEL_13397",
"LABEL_13398",
"LABEL_13399",
"LABEL_134",
"LABEL_1340",
"LABEL_13400",
"LABEL_13401",
"LABEL_13402",
"LABEL_13403",
"LABEL_13404",
"LABEL_13405",
"LABEL_13406",
"LABEL_13407",
"LABEL_13408",
"LABEL_13409",
"LABEL_1341",
"LABEL_13410",
"LABEL_13411",
"LABEL_13412",
"LABEL_13413",
"LABEL_13414",
"LABEL_13415",
"LABEL_13416",
"LABEL_13417",
"LABEL_13418",
"LABEL_13419",
"LABEL_1342",
"LABEL_13420",
"LABEL_13421",
"LABEL_13422",
"LABEL_13423",
"LABEL_13424",
"LABEL_13425",
"LABEL_13426",
"LABEL_13427",
"LABEL_13428",
"LABEL_13429",
"LABEL_1343",
"LABEL_13430",
"LABEL_13431",
"LABEL_13432",
"LABEL_13433",
"LABEL_13434",
"LABEL_13435",
"LABEL_13436",
"LABEL_13437",
"LABEL_13438",
"LABEL_13439",
"LABEL_1344",
"LABEL_13440",
"LABEL_13441",
"LABEL_13442",
"LABEL_13443",
"LABEL_13444",
"LABEL_13445",
"LABEL_13446",
"LABEL_13447",
"LABEL_13448",
"LABEL_13449",
"LABEL_1345",
"LABEL_13450",
"LABEL_13451",
"LABEL_13452",
"LABEL_13453",
"LABEL_13454",
"LABEL_13455",
"LABEL_13456",
"LABEL_13457",
"LABEL_13458",
"LABEL_13459",
"LABEL_1346",
"LABEL_13460",
"LABEL_13461",
"LABEL_13462",
"LABEL_13463",
"LABEL_13464",
"LABEL_13465",
"LABEL_13466",
"LABEL_13467",
"LABEL_13468",
"LABEL_13469",
"LABEL_1347",
"LABEL_13470",
"LABEL_13471",
"LABEL_13472",
"LABEL_13473",
"LABEL_13474",
"LABEL_13475",
"LABEL_13476",
"LABEL_13477",
"LABEL_13478",
"LABEL_13479",
"LABEL_1348",
"LABEL_13480",
"LABEL_13481",
"LABEL_13482",
"LABEL_13483",
"LABEL_13484",
"LABEL_13485",
"LABEL_13486",
"LABEL_13487",
"LABEL_13488",
"LABEL_13489",
"LABEL_1349",
"LABEL_13490",
"LABEL_13491",
"LABEL_13492",
"LABEL_13493",
"LABEL_13494",
"LABEL_13495",
"LABEL_13496",
"LABEL_13497",
"LABEL_13498",
"LABEL_13499",
"LABEL_135",
"LABEL_1350",
"LABEL_13500",
"LABEL_13501",
"LABEL_13502",
"LABEL_13503",
"LABEL_13504",
"LABEL_13505",
"LABEL_13506",
"LABEL_13507",
"LABEL_13508",
"LABEL_13509",
"LABEL_1351",
"LABEL_13510",
"LABEL_13511",
"LABEL_13512",
"LABEL_13513",
"LABEL_13514",
"LABEL_13515",
"LABEL_13516",
"LABEL_13517",
"LABEL_13518",
"LABEL_13519",
"LABEL_1352",
"LABEL_13520",
"LABEL_13521",
"LABEL_13522",
"LABEL_13523",
"LABEL_13524",
"LABEL_13525",
"LABEL_13526",
"LABEL_13527",
"LABEL_13528",
"LABEL_13529",
"LABEL_1353",
"LABEL_13530",
"LABEL_13531",
"LABEL_13532",
"LABEL_13533",
"LABEL_13534",
"LABEL_13535",
"LABEL_13536",
"LABEL_13537",
"LABEL_13538",
"LABEL_13539",
"LABEL_1354",
"LABEL_13540",
"LABEL_13541",
"LABEL_13542",
"LABEL_13543",
"LABEL_13544",
"LABEL_13545",
"LABEL_13546",
"LABEL_13547",
"LABEL_13548",
"LABEL_13549",
"LABEL_1355",
"LABEL_13550",
"LABEL_13551",
"LABEL_13552",
"LABEL_13553",
"LABEL_13554",
"LABEL_13555",
"LABEL_13556",
"LABEL_13557",
"LABEL_13558",
"LABEL_13559",
"LABEL_1356",
"LABEL_13560",
"LABEL_13561",
"LABEL_13562",
"LABEL_13563",
"LABEL_13564",
"LABEL_13565",
"LABEL_13566",
"LABEL_13567",
"LABEL_13568",
"LABEL_13569",
"LABEL_1357",
"LABEL_13570",
"LABEL_13571",
"LABEL_13572",
"LABEL_13573",
"LABEL_13574",
"LABEL_13575",
"LABEL_13576",
"LABEL_13577",
"LABEL_13578",
"LABEL_13579",
"LABEL_1358",
"LABEL_13580",
"LABEL_13581",
"LABEL_13582",
"LABEL_13583",
"LABEL_13584",
"LABEL_13585",
"LABEL_13586",
"LABEL_13587",
"LABEL_13588",
"LABEL_13589",
"LABEL_1359",
"LABEL_13590",
"LABEL_13591",
"LABEL_13592",
"LABEL_13593",
"LABEL_13594",
"LABEL_13595",
"LABEL_13596",
"LABEL_13597",
"LABEL_13598",
"LABEL_13599",
"LABEL_136",
"LABEL_1360",
"LABEL_13600",
"LABEL_13601",
"LABEL_13602",
"LABEL_13603",
"LABEL_13604",
"LABEL_13605",
"LABEL_13606",
"LABEL_13607",
"LABEL_13608",
"LABEL_13609",
"LABEL_1361",
"LABEL_13610",
"LABEL_13611",
"LABEL_13612",
"LABEL_13613",
"LABEL_13614",
"LABEL_13615",
"LABEL_13616",
"LABEL_13617",
"LABEL_13618",
"LABEL_13619",
"LABEL_1362",
"LABEL_13620",
"LABEL_13621",
"LABEL_13622",
"LABEL_13623",
"LABEL_13624",
"LABEL_13625",
"LABEL_13626",
"LABEL_13627",
"LABEL_13628",
"LABEL_13629",
"LABEL_1363",
"LABEL_13630",
"LABEL_13631",
"LABEL_13632",
"LABEL_13633",
"LABEL_13634",
"LABEL_13635",
"LABEL_13636",
"LABEL_13637",
"LABEL_13638",
"LABEL_13639",
"LABEL_1364",
"LABEL_13640",
"LABEL_13641",
"LABEL_13642",
"LABEL_13643",
"LABEL_13644",
"LABEL_13645",
"LABEL_13646",
"LABEL_13647",
"LABEL_13648",
"LABEL_13649",
"LABEL_1365",
"LABEL_13650",
"LABEL_13651",
"LABEL_13652",
"LABEL_13653",
"LABEL_13654",
"LABEL_13655",
"LABEL_13656",
"LABEL_13657",
"LABEL_13658",
"LABEL_13659",
"LABEL_1366",
"LABEL_13660",
"LABEL_13661",
"LABEL_13662",
"LABEL_13663",
"LABEL_13664",
"LABEL_13665",
"LABEL_13666",
"LABEL_13667",
"LABEL_13668",
"LABEL_13669",
"LABEL_1367",
"LABEL_13670",
"LABEL_13671",
"LABEL_13672",
"LABEL_13673",
"LABEL_13674",
"LABEL_13675",
"LABEL_13676",
"LABEL_13677",
"LABEL_13678",
"LABEL_13679",
"LABEL_1368",
"LABEL_13680",
"LABEL_13681",
"LABEL_13682",
"LABEL_13683",
"LABEL_13684",
"LABEL_13685",
"LABEL_13686",
"LABEL_13687",
"LABEL_13688",
"LABEL_13689",
"LABEL_1369",
"LABEL_13690",
"LABEL_13691",
"LABEL_13692",
"LABEL_13693",
"LABEL_13694",
"LABEL_13695",
"LABEL_13696",
"LABEL_13697",
"LABEL_13698",
"LABEL_13699",
"LABEL_137",
"LABEL_1370",
"LABEL_13700",
"LABEL_13701",
"LABEL_13702",
"LABEL_13703",
"LABEL_13704",
"LABEL_13705",
"LABEL_13706",
"LABEL_13707",
"LABEL_13708",
"LABEL_13709",
"LABEL_1371",
"LABEL_13710",
"LABEL_13711",
"LABEL_13712",
"LABEL_13713",
"LABEL_13714",
"LABEL_13715",
"LABEL_13716",
"LABEL_13717",
"LABEL_13718",
"LABEL_13719",
"LABEL_1372",
"LABEL_13720",
"LABEL_13721",
"LABEL_13722",
"LABEL_13723",
"LABEL_13724",
"LABEL_13725",
"LABEL_13726",
"LABEL_13727",
"LABEL_13728",
"LABEL_13729",
"LABEL_1373",
"LABEL_13730",
"LABEL_13731",
"LABEL_13732",
"LABEL_13733",
"LABEL_13734",
"LABEL_13735",
"LABEL_13736",
"LABEL_13737",
"LABEL_13738",
"LABEL_13739",
"LABEL_1374",
"LABEL_13740",
"LABEL_13741",
"LABEL_13742",
"LABEL_13743",
"LABEL_13744",
"LABEL_13745",
"LABEL_13746",
"LABEL_13747",
"LABEL_13748",
"LABEL_13749",
"LABEL_1375",
"LABEL_13750",
"LABEL_13751",
"LABEL_13752",
"LABEL_13753",
"LABEL_13754",
"LABEL_13755",
"LABEL_13756",
"LABEL_13757",
"LABEL_13758",
"LABEL_13759",
"LABEL_1376",
"LABEL_13760",
"LABEL_13761",
"LABEL_13762",
"LABEL_13763",
"LABEL_13764",
"LABEL_13765",
"LABEL_13766",
"LABEL_13767",
"LABEL_13768",
"LABEL_13769",
"LABEL_1377",
"LABEL_13770",
"LABEL_13771",
"LABEL_13772",
"LABEL_13773",
"LABEL_13774",
"LABEL_13775",
"LABEL_13776",
"LABEL_13777",
"LABEL_13778",
"LABEL_13779",
"LABEL_1378",
"LABEL_13780",
"LABEL_13781",
"LABEL_13782",
"LABEL_13783",
"LABEL_13784",
"LABEL_13785",
"LABEL_13786",
"LABEL_13787",
"LABEL_13788",
"LABEL_13789",
"LABEL_1379",
"LABEL_13790",
"LABEL_13791",
"LABEL_13792",
"LABEL_13793",
"LABEL_13794",
"LABEL_13795",
"LABEL_13796",
"LABEL_13797",
"LABEL_13798",
"LABEL_13799",
"LABEL_138",
"LABEL_1380",
"LABEL_13800",
"LABEL_13801",
"LABEL_13802",
"LABEL_13803",
"LABEL_13804",
"LABEL_13805",
"LABEL_13806",
"LABEL_13807",
"LABEL_13808",
"LABEL_13809",
"LABEL_1381",
"LABEL_13810",
"LABEL_13811",
"LABEL_13812",
"LABEL_13813",
"LABEL_13814",
"LABEL_13815",
"LABEL_13816",
"LABEL_13817",
"LABEL_13818",
"LABEL_13819",
"LABEL_1382",
"LABEL_13820",
"LABEL_13821",
"LABEL_13822",
"LABEL_13823",
"LABEL_13824",
"LABEL_13825",
"LABEL_13826",
"LABEL_13827",
"LABEL_13828",
"LABEL_13829",
"LABEL_1383",
"LABEL_13830",
"LABEL_13831",
"LABEL_13832",
"LABEL_13833",
"LABEL_13834",
"LABEL_13835",
"LABEL_13836",
"LABEL_13837",
"LABEL_13838",
"LABEL_13839",
"LABEL_1384",
"LABEL_13840",
"LABEL_13841",
"LABEL_13842",
"LABEL_13843",
"LABEL_13844",
"LABEL_13845",
"LABEL_13846",
"LABEL_13847",
"LABEL_13848",
"LABEL_13849",
"LABEL_1385",
"LABEL_13850",
"LABEL_13851",
"LABEL_13852",
"LABEL_13853",
"LABEL_13854",
"LABEL_13855",
"LABEL_13856",
"LABEL_13857",
"LABEL_13858",
"LABEL_13859",
"LABEL_1386",
"LABEL_13860",
"LABEL_13861",
"LABEL_13862",
"LABEL_13863",
"LABEL_13864",
"LABEL_13865",
"LABEL_13866",
"LABEL_13867",
"LABEL_13868",
"LABEL_13869",
"LABEL_1387",
"LABEL_13870",
"LABEL_13871",
"LABEL_13872",
"LABEL_13873",
"LABEL_13874",
"LABEL_13875",
"LABEL_13876",
"LABEL_13877",
"LABEL_13878",
"LABEL_13879",
"LABEL_1388",
"LABEL_13880",
"LABEL_13881",
"LABEL_13882",
"LABEL_13883",
"LABEL_13884",
"LABEL_13885",
"LABEL_13886",
"LABEL_13887",
"LABEL_13888",
"LABEL_13889",
"LABEL_1389",
"LABEL_13890",
"LABEL_13891",
"LABEL_13892",
"LABEL_13893",
"LABEL_13894",
"LABEL_13895",
"LABEL_13896",
"LABEL_13897",
"LABEL_13898",
"LABEL_13899",
"LABEL_139",
"LABEL_1390",
"LABEL_13900",
"LABEL_13901",
"LABEL_13902",
"LABEL_13903",
"LABEL_13904",
"LABEL_13905",
"LABEL_13906",
"LABEL_13907",
"LABEL_13908",
"LABEL_13909",
"LABEL_1391",
"LABEL_13910",
"LABEL_13911",
"LABEL_13912",
"LABEL_13913",
"LABEL_13914",
"LABEL_13915",
"LABEL_13916",
"LABEL_13917",
"LABEL_13918",
"LABEL_13919",
"LABEL_1392",
"LABEL_13920",
"LABEL_13921",
"LABEL_13922",
"LABEL_13923",
"LABEL_13924",
"LABEL_13925",
"LABEL_13926",
"LABEL_13927",
"LABEL_13928",
"LABEL_13929",
"LABEL_1393",
"LABEL_13930",
"LABEL_13931",
"LABEL_13932",
"LABEL_13933",
"LABEL_13934",
"LABEL_13935",
"LABEL_13936",
"LABEL_13937",
"LABEL_13938",
"LABEL_13939",
"LABEL_1394",
"LABEL_13940",
"LABEL_13941",
"LABEL_13942",
"LABEL_13943",
"LABEL_13944",
"LABEL_13945",
"LABEL_13946",
"LABEL_13947",
"LABEL_13948",
"LABEL_13949",
"LABEL_1395",
"LABEL_13950",
"LABEL_13951",
"LABEL_13952",
"LABEL_13953",
"LABEL_13954",
"LABEL_13955",
"LABEL_13956",
"LABEL_13957",
"LABEL_13958",
"LABEL_13959",
"LABEL_1396",
"LABEL_13960",
"LABEL_13961",
"LABEL_13962",
"LABEL_13963",
"LABEL_13964",
"LABEL_13965",
"LABEL_13966",
"LABEL_13967",
"LABEL_13968",
"LABEL_13969",
"LABEL_1397",
"LABEL_13970",
"LABEL_13971",
"LABEL_13972",
"LABEL_13973",
"LABEL_13974",
"LABEL_13975",
"LABEL_13976",
"LABEL_13977",
"LABEL_13978",
"LABEL_13979",
"LABEL_1398",
"LABEL_13980",
"LABEL_13981",
"LABEL_13982",
"LABEL_13983",
"LABEL_13984",
"LABEL_13985",
"LABEL_13986",
"LABEL_13987",
"LABEL_13988",
"LABEL_13989",
"LABEL_1399",
"LABEL_13990",
"LABEL_13991",
"LABEL_13992",
"LABEL_13993",
"LABEL_13994",
"LABEL_13995",
"LABEL_13996",
"LABEL_13997",
"LABEL_13998",
"LABEL_13999",
"LABEL_14",
"LABEL_140",
"LABEL_1400",
"LABEL_14000",
"LABEL_14001",
"LABEL_14002",
"LABEL_14003",
"LABEL_14004",
"LABEL_14005",
"LABEL_14006",
"LABEL_14007",
"LABEL_14008",
"LABEL_14009",
"LABEL_1401",
"LABEL_14010",
"LABEL_14011",
"LABEL_14012",
"LABEL_14013",
"LABEL_14014",
"LABEL_14015",
"LABEL_14016",
"LABEL_14017",
"LABEL_14018",
"LABEL_14019",
"LABEL_1402",
"LABEL_14020",
"LABEL_14021",
"LABEL_14022",
"LABEL_14023",
"LABEL_14024",
"LABEL_14025",
"LABEL_14026",
"LABEL_14027",
"LABEL_14028",
"LABEL_14029",
"LABEL_1403",
"LABEL_14030",
"LABEL_14031",
"LABEL_14032",
"LABEL_14033",
"LABEL_14034",
"LABEL_14035",
"LABEL_14036",
"LABEL_14037",
"LABEL_14038",
"LABEL_14039",
"LABEL_1404",
"LABEL_14040",
"LABEL_14041",
"LABEL_14042",
"LABEL_14043",
"LABEL_14044",
"LABEL_14045",
"LABEL_14046",
"LABEL_14047",
"LABEL_14048",
"LABEL_14049",
"LABEL_1405",
"LABEL_14050",
"LABEL_14051",
"LABEL_14052",
"LABEL_14053",
"LABEL_14054",
"LABEL_14055",
"LABEL_14056",
"LABEL_14057",
"LABEL_14058",
"LABEL_14059",
"LABEL_1406",
"LABEL_14060",
"LABEL_14061",
"LABEL_14062",
"LABEL_14063",
"LABEL_14064",
"LABEL_14065",
"LABEL_14066",
"LABEL_14067",
"LABEL_14068",
"LABEL_14069",
"LABEL_1407",
"LABEL_14070",
"LABEL_14071",
"LABEL_14072",
"LABEL_14073",
"LABEL_14074",
"LABEL_14075",
"LABEL_14076",
"LABEL_14077",
"LABEL_14078",
"LABEL_14079",
"LABEL_1408",
"LABEL_14080",
"LABEL_14081",
"LABEL_14082",
"LABEL_14083",
"LABEL_14084",
"LABEL_14085",
"LABEL_14086",
"LABEL_14087",
"LABEL_14088",
"LABEL_14089",
"LABEL_1409",
"LABEL_14090",
"LABEL_14091",
"LABEL_14092",
"LABEL_14093",
"LABEL_14094",
"LABEL_14095",
"LABEL_14096",
"LABEL_14097",
"LABEL_14098",
"LABEL_14099",
"LABEL_141",
"LABEL_1410",
"LABEL_14100",
"LABEL_14101",
"LABEL_14102",
"LABEL_14103",
"LABEL_14104",
"LABEL_14105",
"LABEL_14106",
"LABEL_14107",
"LABEL_14108",
"LABEL_14109",
"LABEL_1411",
"LABEL_14110",
"LABEL_14111",
"LABEL_14112",
"LABEL_14113",
"LABEL_14114",
"LABEL_14115",
"LABEL_14116",
"LABEL_14117",
"LABEL_14118",
"LABEL_14119",
"LABEL_1412",
"LABEL_14120",
"LABEL_14121",
"LABEL_14122",
"LABEL_14123",
"LABEL_14124",
"LABEL_14125",
"LABEL_14126",
"LABEL_14127",
"LABEL_14128",
"LABEL_14129",
"LABEL_1413",
"LABEL_14130",
"LABEL_14131",
"LABEL_14132",
"LABEL_14133",
"LABEL_14134",
"LABEL_14135",
"LABEL_14136",
"LABEL_14137",
"LABEL_14138",
"LABEL_14139",
"LABEL_1414",
"LABEL_14140",
"LABEL_14141",
"LABEL_14142",
"LABEL_14143",
"LABEL_14144",
"LABEL_14145",
"LABEL_14146",
"LABEL_14147",
"LABEL_14148",
"LABEL_14149",
"LABEL_1415",
"LABEL_14150",
"LABEL_14151",
"LABEL_14152",
"LABEL_14153",
"LABEL_14154",
"LABEL_14155",
"LABEL_14156",
"LABEL_14157",
"LABEL_14158",
"LABEL_14159",
"LABEL_1416",
"LABEL_14160",
"LABEL_14161",
"LABEL_14162",
"LABEL_14163",
"LABEL_14164",
"LABEL_14165",
"LABEL_14166",
"LABEL_14167",
"LABEL_14168",
"LABEL_14169",
"LABEL_1417",
"LABEL_14170",
"LABEL_14171",
"LABEL_14172",
"LABEL_14173",
"LABEL_14174",
"LABEL_14175",
"LABEL_14176",
"LABEL_14177",
"LABEL_14178",
"LABEL_14179",
"LABEL_1418",
"LABEL_14180",
"LABEL_14181",
"LABEL_14182",
"LABEL_14183",
"LABEL_14184",
"LABEL_14185",
"LABEL_14186",
"LABEL_14187",
"LABEL_14188",
"LABEL_14189",
"LABEL_1419",
"LABEL_14190",
"LABEL_14191",
"LABEL_14192",
"LABEL_14193",
"LABEL_14194",
"LABEL_14195",
"LABEL_14196",
"LABEL_14197",
"LABEL_14198",
"LABEL_14199",
"LABEL_142",
"LABEL_1420",
"LABEL_14200",
"LABEL_14201",
"LABEL_14202",
"LABEL_14203",
"LABEL_14204",
"LABEL_14205",
"LABEL_14206",
"LABEL_14207",
"LABEL_14208",
"LABEL_14209",
"LABEL_1421",
"LABEL_14210",
"LABEL_14211",
"LABEL_14212",
"LABEL_14213",
"LABEL_14214",
"LABEL_14215",
"LABEL_14216",
"LABEL_14217",
"LABEL_14218",
"LABEL_14219",
"LABEL_1422",
"LABEL_14220",
"LABEL_14221",
"LABEL_14222",
"LABEL_14223",
"LABEL_14224",
"LABEL_14225",
"LABEL_14226",
"LABEL_14227",
"LABEL_14228",
"LABEL_14229",
"LABEL_1423",
"LABEL_14230",
"LABEL_14231",
"LABEL_14232",
"LABEL_14233",
"LABEL_14234",
"LABEL_14235",
"LABEL_14236",
"LABEL_14237",
"LABEL_14238",
"LABEL_14239",
"LABEL_1424",
"LABEL_14240",
"LABEL_14241",
"LABEL_14242",
"LABEL_14243",
"LABEL_14244",
"LABEL_14245",
"LABEL_14246",
"LABEL_14247",
"LABEL_14248",
"LABEL_14249",
"LABEL_1425",
"LABEL_14250",
"LABEL_14251",
"LABEL_14252",
"LABEL_14253",
"LABEL_14254",
"LABEL_14255",
"LABEL_14256",
"LABEL_14257",
"LABEL_14258",
"LABEL_14259",
"LABEL_1426",
"LABEL_14260",
"LABEL_14261",
"LABEL_14262",
"LABEL_14263",
"LABEL_14264",
"LABEL_14265",
"LABEL_14266",
"LABEL_14267",
"LABEL_14268",
"LABEL_14269",
"LABEL_1427",
"LABEL_14270",
"LABEL_14271",
"LABEL_14272",
"LABEL_14273",
"LABEL_14274",
"LABEL_14275",
"LABEL_14276",
"LABEL_14277",
"LABEL_14278",
"LABEL_14279",
"LABEL_1428",
"LABEL_14280",
"LABEL_14281",
"LABEL_14282",
"LABEL_14283",
"LABEL_14284",
"LABEL_14285",
"LABEL_14286",
"LABEL_14287",
"LABEL_14288",
"LABEL_14289",
"LABEL_1429",
"LABEL_14290",
"LABEL_14291",
"LABEL_14292",
"LABEL_14293",
"LABEL_14294",
"LABEL_14295",
"LABEL_14296",
"LABEL_14297",
"LABEL_14298",
"LABEL_14299",
"LABEL_143",
"LABEL_1430",
"LABEL_14300",
"LABEL_14301",
"LABEL_14302",
"LABEL_14303",
"LABEL_14304",
"LABEL_14305",
"LABEL_14306",
"LABEL_14307",
"LABEL_14308",
"LABEL_14309",
"LABEL_1431",
"LABEL_14310",
"LABEL_14311",
"LABEL_14312",
"LABEL_14313",
"LABEL_14314",
"LABEL_14315",
"LABEL_14316",
"LABEL_14317",
"LABEL_14318",
"LABEL_14319",
"LABEL_1432",
"LABEL_14320",
"LABEL_14321",
"LABEL_14322",
"LABEL_14323",
"LABEL_14324",
"LABEL_14325",
"LABEL_14326",
"LABEL_14327",
"LABEL_14328",
"LABEL_14329",
"LABEL_1433",
"LABEL_14330",
"LABEL_14331",
"LABEL_14332",
"LABEL_14333",
"LABEL_14334",
"LABEL_14335",
"LABEL_14336",
"LABEL_14337",
"LABEL_14338",
"LABEL_14339",
"LABEL_1434",
"LABEL_14340",
"LABEL_14341",
"LABEL_14342",
"LABEL_14343",
"LABEL_14344",
"LABEL_14345",
"LABEL_14346",
"LABEL_14347",
"LABEL_14348",
"LABEL_14349",
"LABEL_1435",
"LABEL_14350",
"LABEL_14351",
"LABEL_14352",
"LABEL_14353",
"LABEL_14354",
"LABEL_14355",
"LABEL_14356",
"LABEL_14357",
"LABEL_14358",
"LABEL_14359",
"LABEL_1436",
"LABEL_14360",
"LABEL_14361",
"LABEL_14362",
"LABEL_14363",
"LABEL_14364",
"LABEL_14365",
"LABEL_14366",
"LABEL_14367",
"LABEL_14368",
"LABEL_14369",
"LABEL_1437",
"LABEL_14370",
"LABEL_14371",
"LABEL_14372",
"LABEL_14373",
"LABEL_14374",
"LABEL_14375",
"LABEL_14376",
"LABEL_14377",
"LABEL_14378",
"LABEL_14379",
"LABEL_1438",
"LABEL_14380",
"LABEL_14381",
"LABEL_14382",
"LABEL_14383",
"LABEL_14384",
"LABEL_14385",
"LABEL_14386",
"LABEL_14387",
"LABEL_14388",
"LABEL_14389",
"LABEL_1439",
"LABEL_14390",
"LABEL_14391",
"LABEL_14392",
"LABEL_14393",
"LABEL_14394",
"LABEL_14395",
"LABEL_14396",
"LABEL_14397",
"LABEL_14398",
"LABEL_14399",
"LABEL_144",
"LABEL_1440",
"LABEL_14400",
"LABEL_14401",
"LABEL_14402",
"LABEL_14403",
"LABEL_14404",
"LABEL_14405",
"LABEL_14406",
"LABEL_14407",
"LABEL_14408",
"LABEL_14409",
"LABEL_1441",
"LABEL_14410",
"LABEL_14411",
"LABEL_14412",
"LABEL_14413",
"LABEL_14414",
"LABEL_14415",
"LABEL_14416",
"LABEL_14417",
"LABEL_14418",
"LABEL_14419",
"LABEL_1442",
"LABEL_14420",
"LABEL_14421",
"LABEL_14422",
"LABEL_14423",
"LABEL_14424",
"LABEL_14425",
"LABEL_14426",
"LABEL_14427",
"LABEL_14428",
"LABEL_14429",
"LABEL_1443",
"LABEL_14430",
"LABEL_14431",
"LABEL_14432",
"LABEL_14433",
"LABEL_14434",
"LABEL_14435",
"LABEL_14436",
"LABEL_14437",
"LABEL_14438",
"LABEL_14439",
"LABEL_1444",
"LABEL_14440",
"LABEL_14441",
"LABEL_14442",
"LABEL_14443",
"LABEL_14444",
"LABEL_14445",
"LABEL_14446",
"LABEL_14447",
"LABEL_14448",
"LABEL_14449",
"LABEL_1445",
"LABEL_14450",
"LABEL_14451",
"LABEL_14452",
"LABEL_14453",
"LABEL_14454",
"LABEL_14455",
"LABEL_14456",
"LABEL_14457",
"LABEL_14458",
"LABEL_14459",
"LABEL_1446",
"LABEL_14460",
"LABEL_14461",
"LABEL_14462",
"LABEL_14463",
"LABEL_14464",
"LABEL_14465",
"LABEL_14466",
"LABEL_14467",
"LABEL_14468",
"LABEL_14469",
"LABEL_1447",
"LABEL_14470",
"LABEL_14471",
"LABEL_14472",
"LABEL_14473",
"LABEL_14474",
"LABEL_14475",
"LABEL_14476",
"LABEL_14477",
"LABEL_14478",
"LABEL_14479",
"LABEL_1448",
"LABEL_14480",
"LABEL_14481",
"LABEL_14482",
"LABEL_14483",
"LABEL_14484",
"LABEL_14485",
"LABEL_14486",
"LABEL_14487",
"LABEL_14488",
"LABEL_14489",
"LABEL_1449",
"LABEL_14490",
"LABEL_14491",
"LABEL_14492",
"LABEL_14493",
"LABEL_14494",
"LABEL_14495",
"LABEL_14496",
"LABEL_14497",
"LABEL_14498",
"LABEL_14499",
"LABEL_145",
"LABEL_1450",
"LABEL_14500",
"LABEL_14501",
"LABEL_14502",
"LABEL_14503",
"LABEL_14504",
"LABEL_14505",
"LABEL_14506",
"LABEL_14507",
"LABEL_14508",
"LABEL_14509",
"LABEL_1451",
"LABEL_14510",
"LABEL_14511",
"LABEL_14512",
"LABEL_14513",
"LABEL_14514",
"LABEL_14515",
"LABEL_14516",
"LABEL_14517",
"LABEL_14518",
"LABEL_14519",
"LABEL_1452",
"LABEL_14520",
"LABEL_14521",
"LABEL_14522",
"LABEL_14523",
"LABEL_14524",
"LABEL_14525",
"LABEL_14526",
"LABEL_14527",
"LABEL_14528",
"LABEL_14529",
"LABEL_1453",
"LABEL_14530",
"LABEL_14531",
"LABEL_14532",
"LABEL_14533",
"LABEL_14534",
"LABEL_14535",
"LABEL_14536",
"LABEL_14537",
"LABEL_14538",
"LABEL_14539",
"LABEL_1454",
"LABEL_14540",
"LABEL_14541",
"LABEL_14542",
"LABEL_14543",
"LABEL_14544",
"LABEL_14545",
"LABEL_14546",
"LABEL_14547",
"LABEL_14548",
"LABEL_14549",
"LABEL_1455",
"LABEL_14550",
"LABEL_14551",
"LABEL_14552",
"LABEL_14553",
"LABEL_14554",
"LABEL_14555",
"LABEL_14556",
"LABEL_14557",
"LABEL_14558",
"LABEL_14559",
"LABEL_1456",
"LABEL_14560",
"LABEL_14561",
"LABEL_14562",
"LABEL_14563",
"LABEL_14564",
"LABEL_14565",
"LABEL_14566",
"LABEL_14567",
"LABEL_14568",
"LABEL_14569",
"LABEL_1457",
"LABEL_14570",
"LABEL_14571",
"LABEL_14572",
"LABEL_14573",
"LABEL_14574",
"LABEL_14575",
"LABEL_14576",
"LABEL_14577",
"LABEL_14578",
"LABEL_14579",
"LABEL_1458",
"LABEL_14580",
"LABEL_14581",
"LABEL_14582",
"LABEL_14583",
"LABEL_14584",
"LABEL_14585",
"LABEL_14586",
"LABEL_14587",
"LABEL_14588",
"LABEL_14589",
"LABEL_1459",
"LABEL_14590",
"LABEL_14591",
"LABEL_14592",
"LABEL_14593",
"LABEL_14594",
"LABEL_14595",
"LABEL_14596",
"LABEL_14597",
"LABEL_14598",
"LABEL_14599",
"LABEL_146",
"LABEL_1460",
"LABEL_14600",
"LABEL_14601",
"LABEL_14602",
"LABEL_14603",
"LABEL_14604",
"LABEL_14605",
"LABEL_14606",
"LABEL_14607",
"LABEL_14608",
"LABEL_14609",
"LABEL_1461",
"LABEL_14610",
"LABEL_14611",
"LABEL_14612",
"LABEL_14613",
"LABEL_14614",
"LABEL_14615",
"LABEL_14616",
"LABEL_14617",
"LABEL_14618",
"LABEL_14619",
"LABEL_1462",
"LABEL_14620",
"LABEL_14621",
"LABEL_14622",
"LABEL_14623",
"LABEL_14624",
"LABEL_14625",
"LABEL_14626",
"LABEL_14627",
"LABEL_14628",
"LABEL_14629",
"LABEL_1463",
"LABEL_14630",
"LABEL_14631",
"LABEL_14632",
"LABEL_14633",
"LABEL_14634",
"LABEL_14635",
"LABEL_14636",
"LABEL_14637",
"LABEL_14638",
"LABEL_14639",
"LABEL_1464",
"LABEL_14640",
"LABEL_14641",
"LABEL_14642",
"LABEL_14643",
"LABEL_14644",
"LABEL_14645",
"LABEL_14646",
"LABEL_14647",
"LABEL_14648",
"LABEL_14649",
"LABEL_1465",
"LABEL_14650",
"LABEL_14651",
"LABEL_14652",
"LABEL_14653",
"LABEL_14654",
"LABEL_14655",
"LABEL_14656",
"LABEL_14657",
"LABEL_14658",
"LABEL_14659",
"LABEL_1466",
"LABEL_14660",
"LABEL_14661",
"LABEL_14662",
"LABEL_14663",
"LABEL_14664",
"LABEL_14665",
"LABEL_14666",
"LABEL_14667",
"LABEL_14668",
"LABEL_14669",
"LABEL_1467",
"LABEL_14670",
"LABEL_14671",
"LABEL_14672",
"LABEL_14673",
"LABEL_14674",
"LABEL_14675",
"LABEL_14676",
"LABEL_14677",
"LABEL_14678",
"LABEL_14679",
"LABEL_1468",
"LABEL_14680",
"LABEL_14681",
"LABEL_14682",
"LABEL_14683",
"LABEL_14684",
"LABEL_14685",
"LABEL_14686",
"LABEL_14687",
"LABEL_14688",
"LABEL_14689",
"LABEL_1469",
"LABEL_14690",
"LABEL_14691",
"LABEL_14692",
"LABEL_14693",
"LABEL_14694",
"LABEL_14695",
"LABEL_14696",
"LABEL_14697",
"LABEL_14698",
"LABEL_14699",
"LABEL_147",
"LABEL_1470",
"LABEL_14700",
"LABEL_14701",
"LABEL_14702",
"LABEL_14703",
"LABEL_14704",
"LABEL_14705",
"LABEL_14706",
"LABEL_14707",
"LABEL_14708",
"LABEL_14709",
"LABEL_1471",
"LABEL_14710",
"LABEL_14711",
"LABEL_14712",
"LABEL_14713",
"LABEL_14714",
"LABEL_14715",
"LABEL_14716",
"LABEL_14717",
"LABEL_14718",
"LABEL_14719",
"LABEL_1472",
"LABEL_14720",
"LABEL_14721",
"LABEL_14722",
"LABEL_14723",
"LABEL_14724",
"LABEL_14725",
"LABEL_14726",
"LABEL_14727",
"LABEL_14728",
"LABEL_14729",
"LABEL_1473",
"LABEL_14730",
"LABEL_14731",
"LABEL_14732",
"LABEL_14733",
"LABEL_14734",
"LABEL_14735",
"LABEL_14736",
"LABEL_14737",
"LABEL_14738",
"LABEL_14739",
"LABEL_1474",
"LABEL_14740",
"LABEL_14741",
"LABEL_14742",
"LABEL_14743",
"LABEL_14744",
"LABEL_14745",
"LABEL_14746",
"LABEL_14747",
"LABEL_14748",
"LABEL_14749",
"LABEL_1475",
"LABEL_14750",
"LABEL_14751",
"LABEL_14752",
"LABEL_14753",
"LABEL_14754",
"LABEL_14755",
"LABEL_14756",
"LABEL_14757",
"LABEL_14758",
"LABEL_14759",
"LABEL_1476",
"LABEL_14760",
"LABEL_14761",
"LABEL_14762",
"LABEL_14763",
"LABEL_14764",
"LABEL_14765",
"LABEL_14766",
"LABEL_14767",
"LABEL_14768",
"LABEL_14769",
"LABEL_1477",
"LABEL_14770",
"LABEL_14771",
"LABEL_14772",
"LABEL_14773",
"LABEL_14774",
"LABEL_14775",
"LABEL_14776",
"LABEL_14777",
"LABEL_14778",
"LABEL_14779",
"LABEL_1478",
"LABEL_14780",
"LABEL_14781",
"LABEL_14782",
"LABEL_14783",
"LABEL_14784",
"LABEL_14785",
"LABEL_14786",
"LABEL_14787",
"LABEL_14788",
"LABEL_14789",
"LABEL_1479",
"LABEL_14790",
"LABEL_14791",
"LABEL_14792",
"LABEL_14793",
"LABEL_14794",
"LABEL_14795",
"LABEL_14796",
"LABEL_14797",
"LABEL_14798",
"LABEL_14799",
"LABEL_148",
"LABEL_1480",
"LABEL_14800",
"LABEL_14801",
"LABEL_14802",
"LABEL_14803",
"LABEL_14804",
"LABEL_14805",
"LABEL_14806",
"LABEL_14807",
"LABEL_14808",
"LABEL_14809",
"LABEL_1481",
"LABEL_14810",
"LABEL_14811",
"LABEL_14812",
"LABEL_14813",
"LABEL_14814",
"LABEL_14815",
"LABEL_14816",
"LABEL_14817",
"LABEL_14818",
"LABEL_14819",
"LABEL_1482",
"LABEL_14820",
"LABEL_14821",
"LABEL_14822",
"LABEL_14823",
"LABEL_14824",
"LABEL_14825",
"LABEL_14826",
"LABEL_14827",
"LABEL_14828",
"LABEL_14829",
"LABEL_1483",
"LABEL_14830",
"LABEL_14831",
"LABEL_14832",
"LABEL_14833",
"LABEL_14834",
"LABEL_14835",
"LABEL_14836",
"LABEL_14837",
"LABEL_14838",
"LABEL_14839",
"LABEL_1484",
"LABEL_14840",
"LABEL_14841",
"LABEL_14842",
"LABEL_14843",
"LABEL_14844",
"LABEL_14845",
"LABEL_14846",
"LABEL_14847",
"LABEL_14848",
"LABEL_14849",
"LABEL_1485",
"LABEL_14850",
"LABEL_14851",
"LABEL_14852",
"LABEL_14853",
"LABEL_14854",
"LABEL_14855",
"LABEL_14856",
"LABEL_14857",
"LABEL_14858",
"LABEL_14859",
"LABEL_1486",
"LABEL_14860",
"LABEL_14861",
"LABEL_14862",
"LABEL_14863",
"LABEL_14864",
"LABEL_14865",
"LABEL_14866",
"LABEL_14867",
"LABEL_14868",
"LABEL_14869",
"LABEL_1487",
"LABEL_14870",
"LABEL_14871",
"LABEL_14872",
"LABEL_14873",
"LABEL_14874",
"LABEL_14875",
"LABEL_14876",
"LABEL_14877",
"LABEL_14878",
"LABEL_14879",
"LABEL_1488",
"LABEL_14880",
"LABEL_14881",
"LABEL_14882",
"LABEL_14883",
"LABEL_14884",
"LABEL_14885",
"LABEL_14886",
"LABEL_14887",
"LABEL_14888",
"LABEL_14889",
"LABEL_1489",
"LABEL_14890",
"LABEL_14891",
"LABEL_14892",
"LABEL_14893",
"LABEL_14894",
"LABEL_14895",
"LABEL_14896",
"LABEL_14897",
"LABEL_14898",
"LABEL_14899",
"LABEL_149",
"LABEL_1490",
"LABEL_14900",
"LABEL_14901",
"LABEL_14902",
"LABEL_14903",
"LABEL_14904",
"LABEL_14905",
"LABEL_14906",
"LABEL_14907",
"LABEL_14908",
"LABEL_14909",
"LABEL_1491",
"LABEL_14910",
"LABEL_14911",
"LABEL_14912",
"LABEL_14913",
"LABEL_14914",
"LABEL_14915",
"LABEL_14916",
"LABEL_14917",
"LABEL_14918",
"LABEL_14919",
"LABEL_1492",
"LABEL_14920",
"LABEL_14921",
"LABEL_14922",
"LABEL_14923",
"LABEL_14924",
"LABEL_14925",
"LABEL_14926",
"LABEL_14927",
"LABEL_14928",
"LABEL_14929",
"LABEL_1493",
"LABEL_14930",
"LABEL_14931",
"LABEL_14932",
"LABEL_14933",
"LABEL_14934",
"LABEL_14935",
"LABEL_14936",
"LABEL_14937",
"LABEL_14938",
"LABEL_14939",
"LABEL_1494",
"LABEL_14940",
"LABEL_14941",
"LABEL_14942",
"LABEL_14943",
"LABEL_14944",
"LABEL_14945",
"LABEL_14946",
"LABEL_14947",
"LABEL_14948",
"LABEL_14949",
"LABEL_1495",
"LABEL_14950",
"LABEL_14951",
"LABEL_14952",
"LABEL_14953",
"LABEL_14954",
"LABEL_14955",
"LABEL_14956",
"LABEL_14957",
"LABEL_14958",
"LABEL_14959",
"LABEL_1496",
"LABEL_14960",
"LABEL_14961",
"LABEL_14962",
"LABEL_14963",
"LABEL_14964",
"LABEL_14965",
"LABEL_14966",
"LABEL_14967",
"LABEL_14968",
"LABEL_14969",
"LABEL_1497",
"LABEL_14970",
"LABEL_14971",
"LABEL_14972",
"LABEL_14973",
"LABEL_14974",
"LABEL_14975",
"LABEL_14976",
"LABEL_14977",
"LABEL_14978",
"LABEL_14979",
"LABEL_1498",
"LABEL_14980",
"LABEL_14981",
"LABEL_14982",
"LABEL_14983",
"LABEL_14984",
"LABEL_14985",
"LABEL_14986",
"LABEL_14987",
"LABEL_14988",
"LABEL_14989",
"LABEL_1499",
"LABEL_14990",
"LABEL_14991",
"LABEL_14992",
"LABEL_14993",
"LABEL_14994",
"LABEL_14995",
"LABEL_14996",
"LABEL_14997",
"LABEL_14998",
"LABEL_14999",
"LABEL_15",
"LABEL_150",
"LABEL_1500",
"LABEL_15000",
"LABEL_15001",
"LABEL_15002",
"LABEL_15003",
"LABEL_15004",
"LABEL_15005",
"LABEL_15006",
"LABEL_15007",
"LABEL_15008",
"LABEL_15009",
"LABEL_1501",
"LABEL_15010",
"LABEL_15011",
"LABEL_15012",
"LABEL_15013",
"LABEL_15014",
"LABEL_15015",
"LABEL_15016",
"LABEL_15017",
"LABEL_15018",
"LABEL_15019",
"LABEL_1502",
"LABEL_15020",
"LABEL_15021",
"LABEL_15022",
"LABEL_15023",
"LABEL_15024",
"LABEL_15025",
"LABEL_15026",
"LABEL_15027",
"LABEL_15028",
"LABEL_15029",
"LABEL_1503",
"LABEL_15030",
"LABEL_15031",
"LABEL_15032",
"LABEL_15033",
"LABEL_15034",
"LABEL_15035",
"LABEL_15036",
"LABEL_15037",
"LABEL_15038",
"LABEL_15039",
"LABEL_1504",
"LABEL_15040",
"LABEL_15041",
"LABEL_15042",
"LABEL_15043",
"LABEL_15044",
"LABEL_15045",
"LABEL_15046",
"LABEL_15047",
"LABEL_15048",
"LABEL_15049",
"LABEL_1505",
"LABEL_15050",
"LABEL_15051",
"LABEL_15052",
"LABEL_15053",
"LABEL_15054",
"LABEL_15055",
"LABEL_15056",
"LABEL_15057",
"LABEL_15058",
"LABEL_15059",
"LABEL_1506",
"LABEL_15060",
"LABEL_15061",
"LABEL_15062",
"LABEL_15063",
"LABEL_15064",
"LABEL_15065",
"LABEL_15066",
"LABEL_15067",
"LABEL_15068",
"LABEL_15069",
"LABEL_1507",
"LABEL_15070",
"LABEL_15071",
"LABEL_15072",
"LABEL_15073",
"LABEL_15074",
"LABEL_15075",
"LABEL_15076",
"LABEL_15077",
"LABEL_15078",
"LABEL_15079",
"LABEL_1508",
"LABEL_15080",
"LABEL_15081",
"LABEL_15082",
"LABEL_15083",
"LABEL_15084",
"LABEL_15085",
"LABEL_15086",
"LABEL_15087",
"LABEL_15088",
"LABEL_15089",
"LABEL_1509",
"LABEL_15090",
"LABEL_15091",
"LABEL_15092",
"LABEL_15093",
"LABEL_15094",
"LABEL_15095",
"LABEL_15096",
"LABEL_15097",
"LABEL_15098",
"LABEL_15099",
"LABEL_151",
"LABEL_1510",
"LABEL_15100",
"LABEL_15101",
"LABEL_15102",
"LABEL_15103",
"LABEL_15104",
"LABEL_15105",
"LABEL_15106",
"LABEL_15107",
"LABEL_15108",
"LABEL_15109",
"LABEL_1511",
"LABEL_15110",
"LABEL_15111",
"LABEL_15112",
"LABEL_15113",
"LABEL_15114",
"LABEL_15115",
"LABEL_15116",
"LABEL_15117",
"LABEL_15118",
"LABEL_15119",
"LABEL_1512",
"LABEL_15120",
"LABEL_15121",
"LABEL_15122",
"LABEL_15123",
"LABEL_15124",
"LABEL_15125",
"LABEL_15126",
"LABEL_15127",
"LABEL_15128",
"LABEL_15129",
"LABEL_1513",
"LABEL_15130",
"LABEL_15131",
"LABEL_15132",
"LABEL_15133",
"LABEL_15134",
"LABEL_15135",
"LABEL_15136",
"LABEL_15137",
"LABEL_15138",
"LABEL_15139",
"LABEL_1514",
"LABEL_15140",
"LABEL_15141",
"LABEL_15142",
"LABEL_15143",
"LABEL_15144",
"LABEL_15145",
"LABEL_15146",
"LABEL_15147",
"LABEL_15148",
"LABEL_15149",
"LABEL_1515",
"LABEL_15150",
"LABEL_15151",
"LABEL_15152",
"LABEL_15153",
"LABEL_15154",
"LABEL_15155",
"LABEL_15156",
"LABEL_15157",
"LABEL_15158",
"LABEL_15159",
"LABEL_1516",
"LABEL_15160",
"LABEL_15161",
"LABEL_15162",
"LABEL_15163",
"LABEL_15164",
"LABEL_15165",
"LABEL_15166",
"LABEL_15167",
"LABEL_15168",
"LABEL_15169",
"LABEL_1517",
"LABEL_15170",
"LABEL_15171",
"LABEL_15172",
"LABEL_15173",
"LABEL_15174",
"LABEL_15175",
"LABEL_15176",
"LABEL_15177",
"LABEL_15178",
"LABEL_15179",
"LABEL_1518",
"LABEL_15180",
"LABEL_15181",
"LABEL_15182",
"LABEL_15183",
"LABEL_15184",
"LABEL_15185",
"LABEL_15186",
"LABEL_15187",
"LABEL_15188",
"LABEL_15189",
"LABEL_1519",
"LABEL_15190",
"LABEL_15191",
"LABEL_15192",
"LABEL_15193",
"LABEL_15194",
"LABEL_15195",
"LABEL_15196",
"LABEL_15197",
"LABEL_15198",
"LABEL_15199",
"LABEL_152",
"LABEL_1520",
"LABEL_15200",
"LABEL_15201",
"LABEL_15202",
"LABEL_15203",
"LABEL_15204",
"LABEL_15205",
"LABEL_15206",
"LABEL_15207",
"LABEL_15208",
"LABEL_15209",
"LABEL_1521",
"LABEL_15210",
"LABEL_15211",
"LABEL_15212",
"LABEL_15213",
"LABEL_15214",
"LABEL_15215",
"LABEL_15216",
"LABEL_15217",
"LABEL_15218",
"LABEL_15219",
"LABEL_1522",
"LABEL_15220",
"LABEL_15221",
"LABEL_15222",
"LABEL_15223",
"LABEL_15224",
"LABEL_15225",
"LABEL_15226",
"LABEL_15227",
"LABEL_15228",
"LABEL_15229",
"LABEL_1523",
"LABEL_15230",
"LABEL_15231",
"LABEL_15232",
"LABEL_15233",
"LABEL_15234",
"LABEL_15235",
"LABEL_15236",
"LABEL_15237",
"LABEL_15238",
"LABEL_15239",
"LABEL_1524",
"LABEL_15240",
"LABEL_15241",
"LABEL_15242",
"LABEL_15243",
"LABEL_15244",
"LABEL_15245",
"LABEL_15246",
"LABEL_15247",
"LABEL_15248",
"LABEL_15249",
"LABEL_1525",
"LABEL_15250",
"LABEL_15251",
"LABEL_15252",
"LABEL_15253",
"LABEL_15254",
"LABEL_15255",
"LABEL_15256",
"LABEL_15257",
"LABEL_15258",
"LABEL_15259",
"LABEL_1526",
"LABEL_15260",
"LABEL_15261",
"LABEL_15262",
"LABEL_15263",
"LABEL_15264",
"LABEL_15265",
"LABEL_15266",
"LABEL_15267",
"LABEL_15268",
"LABEL_15269",
"LABEL_1527",
"LABEL_15270",
"LABEL_15271",
"LABEL_15272",
"LABEL_15273",
"LABEL_15274",
"LABEL_15275",
"LABEL_15276",
"LABEL_15277",
"LABEL_15278",
"LABEL_15279",
"LABEL_1528",
"LABEL_15280",
"LABEL_15281",
"LABEL_15282",
"LABEL_15283",
"LABEL_15284",
"LABEL_15285",
"LABEL_15286",
"LABEL_15287",
"LABEL_15288",
"LABEL_15289",
"LABEL_1529",
"LABEL_15290",
"LABEL_15291",
"LABEL_15292",
"LABEL_15293",
"LABEL_15294",
"LABEL_15295",
"LABEL_15296",
"LABEL_15297",
"LABEL_15298",
"LABEL_15299",
"LABEL_153",
"LABEL_1530",
"LABEL_15300",
"LABEL_15301",
"LABEL_15302",
"LABEL_15303",
"LABEL_15304",
"LABEL_15305",
"LABEL_15306",
"LABEL_15307",
"LABEL_15308",
"LABEL_15309",
"LABEL_1531",
"LABEL_15310",
"LABEL_15311",
"LABEL_15312",
"LABEL_15313",
"LABEL_15314",
"LABEL_15315",
"LABEL_15316",
"LABEL_15317",
"LABEL_15318",
"LABEL_15319",
"LABEL_1532",
"LABEL_15320",
"LABEL_15321",
"LABEL_15322",
"LABEL_15323",
"LABEL_15324",
"LABEL_15325",
"LABEL_15326",
"LABEL_15327",
"LABEL_15328",
"LABEL_15329",
"LABEL_1533",
"LABEL_15330",
"LABEL_15331",
"LABEL_15332",
"LABEL_15333",
"LABEL_15334",
"LABEL_15335",
"LABEL_15336",
"LABEL_15337",
"LABEL_15338",
"LABEL_15339",
"LABEL_1534",
"LABEL_15340",
"LABEL_15341",
"LABEL_15342",
"LABEL_15343",
"LABEL_15344",
"LABEL_15345",
"LABEL_15346",
"LABEL_15347",
"LABEL_15348",
"LABEL_15349",
"LABEL_1535",
"LABEL_15350",
"LABEL_15351",
"LABEL_15352",
"LABEL_15353",
"LABEL_15354",
"LABEL_15355",
"LABEL_15356",
"LABEL_15357",
"LABEL_15358",
"LABEL_15359",
"LABEL_1536",
"LABEL_15360",
"LABEL_15361",
"LABEL_15362",
"LABEL_15363",
"LABEL_15364",
"LABEL_15365",
"LABEL_15366",
"LABEL_15367",
"LABEL_15368",
"LABEL_15369",
"LABEL_1537",
"LABEL_15370",
"LABEL_15371",
"LABEL_15372",
"LABEL_15373",
"LABEL_15374",
"LABEL_15375",
"LABEL_15376",
"LABEL_15377",
"LABEL_15378",
"LABEL_15379",
"LABEL_1538",
"LABEL_15380",
"LABEL_15381",
"LABEL_15382",
"LABEL_15383",
"LABEL_15384",
"LABEL_15385",
"LABEL_15386",
"LABEL_15387",
"LABEL_15388",
"LABEL_15389",
"LABEL_1539",
"LABEL_15390",
"LABEL_15391",
"LABEL_15392",
"LABEL_15393",
"LABEL_15394",
"LABEL_15395",
"LABEL_15396",
"LABEL_15397",
"LABEL_15398",
"LABEL_15399",
"LABEL_154",
"LABEL_1540",
"LABEL_15400",
"LABEL_15401",
"LABEL_15402",
"LABEL_15403",
"LABEL_15404",
"LABEL_15405",
"LABEL_15406",
"LABEL_15407",
"LABEL_15408",
"LABEL_15409",
"LABEL_1541",
"LABEL_15410",
"LABEL_15411",
"LABEL_15412",
"LABEL_15413",
"LABEL_15414",
"LABEL_15415",
"LABEL_15416",
"LABEL_15417",
"LABEL_15418",
"LABEL_15419",
"LABEL_1542",
"LABEL_15420",
"LABEL_15421",
"LABEL_15422",
"LABEL_15423",
"LABEL_15424",
"LABEL_15425",
"LABEL_15426",
"LABEL_15427",
"LABEL_15428",
"LABEL_15429",
"LABEL_1543",
"LABEL_15430",
"LABEL_15431",
"LABEL_15432",
"LABEL_15433",
"LABEL_15434",
"LABEL_15435",
"LABEL_15436",
"LABEL_15437",
"LABEL_15438",
"LABEL_15439",
"LABEL_1544",
"LABEL_15440",
"LABEL_15441",
"LABEL_15442",
"LABEL_15443",
"LABEL_15444",
"LABEL_15445",
"LABEL_15446",
"LABEL_15447",
"LABEL_15448",
"LABEL_15449",
"LABEL_1545",
"LABEL_15450",
"LABEL_15451",
"LABEL_15452",
"LABEL_15453",
"LABEL_15454",
"LABEL_15455",
"LABEL_15456",
"LABEL_15457",
"LABEL_15458",
"LABEL_15459",
"LABEL_1546",
"LABEL_15460",
"LABEL_15461",
"LABEL_15462",
"LABEL_15463",
"LABEL_15464",
"LABEL_15465",
"LABEL_15466",
"LABEL_15467",
"LABEL_15468",
"LABEL_15469",
"LABEL_1547",
"LABEL_15470",
"LABEL_15471",
"LABEL_15472",
"LABEL_15473",
"LABEL_15474",
"LABEL_15475",
"LABEL_15476",
"LABEL_15477",
"LABEL_15478",
"LABEL_15479",
"LABEL_1548",
"LABEL_15480",
"LABEL_15481",
"LABEL_15482",
"LABEL_15483",
"LABEL_15484",
"LABEL_15485",
"LABEL_15486",
"LABEL_15487",
"LABEL_15488",
"LABEL_15489",
"LABEL_1549",
"LABEL_15490",
"LABEL_15491",
"LABEL_15492",
"LABEL_15493",
"LABEL_15494",
"LABEL_15495",
"LABEL_15496",
"LABEL_15497",
"LABEL_15498",
"LABEL_15499",
"LABEL_155",
"LABEL_1550",
"LABEL_15500",
"LABEL_15501",
"LABEL_15502",
"LABEL_15503",
"LABEL_15504",
"LABEL_15505",
"LABEL_15506",
"LABEL_15507",
"LABEL_15508",
"LABEL_15509",
"LABEL_1551",
"LABEL_15510",
"LABEL_15511",
"LABEL_15512",
"LABEL_15513",
"LABEL_15514",
"LABEL_15515",
"LABEL_15516",
"LABEL_15517",
"LABEL_15518",
"LABEL_15519",
"LABEL_1552",
"LABEL_15520",
"LABEL_15521",
"LABEL_15522",
"LABEL_15523",
"LABEL_15524",
"LABEL_15525",
"LABEL_15526",
"LABEL_15527",
"LABEL_15528",
"LABEL_15529",
"LABEL_1553",
"LABEL_15530",
"LABEL_15531",
"LABEL_15532",
"LABEL_15533",
"LABEL_15534",
"LABEL_15535",
"LABEL_15536",
"LABEL_15537",
"LABEL_15538",
"LABEL_15539",
"LABEL_1554",
"LABEL_15540",
"LABEL_15541",
"LABEL_15542",
"LABEL_15543",
"LABEL_15544",
"LABEL_15545",
"LABEL_15546",
"LABEL_15547",
"LABEL_15548",
"LABEL_15549",
"LABEL_1555",
"LABEL_15550",
"LABEL_15551",
"LABEL_15552",
"LABEL_15553",
"LABEL_15554",
"LABEL_15555",
"LABEL_15556",
"LABEL_15557",
"LABEL_15558",
"LABEL_15559",
"LABEL_1556",
"LABEL_15560",
"LABEL_15561",
"LABEL_15562",
"LABEL_15563",
"LABEL_15564",
"LABEL_15565",
"LABEL_15566",
"LABEL_15567",
"LABEL_15568",
"LABEL_15569",
"LABEL_1557",
"LABEL_15570",
"LABEL_15571",
"LABEL_15572",
"LABEL_15573",
"LABEL_15574",
"LABEL_15575",
"LABEL_15576",
"LABEL_15577",
"LABEL_15578",
"LABEL_15579",
"LABEL_1558",
"LABEL_15580",
"LABEL_15581",
"LABEL_15582",
"LABEL_15583",
"LABEL_15584",
"LABEL_15585",
"LABEL_15586",
"LABEL_15587",
"LABEL_15588",
"LABEL_15589",
"LABEL_1559",
"LABEL_15590",
"LABEL_15591",
"LABEL_15592",
"LABEL_15593",
"LABEL_15594",
"LABEL_15595",
"LABEL_15596",
"LABEL_15597",
"LABEL_15598",
"LABEL_15599",
"LABEL_156",
"LABEL_1560",
"LABEL_15600",
"LABEL_15601",
"LABEL_15602",
"LABEL_15603",
"LABEL_15604",
"LABEL_15605",
"LABEL_15606",
"LABEL_15607",
"LABEL_15608",
"LABEL_15609",
"LABEL_1561",
"LABEL_15610",
"LABEL_15611",
"LABEL_15612",
"LABEL_15613",
"LABEL_15614",
"LABEL_15615",
"LABEL_15616",
"LABEL_15617",
"LABEL_15618",
"LABEL_15619",
"LABEL_1562",
"LABEL_15620",
"LABEL_15621",
"LABEL_15622",
"LABEL_15623",
"LABEL_15624",
"LABEL_15625",
"LABEL_15626",
"LABEL_15627",
"LABEL_15628",
"LABEL_15629",
"LABEL_1563",
"LABEL_15630",
"LABEL_15631",
"LABEL_15632",
"LABEL_15633",
"LABEL_15634",
"LABEL_15635",
"LABEL_15636",
"LABEL_15637",
"LABEL_15638",
"LABEL_15639",
"LABEL_1564",
"LABEL_15640",
"LABEL_15641",
"LABEL_15642",
"LABEL_15643",
"LABEL_15644",
"LABEL_15645",
"LABEL_15646",
"LABEL_15647",
"LABEL_15648",
"LABEL_15649",
"LABEL_1565",
"LABEL_15650",
"LABEL_15651",
"LABEL_15652",
"LABEL_15653",
"LABEL_15654",
"LABEL_15655",
"LABEL_15656",
"LABEL_15657",
"LABEL_15658",
"LABEL_15659",
"LABEL_1566",
"LABEL_15660",
"LABEL_15661",
"LABEL_15662",
"LABEL_15663",
"LABEL_15664",
"LABEL_15665",
"LABEL_15666",
"LABEL_15667",
"LABEL_15668",
"LABEL_15669",
"LABEL_1567",
"LABEL_15670",
"LABEL_15671",
"LABEL_15672",
"LABEL_15673",
"LABEL_15674",
"LABEL_15675",
"LABEL_15676",
"LABEL_15677",
"LABEL_15678",
"LABEL_15679",
"LABEL_1568",
"LABEL_15680",
"LABEL_15681",
"LABEL_15682",
"LABEL_15683",
"LABEL_15684",
"LABEL_15685",
"LABEL_15686",
"LABEL_15687",
"LABEL_15688",
"LABEL_15689",
"LABEL_1569",
"LABEL_15690",
"LABEL_15691",
"LABEL_15692",
"LABEL_15693",
"LABEL_15694",
"LABEL_15695",
"LABEL_15696",
"LABEL_15697",
"LABEL_15698",
"LABEL_15699",
"LABEL_157",
"LABEL_1570",
"LABEL_15700",
"LABEL_15701",
"LABEL_15702",
"LABEL_15703",
"LABEL_15704",
"LABEL_15705",
"LABEL_15706",
"LABEL_15707",
"LABEL_15708",
"LABEL_15709",
"LABEL_1571",
"LABEL_15710",
"LABEL_15711",
"LABEL_15712",
"LABEL_15713",
"LABEL_15714",
"LABEL_15715",
"LABEL_15716",
"LABEL_15717",
"LABEL_15718",
"LABEL_15719",
"LABEL_1572",
"LABEL_15720",
"LABEL_15721",
"LABEL_15722",
"LABEL_15723",
"LABEL_15724",
"LABEL_15725",
"LABEL_15726",
"LABEL_15727",
"LABEL_15728",
"LABEL_15729",
"LABEL_1573",
"LABEL_15730",
"LABEL_15731",
"LABEL_15732",
"LABEL_15733",
"LABEL_15734",
"LABEL_15735",
"LABEL_15736",
"LABEL_15737",
"LABEL_15738",
"LABEL_15739",
"LABEL_1574",
"LABEL_15740",
"LABEL_15741",
"LABEL_15742",
"LABEL_15743",
"LABEL_15744",
"LABEL_15745",
"LABEL_15746",
"LABEL_15747",
"LABEL_15748",
"LABEL_15749",
"LABEL_1575",
"LABEL_15750",
"LABEL_15751",
"LABEL_15752",
"LABEL_15753",
"LABEL_15754",
"LABEL_15755",
"LABEL_15756",
"LABEL_15757",
"LABEL_15758",
"LABEL_15759",
"LABEL_1576",
"LABEL_15760",
"LABEL_15761",
"LABEL_15762",
"LABEL_15763",
"LABEL_15764",
"LABEL_15765",
"LABEL_15766",
"LABEL_15767",
"LABEL_15768",
"LABEL_15769",
"LABEL_1577",
"LABEL_15770",
"LABEL_15771",
"LABEL_15772",
"LABEL_15773",
"LABEL_15774",
"LABEL_15775",
"LABEL_15776",
"LABEL_15777",
"LABEL_15778",
"LABEL_15779",
"LABEL_1578",
"LABEL_15780",
"LABEL_15781",
"LABEL_15782",
"LABEL_15783",
"LABEL_15784",
"LABEL_15785",
"LABEL_15786",
"LABEL_15787",
"LABEL_15788",
"LABEL_15789",
"LABEL_1579",
"LABEL_15790",
"LABEL_15791",
"LABEL_15792",
"LABEL_15793",
"LABEL_15794",
"LABEL_15795",
"LABEL_15796",
"LABEL_15797",
"LABEL_15798",
"LABEL_15799",
"LABEL_158",
"LABEL_1580",
"LABEL_15800",
"LABEL_15801",
"LABEL_15802",
"LABEL_15803",
"LABEL_15804",
"LABEL_15805",
"LABEL_15806",
"LABEL_15807",
"LABEL_15808",
"LABEL_15809",
"LABEL_1581",
"LABEL_15810",
"LABEL_15811",
"LABEL_15812",
"LABEL_15813",
"LABEL_15814",
"LABEL_15815",
"LABEL_15816",
"LABEL_15817",
"LABEL_15818",
"LABEL_15819",
"LABEL_1582",
"LABEL_15820",
"LABEL_15821",
"LABEL_15822",
"LABEL_15823",
"LABEL_15824",
"LABEL_15825",
"LABEL_15826",
"LABEL_15827",
"LABEL_15828",
"LABEL_15829",
"LABEL_1583",
"LABEL_15830",
"LABEL_15831",
"LABEL_15832",
"LABEL_15833",
"LABEL_15834",
"LABEL_15835",
"LABEL_15836",
"LABEL_15837",
"LABEL_15838",
"LABEL_15839",
"LABEL_1584",
"LABEL_15840",
"LABEL_15841",
"LABEL_15842",
"LABEL_15843",
"LABEL_15844",
"LABEL_15845",
"LABEL_15846",
"LABEL_15847",
"LABEL_15848",
"LABEL_15849",
"LABEL_1585",
"LABEL_15850",
"LABEL_15851",
"LABEL_15852",
"LABEL_15853",
"LABEL_15854",
"LABEL_15855",
"LABEL_15856",
"LABEL_15857",
"LABEL_15858",
"LABEL_15859",
"LABEL_1586",
"LABEL_15860",
"LABEL_15861",
"LABEL_15862",
"LABEL_15863",
"LABEL_15864",
"LABEL_15865",
"LABEL_15866",
"LABEL_15867",
"LABEL_15868",
"LABEL_15869",
"LABEL_1587",
"LABEL_15870",
"LABEL_15871",
"LABEL_15872",
"LABEL_15873",
"LABEL_15874",
"LABEL_15875",
"LABEL_15876",
"LABEL_15877",
"LABEL_15878",
"LABEL_15879",
"LABEL_1588",
"LABEL_15880",
"LABEL_15881",
"LABEL_15882",
"LABEL_15883",
"LABEL_15884",
"LABEL_15885",
"LABEL_15886",
"LABEL_15887",
"LABEL_15888",
"LABEL_15889",
"LABEL_1589",
"LABEL_15890",
"LABEL_15891",
"LABEL_15892",
"LABEL_15893",
"LABEL_15894",
"LABEL_15895",
"LABEL_15896",
"LABEL_15897",
"LABEL_15898",
"LABEL_15899",
"LABEL_159",
"LABEL_1590",
"LABEL_15900",
"LABEL_15901",
"LABEL_15902",
"LABEL_15903",
"LABEL_15904",
"LABEL_15905",
"LABEL_15906",
"LABEL_15907",
"LABEL_15908",
"LABEL_15909",
"LABEL_1591",
"LABEL_15910",
"LABEL_15911",
"LABEL_15912",
"LABEL_15913",
"LABEL_15914",
"LABEL_15915",
"LABEL_15916",
"LABEL_15917",
"LABEL_15918",
"LABEL_15919",
"LABEL_1592",
"LABEL_15920",
"LABEL_15921",
"LABEL_15922",
"LABEL_15923",
"LABEL_15924",
"LABEL_15925",
"LABEL_15926",
"LABEL_15927",
"LABEL_15928",
"LABEL_15929",
"LABEL_1593",
"LABEL_15930",
"LABEL_15931",
"LABEL_15932",
"LABEL_15933",
"LABEL_15934",
"LABEL_15935",
"LABEL_15936",
"LABEL_15937",
"LABEL_15938",
"LABEL_15939",
"LABEL_1594",
"LABEL_15940",
"LABEL_15941",
"LABEL_15942",
"LABEL_15943",
"LABEL_15944",
"LABEL_15945",
"LABEL_15946",
"LABEL_15947",
"LABEL_15948",
"LABEL_15949",
"LABEL_1595",
"LABEL_15950",
"LABEL_15951",
"LABEL_15952",
"LABEL_15953",
"LABEL_15954",
"LABEL_15955",
"LABEL_15956",
"LABEL_15957",
"LABEL_15958",
"LABEL_15959",
"LABEL_1596",
"LABEL_15960",
"LABEL_15961",
"LABEL_15962",
"LABEL_15963",
"LABEL_15964",
"LABEL_15965",
"LABEL_15966",
"LABEL_15967",
"LABEL_15968",
"LABEL_15969",
"LABEL_1597",
"LABEL_15970",
"LABEL_15971",
"LABEL_15972",
"LABEL_15973",
"LABEL_15974",
"LABEL_15975",
"LABEL_15976",
"LABEL_15977",
"LABEL_15978",
"LABEL_15979",
"LABEL_1598",
"LABEL_15980",
"LABEL_15981",
"LABEL_15982",
"LABEL_15983",
"LABEL_15984",
"LABEL_15985",
"LABEL_15986",
"LABEL_15987",
"LABEL_15988",
"LABEL_15989",
"LABEL_1599",
"LABEL_15990",
"LABEL_15991",
"LABEL_15992",
"LABEL_15993",
"LABEL_15994",
"LABEL_15995",
"LABEL_15996",
"LABEL_15997",
"LABEL_15998",
"LABEL_15999",
"LABEL_16",
"LABEL_160",
"LABEL_1600",
"LABEL_16000",
"LABEL_16001",
"LABEL_16002",
"LABEL_16003",
"LABEL_16004",
"LABEL_16005",
"LABEL_16006",
"LABEL_16007",
"LABEL_16008",
"LABEL_16009",
"LABEL_1601",
"LABEL_16010",
"LABEL_16011",
"LABEL_16012",
"LABEL_16013",
"LABEL_16014",
"LABEL_16015",
"LABEL_16016",
"LABEL_16017",
"LABEL_16018",
"LABEL_16019",
"LABEL_1602",
"LABEL_16020",
"LABEL_16021",
"LABEL_16022",
"LABEL_16023",
"LABEL_16024",
"LABEL_16025",
"LABEL_16026",
"LABEL_16027",
"LABEL_16028",
"LABEL_16029",
"LABEL_1603",
"LABEL_16030",
"LABEL_16031",
"LABEL_16032",
"LABEL_16033",
"LABEL_16034",
"LABEL_16035",
"LABEL_16036",
"LABEL_16037",
"LABEL_16038",
"LABEL_16039",
"LABEL_1604",
"LABEL_16040",
"LABEL_16041",
"LABEL_16042",
"LABEL_16043",
"LABEL_16044",
"LABEL_16045",
"LABEL_16046",
"LABEL_16047",
"LABEL_16048",
"LABEL_16049",
"LABEL_1605",
"LABEL_16050",
"LABEL_16051",
"LABEL_16052",
"LABEL_16053",
"LABEL_16054",
"LABEL_16055",
"LABEL_16056",
"LABEL_16057",
"LABEL_16058",
"LABEL_16059",
"LABEL_1606",
"LABEL_16060",
"LABEL_16061",
"LABEL_16062",
"LABEL_16063",
"LABEL_16064",
"LABEL_16065",
"LABEL_16066",
"LABEL_16067",
"LABEL_16068",
"LABEL_16069",
"LABEL_1607",
"LABEL_16070",
"LABEL_16071",
"LABEL_16072",
"LABEL_16073",
"LABEL_16074",
"LABEL_16075",
"LABEL_16076",
"LABEL_16077",
"LABEL_16078",
"LABEL_16079",
"LABEL_1608",
"LABEL_16080",
"LABEL_16081",
"LABEL_16082",
"LABEL_16083",
"LABEL_16084",
"LABEL_16085",
"LABEL_16086",
"LABEL_16087",
"LABEL_16088",
"LABEL_16089",
"LABEL_1609",
"LABEL_16090",
"LABEL_16091",
"LABEL_16092",
"LABEL_16093",
"LABEL_16094",
"LABEL_16095",
"LABEL_16096",
"LABEL_16097",
"LABEL_16098",
"LABEL_16099",
"LABEL_161",
"LABEL_1610",
"LABEL_16100",
"LABEL_16101",
"LABEL_16102",
"LABEL_16103",
"LABEL_16104",
"LABEL_16105",
"LABEL_16106",
"LABEL_16107",
"LABEL_16108",
"LABEL_16109",
"LABEL_1611",
"LABEL_16110",
"LABEL_16111",
"LABEL_16112",
"LABEL_16113",
"LABEL_16114",
"LABEL_16115",
"LABEL_16116",
"LABEL_16117",
"LABEL_16118",
"LABEL_16119",
"LABEL_1612",
"LABEL_16120",
"LABEL_16121",
"LABEL_16122",
"LABEL_16123",
"LABEL_16124",
"LABEL_16125",
"LABEL_16126",
"LABEL_16127",
"LABEL_16128",
"LABEL_16129",
"LABEL_1613",
"LABEL_16130",
"LABEL_16131",
"LABEL_16132",
"LABEL_16133",
"LABEL_16134",
"LABEL_16135",
"LABEL_16136",
"LABEL_16137",
"LABEL_16138",
"LABEL_16139",
"LABEL_1614",
"LABEL_16140",
"LABEL_16141",
"LABEL_16142",
"LABEL_16143",
"LABEL_16144",
"LABEL_16145",
"LABEL_16146",
"LABEL_16147",
"LABEL_16148",
"LABEL_16149",
"LABEL_1615",
"LABEL_16150",
"LABEL_16151",
"LABEL_16152",
"LABEL_16153",
"LABEL_16154",
"LABEL_16155",
"LABEL_16156",
"LABEL_16157",
"LABEL_16158",
"LABEL_16159",
"LABEL_1616",
"LABEL_16160",
"LABEL_16161",
"LABEL_16162",
"LABEL_16163",
"LABEL_16164",
"LABEL_16165",
"LABEL_16166",
"LABEL_16167",
"LABEL_16168",
"LABEL_16169",
"LABEL_1617",
"LABEL_16170",
"LABEL_16171",
"LABEL_16172",
"LABEL_16173",
"LABEL_16174",
"LABEL_16175",
"LABEL_16176",
"LABEL_16177",
"LABEL_16178",
"LABEL_16179",
"LABEL_1618",
"LABEL_16180",
"LABEL_16181",
"LABEL_16182",
"LABEL_16183",
"LABEL_16184",
"LABEL_16185",
"LABEL_16186",
"LABEL_16187",
"LABEL_16188",
"LABEL_16189",
"LABEL_1619",
"LABEL_16190",
"LABEL_16191",
"LABEL_16192",
"LABEL_16193",
"LABEL_16194",
"LABEL_16195",
"LABEL_16196",
"LABEL_16197",
"LABEL_16198",
"LABEL_16199",
"LABEL_162",
"LABEL_1620",
"LABEL_16200",
"LABEL_16201",
"LABEL_16202",
"LABEL_16203",
"LABEL_16204",
"LABEL_16205",
"LABEL_16206",
"LABEL_16207",
"LABEL_16208",
"LABEL_16209",
"LABEL_1621",
"LABEL_16210",
"LABEL_16211",
"LABEL_16212",
"LABEL_16213",
"LABEL_16214",
"LABEL_16215",
"LABEL_16216",
"LABEL_16217",
"LABEL_16218",
"LABEL_16219",
"LABEL_1622",
"LABEL_16220",
"LABEL_16221",
"LABEL_16222",
"LABEL_16223",
"LABEL_16224",
"LABEL_16225",
"LABEL_16226",
"LABEL_16227",
"LABEL_16228",
"LABEL_16229",
"LABEL_1623",
"LABEL_16230",
"LABEL_16231",
"LABEL_16232",
"LABEL_16233",
"LABEL_16234",
"LABEL_16235",
"LABEL_16236",
"LABEL_16237",
"LABEL_16238",
"LABEL_16239",
"LABEL_1624",
"LABEL_16240",
"LABEL_16241",
"LABEL_16242",
"LABEL_16243",
"LABEL_16244",
"LABEL_16245",
"LABEL_16246",
"LABEL_16247",
"LABEL_16248",
"LABEL_16249",
"LABEL_1625",
"LABEL_16250",
"LABEL_16251",
"LABEL_16252",
"LABEL_16253",
"LABEL_16254",
"LABEL_16255",
"LABEL_16256",
"LABEL_16257",
"LABEL_16258",
"LABEL_16259",
"LABEL_1626",
"LABEL_16260",
"LABEL_16261",
"LABEL_16262",
"LABEL_16263",
"LABEL_16264",
"LABEL_16265",
"LABEL_16266",
"LABEL_16267",
"LABEL_16268",
"LABEL_16269",
"LABEL_1627",
"LABEL_16270",
"LABEL_16271",
"LABEL_16272",
"LABEL_16273",
"LABEL_16274",
"LABEL_16275",
"LABEL_16276",
"LABEL_16277",
"LABEL_16278",
"LABEL_16279",
"LABEL_1628",
"LABEL_16280",
"LABEL_16281",
"LABEL_16282",
"LABEL_16283",
"LABEL_16284",
"LABEL_16285",
"LABEL_16286",
"LABEL_16287",
"LABEL_16288",
"LABEL_16289",
"LABEL_1629",
"LABEL_16290",
"LABEL_16291",
"LABEL_16292",
"LABEL_16293",
"LABEL_16294",
"LABEL_16295",
"LABEL_16296",
"LABEL_16297",
"LABEL_16298",
"LABEL_16299",
"LABEL_163",
"LABEL_1630",
"LABEL_16300",
"LABEL_16301",
"LABEL_16302",
"LABEL_16303",
"LABEL_16304",
"LABEL_16305",
"LABEL_16306",
"LABEL_16307",
"LABEL_16308",
"LABEL_16309",
"LABEL_1631",
"LABEL_16310",
"LABEL_16311",
"LABEL_16312",
"LABEL_16313",
"LABEL_16314",
"LABEL_16315",
"LABEL_16316",
"LABEL_16317",
"LABEL_16318",
"LABEL_16319",
"LABEL_1632",
"LABEL_16320",
"LABEL_16321",
"LABEL_16322",
"LABEL_16323",
"LABEL_16324",
"LABEL_16325",
"LABEL_16326",
"LABEL_16327",
"LABEL_16328",
"LABEL_16329",
"LABEL_1633",
"LABEL_16330",
"LABEL_16331",
"LABEL_16332",
"LABEL_16333",
"LABEL_16334",
"LABEL_16335",
"LABEL_16336",
"LABEL_16337",
"LABEL_16338",
"LABEL_16339",
"LABEL_1634",
"LABEL_16340",
"LABEL_16341",
"LABEL_16342",
"LABEL_16343",
"LABEL_16344",
"LABEL_16345",
"LABEL_16346",
"LABEL_16347",
"LABEL_16348",
"LABEL_16349",
"LABEL_1635",
"LABEL_16350",
"LABEL_16351",
"LABEL_16352",
"LABEL_16353",
"LABEL_16354",
"LABEL_16355",
"LABEL_16356",
"LABEL_16357",
"LABEL_16358",
"LABEL_16359",
"LABEL_1636",
"LABEL_16360",
"LABEL_16361",
"LABEL_16362",
"LABEL_16363",
"LABEL_16364",
"LABEL_16365",
"LABEL_16366",
"LABEL_16367",
"LABEL_16368",
"LABEL_16369",
"LABEL_1637",
"LABEL_16370",
"LABEL_16371",
"LABEL_16372",
"LABEL_16373",
"LABEL_16374",
"LABEL_16375",
"LABEL_16376",
"LABEL_16377",
"LABEL_16378",
"LABEL_16379",
"LABEL_1638",
"LABEL_16380",
"LABEL_16381",
"LABEL_16382",
"LABEL_16383",
"LABEL_16384",
"LABEL_16385",
"LABEL_16386",
"LABEL_16387",
"LABEL_16388",
"LABEL_16389",
"LABEL_1639",
"LABEL_16390",
"LABEL_16391",
"LABEL_16392",
"LABEL_16393",
"LABEL_16394",
"LABEL_16395",
"LABEL_16396",
"LABEL_16397",
"LABEL_16398",
"LABEL_16399",
"LABEL_164",
"LABEL_1640",
"LABEL_16400",
"LABEL_16401",
"LABEL_16402",
"LABEL_16403",
"LABEL_16404",
"LABEL_16405",
"LABEL_16406",
"LABEL_16407",
"LABEL_16408",
"LABEL_16409",
"LABEL_1641",
"LABEL_16410",
"LABEL_16411",
"LABEL_16412",
"LABEL_16413",
"LABEL_16414",
"LABEL_16415",
"LABEL_16416",
"LABEL_16417",
"LABEL_16418",
"LABEL_16419",
"LABEL_1642",
"LABEL_16420",
"LABEL_16421",
"LABEL_16422",
"LABEL_16423",
"LABEL_16424",
"LABEL_16425",
"LABEL_16426",
"LABEL_16427",
"LABEL_16428",
"LABEL_16429",
"LABEL_1643",
"LABEL_16430",
"LABEL_16431",
"LABEL_16432",
"LABEL_16433",
"LABEL_16434",
"LABEL_16435",
"LABEL_16436",
"LABEL_16437",
"LABEL_16438",
"LABEL_16439",
"LABEL_1644",
"LABEL_16440",
"LABEL_16441",
"LABEL_16442",
"LABEL_16443",
"LABEL_16444",
"LABEL_16445",
"LABEL_16446",
"LABEL_16447",
"LABEL_16448",
"LABEL_16449",
"LABEL_1645",
"LABEL_16450",
"LABEL_16451",
"LABEL_16452",
"LABEL_16453",
"LABEL_16454",
"LABEL_16455",
"LABEL_16456",
"LABEL_16457",
"LABEL_16458",
"LABEL_16459",
"LABEL_1646",
"LABEL_16460",
"LABEL_16461",
"LABEL_16462",
"LABEL_16463",
"LABEL_16464",
"LABEL_16465",
"LABEL_16466",
"LABEL_16467",
"LABEL_16468",
"LABEL_16469",
"LABEL_1647",
"LABEL_16470",
"LABEL_16471",
"LABEL_16472",
"LABEL_16473",
"LABEL_16474",
"LABEL_16475",
"LABEL_16476",
"LABEL_16477",
"LABEL_16478",
"LABEL_16479",
"LABEL_1648",
"LABEL_16480",
"LABEL_16481",
"LABEL_16482",
"LABEL_16483",
"LABEL_16484",
"LABEL_16485",
"LABEL_16486",
"LABEL_16487",
"LABEL_16488",
"LABEL_16489",
"LABEL_1649",
"LABEL_16490",
"LABEL_16491",
"LABEL_16492",
"LABEL_16493",
"LABEL_16494",
"LABEL_16495",
"LABEL_16496",
"LABEL_16497",
"LABEL_16498",
"LABEL_16499",
"LABEL_165",
"LABEL_1650",
"LABEL_16500",
"LABEL_16501",
"LABEL_16502",
"LABEL_16503",
"LABEL_16504",
"LABEL_16505",
"LABEL_16506",
"LABEL_16507",
"LABEL_16508",
"LABEL_16509",
"LABEL_1651",
"LABEL_16510",
"LABEL_16511",
"LABEL_16512",
"LABEL_16513",
"LABEL_16514",
"LABEL_16515",
"LABEL_16516",
"LABEL_16517",
"LABEL_16518",
"LABEL_16519",
"LABEL_1652",
"LABEL_16520",
"LABEL_16521",
"LABEL_16522",
"LABEL_16523",
"LABEL_16524",
"LABEL_16525",
"LABEL_16526",
"LABEL_16527",
"LABEL_16528",
"LABEL_16529",
"LABEL_1653",
"LABEL_16530",
"LABEL_16531",
"LABEL_16532",
"LABEL_16533",
"LABEL_16534",
"LABEL_16535",
"LABEL_16536",
"LABEL_16537",
"LABEL_16538",
"LABEL_16539",
"LABEL_1654",
"LABEL_16540",
"LABEL_16541",
"LABEL_16542",
"LABEL_16543",
"LABEL_16544",
"LABEL_16545",
"LABEL_16546",
"LABEL_16547",
"LABEL_16548",
"LABEL_16549",
"LABEL_1655",
"LABEL_16550",
"LABEL_16551",
"LABEL_16552",
"LABEL_16553",
"LABEL_16554",
"LABEL_16555",
"LABEL_16556",
"LABEL_16557",
"LABEL_16558",
"LABEL_16559",
"LABEL_1656",
"LABEL_16560",
"LABEL_16561",
"LABEL_16562",
"LABEL_16563",
"LABEL_16564",
"LABEL_16565",
"LABEL_16566",
"LABEL_16567",
"LABEL_16568",
"LABEL_16569",
"LABEL_1657",
"LABEL_16570",
"LABEL_16571",
"LABEL_16572",
"LABEL_16573",
"LABEL_16574",
"LABEL_16575",
"LABEL_16576",
"LABEL_16577",
"LABEL_16578",
"LABEL_16579",
"LABEL_1658",
"LABEL_16580",
"LABEL_16581",
"LABEL_16582",
"LABEL_16583",
"LABEL_16584",
"LABEL_16585",
"LABEL_16586",
"LABEL_16587",
"LABEL_16588",
"LABEL_16589",
"LABEL_1659",
"LABEL_16590",
"LABEL_16591",
"LABEL_16592",
"LABEL_16593",
"LABEL_16594",
"LABEL_16595",
"LABEL_16596",
"LABEL_16597",
"LABEL_16598",
"LABEL_16599",
"LABEL_166",
"LABEL_1660",
"LABEL_16600",
"LABEL_16601",
"LABEL_16602",
"LABEL_16603",
"LABEL_16604",
"LABEL_16605",
"LABEL_16606",
"LABEL_16607",
"LABEL_16608",
"LABEL_16609",
"LABEL_1661",
"LABEL_16610",
"LABEL_16611",
"LABEL_16612",
"LABEL_16613",
"LABEL_16614",
"LABEL_16615",
"LABEL_16616",
"LABEL_16617",
"LABEL_16618",
"LABEL_16619",
"LABEL_1662",
"LABEL_16620",
"LABEL_16621",
"LABEL_16622",
"LABEL_16623",
"LABEL_16624",
"LABEL_16625",
"LABEL_16626",
"LABEL_16627",
"LABEL_16628",
"LABEL_16629",
"LABEL_1663",
"LABEL_16630",
"LABEL_16631",
"LABEL_16632",
"LABEL_16633",
"LABEL_16634",
"LABEL_16635",
"LABEL_16636",
"LABEL_16637",
"LABEL_16638",
"LABEL_16639",
"LABEL_1664",
"LABEL_16640",
"LABEL_16641",
"LABEL_16642",
"LABEL_16643",
"LABEL_16644",
"LABEL_16645",
"LABEL_16646",
"LABEL_16647",
"LABEL_16648",
"LABEL_16649",
"LABEL_1665",
"LABEL_16650",
"LABEL_16651",
"LABEL_16652",
"LABEL_16653",
"LABEL_16654",
"LABEL_16655",
"LABEL_16656",
"LABEL_16657",
"LABEL_16658",
"LABEL_16659",
"LABEL_1666",
"LABEL_16660",
"LABEL_16661",
"LABEL_16662",
"LABEL_16663",
"LABEL_16664",
"LABEL_16665",
"LABEL_16666",
"LABEL_16667",
"LABEL_16668",
"LABEL_16669",
"LABEL_1667",
"LABEL_16670",
"LABEL_16671",
"LABEL_16672",
"LABEL_16673",
"LABEL_16674",
"LABEL_16675",
"LABEL_16676",
"LABEL_16677",
"LABEL_16678",
"LABEL_16679",
"LABEL_1668",
"LABEL_16680",
"LABEL_16681",
"LABEL_16682",
"LABEL_16683",
"LABEL_16684",
"LABEL_16685",
"LABEL_16686",
"LABEL_16687",
"LABEL_16688",
"LABEL_16689",
"LABEL_1669",
"LABEL_16690",
"LABEL_16691",
"LABEL_16692",
"LABEL_16693",
"LABEL_16694",
"LABEL_16695",
"LABEL_16696",
"LABEL_16697",
"LABEL_16698",
"LABEL_16699",
"LABEL_167",
"LABEL_1670",
"LABEL_16700",
"LABEL_16701",
"LABEL_16702",
"LABEL_16703",
"LABEL_16704",
"LABEL_16705",
"LABEL_16706",
"LABEL_16707",
"LABEL_16708",
"LABEL_16709",
"LABEL_1671",
"LABEL_16710",
"LABEL_16711",
"LABEL_16712",
"LABEL_16713",
"LABEL_16714",
"LABEL_16715",
"LABEL_16716",
"LABEL_16717",
"LABEL_16718",
"LABEL_16719",
"LABEL_1672",
"LABEL_16720",
"LABEL_16721",
"LABEL_16722",
"LABEL_16723",
"LABEL_16724",
"LABEL_16725",
"LABEL_16726",
"LABEL_16727",
"LABEL_16728",
"LABEL_16729",
"LABEL_1673",
"LABEL_16730",
"LABEL_16731",
"LABEL_16732",
"LABEL_16733",
"LABEL_16734",
"LABEL_16735",
"LABEL_16736",
"LABEL_16737",
"LABEL_16738",
"LABEL_16739",
"LABEL_1674",
"LABEL_16740",
"LABEL_16741",
"LABEL_16742",
"LABEL_16743",
"LABEL_16744",
"LABEL_16745",
"LABEL_16746",
"LABEL_16747",
"LABEL_16748",
"LABEL_16749",
"LABEL_1675",
"LABEL_16750",
"LABEL_16751",
"LABEL_16752",
"LABEL_16753",
"LABEL_16754",
"LABEL_16755",
"LABEL_16756",
"LABEL_16757",
"LABEL_16758",
"LABEL_16759",
"LABEL_1676",
"LABEL_16760",
"LABEL_16761",
"LABEL_16762",
"LABEL_16763",
"LABEL_16764",
"LABEL_16765",
"LABEL_16766",
"LABEL_16767",
"LABEL_16768",
"LABEL_16769",
"LABEL_1677",
"LABEL_16770",
"LABEL_16771",
"LABEL_16772",
"LABEL_16773",
"LABEL_16774",
"LABEL_16775",
"LABEL_16776",
"LABEL_16777",
"LABEL_16778",
"LABEL_16779",
"LABEL_1678",
"LABEL_16780",
"LABEL_16781",
"LABEL_16782",
"LABEL_16783",
"LABEL_16784",
"LABEL_16785",
"LABEL_16786",
"LABEL_16787",
"LABEL_16788",
"LABEL_16789",
"LABEL_1679",
"LABEL_16790",
"LABEL_16791",
"LABEL_16792",
"LABEL_16793",
"LABEL_16794",
"LABEL_16795",
"LABEL_16796",
"LABEL_16797",
"LABEL_16798",
"LABEL_16799",
"LABEL_168",
"LABEL_1680",
"LABEL_16800",
"LABEL_16801",
"LABEL_16802",
"LABEL_16803",
"LABEL_16804",
"LABEL_16805",
"LABEL_16806",
"LABEL_16807",
"LABEL_16808",
"LABEL_16809",
"LABEL_1681",
"LABEL_16810",
"LABEL_16811",
"LABEL_16812",
"LABEL_16813",
"LABEL_16814",
"LABEL_16815",
"LABEL_16816",
"LABEL_16817",
"LABEL_16818",
"LABEL_16819",
"LABEL_1682",
"LABEL_16820",
"LABEL_16821",
"LABEL_16822",
"LABEL_16823",
"LABEL_16824",
"LABEL_16825",
"LABEL_16826",
"LABEL_16827",
"LABEL_16828",
"LABEL_16829",
"LABEL_1683",
"LABEL_16830",
"LABEL_16831",
"LABEL_16832",
"LABEL_16833",
"LABEL_16834",
"LABEL_16835",
"LABEL_16836",
"LABEL_16837",
"LABEL_16838",
"LABEL_16839",
"LABEL_1684",
"LABEL_16840",
"LABEL_16841",
"LABEL_16842",
"LABEL_16843",
"LABEL_16844",
"LABEL_16845",
"LABEL_16846",
"LABEL_16847",
"LABEL_16848",
"LABEL_16849",
"LABEL_1685",
"LABEL_16850",
"LABEL_16851",
"LABEL_16852",
"LABEL_16853",
"LABEL_16854",
"LABEL_16855",
"LABEL_16856",
"LABEL_16857",
"LABEL_16858",
"LABEL_16859",
"LABEL_1686",
"LABEL_16860",
"LABEL_16861",
"LABEL_16862",
"LABEL_16863",
"LABEL_16864",
"LABEL_16865",
"LABEL_16866",
"LABEL_16867",
"LABEL_16868",
"LABEL_16869",
"LABEL_1687",
"LABEL_16870",
"LABEL_16871",
"LABEL_16872",
"LABEL_16873",
"LABEL_16874",
"LABEL_16875",
"LABEL_16876",
"LABEL_16877",
"LABEL_16878",
"LABEL_16879",
"LABEL_1688",
"LABEL_16880",
"LABEL_16881",
"LABEL_16882",
"LABEL_16883",
"LABEL_16884",
"LABEL_16885",
"LABEL_16886",
"LABEL_16887",
"LABEL_16888",
"LABEL_16889",
"LABEL_1689",
"LABEL_16890",
"LABEL_16891",
"LABEL_16892",
"LABEL_16893",
"LABEL_16894",
"LABEL_16895",
"LABEL_16896",
"LABEL_16897",
"LABEL_16898",
"LABEL_16899",
"LABEL_169",
"LABEL_1690",
"LABEL_16900",
"LABEL_16901",
"LABEL_16902",
"LABEL_16903",
"LABEL_16904",
"LABEL_16905",
"LABEL_16906",
"LABEL_16907",
"LABEL_16908",
"LABEL_16909",
"LABEL_1691",
"LABEL_16910",
"LABEL_16911",
"LABEL_16912",
"LABEL_16913",
"LABEL_16914",
"LABEL_16915",
"LABEL_16916",
"LABEL_16917",
"LABEL_16918",
"LABEL_16919",
"LABEL_1692",
"LABEL_16920",
"LABEL_16921",
"LABEL_16922",
"LABEL_16923",
"LABEL_16924",
"LABEL_16925",
"LABEL_16926",
"LABEL_16927",
"LABEL_16928",
"LABEL_16929",
"LABEL_1693",
"LABEL_16930",
"LABEL_16931",
"LABEL_16932",
"LABEL_16933",
"LABEL_16934",
"LABEL_16935",
"LABEL_16936",
"LABEL_16937",
"LABEL_16938",
"LABEL_16939",
"LABEL_1694",
"LABEL_16940",
"LABEL_16941",
"LABEL_16942",
"LABEL_16943",
"LABEL_16944",
"LABEL_16945",
"LABEL_16946",
"LABEL_16947",
"LABEL_16948",
"LABEL_16949",
"LABEL_1695",
"LABEL_16950",
"LABEL_16951",
"LABEL_16952",
"LABEL_16953",
"LABEL_16954",
"LABEL_16955",
"LABEL_16956",
"LABEL_16957",
"LABEL_16958",
"LABEL_16959",
"LABEL_1696",
"LABEL_16960",
"LABEL_16961",
"LABEL_16962",
"LABEL_16963",
"LABEL_16964",
"LABEL_16965",
"LABEL_16966",
"LABEL_16967",
"LABEL_16968",
"LABEL_16969",
"LABEL_1697",
"LABEL_16970",
"LABEL_16971",
"LABEL_16972",
"LABEL_16973",
"LABEL_16974",
"LABEL_16975",
"LABEL_16976",
"LABEL_16977",
"LABEL_16978",
"LABEL_16979",
"LABEL_1698",
"LABEL_16980",
"LABEL_16981",
"LABEL_16982",
"LABEL_16983",
"LABEL_16984",
"LABEL_16985",
"LABEL_16986",
"LABEL_16987",
"LABEL_16988",
"LABEL_16989",
"LABEL_1699",
"LABEL_16990",
"LABEL_16991",
"LABEL_16992",
"LABEL_16993",
"LABEL_16994",
"LABEL_16995",
"LABEL_16996",
"LABEL_16997",
"LABEL_16998",
"LABEL_16999",
"LABEL_17",
"LABEL_170",
"LABEL_1700",
"LABEL_17000",
"LABEL_17001",
"LABEL_17002",
"LABEL_17003",
"LABEL_17004",
"LABEL_17005",
"LABEL_17006",
"LABEL_17007",
"LABEL_17008",
"LABEL_17009",
"LABEL_1701",
"LABEL_17010",
"LABEL_17011",
"LABEL_17012",
"LABEL_17013",
"LABEL_17014",
"LABEL_17015",
"LABEL_17016",
"LABEL_17017",
"LABEL_17018",
"LABEL_17019",
"LABEL_1702",
"LABEL_17020",
"LABEL_17021",
"LABEL_17022",
"LABEL_17023",
"LABEL_17024",
"LABEL_17025",
"LABEL_17026",
"LABEL_17027",
"LABEL_17028",
"LABEL_17029",
"LABEL_1703",
"LABEL_17030",
"LABEL_17031",
"LABEL_17032",
"LABEL_17033",
"LABEL_17034",
"LABEL_17035",
"LABEL_17036",
"LABEL_17037",
"LABEL_17038",
"LABEL_17039",
"LABEL_1704",
"LABEL_17040",
"LABEL_17041",
"LABEL_17042",
"LABEL_17043",
"LABEL_17044",
"LABEL_17045",
"LABEL_17046",
"LABEL_17047",
"LABEL_17048",
"LABEL_17049",
"LABEL_1705",
"LABEL_17050",
"LABEL_17051",
"LABEL_17052",
"LABEL_17053",
"LABEL_17054",
"LABEL_17055",
"LABEL_17056",
"LABEL_17057",
"LABEL_17058",
"LABEL_17059",
"LABEL_1706",
"LABEL_17060",
"LABEL_17061",
"LABEL_17062",
"LABEL_17063",
"LABEL_17064",
"LABEL_17065",
"LABEL_17066",
"LABEL_17067",
"LABEL_17068",
"LABEL_17069",
"LABEL_1707",
"LABEL_17070",
"LABEL_17071",
"LABEL_17072",
"LABEL_17073",
"LABEL_17074",
"LABEL_17075",
"LABEL_17076",
"LABEL_17077",
"LABEL_17078",
"LABEL_17079",
"LABEL_1708",
"LABEL_17080",
"LABEL_17081",
"LABEL_17082",
"LABEL_17083",
"LABEL_17084",
"LABEL_17085",
"LABEL_17086",
"LABEL_17087",
"LABEL_17088",
"LABEL_17089",
"LABEL_1709",
"LABEL_17090",
"LABEL_17091",
"LABEL_17092",
"LABEL_17093",
"LABEL_17094",
"LABEL_17095",
"LABEL_17096",
"LABEL_17097",
"LABEL_17098",
"LABEL_17099",
"LABEL_171",
"LABEL_1710",
"LABEL_17100",
"LABEL_17101",
"LABEL_17102",
"LABEL_17103",
"LABEL_17104",
"LABEL_17105",
"LABEL_17106",
"LABEL_17107",
"LABEL_17108",
"LABEL_17109",
"LABEL_1711",
"LABEL_17110",
"LABEL_17111",
"LABEL_17112",
"LABEL_17113",
"LABEL_17114",
"LABEL_17115",
"LABEL_17116",
"LABEL_17117",
"LABEL_17118",
"LABEL_17119",
"LABEL_1712",
"LABEL_17120",
"LABEL_17121",
"LABEL_17122",
"LABEL_17123",
"LABEL_17124",
"LABEL_17125",
"LABEL_17126",
"LABEL_17127",
"LABEL_17128",
"LABEL_17129",
"LABEL_1713",
"LABEL_17130",
"LABEL_17131",
"LABEL_17132",
"LABEL_17133",
"LABEL_17134",
"LABEL_17135",
"LABEL_17136",
"LABEL_17137",
"LABEL_17138",
"LABEL_17139",
"LABEL_1714",
"LABEL_17140",
"LABEL_17141",
"LABEL_17142",
"LABEL_17143",
"LABEL_17144",
"LABEL_17145",
"LABEL_17146",
"LABEL_17147",
"LABEL_17148",
"LABEL_17149",
"LABEL_1715",
"LABEL_17150",
"LABEL_17151",
"LABEL_17152",
"LABEL_17153",
"LABEL_17154",
"LABEL_17155",
"LABEL_17156",
"LABEL_17157",
"LABEL_17158",
"LABEL_17159",
"LABEL_1716",
"LABEL_17160",
"LABEL_17161",
"LABEL_17162",
"LABEL_17163",
"LABEL_17164",
"LABEL_17165",
"LABEL_17166",
"LABEL_17167",
"LABEL_17168",
"LABEL_17169",
"LABEL_1717",
"LABEL_17170",
"LABEL_17171",
"LABEL_17172",
"LABEL_17173",
"LABEL_17174",
"LABEL_17175",
"LABEL_17176",
"LABEL_17177",
"LABEL_17178",
"LABEL_17179",
"LABEL_1718",
"LABEL_17180",
"LABEL_17181",
"LABEL_17182",
"LABEL_17183",
"LABEL_17184",
"LABEL_17185",
"LABEL_17186",
"LABEL_17187",
"LABEL_17188",
"LABEL_17189",
"LABEL_1719",
"LABEL_17190",
"LABEL_17191",
"LABEL_17192",
"LABEL_17193",
"LABEL_17194",
"LABEL_17195",
"LABEL_17196",
"LABEL_17197",
"LABEL_17198",
"LABEL_17199",
"LABEL_172",
"LABEL_1720",
"LABEL_17200",
"LABEL_17201",
"LABEL_17202",
"LABEL_17203",
"LABEL_17204",
"LABEL_17205",
"LABEL_17206",
"LABEL_17207",
"LABEL_17208",
"LABEL_17209",
"LABEL_1721",
"LABEL_17210",
"LABEL_17211",
"LABEL_17212",
"LABEL_17213",
"LABEL_17214",
"LABEL_17215",
"LABEL_17216",
"LABEL_17217",
"LABEL_17218",
"LABEL_17219",
"LABEL_1722",
"LABEL_17220",
"LABEL_17221",
"LABEL_17222",
"LABEL_17223",
"LABEL_17224",
"LABEL_17225",
"LABEL_17226",
"LABEL_17227",
"LABEL_17228",
"LABEL_17229",
"LABEL_1723",
"LABEL_17230",
"LABEL_17231",
"LABEL_17232",
"LABEL_17233",
"LABEL_17234",
"LABEL_17235",
"LABEL_17236",
"LABEL_17237",
"LABEL_17238",
"LABEL_17239",
"LABEL_1724",
"LABEL_17240",
"LABEL_17241",
"LABEL_17242",
"LABEL_17243",
"LABEL_17244",
"LABEL_17245",
"LABEL_17246",
"LABEL_17247",
"LABEL_17248",
"LABEL_17249",
"LABEL_1725",
"LABEL_17250",
"LABEL_17251",
"LABEL_17252",
"LABEL_17253",
"LABEL_17254",
"LABEL_17255",
"LABEL_17256",
"LABEL_17257",
"LABEL_17258",
"LABEL_17259",
"LABEL_1726",
"LABEL_17260",
"LABEL_17261",
"LABEL_17262",
"LABEL_17263",
"LABEL_17264",
"LABEL_17265",
"LABEL_17266",
"LABEL_17267",
"LABEL_17268",
"LABEL_17269",
"LABEL_1727",
"LABEL_17270",
"LABEL_17271",
"LABEL_17272",
"LABEL_17273",
"LABEL_17274",
"LABEL_17275",
"LABEL_17276",
"LABEL_17277",
"LABEL_17278",
"LABEL_17279",
"LABEL_1728",
"LABEL_17280",
"LABEL_17281",
"LABEL_17282",
"LABEL_17283",
"LABEL_17284",
"LABEL_17285",
"LABEL_17286",
"LABEL_17287",
"LABEL_17288",
"LABEL_17289",
"LABEL_1729",
"LABEL_17290",
"LABEL_17291",
"LABEL_17292",
"LABEL_17293",
"LABEL_17294",
"LABEL_17295",
"LABEL_17296",
"LABEL_17297",
"LABEL_17298",
"LABEL_17299",
"LABEL_173",
"LABEL_1730",
"LABEL_17300",
"LABEL_17301",
"LABEL_17302",
"LABEL_17303",
"LABEL_17304",
"LABEL_17305",
"LABEL_17306",
"LABEL_17307",
"LABEL_17308",
"LABEL_17309",
"LABEL_1731",
"LABEL_17310",
"LABEL_17311",
"LABEL_17312",
"LABEL_17313",
"LABEL_17314",
"LABEL_17315",
"LABEL_17316",
"LABEL_17317",
"LABEL_17318",
"LABEL_17319",
"LABEL_1732",
"LABEL_17320",
"LABEL_17321",
"LABEL_17322",
"LABEL_17323",
"LABEL_17324",
"LABEL_17325",
"LABEL_17326",
"LABEL_17327",
"LABEL_17328",
"LABEL_17329",
"LABEL_1733",
"LABEL_17330",
"LABEL_17331",
"LABEL_17332",
"LABEL_17333",
"LABEL_17334",
"LABEL_17335",
"LABEL_17336",
"LABEL_17337",
"LABEL_17338",
"LABEL_17339",
"LABEL_1734",
"LABEL_17340",
"LABEL_17341",
"LABEL_17342",
"LABEL_17343",
"LABEL_17344",
"LABEL_17345",
"LABEL_17346",
"LABEL_17347",
"LABEL_17348",
"LABEL_17349",
"LABEL_1735",
"LABEL_17350",
"LABEL_17351",
"LABEL_17352",
"LABEL_17353",
"LABEL_17354",
"LABEL_17355",
"LABEL_17356",
"LABEL_17357",
"LABEL_17358",
"LABEL_17359",
"LABEL_1736",
"LABEL_17360",
"LABEL_17361",
"LABEL_17362",
"LABEL_17363",
"LABEL_17364",
"LABEL_17365",
"LABEL_17366",
"LABEL_17367",
"LABEL_17368",
"LABEL_17369",
"LABEL_1737",
"LABEL_17370",
"LABEL_17371",
"LABEL_17372",
"LABEL_17373",
"LABEL_17374",
"LABEL_17375",
"LABEL_17376",
"LABEL_17377",
"LABEL_17378",
"LABEL_17379",
"LABEL_1738",
"LABEL_17380",
"LABEL_17381",
"LABEL_17382",
"LABEL_17383",
"LABEL_17384",
"LABEL_17385",
"LABEL_17386",
"LABEL_17387",
"LABEL_17388",
"LABEL_17389",
"LABEL_1739",
"LABEL_17390",
"LABEL_17391",
"LABEL_17392",
"LABEL_17393",
"LABEL_17394",
"LABEL_17395",
"LABEL_17396",
"LABEL_17397",
"LABEL_17398",
"LABEL_17399",
"LABEL_174",
"LABEL_1740",
"LABEL_17400",
"LABEL_17401",
"LABEL_17402",
"LABEL_17403",
"LABEL_17404",
"LABEL_17405",
"LABEL_17406",
"LABEL_17407",
"LABEL_17408",
"LABEL_17409",
"LABEL_1741",
"LABEL_17410",
"LABEL_17411",
"LABEL_17412",
"LABEL_17413",
"LABEL_17414",
"LABEL_17415",
"LABEL_17416",
"LABEL_17417",
"LABEL_17418",
"LABEL_17419",
"LABEL_1742",
"LABEL_17420",
"LABEL_17421",
"LABEL_17422",
"LABEL_17423",
"LABEL_17424",
"LABEL_17425",
"LABEL_17426",
"LABEL_17427",
"LABEL_17428",
"LABEL_17429",
"LABEL_1743",
"LABEL_17430",
"LABEL_17431",
"LABEL_17432",
"LABEL_17433",
"LABEL_17434",
"LABEL_17435",
"LABEL_17436",
"LABEL_17437",
"LABEL_17438",
"LABEL_17439",
"LABEL_1744",
"LABEL_17440",
"LABEL_17441",
"LABEL_17442",
"LABEL_17443",
"LABEL_17444",
"LABEL_17445",
"LABEL_17446",
"LABEL_17447",
"LABEL_17448",
"LABEL_17449",
"LABEL_1745",
"LABEL_17450",
"LABEL_17451",
"LABEL_17452",
"LABEL_17453",
"LABEL_17454",
"LABEL_17455",
"LABEL_17456",
"LABEL_17457",
"LABEL_17458",
"LABEL_17459",
"LABEL_1746",
"LABEL_17460",
"LABEL_17461",
"LABEL_17462",
"LABEL_17463",
"LABEL_17464",
"LABEL_17465",
"LABEL_17466",
"LABEL_17467",
"LABEL_17468",
"LABEL_17469",
"LABEL_1747",
"LABEL_17470",
"LABEL_17471",
"LABEL_17472",
"LABEL_17473",
"LABEL_17474",
"LABEL_17475",
"LABEL_17476",
"LABEL_17477",
"LABEL_17478",
"LABEL_17479",
"LABEL_1748",
"LABEL_17480",
"LABEL_17481",
"LABEL_17482",
"LABEL_17483",
"LABEL_17484",
"LABEL_17485",
"LABEL_17486",
"LABEL_17487",
"LABEL_17488",
"LABEL_17489",
"LABEL_1749",
"LABEL_17490",
"LABEL_17491",
"LABEL_17492",
"LABEL_17493",
"LABEL_17494",
"LABEL_17495",
"LABEL_17496",
"LABEL_17497",
"LABEL_17498",
"LABEL_17499",
"LABEL_175",
"LABEL_1750",
"LABEL_17500",
"LABEL_17501",
"LABEL_17502",
"LABEL_17503",
"LABEL_17504",
"LABEL_17505",
"LABEL_17506",
"LABEL_17507",
"LABEL_17508",
"LABEL_17509",
"LABEL_1751",
"LABEL_17510",
"LABEL_17511",
"LABEL_17512",
"LABEL_17513",
"LABEL_17514",
"LABEL_17515",
"LABEL_17516",
"LABEL_17517",
"LABEL_17518",
"LABEL_17519",
"LABEL_1752",
"LABEL_17520",
"LABEL_17521",
"LABEL_17522",
"LABEL_17523",
"LABEL_17524",
"LABEL_17525",
"LABEL_17526",
"LABEL_17527",
"LABEL_17528",
"LABEL_17529",
"LABEL_1753",
"LABEL_17530",
"LABEL_17531",
"LABEL_17532",
"LABEL_17533",
"LABEL_17534",
"LABEL_17535",
"LABEL_17536",
"LABEL_17537",
"LABEL_17538",
"LABEL_17539",
"LABEL_1754",
"LABEL_17540",
"LABEL_17541",
"LABEL_17542",
"LABEL_17543",
"LABEL_17544",
"LABEL_17545",
"LABEL_17546",
"LABEL_17547",
"LABEL_17548",
"LABEL_17549",
"LABEL_1755",
"LABEL_17550",
"LABEL_17551",
"LABEL_17552",
"LABEL_17553",
"LABEL_17554",
"LABEL_17555",
"LABEL_17556",
"LABEL_17557",
"LABEL_17558",
"LABEL_17559",
"LABEL_1756",
"LABEL_17560",
"LABEL_17561",
"LABEL_17562",
"LABEL_17563",
"LABEL_17564",
"LABEL_17565",
"LABEL_17566",
"LABEL_17567",
"LABEL_17568",
"LABEL_17569",
"LABEL_1757",
"LABEL_17570",
"LABEL_17571",
"LABEL_17572",
"LABEL_17573",
"LABEL_17574",
"LABEL_17575",
"LABEL_17576",
"LABEL_17577",
"LABEL_17578",
"LABEL_17579",
"LABEL_1758",
"LABEL_17580",
"LABEL_17581",
"LABEL_17582",
"LABEL_17583",
"LABEL_17584",
"LABEL_17585",
"LABEL_17586",
"LABEL_17587",
"LABEL_17588",
"LABEL_17589",
"LABEL_1759",
"LABEL_17590",
"LABEL_17591",
"LABEL_17592",
"LABEL_17593",
"LABEL_17594",
"LABEL_17595",
"LABEL_17596",
"LABEL_17597",
"LABEL_17598",
"LABEL_17599",
"LABEL_176",
"LABEL_1760",
"LABEL_17600",
"LABEL_17601",
"LABEL_17602",
"LABEL_17603",
"LABEL_17604",
"LABEL_17605",
"LABEL_17606",
"LABEL_17607",
"LABEL_17608",
"LABEL_17609",
"LABEL_1761",
"LABEL_17610",
"LABEL_17611",
"LABEL_17612",
"LABEL_17613",
"LABEL_17614",
"LABEL_17615",
"LABEL_17616",
"LABEL_17617",
"LABEL_17618",
"LABEL_17619",
"LABEL_1762",
"LABEL_17620",
"LABEL_17621",
"LABEL_17622",
"LABEL_17623",
"LABEL_17624",
"LABEL_17625",
"LABEL_17626",
"LABEL_17627",
"LABEL_17628",
"LABEL_17629",
"LABEL_1763",
"LABEL_17630",
"LABEL_17631",
"LABEL_17632",
"LABEL_17633",
"LABEL_17634",
"LABEL_17635",
"LABEL_17636",
"LABEL_17637",
"LABEL_17638",
"LABEL_17639",
"LABEL_1764",
"LABEL_17640",
"LABEL_17641",
"LABEL_17642",
"LABEL_17643",
"LABEL_17644",
"LABEL_17645",
"LABEL_17646",
"LABEL_17647",
"LABEL_17648",
"LABEL_17649",
"LABEL_1765",
"LABEL_17650",
"LABEL_17651",
"LABEL_17652",
"LABEL_17653",
"LABEL_17654",
"LABEL_17655",
"LABEL_17656",
"LABEL_17657",
"LABEL_17658",
"LABEL_17659",
"LABEL_1766",
"LABEL_17660",
"LABEL_17661",
"LABEL_17662",
"LABEL_17663",
"LABEL_17664",
"LABEL_17665",
"LABEL_17666",
"LABEL_17667",
"LABEL_17668",
"LABEL_17669",
"LABEL_1767",
"LABEL_17670",
"LABEL_17671",
"LABEL_17672",
"LABEL_17673",
"LABEL_17674",
"LABEL_17675",
"LABEL_17676",
"LABEL_17677",
"LABEL_17678",
"LABEL_17679",
"LABEL_1768",
"LABEL_17680",
"LABEL_17681",
"LABEL_17682",
"LABEL_17683",
"LABEL_17684",
"LABEL_17685",
"LABEL_17686",
"LABEL_17687",
"LABEL_17688",
"LABEL_17689",
"LABEL_1769",
"LABEL_17690",
"LABEL_17691",
"LABEL_17692",
"LABEL_17693",
"LABEL_17694",
"LABEL_17695",
"LABEL_17696",
"LABEL_17697",
"LABEL_17698",
"LABEL_17699",
"LABEL_177",
"LABEL_1770",
"LABEL_17700",
"LABEL_17701",
"LABEL_17702",
"LABEL_17703",
"LABEL_17704",
"LABEL_17705",
"LABEL_17706",
"LABEL_17707",
"LABEL_17708",
"LABEL_17709",
"LABEL_1771",
"LABEL_17710",
"LABEL_17711",
"LABEL_17712",
"LABEL_17713",
"LABEL_17714",
"LABEL_17715",
"LABEL_17716",
"LABEL_17717",
"LABEL_17718",
"LABEL_17719",
"LABEL_1772",
"LABEL_17720",
"LABEL_17721",
"LABEL_17722",
"LABEL_17723",
"LABEL_17724",
"LABEL_17725",
"LABEL_17726",
"LABEL_17727",
"LABEL_17728",
"LABEL_17729",
"LABEL_1773",
"LABEL_17730",
"LABEL_17731",
"LABEL_17732",
"LABEL_17733",
"LABEL_17734",
"LABEL_17735",
"LABEL_17736",
"LABEL_17737",
"LABEL_17738",
"LABEL_17739",
"LABEL_1774",
"LABEL_17740",
"LABEL_17741",
"LABEL_17742",
"LABEL_17743",
"LABEL_17744",
"LABEL_17745",
"LABEL_17746",
"LABEL_17747",
"LABEL_17748",
"LABEL_17749",
"LABEL_1775",
"LABEL_17750",
"LABEL_17751",
"LABEL_17752",
"LABEL_17753",
"LABEL_17754",
"LABEL_17755",
"LABEL_17756",
"LABEL_17757",
"LABEL_17758",
"LABEL_17759",
"LABEL_1776",
"LABEL_17760",
"LABEL_17761",
"LABEL_17762",
"LABEL_17763",
"LABEL_17764",
"LABEL_17765",
"LABEL_17766",
"LABEL_17767",
"LABEL_17768",
"LABEL_17769",
"LABEL_1777",
"LABEL_17770",
"LABEL_17771",
"LABEL_17772",
"LABEL_17773",
"LABEL_17774",
"LABEL_17775",
"LABEL_17776",
"LABEL_17777",
"LABEL_17778",
"LABEL_17779",
"LABEL_1778",
"LABEL_17780",
"LABEL_17781",
"LABEL_17782",
"LABEL_17783",
"LABEL_17784",
"LABEL_17785",
"LABEL_17786",
"LABEL_17787",
"LABEL_17788",
"LABEL_17789",
"LABEL_1779",
"LABEL_17790",
"LABEL_17791",
"LABEL_17792",
"LABEL_17793",
"LABEL_17794",
"LABEL_17795",
"LABEL_17796",
"LABEL_17797",
"LABEL_17798",
"LABEL_17799",
"LABEL_178",
"LABEL_1780",
"LABEL_17800",
"LABEL_17801",
"LABEL_17802",
"LABEL_17803",
"LABEL_17804",
"LABEL_17805",
"LABEL_17806",
"LABEL_17807",
"LABEL_17808",
"LABEL_17809",
"LABEL_1781",
"LABEL_17810",
"LABEL_17811",
"LABEL_17812",
"LABEL_17813",
"LABEL_17814",
"LABEL_17815",
"LABEL_17816",
"LABEL_17817",
"LABEL_17818",
"LABEL_17819",
"LABEL_1782",
"LABEL_17820",
"LABEL_17821",
"LABEL_17822",
"LABEL_17823",
"LABEL_17824",
"LABEL_17825",
"LABEL_17826",
"LABEL_17827",
"LABEL_17828",
"LABEL_17829",
"LABEL_1783",
"LABEL_17830",
"LABEL_17831",
"LABEL_17832",
"LABEL_17833",
"LABEL_17834",
"LABEL_17835",
"LABEL_17836",
"LABEL_17837",
"LABEL_17838",
"LABEL_17839",
"LABEL_1784",
"LABEL_17840",
"LABEL_17841",
"LABEL_17842",
"LABEL_17843",
"LABEL_17844",
"LABEL_17845",
"LABEL_17846",
"LABEL_17847",
"LABEL_17848",
"LABEL_17849",
"LABEL_1785",
"LABEL_17850",
"LABEL_17851",
"LABEL_17852",
"LABEL_17853",
"LABEL_17854",
"LABEL_17855",
"LABEL_17856",
"LABEL_17857",
"LABEL_17858",
"LABEL_17859",
"LABEL_1786",
"LABEL_17860",
"LABEL_17861",
"LABEL_17862",
"LABEL_17863",
"LABEL_17864",
"LABEL_17865",
"LABEL_17866",
"LABEL_17867",
"LABEL_17868",
"LABEL_17869",
"LABEL_1787",
"LABEL_17870",
"LABEL_17871",
"LABEL_17872",
"LABEL_17873",
"LABEL_17874",
"LABEL_17875",
"LABEL_17876",
"LABEL_17877",
"LABEL_17878",
"LABEL_17879",
"LABEL_1788",
"LABEL_17880",
"LABEL_17881",
"LABEL_17882",
"LABEL_17883",
"LABEL_17884",
"LABEL_17885",
"LABEL_17886",
"LABEL_17887",
"LABEL_17888",
"LABEL_17889",
"LABEL_1789",
"LABEL_17890",
"LABEL_17891",
"LABEL_17892",
"LABEL_17893",
"LABEL_17894",
"LABEL_17895",
"LABEL_17896",
"LABEL_17897",
"LABEL_17898",
"LABEL_17899",
"LABEL_179",
"LABEL_1790",
"LABEL_17900",
"LABEL_17901",
"LABEL_17902",
"LABEL_17903",
"LABEL_17904",
"LABEL_17905",
"LABEL_17906",
"LABEL_17907",
"LABEL_17908",
"LABEL_17909",
"LABEL_1791",
"LABEL_17910",
"LABEL_17911",
"LABEL_17912",
"LABEL_17913",
"LABEL_17914",
"LABEL_17915",
"LABEL_17916",
"LABEL_17917",
"LABEL_17918",
"LABEL_17919",
"LABEL_1792",
"LABEL_17920",
"LABEL_17921",
"LABEL_17922",
"LABEL_17923",
"LABEL_17924",
"LABEL_17925",
"LABEL_17926",
"LABEL_17927",
"LABEL_17928",
"LABEL_17929",
"LABEL_1793",
"LABEL_17930",
"LABEL_17931",
"LABEL_17932",
"LABEL_17933",
"LABEL_17934",
"LABEL_17935",
"LABEL_17936",
"LABEL_17937",
"LABEL_17938",
"LABEL_17939",
"LABEL_1794",
"LABEL_17940",
"LABEL_17941",
"LABEL_17942",
"LABEL_17943",
"LABEL_17944",
"LABEL_17945",
"LABEL_17946",
"LABEL_17947",
"LABEL_17948",
"LABEL_17949",
"LABEL_1795",
"LABEL_17950",
"LABEL_17951",
"LABEL_17952",
"LABEL_17953",
"LABEL_17954",
"LABEL_17955",
"LABEL_17956",
"LABEL_17957",
"LABEL_17958",
"LABEL_17959",
"LABEL_1796",
"LABEL_17960",
"LABEL_17961",
"LABEL_17962",
"LABEL_17963",
"LABEL_17964",
"LABEL_17965",
"LABEL_17966",
"LABEL_17967",
"LABEL_17968",
"LABEL_17969",
"LABEL_1797",
"LABEL_17970",
"LABEL_17971",
"LABEL_17972",
"LABEL_17973",
"LABEL_17974",
"LABEL_17975",
"LABEL_17976",
"LABEL_17977",
"LABEL_17978",
"LABEL_17979",
"LABEL_1798",
"LABEL_17980",
"LABEL_17981",
"LABEL_17982",
"LABEL_17983",
"LABEL_17984",
"LABEL_17985",
"LABEL_17986",
"LABEL_17987",
"LABEL_17988",
"LABEL_17989",
"LABEL_1799",
"LABEL_17990",
"LABEL_17991",
"LABEL_17992",
"LABEL_17993",
"LABEL_17994",
"LABEL_17995",
"LABEL_17996",
"LABEL_17997",
"LABEL_17998",
"LABEL_17999",
"LABEL_18",
"LABEL_180",
"LABEL_1800",
"LABEL_18000",
"LABEL_18001",
"LABEL_18002",
"LABEL_18003",
"LABEL_18004",
"LABEL_18005",
"LABEL_18006",
"LABEL_18007",
"LABEL_18008",
"LABEL_18009",
"LABEL_1801",
"LABEL_18010",
"LABEL_18011",
"LABEL_18012",
"LABEL_18013",
"LABEL_18014",
"LABEL_18015",
"LABEL_18016",
"LABEL_18017",
"LABEL_18018",
"LABEL_18019",
"LABEL_1802",
"LABEL_18020",
"LABEL_18021",
"LABEL_18022",
"LABEL_18023",
"LABEL_18024",
"LABEL_18025",
"LABEL_18026",
"LABEL_18027",
"LABEL_18028",
"LABEL_18029",
"LABEL_1803",
"LABEL_18030",
"LABEL_18031",
"LABEL_18032",
"LABEL_18033",
"LABEL_18034",
"LABEL_18035",
"LABEL_18036",
"LABEL_18037",
"LABEL_18038",
"LABEL_18039",
"LABEL_1804",
"LABEL_18040",
"LABEL_18041",
"LABEL_18042",
"LABEL_18043",
"LABEL_18044",
"LABEL_18045",
"LABEL_18046",
"LABEL_18047",
"LABEL_18048",
"LABEL_18049",
"LABEL_1805",
"LABEL_18050",
"LABEL_18051",
"LABEL_18052",
"LABEL_18053",
"LABEL_18054",
"LABEL_18055",
"LABEL_18056",
"LABEL_18057",
"LABEL_18058",
"LABEL_18059",
"LABEL_1806",
"LABEL_18060",
"LABEL_18061",
"LABEL_18062",
"LABEL_18063",
"LABEL_18064",
"LABEL_18065",
"LABEL_18066",
"LABEL_18067",
"LABEL_18068",
"LABEL_18069",
"LABEL_1807",
"LABEL_18070",
"LABEL_18071",
"LABEL_18072",
"LABEL_18073",
"LABEL_18074",
"LABEL_18075",
"LABEL_18076",
"LABEL_18077",
"LABEL_18078",
"LABEL_18079",
"LABEL_1808",
"LABEL_18080",
"LABEL_18081",
"LABEL_18082",
"LABEL_18083",
"LABEL_18084",
"LABEL_18085",
"LABEL_18086",
"LABEL_18087",
"LABEL_18088",
"LABEL_18089",
"LABEL_1809",
"LABEL_18090",
"LABEL_18091",
"LABEL_18092",
"LABEL_18093",
"LABEL_18094",
"LABEL_18095",
"LABEL_18096",
"LABEL_18097",
"LABEL_18098",
"LABEL_18099",
"LABEL_181",
"LABEL_1810",
"LABEL_18100",
"LABEL_18101",
"LABEL_18102",
"LABEL_18103",
"LABEL_18104",
"LABEL_18105",
"LABEL_18106",
"LABEL_18107",
"LABEL_18108",
"LABEL_18109",
"LABEL_1811",
"LABEL_18110",
"LABEL_18111",
"LABEL_18112",
"LABEL_18113",
"LABEL_18114",
"LABEL_18115",
"LABEL_18116",
"LABEL_18117",
"LABEL_18118",
"LABEL_18119",
"LABEL_1812",
"LABEL_18120",
"LABEL_18121",
"LABEL_18122",
"LABEL_18123",
"LABEL_18124",
"LABEL_18125",
"LABEL_18126",
"LABEL_18127",
"LABEL_18128",
"LABEL_18129",
"LABEL_1813",
"LABEL_18130",
"LABEL_18131",
"LABEL_18132",
"LABEL_18133",
"LABEL_18134",
"LABEL_18135",
"LABEL_18136",
"LABEL_18137",
"LABEL_18138",
"LABEL_18139",
"LABEL_1814",
"LABEL_18140",
"LABEL_18141",
"LABEL_18142",
"LABEL_18143",
"LABEL_18144",
"LABEL_18145",
"LABEL_18146",
"LABEL_18147",
"LABEL_18148",
"LABEL_18149",
"LABEL_1815",
"LABEL_18150",
"LABEL_18151",
"LABEL_18152",
"LABEL_18153",
"LABEL_18154",
"LABEL_18155",
"LABEL_18156",
"LABEL_18157",
"LABEL_18158",
"LABEL_18159",
"LABEL_1816",
"LABEL_18160",
"LABEL_18161",
"LABEL_18162",
"LABEL_18163",
"LABEL_18164",
"LABEL_18165",
"LABEL_18166",
"LABEL_18167",
"LABEL_18168",
"LABEL_18169",
"LABEL_1817",
"LABEL_18170",
"LABEL_18171",
"LABEL_18172",
"LABEL_18173",
"LABEL_18174",
"LABEL_18175",
"LABEL_18176",
"LABEL_18177",
"LABEL_18178",
"LABEL_18179",
"LABEL_1818",
"LABEL_18180",
"LABEL_18181",
"LABEL_18182",
"LABEL_18183",
"LABEL_18184",
"LABEL_18185",
"LABEL_18186",
"LABEL_18187",
"LABEL_18188",
"LABEL_18189",
"LABEL_1819",
"LABEL_18190",
"LABEL_18191",
"LABEL_18192",
"LABEL_18193",
"LABEL_18194",
"LABEL_18195",
"LABEL_18196",
"LABEL_18197",
"LABEL_18198",
"LABEL_18199",
"LABEL_182",
"LABEL_1820",
"LABEL_18200",
"LABEL_18201",
"LABEL_18202",
"LABEL_18203",
"LABEL_18204",
"LABEL_18205",
"LABEL_18206",
"LABEL_18207",
"LABEL_18208",
"LABEL_18209",
"LABEL_1821",
"LABEL_18210",
"LABEL_18211",
"LABEL_18212",
"LABEL_18213",
"LABEL_18214",
"LABEL_18215",
"LABEL_18216",
"LABEL_18217",
"LABEL_18218",
"LABEL_18219",
"LABEL_1822",
"LABEL_18220",
"LABEL_18221",
"LABEL_18222",
"LABEL_18223",
"LABEL_18224",
"LABEL_18225",
"LABEL_18226",
"LABEL_18227",
"LABEL_18228",
"LABEL_18229",
"LABEL_1823",
"LABEL_18230",
"LABEL_18231",
"LABEL_18232",
"LABEL_18233",
"LABEL_18234",
"LABEL_18235",
"LABEL_18236",
"LABEL_18237",
"LABEL_18238",
"LABEL_18239",
"LABEL_1824",
"LABEL_18240",
"LABEL_18241",
"LABEL_18242",
"LABEL_18243",
"LABEL_18244",
"LABEL_18245",
"LABEL_18246",
"LABEL_18247",
"LABEL_18248",
"LABEL_18249",
"LABEL_1825",
"LABEL_18250",
"LABEL_18251",
"LABEL_18252",
"LABEL_18253",
"LABEL_18254",
"LABEL_18255",
"LABEL_18256",
"LABEL_18257",
"LABEL_18258",
"LABEL_18259",
"LABEL_1826",
"LABEL_18260",
"LABEL_18261",
"LABEL_18262",
"LABEL_18263",
"LABEL_18264",
"LABEL_18265",
"LABEL_18266",
"LABEL_18267",
"LABEL_18268",
"LABEL_18269",
"LABEL_1827",
"LABEL_18270",
"LABEL_18271",
"LABEL_18272",
"LABEL_18273",
"LABEL_18274",
"LABEL_18275",
"LABEL_18276",
"LABEL_18277",
"LABEL_18278",
"LABEL_18279",
"LABEL_1828",
"LABEL_18280",
"LABEL_18281",
"LABEL_18282",
"LABEL_18283",
"LABEL_18284",
"LABEL_18285",
"LABEL_18286",
"LABEL_18287",
"LABEL_18288",
"LABEL_18289",
"LABEL_1829",
"LABEL_18290",
"LABEL_18291",
"LABEL_18292",
"LABEL_18293",
"LABEL_18294",
"LABEL_18295",
"LABEL_18296",
"LABEL_18297",
"LABEL_18298",
"LABEL_18299",
"LABEL_183",
"LABEL_1830",
"LABEL_18300",
"LABEL_18301",
"LABEL_18302",
"LABEL_18303",
"LABEL_18304",
"LABEL_18305",
"LABEL_18306",
"LABEL_18307",
"LABEL_18308",
"LABEL_18309",
"LABEL_1831",
"LABEL_18310",
"LABEL_18311",
"LABEL_18312",
"LABEL_18313",
"LABEL_18314",
"LABEL_18315",
"LABEL_18316",
"LABEL_18317",
"LABEL_18318",
"LABEL_18319",
"LABEL_1832",
"LABEL_18320",
"LABEL_18321",
"LABEL_18322",
"LABEL_18323",
"LABEL_18324",
"LABEL_18325",
"LABEL_18326",
"LABEL_18327",
"LABEL_18328",
"LABEL_18329",
"LABEL_1833",
"LABEL_18330",
"LABEL_18331",
"LABEL_18332",
"LABEL_18333",
"LABEL_18334",
"LABEL_18335",
"LABEL_18336",
"LABEL_18337",
"LABEL_18338",
"LABEL_18339",
"LABEL_1834",
"LABEL_18340",
"LABEL_18341",
"LABEL_18342",
"LABEL_18343",
"LABEL_18344",
"LABEL_18345",
"LABEL_18346",
"LABEL_18347",
"LABEL_18348",
"LABEL_18349",
"LABEL_1835",
"LABEL_18350",
"LABEL_18351",
"LABEL_18352",
"LABEL_18353",
"LABEL_18354",
"LABEL_18355",
"LABEL_18356",
"LABEL_18357",
"LABEL_18358",
"LABEL_18359",
"LABEL_1836",
"LABEL_18360",
"LABEL_18361",
"LABEL_18362",
"LABEL_18363",
"LABEL_18364",
"LABEL_18365",
"LABEL_18366",
"LABEL_18367",
"LABEL_18368",
"LABEL_18369",
"LABEL_1837",
"LABEL_18370",
"LABEL_18371",
"LABEL_18372",
"LABEL_18373",
"LABEL_18374",
"LABEL_18375",
"LABEL_18376",
"LABEL_18377",
"LABEL_18378",
"LABEL_18379",
"LABEL_1838",
"LABEL_18380",
"LABEL_18381",
"LABEL_18382",
"LABEL_18383",
"LABEL_18384",
"LABEL_18385",
"LABEL_18386",
"LABEL_18387",
"LABEL_18388",
"LABEL_18389",
"LABEL_1839",
"LABEL_18390",
"LABEL_18391",
"LABEL_18392",
"LABEL_18393",
"LABEL_18394",
"LABEL_18395",
"LABEL_18396",
"LABEL_18397",
"LABEL_18398",
"LABEL_18399",
"LABEL_184",
"LABEL_1840",
"LABEL_18400",
"LABEL_18401",
"LABEL_18402",
"LABEL_18403",
"LABEL_18404",
"LABEL_18405",
"LABEL_18406",
"LABEL_18407",
"LABEL_18408",
"LABEL_18409",
"LABEL_1841",
"LABEL_18410",
"LABEL_18411",
"LABEL_18412",
"LABEL_18413",
"LABEL_18414",
"LABEL_18415",
"LABEL_18416",
"LABEL_18417",
"LABEL_18418",
"LABEL_18419",
"LABEL_1842",
"LABEL_18420",
"LABEL_18421",
"LABEL_18422",
"LABEL_18423",
"LABEL_18424",
"LABEL_18425",
"LABEL_18426",
"LABEL_18427",
"LABEL_18428",
"LABEL_18429",
"LABEL_1843",
"LABEL_18430",
"LABEL_18431",
"LABEL_18432",
"LABEL_18433",
"LABEL_18434",
"LABEL_18435",
"LABEL_18436",
"LABEL_18437",
"LABEL_18438",
"LABEL_18439",
"LABEL_1844",
"LABEL_18440",
"LABEL_18441",
"LABEL_18442",
"LABEL_18443",
"LABEL_18444",
"LABEL_18445",
"LABEL_18446",
"LABEL_18447",
"LABEL_18448",
"LABEL_18449",
"LABEL_1845",
"LABEL_18450",
"LABEL_18451",
"LABEL_18452",
"LABEL_18453",
"LABEL_18454",
"LABEL_18455",
"LABEL_18456",
"LABEL_18457",
"LABEL_18458",
"LABEL_18459",
"LABEL_1846",
"LABEL_18460",
"LABEL_18461",
"LABEL_18462",
"LABEL_18463",
"LABEL_18464",
"LABEL_18465",
"LABEL_18466",
"LABEL_18467",
"LABEL_18468",
"LABEL_18469",
"LABEL_1847",
"LABEL_18470",
"LABEL_18471",
"LABEL_18472",
"LABEL_18473",
"LABEL_18474",
"LABEL_18475",
"LABEL_18476",
"LABEL_18477",
"LABEL_18478",
"LABEL_18479",
"LABEL_1848",
"LABEL_18480",
"LABEL_18481",
"LABEL_18482",
"LABEL_18483",
"LABEL_18484",
"LABEL_18485",
"LABEL_18486",
"LABEL_18487",
"LABEL_18488",
"LABEL_18489",
"LABEL_1849",
"LABEL_18490",
"LABEL_18491",
"LABEL_18492",
"LABEL_18493",
"LABEL_18494",
"LABEL_18495",
"LABEL_18496",
"LABEL_18497",
"LABEL_18498",
"LABEL_18499",
"LABEL_185",
"LABEL_1850",
"LABEL_18500",
"LABEL_18501",
"LABEL_18502",
"LABEL_18503",
"LABEL_18504",
"LABEL_18505",
"LABEL_18506",
"LABEL_18507",
"LABEL_18508",
"LABEL_18509",
"LABEL_1851",
"LABEL_18510",
"LABEL_18511",
"LABEL_18512",
"LABEL_18513",
"LABEL_18514",
"LABEL_18515",
"LABEL_18516",
"LABEL_18517",
"LABEL_18518",
"LABEL_18519",
"LABEL_1852",
"LABEL_18520",
"LABEL_18521",
"LABEL_18522",
"LABEL_18523",
"LABEL_18524",
"LABEL_18525",
"LABEL_18526",
"LABEL_18527",
"LABEL_18528",
"LABEL_18529",
"LABEL_1853",
"LABEL_18530",
"LABEL_18531",
"LABEL_18532",
"LABEL_18533",
"LABEL_18534",
"LABEL_18535",
"LABEL_18536",
"LABEL_18537",
"LABEL_18538",
"LABEL_18539",
"LABEL_1854",
"LABEL_18540",
"LABEL_18541",
"LABEL_18542",
"LABEL_18543",
"LABEL_18544",
"LABEL_18545",
"LABEL_18546",
"LABEL_18547",
"LABEL_18548",
"LABEL_18549",
"LABEL_1855",
"LABEL_18550",
"LABEL_18551",
"LABEL_18552",
"LABEL_18553",
"LABEL_18554",
"LABEL_18555",
"LABEL_18556",
"LABEL_18557",
"LABEL_18558",
"LABEL_18559",
"LABEL_1856",
"LABEL_18560",
"LABEL_18561",
"LABEL_18562",
"LABEL_18563",
"LABEL_18564",
"LABEL_18565",
"LABEL_18566",
"LABEL_18567",
"LABEL_18568",
"LABEL_18569",
"LABEL_1857",
"LABEL_18570",
"LABEL_18571",
"LABEL_18572",
"LABEL_18573",
"LABEL_18574",
"LABEL_18575",
"LABEL_18576",
"LABEL_18577",
"LABEL_18578",
"LABEL_18579",
"LABEL_1858",
"LABEL_18580",
"LABEL_18581",
"LABEL_18582",
"LABEL_18583",
"LABEL_18584",
"LABEL_18585",
"LABEL_18586",
"LABEL_18587",
"LABEL_18588",
"LABEL_18589",
"LABEL_1859",
"LABEL_18590",
"LABEL_18591",
"LABEL_18592",
"LABEL_18593",
"LABEL_18594",
"LABEL_18595",
"LABEL_18596",
"LABEL_18597",
"LABEL_18598",
"LABEL_18599",
"LABEL_186",
"LABEL_1860",
"LABEL_18600",
"LABEL_18601",
"LABEL_18602",
"LABEL_18603",
"LABEL_18604",
"LABEL_18605",
"LABEL_18606",
"LABEL_18607",
"LABEL_18608",
"LABEL_18609",
"LABEL_1861",
"LABEL_18610",
"LABEL_18611",
"LABEL_18612",
"LABEL_18613",
"LABEL_18614",
"LABEL_18615",
"LABEL_18616",
"LABEL_18617",
"LABEL_18618",
"LABEL_18619",
"LABEL_1862",
"LABEL_18620",
"LABEL_18621",
"LABEL_18622",
"LABEL_18623",
"LABEL_18624",
"LABEL_18625",
"LABEL_18626",
"LABEL_18627",
"LABEL_18628",
"LABEL_18629",
"LABEL_1863",
"LABEL_18630",
"LABEL_18631",
"LABEL_18632",
"LABEL_18633",
"LABEL_18634",
"LABEL_18635",
"LABEL_18636",
"LABEL_18637",
"LABEL_18638",
"LABEL_18639",
"LABEL_1864",
"LABEL_18640",
"LABEL_18641",
"LABEL_18642",
"LABEL_18643",
"LABEL_18644",
"LABEL_18645",
"LABEL_18646",
"LABEL_18647",
"LABEL_18648",
"LABEL_18649",
"LABEL_1865",
"LABEL_18650",
"LABEL_18651",
"LABEL_18652",
"LABEL_18653",
"LABEL_18654",
"LABEL_18655",
"LABEL_18656",
"LABEL_18657",
"LABEL_18658",
"LABEL_18659",
"LABEL_1866",
"LABEL_18660",
"LABEL_18661",
"LABEL_18662",
"LABEL_18663",
"LABEL_18664",
"LABEL_18665",
"LABEL_18666",
"LABEL_18667",
"LABEL_18668",
"LABEL_18669",
"LABEL_1867",
"LABEL_18670",
"LABEL_18671",
"LABEL_18672",
"LABEL_18673",
"LABEL_18674",
"LABEL_18675",
"LABEL_18676",
"LABEL_18677",
"LABEL_18678",
"LABEL_18679",
"LABEL_1868",
"LABEL_18680",
"LABEL_18681",
"LABEL_18682",
"LABEL_18683",
"LABEL_18684",
"LABEL_18685",
"LABEL_18686",
"LABEL_18687",
"LABEL_18688",
"LABEL_18689",
"LABEL_1869",
"LABEL_18690",
"LABEL_18691",
"LABEL_18692",
"LABEL_18693",
"LABEL_18694",
"LABEL_18695",
"LABEL_18696",
"LABEL_18697",
"LABEL_18698",
"LABEL_18699",
"LABEL_187",
"LABEL_1870",
"LABEL_18700",
"LABEL_18701",
"LABEL_18702",
"LABEL_18703",
"LABEL_18704",
"LABEL_18705",
"LABEL_18706",
"LABEL_18707",
"LABEL_18708",
"LABEL_18709",
"LABEL_1871",
"LABEL_18710",
"LABEL_18711",
"LABEL_18712",
"LABEL_18713",
"LABEL_18714",
"LABEL_18715",
"LABEL_18716",
"LABEL_18717",
"LABEL_18718",
"LABEL_18719",
"LABEL_1872",
"LABEL_18720",
"LABEL_18721",
"LABEL_18722",
"LABEL_18723",
"LABEL_18724",
"LABEL_18725",
"LABEL_18726",
"LABEL_18727",
"LABEL_18728",
"LABEL_18729",
"LABEL_1873",
"LABEL_18730",
"LABEL_18731",
"LABEL_18732",
"LABEL_18733",
"LABEL_18734",
"LABEL_18735",
"LABEL_18736",
"LABEL_18737",
"LABEL_18738",
"LABEL_18739",
"LABEL_1874",
"LABEL_18740",
"LABEL_18741",
"LABEL_18742",
"LABEL_18743",
"LABEL_18744",
"LABEL_18745",
"LABEL_18746",
"LABEL_18747",
"LABEL_18748",
"LABEL_18749",
"LABEL_1875",
"LABEL_18750",
"LABEL_18751",
"LABEL_18752",
"LABEL_18753",
"LABEL_18754",
"LABEL_18755",
"LABEL_18756",
"LABEL_18757",
"LABEL_18758",
"LABEL_18759",
"LABEL_1876",
"LABEL_18760",
"LABEL_18761",
"LABEL_18762",
"LABEL_18763",
"LABEL_18764",
"LABEL_18765",
"LABEL_18766",
"LABEL_18767",
"LABEL_18768",
"LABEL_18769",
"LABEL_1877",
"LABEL_18770",
"LABEL_18771",
"LABEL_18772",
"LABEL_18773",
"LABEL_18774",
"LABEL_18775",
"LABEL_18776",
"LABEL_18777",
"LABEL_18778",
"LABEL_18779",
"LABEL_1878",
"LABEL_18780",
"LABEL_18781",
"LABEL_18782",
"LABEL_18783",
"LABEL_18784",
"LABEL_18785",
"LABEL_18786",
"LABEL_18787",
"LABEL_18788",
"LABEL_18789",
"LABEL_1879",
"LABEL_18790",
"LABEL_18791",
"LABEL_18792",
"LABEL_18793",
"LABEL_18794",
"LABEL_18795",
"LABEL_18796",
"LABEL_18797",
"LABEL_18798",
"LABEL_18799",
"LABEL_188",
"LABEL_1880",
"LABEL_18800",
"LABEL_18801",
"LABEL_18802",
"LABEL_18803",
"LABEL_18804",
"LABEL_18805",
"LABEL_18806",
"LABEL_18807",
"LABEL_18808",
"LABEL_18809",
"LABEL_1881",
"LABEL_18810",
"LABEL_18811",
"LABEL_18812",
"LABEL_18813",
"LABEL_18814",
"LABEL_18815",
"LABEL_18816",
"LABEL_18817",
"LABEL_18818",
"LABEL_18819",
"LABEL_1882",
"LABEL_18820",
"LABEL_18821",
"LABEL_18822",
"LABEL_18823",
"LABEL_18824",
"LABEL_18825",
"LABEL_18826",
"LABEL_18827",
"LABEL_18828",
"LABEL_18829",
"LABEL_1883",
"LABEL_18830",
"LABEL_18831",
"LABEL_18832",
"LABEL_18833",
"LABEL_18834",
"LABEL_18835",
"LABEL_18836",
"LABEL_18837",
"LABEL_18838",
"LABEL_18839",
"LABEL_1884",
"LABEL_18840",
"LABEL_18841",
"LABEL_18842",
"LABEL_18843",
"LABEL_18844",
"LABEL_18845",
"LABEL_18846",
"LABEL_18847",
"LABEL_18848",
"LABEL_18849",
"LABEL_1885",
"LABEL_18850",
"LABEL_18851",
"LABEL_18852",
"LABEL_18853",
"LABEL_18854",
"LABEL_18855",
"LABEL_18856",
"LABEL_18857",
"LABEL_18858",
"LABEL_18859",
"LABEL_1886",
"LABEL_18860",
"LABEL_18861",
"LABEL_18862",
"LABEL_18863",
"LABEL_18864",
"LABEL_18865",
"LABEL_18866",
"LABEL_18867",
"LABEL_18868",
"LABEL_18869",
"LABEL_1887",
"LABEL_18870",
"LABEL_18871",
"LABEL_18872",
"LABEL_18873",
"LABEL_18874",
"LABEL_18875",
"LABEL_18876",
"LABEL_18877",
"LABEL_18878",
"LABEL_18879",
"LABEL_1888",
"LABEL_18880",
"LABEL_18881",
"LABEL_18882",
"LABEL_18883",
"LABEL_18884",
"LABEL_18885",
"LABEL_18886",
"LABEL_18887",
"LABEL_18888",
"LABEL_18889",
"LABEL_1889",
"LABEL_18890",
"LABEL_18891",
"LABEL_18892",
"LABEL_18893",
"LABEL_18894",
"LABEL_18895",
"LABEL_18896",
"LABEL_18897",
"LABEL_18898",
"LABEL_18899",
"LABEL_189",
"LABEL_1890",
"LABEL_18900",
"LABEL_18901",
"LABEL_18902",
"LABEL_18903",
"LABEL_18904",
"LABEL_18905",
"LABEL_18906",
"LABEL_18907",
"LABEL_18908",
"LABEL_18909",
"LABEL_1891",
"LABEL_18910",
"LABEL_18911",
"LABEL_18912",
"LABEL_18913",
"LABEL_18914",
"LABEL_18915",
"LABEL_18916",
"LABEL_18917",
"LABEL_18918",
"LABEL_18919",
"LABEL_1892",
"LABEL_18920",
"LABEL_18921",
"LABEL_18922",
"LABEL_18923",
"LABEL_18924",
"LABEL_18925",
"LABEL_18926",
"LABEL_18927",
"LABEL_18928",
"LABEL_18929",
"LABEL_1893",
"LABEL_18930",
"LABEL_18931",
"LABEL_18932",
"LABEL_18933",
"LABEL_18934",
"LABEL_18935",
"LABEL_18936",
"LABEL_18937",
"LABEL_18938",
"LABEL_18939",
"LABEL_1894",
"LABEL_18940",
"LABEL_18941",
"LABEL_18942",
"LABEL_18943",
"LABEL_18944",
"LABEL_18945",
"LABEL_18946",
"LABEL_18947",
"LABEL_18948",
"LABEL_18949",
"LABEL_1895",
"LABEL_18950",
"LABEL_18951",
"LABEL_18952",
"LABEL_18953",
"LABEL_18954",
"LABEL_18955",
"LABEL_18956",
"LABEL_18957",
"LABEL_18958",
"LABEL_18959",
"LABEL_1896",
"LABEL_18960",
"LABEL_18961",
"LABEL_18962",
"LABEL_18963",
"LABEL_18964",
"LABEL_18965",
"LABEL_18966",
"LABEL_18967",
"LABEL_18968",
"LABEL_18969",
"LABEL_1897",
"LABEL_18970",
"LABEL_18971",
"LABEL_18972",
"LABEL_18973",
"LABEL_18974",
"LABEL_18975",
"LABEL_18976",
"LABEL_18977",
"LABEL_18978",
"LABEL_18979",
"LABEL_1898",
"LABEL_18980",
"LABEL_18981",
"LABEL_18982",
"LABEL_18983",
"LABEL_18984",
"LABEL_18985",
"LABEL_18986",
"LABEL_18987",
"LABEL_18988",
"LABEL_18989",
"LABEL_1899",
"LABEL_18990",
"LABEL_18991",
"LABEL_18992",
"LABEL_18993",
"LABEL_18994",
"LABEL_18995",
"LABEL_18996",
"LABEL_18997",
"LABEL_18998",
"LABEL_18999",
"LABEL_19",
"LABEL_190",
"LABEL_1900",
"LABEL_19000",
"LABEL_19001",
"LABEL_19002",
"LABEL_19003",
"LABEL_19004",
"LABEL_19005",
"LABEL_19006",
"LABEL_19007",
"LABEL_19008",
"LABEL_19009",
"LABEL_1901",
"LABEL_19010",
"LABEL_19011",
"LABEL_19012",
"LABEL_19013",
"LABEL_19014",
"LABEL_19015",
"LABEL_19016",
"LABEL_19017",
"LABEL_19018",
"LABEL_19019",
"LABEL_1902",
"LABEL_19020",
"LABEL_19021",
"LABEL_19022",
"LABEL_19023",
"LABEL_19024",
"LABEL_19025",
"LABEL_19026",
"LABEL_19027",
"LABEL_19028",
"LABEL_19029",
"LABEL_1903",
"LABEL_19030",
"LABEL_19031",
"LABEL_19032",
"LABEL_19033",
"LABEL_19034",
"LABEL_19035",
"LABEL_19036",
"LABEL_19037",
"LABEL_19038",
"LABEL_19039",
"LABEL_1904",
"LABEL_19040",
"LABEL_19041",
"LABEL_19042",
"LABEL_19043",
"LABEL_19044",
"LABEL_19045",
"LABEL_19046",
"LABEL_19047",
"LABEL_19048",
"LABEL_19049",
"LABEL_1905",
"LABEL_19050",
"LABEL_19051",
"LABEL_19052",
"LABEL_19053",
"LABEL_19054",
"LABEL_19055",
"LABEL_19056",
"LABEL_19057",
"LABEL_19058",
"LABEL_19059",
"LABEL_1906",
"LABEL_19060",
"LABEL_19061",
"LABEL_19062",
"LABEL_19063",
"LABEL_19064",
"LABEL_19065",
"LABEL_19066",
"LABEL_19067",
"LABEL_19068",
"LABEL_19069",
"LABEL_1907",
"LABEL_19070",
"LABEL_19071",
"LABEL_19072",
"LABEL_19073",
"LABEL_19074",
"LABEL_19075",
"LABEL_19076",
"LABEL_19077",
"LABEL_19078",
"LABEL_19079",
"LABEL_1908",
"LABEL_19080",
"LABEL_19081",
"LABEL_19082",
"LABEL_19083",
"LABEL_19084",
"LABEL_19085",
"LABEL_19086",
"LABEL_19087",
"LABEL_19088",
"LABEL_19089",
"LABEL_1909",
"LABEL_19090",
"LABEL_19091",
"LABEL_19092",
"LABEL_19093",
"LABEL_19094",
"LABEL_19095",
"LABEL_19096",
"LABEL_19097",
"LABEL_19098",
"LABEL_19099",
"LABEL_191",
"LABEL_1910",
"LABEL_19100",
"LABEL_19101",
"LABEL_19102",
"LABEL_19103",
"LABEL_19104",
"LABEL_19105",
"LABEL_19106",
"LABEL_19107",
"LABEL_19108",
"LABEL_19109",
"LABEL_1911",
"LABEL_19110",
"LABEL_19111",
"LABEL_19112",
"LABEL_19113",
"LABEL_19114",
"LABEL_19115",
"LABEL_19116",
"LABEL_19117",
"LABEL_19118",
"LABEL_19119",
"LABEL_1912",
"LABEL_19120",
"LABEL_19121",
"LABEL_19122",
"LABEL_19123",
"LABEL_19124",
"LABEL_19125",
"LABEL_19126",
"LABEL_19127",
"LABEL_19128",
"LABEL_19129",
"LABEL_1913",
"LABEL_19130",
"LABEL_19131",
"LABEL_19132",
"LABEL_19133",
"LABEL_19134",
"LABEL_19135",
"LABEL_19136",
"LABEL_19137",
"LABEL_19138",
"LABEL_19139",
"LABEL_1914",
"LABEL_19140",
"LABEL_19141",
"LABEL_19142",
"LABEL_19143",
"LABEL_19144",
"LABEL_19145",
"LABEL_19146",
"LABEL_19147",
"LABEL_19148",
"LABEL_19149",
"LABEL_1915",
"LABEL_19150",
"LABEL_19151",
"LABEL_19152",
"LABEL_19153",
"LABEL_19154",
"LABEL_19155",
"LABEL_19156",
"LABEL_19157",
"LABEL_19158",
"LABEL_19159",
"LABEL_1916",
"LABEL_19160",
"LABEL_19161",
"LABEL_19162",
"LABEL_19163",
"LABEL_19164",
"LABEL_19165",
"LABEL_19166",
"LABEL_19167",
"LABEL_19168",
"LABEL_19169",
"LABEL_1917",
"LABEL_19170",
"LABEL_19171",
"LABEL_19172",
"LABEL_19173",
"LABEL_19174",
"LABEL_19175",
"LABEL_19176",
"LABEL_19177",
"LABEL_19178",
"LABEL_19179",
"LABEL_1918",
"LABEL_19180",
"LABEL_19181",
"LABEL_19182",
"LABEL_19183",
"LABEL_19184",
"LABEL_19185",
"LABEL_19186",
"LABEL_19187",
"LABEL_19188",
"LABEL_19189",
"LABEL_1919",
"LABEL_19190",
"LABEL_19191",
"LABEL_19192",
"LABEL_19193",
"LABEL_19194",
"LABEL_19195",
"LABEL_19196",
"LABEL_19197",
"LABEL_19198",
"LABEL_19199",
"LABEL_192",
"LABEL_1920",
"LABEL_19200",
"LABEL_19201",
"LABEL_19202",
"LABEL_19203",
"LABEL_19204",
"LABEL_19205",
"LABEL_19206",
"LABEL_19207",
"LABEL_19208",
"LABEL_19209",
"LABEL_1921",
"LABEL_19210",
"LABEL_19211",
"LABEL_19212",
"LABEL_19213",
"LABEL_19214",
"LABEL_19215",
"LABEL_19216",
"LABEL_19217",
"LABEL_19218",
"LABEL_19219",
"LABEL_1922",
"LABEL_19220",
"LABEL_19221",
"LABEL_19222",
"LABEL_19223",
"LABEL_19224",
"LABEL_19225",
"LABEL_19226",
"LABEL_19227",
"LABEL_19228",
"LABEL_19229",
"LABEL_1923",
"LABEL_19230",
"LABEL_19231",
"LABEL_19232",
"LABEL_19233",
"LABEL_19234",
"LABEL_19235",
"LABEL_19236",
"LABEL_19237",
"LABEL_19238",
"LABEL_19239",
"LABEL_1924",
"LABEL_19240",
"LABEL_19241",
"LABEL_19242",
"LABEL_19243",
"LABEL_19244",
"LABEL_19245",
"LABEL_19246",
"LABEL_19247",
"LABEL_19248",
"LABEL_19249",
"LABEL_1925",
"LABEL_19250",
"LABEL_19251",
"LABEL_19252",
"LABEL_19253",
"LABEL_19254",
"LABEL_19255",
"LABEL_19256",
"LABEL_19257",
"LABEL_19258",
"LABEL_19259",
"LABEL_1926",
"LABEL_19260",
"LABEL_19261",
"LABEL_19262",
"LABEL_19263",
"LABEL_19264",
"LABEL_19265",
"LABEL_19266",
"LABEL_19267",
"LABEL_19268",
"LABEL_19269",
"LABEL_1927",
"LABEL_19270",
"LABEL_19271",
"LABEL_19272",
"LABEL_19273",
"LABEL_19274",
"LABEL_19275",
"LABEL_19276",
"LABEL_19277",
"LABEL_19278",
"LABEL_19279",
"LABEL_1928",
"LABEL_19280",
"LABEL_19281",
"LABEL_19282",
"LABEL_19283",
"LABEL_19284",
"LABEL_19285",
"LABEL_19286",
"LABEL_19287",
"LABEL_19288",
"LABEL_19289",
"LABEL_1929",
"LABEL_19290",
"LABEL_19291",
"LABEL_19292",
"LABEL_19293",
"LABEL_19294",
"LABEL_19295",
"LABEL_19296",
"LABEL_19297",
"LABEL_19298",
"LABEL_19299",
"LABEL_193",
"LABEL_1930",
"LABEL_19300",
"LABEL_19301",
"LABEL_19302",
"LABEL_19303",
"LABEL_19304",
"LABEL_19305",
"LABEL_19306",
"LABEL_19307",
"LABEL_19308",
"LABEL_19309",
"LABEL_1931",
"LABEL_19310",
"LABEL_19311",
"LABEL_19312",
"LABEL_19313",
"LABEL_19314",
"LABEL_19315",
"LABEL_19316",
"LABEL_19317",
"LABEL_19318",
"LABEL_19319",
"LABEL_1932",
"LABEL_19320",
"LABEL_19321",
"LABEL_19322",
"LABEL_19323",
"LABEL_19324",
"LABEL_19325",
"LABEL_19326",
"LABEL_19327",
"LABEL_19328",
"LABEL_19329",
"LABEL_1933",
"LABEL_19330",
"LABEL_19331",
"LABEL_19332",
"LABEL_19333",
"LABEL_19334",
"LABEL_19335",
"LABEL_19336",
"LABEL_19337",
"LABEL_19338",
"LABEL_19339",
"LABEL_1934",
"LABEL_19340",
"LABEL_19341",
"LABEL_19342",
"LABEL_19343",
"LABEL_19344",
"LABEL_19345",
"LABEL_19346",
"LABEL_19347",
"LABEL_19348",
"LABEL_19349",
"LABEL_1935",
"LABEL_19350",
"LABEL_19351",
"LABEL_19352",
"LABEL_19353",
"LABEL_19354",
"LABEL_19355",
"LABEL_19356",
"LABEL_19357",
"LABEL_19358",
"LABEL_19359",
"LABEL_1936",
"LABEL_19360",
"LABEL_19361",
"LABEL_19362",
"LABEL_19363",
"LABEL_19364",
"LABEL_19365",
"LABEL_19366",
"LABEL_19367",
"LABEL_19368",
"LABEL_19369",
"LABEL_1937",
"LABEL_19370",
"LABEL_19371",
"LABEL_19372",
"LABEL_19373",
"LABEL_19374",
"LABEL_19375",
"LABEL_19376",
"LABEL_19377",
"LABEL_19378",
"LABEL_19379",
"LABEL_1938",
"LABEL_19380",
"LABEL_19381",
"LABEL_19382",
"LABEL_19383",
"LABEL_19384",
"LABEL_19385",
"LABEL_19386",
"LABEL_19387",
"LABEL_19388",
"LABEL_19389",
"LABEL_1939",
"LABEL_19390",
"LABEL_19391",
"LABEL_19392",
"LABEL_19393",
"LABEL_19394",
"LABEL_19395",
"LABEL_19396",
"LABEL_19397",
"LABEL_19398",
"LABEL_19399",
"LABEL_194",
"LABEL_1940",
"LABEL_19400",
"LABEL_19401",
"LABEL_19402",
"LABEL_19403",
"LABEL_19404",
"LABEL_19405",
"LABEL_19406",
"LABEL_19407",
"LABEL_19408",
"LABEL_19409",
"LABEL_1941",
"LABEL_19410",
"LABEL_19411",
"LABEL_19412",
"LABEL_19413",
"LABEL_19414",
"LABEL_19415",
"LABEL_19416",
"LABEL_19417",
"LABEL_19418",
"LABEL_19419",
"LABEL_1942",
"LABEL_19420",
"LABEL_19421",
"LABEL_19422",
"LABEL_19423",
"LABEL_19424",
"LABEL_19425",
"LABEL_19426",
"LABEL_19427",
"LABEL_19428",
"LABEL_19429",
"LABEL_1943",
"LABEL_19430",
"LABEL_19431",
"LABEL_19432",
"LABEL_19433",
"LABEL_19434",
"LABEL_19435",
"LABEL_19436",
"LABEL_19437",
"LABEL_19438",
"LABEL_19439",
"LABEL_1944",
"LABEL_19440",
"LABEL_19441",
"LABEL_19442",
"LABEL_19443",
"LABEL_19444",
"LABEL_19445",
"LABEL_19446",
"LABEL_19447",
"LABEL_19448",
"LABEL_19449",
"LABEL_1945",
"LABEL_19450",
"LABEL_19451",
"LABEL_19452",
"LABEL_19453",
"LABEL_19454",
"LABEL_19455",
"LABEL_19456",
"LABEL_19457",
"LABEL_19458",
"LABEL_19459",
"LABEL_1946",
"LABEL_19460",
"LABEL_19461",
"LABEL_19462",
"LABEL_19463",
"LABEL_19464",
"LABEL_19465",
"LABEL_19466",
"LABEL_19467",
"LABEL_19468",
"LABEL_19469",
"LABEL_1947",
"LABEL_19470",
"LABEL_19471",
"LABEL_19472",
"LABEL_19473",
"LABEL_19474",
"LABEL_19475",
"LABEL_19476",
"LABEL_19477",
"LABEL_19478",
"LABEL_19479",
"LABEL_1948",
"LABEL_19480",
"LABEL_19481",
"LABEL_19482",
"LABEL_19483",
"LABEL_19484",
"LABEL_19485",
"LABEL_19486",
"LABEL_19487",
"LABEL_19488",
"LABEL_19489",
"LABEL_1949",
"LABEL_19490",
"LABEL_19491",
"LABEL_19492",
"LABEL_19493",
"LABEL_19494",
"LABEL_19495",
"LABEL_19496",
"LABEL_19497",
"LABEL_19498",
"LABEL_19499",
"LABEL_195",
"LABEL_1950",
"LABEL_19500",
"LABEL_19501",
"LABEL_19502",
"LABEL_19503",
"LABEL_19504",
"LABEL_19505",
"LABEL_19506",
"LABEL_19507",
"LABEL_19508",
"LABEL_19509",
"LABEL_1951",
"LABEL_19510",
"LABEL_19511",
"LABEL_19512",
"LABEL_19513",
"LABEL_19514",
"LABEL_19515",
"LABEL_19516",
"LABEL_19517",
"LABEL_19518",
"LABEL_19519",
"LABEL_1952",
"LABEL_19520",
"LABEL_19521",
"LABEL_19522",
"LABEL_19523",
"LABEL_19524",
"LABEL_19525",
"LABEL_19526",
"LABEL_19527",
"LABEL_19528",
"LABEL_19529",
"LABEL_1953",
"LABEL_19530",
"LABEL_19531",
"LABEL_19532",
"LABEL_19533",
"LABEL_19534",
"LABEL_19535",
"LABEL_19536",
"LABEL_19537",
"LABEL_19538",
"LABEL_19539",
"LABEL_1954",
"LABEL_19540",
"LABEL_19541",
"LABEL_19542",
"LABEL_19543",
"LABEL_19544",
"LABEL_19545",
"LABEL_19546",
"LABEL_19547",
"LABEL_19548",
"LABEL_19549",
"LABEL_1955",
"LABEL_19550",
"LABEL_19551",
"LABEL_19552",
"LABEL_19553",
"LABEL_19554",
"LABEL_19555",
"LABEL_19556",
"LABEL_19557",
"LABEL_19558",
"LABEL_19559",
"LABEL_1956",
"LABEL_19560",
"LABEL_19561",
"LABEL_19562",
"LABEL_19563",
"LABEL_19564",
"LABEL_19565",
"LABEL_19566",
"LABEL_19567",
"LABEL_19568",
"LABEL_19569",
"LABEL_1957",
"LABEL_19570",
"LABEL_19571",
"LABEL_19572",
"LABEL_19573",
"LABEL_19574",
"LABEL_19575",
"LABEL_19576",
"LABEL_19577",
"LABEL_19578",
"LABEL_19579",
"LABEL_1958",
"LABEL_19580",
"LABEL_19581",
"LABEL_19582",
"LABEL_19583",
"LABEL_19584",
"LABEL_19585",
"LABEL_19586",
"LABEL_19587",
"LABEL_19588",
"LABEL_19589",
"LABEL_1959",
"LABEL_19590",
"LABEL_19591",
"LABEL_19592",
"LABEL_19593",
"LABEL_19594",
"LABEL_19595",
"LABEL_19596",
"LABEL_19597",
"LABEL_19598",
"LABEL_19599",
"LABEL_196",
"LABEL_1960",
"LABEL_19600",
"LABEL_19601",
"LABEL_19602",
"LABEL_19603",
"LABEL_19604",
"LABEL_19605",
"LABEL_19606",
"LABEL_19607",
"LABEL_19608",
"LABEL_19609",
"LABEL_1961",
"LABEL_19610",
"LABEL_19611",
"LABEL_19612",
"LABEL_19613",
"LABEL_19614",
"LABEL_19615",
"LABEL_19616",
"LABEL_19617",
"LABEL_19618",
"LABEL_19619",
"LABEL_1962",
"LABEL_19620",
"LABEL_19621",
"LABEL_19622",
"LABEL_19623",
"LABEL_19624",
"LABEL_19625",
"LABEL_19626",
"LABEL_19627",
"LABEL_19628",
"LABEL_19629",
"LABEL_1963",
"LABEL_19630",
"LABEL_19631",
"LABEL_19632",
"LABEL_19633",
"LABEL_19634",
"LABEL_19635",
"LABEL_19636",
"LABEL_19637",
"LABEL_19638",
"LABEL_19639",
"LABEL_1964",
"LABEL_19640",
"LABEL_19641",
"LABEL_19642",
"LABEL_19643",
"LABEL_19644",
"LABEL_19645",
"LABEL_19646",
"LABEL_19647",
"LABEL_19648",
"LABEL_19649",
"LABEL_1965",
"LABEL_19650",
"LABEL_19651",
"LABEL_19652",
"LABEL_19653",
"LABEL_19654",
"LABEL_19655",
"LABEL_19656",
"LABEL_19657",
"LABEL_19658",
"LABEL_19659",
"LABEL_1966",
"LABEL_19660",
"LABEL_19661",
"LABEL_19662",
"LABEL_19663",
"LABEL_19664",
"LABEL_19665",
"LABEL_19666",
"LABEL_19667",
"LABEL_19668",
"LABEL_19669",
"LABEL_1967",
"LABEL_19670",
"LABEL_19671",
"LABEL_19672",
"LABEL_19673",
"LABEL_19674",
"LABEL_19675",
"LABEL_19676",
"LABEL_19677",
"LABEL_19678",
"LABEL_19679",
"LABEL_1968",
"LABEL_19680",
"LABEL_19681",
"LABEL_19682",
"LABEL_19683",
"LABEL_19684",
"LABEL_19685",
"LABEL_19686",
"LABEL_19687",
"LABEL_19688",
"LABEL_19689",
"LABEL_1969",
"LABEL_19690",
"LABEL_19691",
"LABEL_19692",
"LABEL_19693",
"LABEL_19694",
"LABEL_19695",
"LABEL_19696",
"LABEL_19697",
"LABEL_19698",
"LABEL_19699",
"LABEL_197",
"LABEL_1970",
"LABEL_19700",
"LABEL_19701",
"LABEL_19702",
"LABEL_19703",
"LABEL_19704",
"LABEL_19705",
"LABEL_19706",
"LABEL_19707",
"LABEL_19708",
"LABEL_19709",
"LABEL_1971",
"LABEL_19710",
"LABEL_19711",
"LABEL_19712",
"LABEL_19713",
"LABEL_19714",
"LABEL_19715",
"LABEL_19716",
"LABEL_19717",
"LABEL_19718",
"LABEL_19719",
"LABEL_1972",
"LABEL_19720",
"LABEL_19721",
"LABEL_19722",
"LABEL_19723",
"LABEL_19724",
"LABEL_19725",
"LABEL_19726",
"LABEL_19727",
"LABEL_19728",
"LABEL_19729",
"LABEL_1973",
"LABEL_19730",
"LABEL_19731",
"LABEL_19732",
"LABEL_19733",
"LABEL_19734",
"LABEL_19735",
"LABEL_19736",
"LABEL_19737",
"LABEL_19738",
"LABEL_19739",
"LABEL_1974",
"LABEL_19740",
"LABEL_19741",
"LABEL_19742",
"LABEL_19743",
"LABEL_19744",
"LABEL_19745",
"LABEL_19746",
"LABEL_19747",
"LABEL_19748",
"LABEL_19749",
"LABEL_1975",
"LABEL_19750",
"LABEL_19751",
"LABEL_19752",
"LABEL_19753",
"LABEL_19754",
"LABEL_19755",
"LABEL_19756",
"LABEL_19757",
"LABEL_19758",
"LABEL_19759",
"LABEL_1976",
"LABEL_19760",
"LABEL_19761",
"LABEL_19762",
"LABEL_19763",
"LABEL_19764",
"LABEL_19765",
"LABEL_19766",
"LABEL_19767",
"LABEL_19768",
"LABEL_19769",
"LABEL_1977",
"LABEL_19770",
"LABEL_19771",
"LABEL_19772",
"LABEL_19773",
"LABEL_19774",
"LABEL_19775",
"LABEL_19776",
"LABEL_19777",
"LABEL_19778",
"LABEL_19779",
"LABEL_1978",
"LABEL_19780",
"LABEL_19781",
"LABEL_19782",
"LABEL_19783",
"LABEL_19784",
"LABEL_19785",
"LABEL_19786",
"LABEL_19787",
"LABEL_19788",
"LABEL_19789",
"LABEL_1979",
"LABEL_19790",
"LABEL_19791",
"LABEL_19792",
"LABEL_19793",
"LABEL_19794",
"LABEL_19795",
"LABEL_19796",
"LABEL_19797",
"LABEL_19798",
"LABEL_19799",
"LABEL_198",
"LABEL_1980",
"LABEL_19800",
"LABEL_19801",
"LABEL_19802",
"LABEL_19803",
"LABEL_19804",
"LABEL_19805",
"LABEL_19806",
"LABEL_19807",
"LABEL_19808",
"LABEL_19809",
"LABEL_1981",
"LABEL_19810",
"LABEL_19811",
"LABEL_19812",
"LABEL_19813",
"LABEL_19814",
"LABEL_19815",
"LABEL_19816",
"LABEL_19817",
"LABEL_19818",
"LABEL_19819",
"LABEL_1982",
"LABEL_19820",
"LABEL_19821",
"LABEL_19822",
"LABEL_19823",
"LABEL_19824",
"LABEL_19825",
"LABEL_19826",
"LABEL_19827",
"LABEL_19828",
"LABEL_19829",
"LABEL_1983",
"LABEL_19830",
"LABEL_19831",
"LABEL_19832",
"LABEL_19833",
"LABEL_19834",
"LABEL_19835",
"LABEL_19836",
"LABEL_19837",
"LABEL_19838",
"LABEL_19839",
"LABEL_1984",
"LABEL_19840",
"LABEL_19841",
"LABEL_19842",
"LABEL_19843",
"LABEL_19844",
"LABEL_19845",
"LABEL_19846",
"LABEL_19847",
"LABEL_19848",
"LABEL_19849",
"LABEL_1985",
"LABEL_19850",
"LABEL_19851",
"LABEL_19852",
"LABEL_19853",
"LABEL_19854",
"LABEL_19855",
"LABEL_19856",
"LABEL_19857",
"LABEL_19858",
"LABEL_19859",
"LABEL_1986",
"LABEL_19860",
"LABEL_19861",
"LABEL_19862",
"LABEL_19863",
"LABEL_19864",
"LABEL_19865",
"LABEL_19866",
"LABEL_19867",
"LABEL_19868",
"LABEL_19869",
"LABEL_1987",
"LABEL_19870",
"LABEL_19871",
"LABEL_19872",
"LABEL_19873",
"LABEL_19874",
"LABEL_19875",
"LABEL_19876",
"LABEL_19877",
"LABEL_19878",
"LABEL_19879",
"LABEL_1988",
"LABEL_19880",
"LABEL_19881",
"LABEL_19882",
"LABEL_19883",
"LABEL_19884",
"LABEL_19885",
"LABEL_19886",
"LABEL_19887",
"LABEL_19888",
"LABEL_19889",
"LABEL_1989",
"LABEL_19890",
"LABEL_19891",
"LABEL_19892",
"LABEL_19893",
"LABEL_19894",
"LABEL_19895",
"LABEL_19896",
"LABEL_19897",
"LABEL_19898",
"LABEL_19899",
"LABEL_199",
"LABEL_1990",
"LABEL_19900",
"LABEL_19901",
"LABEL_19902",
"LABEL_19903",
"LABEL_19904",
"LABEL_19905",
"LABEL_19906",
"LABEL_19907",
"LABEL_19908",
"LABEL_19909",
"LABEL_1991",
"LABEL_19910",
"LABEL_19911",
"LABEL_19912",
"LABEL_19913",
"LABEL_19914",
"LABEL_19915",
"LABEL_19916",
"LABEL_19917",
"LABEL_19918",
"LABEL_19919",
"LABEL_1992",
"LABEL_19920",
"LABEL_19921",
"LABEL_19922",
"LABEL_19923",
"LABEL_19924",
"LABEL_19925",
"LABEL_19926",
"LABEL_19927",
"LABEL_19928",
"LABEL_19929",
"LABEL_1993",
"LABEL_19930",
"LABEL_19931",
"LABEL_19932",
"LABEL_19933",
"LABEL_19934",
"LABEL_19935",
"LABEL_19936",
"LABEL_19937",
"LABEL_19938",
"LABEL_19939",
"LABEL_1994",
"LABEL_19940",
"LABEL_19941",
"LABEL_19942",
"LABEL_19943",
"LABEL_19944",
"LABEL_19945",
"LABEL_19946",
"LABEL_19947",
"LABEL_19948",
"LABEL_19949",
"LABEL_1995",
"LABEL_19950",
"LABEL_19951",
"LABEL_19952",
"LABEL_19953",
"LABEL_19954",
"LABEL_19955",
"LABEL_19956",
"LABEL_19957",
"LABEL_19958",
"LABEL_19959",
"LABEL_1996",
"LABEL_19960",
"LABEL_19961",
"LABEL_19962",
"LABEL_19963",
"LABEL_19964",
"LABEL_19965",
"LABEL_19966",
"LABEL_19967",
"LABEL_19968",
"LABEL_19969",
"LABEL_1997",
"LABEL_19970",
"LABEL_19971",
"LABEL_19972",
"LABEL_19973",
"LABEL_19974",
"LABEL_19975",
"LABEL_19976",
"LABEL_19977",
"LABEL_19978",
"LABEL_19979",
"LABEL_1998",
"LABEL_19980",
"LABEL_19981",
"LABEL_19982",
"LABEL_19983",
"LABEL_19984",
"LABEL_19985",
"LABEL_19986",
"LABEL_19987",
"LABEL_19988",
"LABEL_19989",
"LABEL_1999",
"LABEL_19990",
"LABEL_19991",
"LABEL_19992",
"LABEL_19993",
"LABEL_19994",
"LABEL_19995",
"LABEL_19996",
"LABEL_19997",
"LABEL_19998",
"LABEL_19999",
"LABEL_2",
"LABEL_20",
"LABEL_200",
"LABEL_2000",
"LABEL_20000",
"LABEL_20001",
"LABEL_20002",
"LABEL_20003",
"LABEL_20004",
"LABEL_20005",
"LABEL_20006",
"LABEL_20007",
"LABEL_20008",
"LABEL_20009",
"LABEL_2001",
"LABEL_20010",
"LABEL_20011",
"LABEL_20012",
"LABEL_20013",
"LABEL_20014",
"LABEL_20015",
"LABEL_20016",
"LABEL_20017",
"LABEL_20018",
"LABEL_20019",
"LABEL_2002",
"LABEL_20020",
"LABEL_20021",
"LABEL_20022",
"LABEL_20023",
"LABEL_20024",
"LABEL_20025",
"LABEL_20026",
"LABEL_20027",
"LABEL_20028",
"LABEL_20029",
"LABEL_2003",
"LABEL_20030",
"LABEL_20031",
"LABEL_20032",
"LABEL_20033",
"LABEL_20034",
"LABEL_20035",
"LABEL_20036",
"LABEL_20037",
"LABEL_20038",
"LABEL_20039",
"LABEL_2004",
"LABEL_20040",
"LABEL_20041",
"LABEL_20042",
"LABEL_20043",
"LABEL_20044",
"LABEL_20045",
"LABEL_20046",
"LABEL_20047",
"LABEL_20048",
"LABEL_20049",
"LABEL_2005",
"LABEL_20050",
"LABEL_20051",
"LABEL_20052",
"LABEL_20053",
"LABEL_20054",
"LABEL_20055",
"LABEL_20056",
"LABEL_20057",
"LABEL_20058",
"LABEL_20059",
"LABEL_2006",
"LABEL_20060",
"LABEL_20061",
"LABEL_20062",
"LABEL_20063",
"LABEL_20064",
"LABEL_20065",
"LABEL_20066",
"LABEL_20067",
"LABEL_20068",
"LABEL_20069",
"LABEL_2007",
"LABEL_20070",
"LABEL_20071",
"LABEL_20072",
"LABEL_20073",
"LABEL_20074",
"LABEL_20075",
"LABEL_20076",
"LABEL_20077",
"LABEL_20078",
"LABEL_20079",
"LABEL_2008",
"LABEL_20080",
"LABEL_20081",
"LABEL_20082",
"LABEL_20083",
"LABEL_20084",
"LABEL_20085",
"LABEL_20086",
"LABEL_20087",
"LABEL_20088",
"LABEL_20089",
"LABEL_2009",
"LABEL_20090",
"LABEL_20091",
"LABEL_20092",
"LABEL_20093",
"LABEL_20094",
"LABEL_20095",
"LABEL_20096",
"LABEL_20097",
"LABEL_20098",
"LABEL_20099",
"LABEL_201",
"LABEL_2010",
"LABEL_20100",
"LABEL_20101",
"LABEL_20102",
"LABEL_20103",
"LABEL_20104",
"LABEL_20105",
"LABEL_20106",
"LABEL_20107",
"LABEL_20108",
"LABEL_20109",
"LABEL_2011",
"LABEL_20110",
"LABEL_20111",
"LABEL_20112",
"LABEL_20113",
"LABEL_20114",
"LABEL_20115",
"LABEL_20116",
"LABEL_20117",
"LABEL_20118",
"LABEL_20119",
"LABEL_2012",
"LABEL_20120",
"LABEL_20121",
"LABEL_20122",
"LABEL_20123",
"LABEL_20124",
"LABEL_20125",
"LABEL_20126",
"LABEL_20127",
"LABEL_20128",
"LABEL_20129",
"LABEL_2013",
"LABEL_20130",
"LABEL_20131",
"LABEL_20132",
"LABEL_20133",
"LABEL_20134",
"LABEL_20135",
"LABEL_20136",
"LABEL_20137",
"LABEL_20138",
"LABEL_20139",
"LABEL_2014",
"LABEL_20140",
"LABEL_20141",
"LABEL_20142",
"LABEL_20143",
"LABEL_20144",
"LABEL_20145",
"LABEL_20146",
"LABEL_20147",
"LABEL_20148",
"LABEL_20149",
"LABEL_2015",
"LABEL_20150",
"LABEL_20151",
"LABEL_20152",
"LABEL_20153",
"LABEL_20154",
"LABEL_20155",
"LABEL_20156",
"LABEL_20157",
"LABEL_20158",
"LABEL_20159",
"LABEL_2016",
"LABEL_20160",
"LABEL_20161",
"LABEL_20162",
"LABEL_20163",
"LABEL_20164",
"LABEL_20165",
"LABEL_20166",
"LABEL_20167",
"LABEL_20168",
"LABEL_20169",
"LABEL_2017",
"LABEL_20170",
"LABEL_20171",
"LABEL_20172",
"LABEL_20173",
"LABEL_20174",
"LABEL_20175",
"LABEL_20176",
"LABEL_20177",
"LABEL_20178",
"LABEL_20179",
"LABEL_2018",
"LABEL_20180",
"LABEL_20181",
"LABEL_20182",
"LABEL_20183",
"LABEL_20184",
"LABEL_20185",
"LABEL_20186",
"LABEL_20187",
"LABEL_20188",
"LABEL_20189",
"LABEL_2019",
"LABEL_20190",
"LABEL_20191",
"LABEL_20192",
"LABEL_20193",
"LABEL_20194",
"LABEL_20195",
"LABEL_20196",
"LABEL_20197",
"LABEL_20198",
"LABEL_20199",
"LABEL_202",
"LABEL_2020",
"LABEL_20200",
"LABEL_20201",
"LABEL_20202",
"LABEL_20203",
"LABEL_20204",
"LABEL_20205",
"LABEL_20206",
"LABEL_20207",
"LABEL_20208",
"LABEL_20209",
"LABEL_2021",
"LABEL_20210",
"LABEL_20211",
"LABEL_20212",
"LABEL_20213",
"LABEL_20214",
"LABEL_20215",
"LABEL_20216",
"LABEL_20217",
"LABEL_20218",
"LABEL_20219",
"LABEL_2022",
"LABEL_20220",
"LABEL_20221",
"LABEL_20222",
"LABEL_20223",
"LABEL_20224",
"LABEL_20225",
"LABEL_20226",
"LABEL_20227",
"LABEL_20228",
"LABEL_20229",
"LABEL_2023",
"LABEL_20230",
"LABEL_20231",
"LABEL_20232",
"LABEL_20233",
"LABEL_20234",
"LABEL_20235",
"LABEL_20236",
"LABEL_20237",
"LABEL_20238",
"LABEL_20239",
"LABEL_2024",
"LABEL_20240",
"LABEL_20241",
"LABEL_20242",
"LABEL_20243",
"LABEL_20244",
"LABEL_20245",
"LABEL_20246",
"LABEL_20247",
"LABEL_20248",
"LABEL_20249",
"LABEL_2025",
"LABEL_20250",
"LABEL_20251",
"LABEL_20252",
"LABEL_20253",
"LABEL_20254",
"LABEL_20255",
"LABEL_20256",
"LABEL_20257",
"LABEL_20258",
"LABEL_20259",
"LABEL_2026",
"LABEL_20260",
"LABEL_20261",
"LABEL_20262",
"LABEL_20263",
"LABEL_20264",
"LABEL_20265",
"LABEL_20266",
"LABEL_20267",
"LABEL_20268",
"LABEL_20269",
"LABEL_2027",
"LABEL_20270",
"LABEL_20271",
"LABEL_20272",
"LABEL_20273",
"LABEL_20274",
"LABEL_20275",
"LABEL_20276",
"LABEL_20277",
"LABEL_20278",
"LABEL_20279",
"LABEL_2028",
"LABEL_20280",
"LABEL_20281",
"LABEL_20282",
"LABEL_20283",
"LABEL_20284",
"LABEL_20285",
"LABEL_20286",
"LABEL_20287",
"LABEL_20288",
"LABEL_20289",
"LABEL_2029",
"LABEL_20290",
"LABEL_20291",
"LABEL_20292",
"LABEL_20293",
"LABEL_20294",
"LABEL_20295",
"LABEL_20296",
"LABEL_20297",
"LABEL_20298",
"LABEL_20299",
"LABEL_203",
"LABEL_2030",
"LABEL_20300",
"LABEL_20301",
"LABEL_20302",
"LABEL_20303",
"LABEL_20304",
"LABEL_20305",
"LABEL_20306",
"LABEL_20307",
"LABEL_20308",
"LABEL_20309",
"LABEL_2031",
"LABEL_20310",
"LABEL_20311",
"LABEL_20312",
"LABEL_20313",
"LABEL_20314",
"LABEL_20315",
"LABEL_20316",
"LABEL_20317",
"LABEL_20318",
"LABEL_20319",
"LABEL_2032",
"LABEL_20320",
"LABEL_20321",
"LABEL_20322",
"LABEL_20323",
"LABEL_20324",
"LABEL_20325",
"LABEL_20326",
"LABEL_20327",
"LABEL_20328",
"LABEL_20329",
"LABEL_2033",
"LABEL_20330",
"LABEL_20331",
"LABEL_20332",
"LABEL_20333",
"LABEL_20334",
"LABEL_20335",
"LABEL_20336",
"LABEL_20337",
"LABEL_20338",
"LABEL_20339",
"LABEL_2034",
"LABEL_20340",
"LABEL_20341",
"LABEL_20342",
"LABEL_20343",
"LABEL_20344",
"LABEL_20345",
"LABEL_20346",
"LABEL_20347",
"LABEL_20348",
"LABEL_20349",
"LABEL_2035",
"LABEL_20350",
"LABEL_20351",
"LABEL_20352",
"LABEL_20353",
"LABEL_20354",
"LABEL_20355",
"LABEL_20356",
"LABEL_20357",
"LABEL_20358",
"LABEL_20359",
"LABEL_2036",
"LABEL_20360",
"LABEL_20361",
"LABEL_20362",
"LABEL_20363",
"LABEL_20364",
"LABEL_20365",
"LABEL_20366",
"LABEL_20367",
"LABEL_20368",
"LABEL_20369",
"LABEL_2037",
"LABEL_20370",
"LABEL_20371",
"LABEL_20372",
"LABEL_20373",
"LABEL_20374",
"LABEL_20375",
"LABEL_20376",
"LABEL_20377",
"LABEL_20378",
"LABEL_20379",
"LABEL_2038",
"LABEL_20380",
"LABEL_20381",
"LABEL_20382",
"LABEL_20383",
"LABEL_20384",
"LABEL_20385",
"LABEL_20386",
"LABEL_20387",
"LABEL_20388",
"LABEL_20389",
"LABEL_2039",
"LABEL_20390",
"LABEL_20391",
"LABEL_20392",
"LABEL_20393",
"LABEL_20394",
"LABEL_20395",
"LABEL_20396",
"LABEL_20397",
"LABEL_20398",
"LABEL_20399",
"LABEL_204",
"LABEL_2040",
"LABEL_20400",
"LABEL_20401",
"LABEL_20402",
"LABEL_20403",
"LABEL_20404",
"LABEL_20405",
"LABEL_20406",
"LABEL_20407",
"LABEL_20408",
"LABEL_20409",
"LABEL_2041",
"LABEL_20410",
"LABEL_20411",
"LABEL_20412",
"LABEL_20413",
"LABEL_20414",
"LABEL_20415",
"LABEL_20416",
"LABEL_20417",
"LABEL_20418",
"LABEL_20419",
"LABEL_2042",
"LABEL_20420",
"LABEL_20421",
"LABEL_20422",
"LABEL_20423",
"LABEL_20424",
"LABEL_20425",
"LABEL_20426",
"LABEL_20427",
"LABEL_20428",
"LABEL_20429",
"LABEL_2043",
"LABEL_20430",
"LABEL_20431",
"LABEL_20432",
"LABEL_20433",
"LABEL_20434",
"LABEL_20435",
"LABEL_20436",
"LABEL_20437",
"LABEL_20438",
"LABEL_20439",
"LABEL_2044",
"LABEL_20440",
"LABEL_20441",
"LABEL_20442",
"LABEL_20443",
"LABEL_20444",
"LABEL_20445",
"LABEL_20446",
"LABEL_20447",
"LABEL_20448",
"LABEL_20449",
"LABEL_2045",
"LABEL_20450",
"LABEL_20451",
"LABEL_20452",
"LABEL_20453",
"LABEL_20454",
"LABEL_20455",
"LABEL_20456",
"LABEL_20457",
"LABEL_20458",
"LABEL_20459",
"LABEL_2046",
"LABEL_20460",
"LABEL_20461",
"LABEL_20462",
"LABEL_20463",
"LABEL_20464",
"LABEL_20465",
"LABEL_20466",
"LABEL_20467",
"LABEL_20468",
"LABEL_20469",
"LABEL_2047",
"LABEL_20470",
"LABEL_20471",
"LABEL_20472",
"LABEL_20473",
"LABEL_20474",
"LABEL_20475",
"LABEL_20476",
"LABEL_20477",
"LABEL_20478",
"LABEL_20479",
"LABEL_2048",
"LABEL_20480",
"LABEL_20481",
"LABEL_20482",
"LABEL_20483",
"LABEL_20484",
"LABEL_20485",
"LABEL_20486",
"LABEL_20487",
"LABEL_20488",
"LABEL_20489",
"LABEL_2049",
"LABEL_20490",
"LABEL_20491",
"LABEL_20492",
"LABEL_20493",
"LABEL_20494",
"LABEL_20495",
"LABEL_20496",
"LABEL_20497",
"LABEL_20498",
"LABEL_20499",
"LABEL_205",
"LABEL_2050",
"LABEL_20500",
"LABEL_20501",
"LABEL_20502",
"LABEL_20503",
"LABEL_20504",
"LABEL_20505",
"LABEL_20506",
"LABEL_20507",
"LABEL_20508",
"LABEL_20509",
"LABEL_2051",
"LABEL_20510",
"LABEL_20511",
"LABEL_20512",
"LABEL_20513",
"LABEL_20514",
"LABEL_20515",
"LABEL_20516",
"LABEL_20517",
"LABEL_20518",
"LABEL_20519",
"LABEL_2052",
"LABEL_20520",
"LABEL_20521",
"LABEL_20522",
"LABEL_20523",
"LABEL_20524",
"LABEL_20525",
"LABEL_20526",
"LABEL_20527",
"LABEL_20528",
"LABEL_20529",
"LABEL_2053",
"LABEL_20530",
"LABEL_20531",
"LABEL_20532",
"LABEL_20533",
"LABEL_20534",
"LABEL_20535",
"LABEL_20536",
"LABEL_20537",
"LABEL_20538",
"LABEL_20539",
"LABEL_2054",
"LABEL_20540",
"LABEL_20541",
"LABEL_20542",
"LABEL_20543",
"LABEL_20544",
"LABEL_20545",
"LABEL_20546",
"LABEL_20547",
"LABEL_20548",
"LABEL_20549",
"LABEL_2055",
"LABEL_20550",
"LABEL_20551",
"LABEL_20552",
"LABEL_20553",
"LABEL_20554",
"LABEL_20555",
"LABEL_20556",
"LABEL_20557",
"LABEL_20558",
"LABEL_20559",
"LABEL_2056",
"LABEL_20560",
"LABEL_20561",
"LABEL_20562",
"LABEL_20563",
"LABEL_20564",
"LABEL_20565",
"LABEL_20566",
"LABEL_20567",
"LABEL_20568",
"LABEL_20569",
"LABEL_2057",
"LABEL_20570",
"LABEL_20571",
"LABEL_20572",
"LABEL_20573",
"LABEL_20574",
"LABEL_20575",
"LABEL_20576",
"LABEL_20577",
"LABEL_20578",
"LABEL_20579",
"LABEL_2058",
"LABEL_20580",
"LABEL_20581",
"LABEL_20582",
"LABEL_20583",
"LABEL_20584",
"LABEL_20585",
"LABEL_20586",
"LABEL_20587",
"LABEL_20588",
"LABEL_20589",
"LABEL_2059",
"LABEL_20590",
"LABEL_20591",
"LABEL_20592",
"LABEL_20593",
"LABEL_20594",
"LABEL_20595",
"LABEL_20596",
"LABEL_20597",
"LABEL_20598",
"LABEL_20599",
"LABEL_206",
"LABEL_2060",
"LABEL_20600",
"LABEL_20601",
"LABEL_20602",
"LABEL_20603",
"LABEL_20604",
"LABEL_20605",
"LABEL_20606",
"LABEL_20607",
"LABEL_20608",
"LABEL_20609",
"LABEL_2061",
"LABEL_20610",
"LABEL_20611",
"LABEL_20612",
"LABEL_20613",
"LABEL_20614",
"LABEL_20615",
"LABEL_20616",
"LABEL_20617",
"LABEL_20618",
"LABEL_20619",
"LABEL_2062",
"LABEL_20620",
"LABEL_20621",
"LABEL_20622",
"LABEL_20623",
"LABEL_20624",
"LABEL_20625",
"LABEL_20626",
"LABEL_20627",
"LABEL_20628",
"LABEL_20629",
"LABEL_2063",
"LABEL_20630",
"LABEL_20631",
"LABEL_20632",
"LABEL_20633",
"LABEL_20634",
"LABEL_20635",
"LABEL_20636",
"LABEL_20637",
"LABEL_20638",
"LABEL_20639",
"LABEL_2064",
"LABEL_20640",
"LABEL_20641",
"LABEL_20642",
"LABEL_20643",
"LABEL_20644",
"LABEL_20645",
"LABEL_20646",
"LABEL_20647",
"LABEL_20648",
"LABEL_20649",
"LABEL_2065",
"LABEL_20650",
"LABEL_20651",
"LABEL_20652",
"LABEL_20653",
"LABEL_20654",
"LABEL_20655",
"LABEL_20656",
"LABEL_20657",
"LABEL_20658",
"LABEL_20659",
"LABEL_2066",
"LABEL_20660",
"LABEL_20661",
"LABEL_20662",
"LABEL_20663",
"LABEL_20664",
"LABEL_20665",
"LABEL_20666",
"LABEL_20667",
"LABEL_20668",
"LABEL_20669",
"LABEL_2067",
"LABEL_20670",
"LABEL_20671",
"LABEL_20672",
"LABEL_20673",
"LABEL_20674",
"LABEL_20675",
"LABEL_20676",
"LABEL_20677",
"LABEL_20678",
"LABEL_20679",
"LABEL_2068",
"LABEL_20680",
"LABEL_20681",
"LABEL_20682",
"LABEL_20683",
"LABEL_20684",
"LABEL_20685",
"LABEL_20686",
"LABEL_20687",
"LABEL_20688",
"LABEL_20689",
"LABEL_2069",
"LABEL_20690",
"LABEL_20691",
"LABEL_20692",
"LABEL_20693",
"LABEL_20694",
"LABEL_20695",
"LABEL_20696",
"LABEL_20697",
"LABEL_20698",
"LABEL_20699",
"LABEL_207",
"LABEL_2070",
"LABEL_20700",
"LABEL_20701",
"LABEL_20702",
"LABEL_20703",
"LABEL_20704",
"LABEL_20705",
"LABEL_20706",
"LABEL_20707",
"LABEL_20708",
"LABEL_20709",
"LABEL_2071",
"LABEL_20710",
"LABEL_20711",
"LABEL_20712",
"LABEL_20713",
"LABEL_20714",
"LABEL_20715",
"LABEL_20716",
"LABEL_20717",
"LABEL_20718",
"LABEL_20719",
"LABEL_2072",
"LABEL_20720",
"LABEL_20721",
"LABEL_20722",
"LABEL_20723",
"LABEL_20724",
"LABEL_20725",
"LABEL_20726",
"LABEL_20727",
"LABEL_20728",
"LABEL_20729",
"LABEL_2073",
"LABEL_20730",
"LABEL_20731",
"LABEL_20732",
"LABEL_20733",
"LABEL_20734",
"LABEL_20735",
"LABEL_20736",
"LABEL_20737",
"LABEL_20738",
"LABEL_20739",
"LABEL_2074",
"LABEL_20740",
"LABEL_20741",
"LABEL_20742",
"LABEL_20743",
"LABEL_20744",
"LABEL_20745",
"LABEL_20746",
"LABEL_20747",
"LABEL_20748",
"LABEL_20749",
"LABEL_2075",
"LABEL_20750",
"LABEL_20751",
"LABEL_20752",
"LABEL_20753",
"LABEL_20754",
"LABEL_20755",
"LABEL_20756",
"LABEL_20757",
"LABEL_20758",
"LABEL_20759",
"LABEL_2076",
"LABEL_20760",
"LABEL_20761",
"LABEL_20762",
"LABEL_20763",
"LABEL_20764",
"LABEL_20765",
"LABEL_20766",
"LABEL_20767",
"LABEL_20768",
"LABEL_20769",
"LABEL_2077",
"LABEL_20770",
"LABEL_20771",
"LABEL_20772",
"LABEL_20773",
"LABEL_20774",
"LABEL_20775",
"LABEL_20776",
"LABEL_20777",
"LABEL_20778",
"LABEL_20779",
"LABEL_2078",
"LABEL_20780",
"LABEL_20781",
"LABEL_20782",
"LABEL_20783",
"LABEL_20784",
"LABEL_20785",
"LABEL_20786",
"LABEL_20787",
"LABEL_20788",
"LABEL_20789",
"LABEL_2079",
"LABEL_20790",
"LABEL_20791",
"LABEL_20792",
"LABEL_20793",
"LABEL_20794",
"LABEL_20795",
"LABEL_20796",
"LABEL_20797",
"LABEL_20798",
"LABEL_20799",
"LABEL_208",
"LABEL_2080",
"LABEL_20800",
"LABEL_20801",
"LABEL_20802",
"LABEL_20803",
"LABEL_20804",
"LABEL_20805",
"LABEL_20806",
"LABEL_20807",
"LABEL_20808",
"LABEL_20809",
"LABEL_2081",
"LABEL_20810",
"LABEL_20811",
"LABEL_20812",
"LABEL_20813",
"LABEL_20814",
"LABEL_20815",
"LABEL_20816",
"LABEL_20817",
"LABEL_20818",
"LABEL_20819",
"LABEL_2082",
"LABEL_20820",
"LABEL_20821",
"LABEL_20822",
"LABEL_20823",
"LABEL_20824",
"LABEL_20825",
"LABEL_20826",
"LABEL_20827",
"LABEL_20828",
"LABEL_20829",
"LABEL_2083",
"LABEL_20830",
"LABEL_20831",
"LABEL_20832",
"LABEL_20833",
"LABEL_20834",
"LABEL_20835",
"LABEL_20836",
"LABEL_20837",
"LABEL_20838",
"LABEL_20839",
"LABEL_2084",
"LABEL_20840",
"LABEL_20841",
"LABEL_20842",
"LABEL_20843",
"LABEL_20844",
"LABEL_20845",
"LABEL_20846",
"LABEL_20847",
"LABEL_20848",
"LABEL_20849",
"LABEL_2085",
"LABEL_20850",
"LABEL_20851",
"LABEL_20852",
"LABEL_20853",
"LABEL_20854",
"LABEL_20855",
"LABEL_20856",
"LABEL_20857",
"LABEL_20858",
"LABEL_20859",
"LABEL_2086",
"LABEL_20860",
"LABEL_20861",
"LABEL_20862",
"LABEL_20863",
"LABEL_20864",
"LABEL_20865",
"LABEL_20866",
"LABEL_20867",
"LABEL_20868",
"LABEL_20869",
"LABEL_2087",
"LABEL_20870",
"LABEL_20871",
"LABEL_20872",
"LABEL_20873",
"LABEL_20874",
"LABEL_20875",
"LABEL_20876",
"LABEL_20877",
"LABEL_20878",
"LABEL_20879",
"LABEL_2088",
"LABEL_20880",
"LABEL_20881",
"LABEL_20882",
"LABEL_20883",
"LABEL_20884",
"LABEL_20885",
"LABEL_20886",
"LABEL_20887",
"LABEL_20888",
"LABEL_20889",
"LABEL_2089",
"LABEL_20890",
"LABEL_20891",
"LABEL_20892",
"LABEL_20893",
"LABEL_20894",
"LABEL_20895",
"LABEL_20896",
"LABEL_20897",
"LABEL_20898",
"LABEL_20899",
"LABEL_209",
"LABEL_2090",
"LABEL_20900",
"LABEL_20901",
"LABEL_20902",
"LABEL_20903",
"LABEL_20904",
"LABEL_20905",
"LABEL_20906",
"LABEL_20907",
"LABEL_20908",
"LABEL_20909",
"LABEL_2091",
"LABEL_20910",
"LABEL_20911",
"LABEL_20912",
"LABEL_20913",
"LABEL_20914",
"LABEL_20915",
"LABEL_20916",
"LABEL_20917",
"LABEL_20918",
"LABEL_20919",
"LABEL_2092",
"LABEL_20920",
"LABEL_20921",
"LABEL_20922",
"LABEL_20923",
"LABEL_20924",
"LABEL_20925",
"LABEL_20926",
"LABEL_20927",
"LABEL_20928",
"LABEL_20929",
"LABEL_2093",
"LABEL_20930",
"LABEL_20931",
"LABEL_20932",
"LABEL_20933",
"LABEL_20934",
"LABEL_20935",
"LABEL_20936",
"LABEL_20937",
"LABEL_20938",
"LABEL_20939",
"LABEL_2094",
"LABEL_20940",
"LABEL_20941",
"LABEL_20942",
"LABEL_20943",
"LABEL_20944",
"LABEL_20945",
"LABEL_20946",
"LABEL_20947",
"LABEL_20948",
"LABEL_20949",
"LABEL_2095",
"LABEL_20950",
"LABEL_20951",
"LABEL_20952",
"LABEL_20953",
"LABEL_20954",
"LABEL_20955",
"LABEL_20956",
"LABEL_20957",
"LABEL_20958",
"LABEL_20959",
"LABEL_2096",
"LABEL_20960",
"LABEL_20961",
"LABEL_20962",
"LABEL_20963",
"LABEL_20964",
"LABEL_20965",
"LABEL_20966",
"LABEL_20967",
"LABEL_20968",
"LABEL_20969",
"LABEL_2097",
"LABEL_20970",
"LABEL_20971",
"LABEL_20972",
"LABEL_20973",
"LABEL_20974",
"LABEL_20975",
"LABEL_20976",
"LABEL_20977",
"LABEL_20978",
"LABEL_20979",
"LABEL_2098",
"LABEL_20980",
"LABEL_20981",
"LABEL_20982",
"LABEL_20983",
"LABEL_20984",
"LABEL_20985",
"LABEL_20986",
"LABEL_20987",
"LABEL_20988",
"LABEL_20989",
"LABEL_2099",
"LABEL_20990",
"LABEL_20991",
"LABEL_20992",
"LABEL_20993",
"LABEL_20994",
"LABEL_20995",
"LABEL_20996",
"LABEL_20997",
"LABEL_20998",
"LABEL_20999",
"LABEL_21",
"LABEL_210",
"LABEL_2100",
"LABEL_21000",
"LABEL_21001",
"LABEL_21002",
"LABEL_21003",
"LABEL_21004",
"LABEL_21005",
"LABEL_21006",
"LABEL_21007",
"LABEL_21008",
"LABEL_21009",
"LABEL_2101",
"LABEL_21010",
"LABEL_21011",
"LABEL_21012",
"LABEL_21013",
"LABEL_21014",
"LABEL_21015",
"LABEL_21016",
"LABEL_21017",
"LABEL_21018",
"LABEL_21019",
"LABEL_2102",
"LABEL_21020",
"LABEL_21021",
"LABEL_21022",
"LABEL_21023",
"LABEL_21024",
"LABEL_21025",
"LABEL_21026",
"LABEL_21027",
"LABEL_21028",
"LABEL_21029",
"LABEL_2103",
"LABEL_21030",
"LABEL_21031",
"LABEL_21032",
"LABEL_21033",
"LABEL_21034",
"LABEL_21035",
"LABEL_21036",
"LABEL_21037",
"LABEL_21038",
"LABEL_21039",
"LABEL_2104",
"LABEL_21040",
"LABEL_21041",
"LABEL_21042",
"LABEL_21043",
"LABEL_21044",
"LABEL_21045",
"LABEL_21046",
"LABEL_21047",
"LABEL_21048",
"LABEL_21049",
"LABEL_2105",
"LABEL_21050",
"LABEL_21051",
"LABEL_21052",
"LABEL_21053",
"LABEL_21054",
"LABEL_21055",
"LABEL_21056",
"LABEL_21057",
"LABEL_21058",
"LABEL_21059",
"LABEL_2106",
"LABEL_21060",
"LABEL_21061",
"LABEL_21062",
"LABEL_21063",
"LABEL_21064",
"LABEL_21065",
"LABEL_21066",
"LABEL_21067",
"LABEL_21068",
"LABEL_21069",
"LABEL_2107",
"LABEL_21070",
"LABEL_21071",
"LABEL_21072",
"LABEL_21073",
"LABEL_21074",
"LABEL_21075",
"LABEL_21076",
"LABEL_21077",
"LABEL_21078",
"LABEL_21079",
"LABEL_2108",
"LABEL_21080",
"LABEL_21081",
"LABEL_21082",
"LABEL_21083",
"LABEL_21084",
"LABEL_21085",
"LABEL_21086",
"LABEL_21087",
"LABEL_21088",
"LABEL_21089",
"LABEL_2109",
"LABEL_21090",
"LABEL_21091",
"LABEL_21092",
"LABEL_21093",
"LABEL_21094",
"LABEL_21095",
"LABEL_21096",
"LABEL_21097",
"LABEL_21098",
"LABEL_21099",
"LABEL_211",
"LABEL_2110",
"LABEL_21100",
"LABEL_21101",
"LABEL_21102",
"LABEL_21103",
"LABEL_21104",
"LABEL_21105",
"LABEL_21106",
"LABEL_21107",
"LABEL_21108",
"LABEL_21109",
"LABEL_2111",
"LABEL_21110",
"LABEL_21111",
"LABEL_21112",
"LABEL_21113",
"LABEL_21114",
"LABEL_21115",
"LABEL_21116",
"LABEL_21117",
"LABEL_21118",
"LABEL_21119",
"LABEL_2112",
"LABEL_21120",
"LABEL_21121",
"LABEL_21122",
"LABEL_21123",
"LABEL_21124",
"LABEL_21125",
"LABEL_21126",
"LABEL_21127",
"LABEL_21128",
"LABEL_21129",
"LABEL_2113",
"LABEL_21130",
"LABEL_21131",
"LABEL_21132",
"LABEL_21133",
"LABEL_21134",
"LABEL_21135",
"LABEL_21136",
"LABEL_21137",
"LABEL_21138",
"LABEL_21139",
"LABEL_2114",
"LABEL_21140",
"LABEL_21141",
"LABEL_21142",
"LABEL_21143",
"LABEL_21144",
"LABEL_21145",
"LABEL_21146",
"LABEL_21147",
"LABEL_21148",
"LABEL_21149",
"LABEL_2115",
"LABEL_21150",
"LABEL_21151",
"LABEL_21152",
"LABEL_21153",
"LABEL_21154",
"LABEL_21155",
"LABEL_21156",
"LABEL_21157",
"LABEL_21158",
"LABEL_21159",
"LABEL_2116",
"LABEL_21160",
"LABEL_21161",
"LABEL_21162",
"LABEL_21163",
"LABEL_21164",
"LABEL_21165",
"LABEL_21166",
"LABEL_21167",
"LABEL_21168",
"LABEL_21169",
"LABEL_2117",
"LABEL_21170",
"LABEL_21171",
"LABEL_21172",
"LABEL_21173",
"LABEL_21174",
"LABEL_21175",
"LABEL_21176",
"LABEL_21177",
"LABEL_21178",
"LABEL_21179",
"LABEL_2118",
"LABEL_21180",
"LABEL_21181",
"LABEL_21182",
"LABEL_21183",
"LABEL_21184",
"LABEL_21185",
"LABEL_21186",
"LABEL_21187",
"LABEL_21188",
"LABEL_21189",
"LABEL_2119",
"LABEL_21190",
"LABEL_21191",
"LABEL_21192",
"LABEL_21193",
"LABEL_21194",
"LABEL_21195",
"LABEL_21196",
"LABEL_21197",
"LABEL_21198",
"LABEL_21199",
"LABEL_212",
"LABEL_2120",
"LABEL_21200",
"LABEL_21201",
"LABEL_21202",
"LABEL_21203",
"LABEL_21204",
"LABEL_21205",
"LABEL_21206",
"LABEL_21207",
"LABEL_21208",
"LABEL_21209",
"LABEL_2121",
"LABEL_21210",
"LABEL_21211",
"LABEL_21212",
"LABEL_21213",
"LABEL_21214",
"LABEL_21215",
"LABEL_21216",
"LABEL_21217",
"LABEL_21218",
"LABEL_21219",
"LABEL_2122",
"LABEL_21220",
"LABEL_21221",
"LABEL_21222",
"LABEL_21223",
"LABEL_21224",
"LABEL_21225",
"LABEL_21226",
"LABEL_21227",
"LABEL_21228",
"LABEL_21229",
"LABEL_2123",
"LABEL_21230",
"LABEL_21231",
"LABEL_21232",
"LABEL_21233",
"LABEL_21234",
"LABEL_21235",
"LABEL_21236",
"LABEL_21237",
"LABEL_21238",
"LABEL_21239",
"LABEL_2124",
"LABEL_21240",
"LABEL_21241",
"LABEL_21242",
"LABEL_21243",
"LABEL_21244",
"LABEL_21245",
"LABEL_21246",
"LABEL_21247",
"LABEL_21248",
"LABEL_21249",
"LABEL_2125",
"LABEL_21250",
"LABEL_21251",
"LABEL_21252",
"LABEL_21253",
"LABEL_21254",
"LABEL_21255",
"LABEL_21256",
"LABEL_21257",
"LABEL_21258",
"LABEL_21259",
"LABEL_2126",
"LABEL_21260",
"LABEL_21261",
"LABEL_21262",
"LABEL_21263",
"LABEL_21264",
"LABEL_21265",
"LABEL_21266",
"LABEL_21267",
"LABEL_21268",
"LABEL_21269",
"LABEL_2127",
"LABEL_21270",
"LABEL_21271",
"LABEL_21272",
"LABEL_21273",
"LABEL_21274",
"LABEL_21275",
"LABEL_21276",
"LABEL_21277",
"LABEL_21278",
"LABEL_21279",
"LABEL_2128",
"LABEL_21280",
"LABEL_21281",
"LABEL_21282",
"LABEL_21283",
"LABEL_21284",
"LABEL_21285",
"LABEL_21286",
"LABEL_21287",
"LABEL_21288",
"LABEL_21289",
"LABEL_2129",
"LABEL_21290",
"LABEL_21291",
"LABEL_21292",
"LABEL_21293",
"LABEL_21294",
"LABEL_21295",
"LABEL_21296",
"LABEL_21297",
"LABEL_21298",
"LABEL_21299",
"LABEL_213",
"LABEL_2130",
"LABEL_21300",
"LABEL_21301",
"LABEL_21302",
"LABEL_21303",
"LABEL_21304",
"LABEL_21305",
"LABEL_21306",
"LABEL_21307",
"LABEL_21308",
"LABEL_21309",
"LABEL_2131",
"LABEL_21310",
"LABEL_21311",
"LABEL_21312",
"LABEL_21313",
"LABEL_21314",
"LABEL_21315",
"LABEL_21316",
"LABEL_21317",
"LABEL_21318",
"LABEL_21319",
"LABEL_2132",
"LABEL_21320",
"LABEL_21321",
"LABEL_21322",
"LABEL_21323",
"LABEL_21324",
"LABEL_21325",
"LABEL_21326",
"LABEL_21327",
"LABEL_21328",
"LABEL_21329",
"LABEL_2133",
"LABEL_21330",
"LABEL_21331",
"LABEL_21332",
"LABEL_21333",
"LABEL_21334",
"LABEL_21335",
"LABEL_21336",
"LABEL_21337",
"LABEL_21338",
"LABEL_21339",
"LABEL_2134",
"LABEL_21340",
"LABEL_21341",
"LABEL_21342",
"LABEL_21343",
"LABEL_21344",
"LABEL_21345",
"LABEL_21346",
"LABEL_21347",
"LABEL_21348",
"LABEL_21349",
"LABEL_2135",
"LABEL_21350",
"LABEL_21351",
"LABEL_21352",
"LABEL_21353",
"LABEL_21354",
"LABEL_21355",
"LABEL_21356",
"LABEL_21357",
"LABEL_21358",
"LABEL_21359",
"LABEL_2136",
"LABEL_21360",
"LABEL_21361",
"LABEL_21362",
"LABEL_21363",
"LABEL_21364",
"LABEL_21365",
"LABEL_21366",
"LABEL_21367",
"LABEL_21368",
"LABEL_21369",
"LABEL_2137",
"LABEL_21370",
"LABEL_21371",
"LABEL_21372",
"LABEL_21373",
"LABEL_21374",
"LABEL_21375",
"LABEL_21376",
"LABEL_21377",
"LABEL_21378",
"LABEL_21379",
"LABEL_2138",
"LABEL_21380",
"LABEL_21381",
"LABEL_21382",
"LABEL_21383",
"LABEL_21384",
"LABEL_21385",
"LABEL_21386",
"LABEL_21387",
"LABEL_21388",
"LABEL_21389",
"LABEL_2139",
"LABEL_21390",
"LABEL_21391",
"LABEL_21392",
"LABEL_21393",
"LABEL_21394",
"LABEL_21395",
"LABEL_21396",
"LABEL_21397",
"LABEL_21398",
"LABEL_21399",
"LABEL_214",
"LABEL_2140",
"LABEL_21400",
"LABEL_21401",
"LABEL_21402",
"LABEL_21403",
"LABEL_21404",
"LABEL_21405",
"LABEL_21406",
"LABEL_21407",
"LABEL_21408",
"LABEL_21409",
"LABEL_2141",
"LABEL_21410",
"LABEL_21411",
"LABEL_21412",
"LABEL_21413",
"LABEL_21414",
"LABEL_21415",
"LABEL_21416",
"LABEL_21417",
"LABEL_21418",
"LABEL_21419",
"LABEL_2142",
"LABEL_21420",
"LABEL_21421",
"LABEL_21422",
"LABEL_21423",
"LABEL_21424",
"LABEL_21425",
"LABEL_21426",
"LABEL_21427",
"LABEL_21428",
"LABEL_21429",
"LABEL_2143",
"LABEL_21430",
"LABEL_21431",
"LABEL_21432",
"LABEL_21433",
"LABEL_21434",
"LABEL_21435",
"LABEL_21436",
"LABEL_21437",
"LABEL_21438",
"LABEL_21439",
"LABEL_2144",
"LABEL_21440",
"LABEL_21441",
"LABEL_21442",
"LABEL_21443",
"LABEL_21444",
"LABEL_21445",
"LABEL_21446",
"LABEL_21447",
"LABEL_21448",
"LABEL_21449",
"LABEL_2145",
"LABEL_21450",
"LABEL_21451",
"LABEL_21452",
"LABEL_21453",
"LABEL_21454",
"LABEL_21455",
"LABEL_21456",
"LABEL_21457",
"LABEL_21458",
"LABEL_21459",
"LABEL_2146",
"LABEL_21460",
"LABEL_21461",
"LABEL_21462",
"LABEL_21463",
"LABEL_21464",
"LABEL_21465",
"LABEL_21466",
"LABEL_21467",
"LABEL_21468",
"LABEL_21469",
"LABEL_2147",
"LABEL_21470",
"LABEL_21471",
"LABEL_21472",
"LABEL_21473",
"LABEL_21474",
"LABEL_21475",
"LABEL_21476",
"LABEL_21477",
"LABEL_21478",
"LABEL_21479",
"LABEL_2148",
"LABEL_21480",
"LABEL_21481",
"LABEL_21482",
"LABEL_21483",
"LABEL_21484",
"LABEL_21485",
"LABEL_21486",
"LABEL_21487",
"LABEL_21488",
"LABEL_21489",
"LABEL_2149",
"LABEL_21490",
"LABEL_21491",
"LABEL_21492",
"LABEL_21493",
"LABEL_21494",
"LABEL_21495",
"LABEL_21496",
"LABEL_21497",
"LABEL_21498",
"LABEL_21499",
"LABEL_215",
"LABEL_2150",
"LABEL_21500",
"LABEL_21501",
"LABEL_21502",
"LABEL_21503",
"LABEL_21504",
"LABEL_21505",
"LABEL_21506",
"LABEL_21507",
"LABEL_21508",
"LABEL_21509",
"LABEL_2151",
"LABEL_21510",
"LABEL_21511",
"LABEL_21512",
"LABEL_21513",
"LABEL_21514",
"LABEL_21515",
"LABEL_21516",
"LABEL_21517",
"LABEL_21518",
"LABEL_21519",
"LABEL_2152",
"LABEL_21520",
"LABEL_21521",
"LABEL_21522",
"LABEL_21523",
"LABEL_21524",
"LABEL_21525",
"LABEL_21526",
"LABEL_21527",
"LABEL_21528",
"LABEL_21529",
"LABEL_2153",
"LABEL_21530",
"LABEL_21531",
"LABEL_21532",
"LABEL_21533",
"LABEL_21534",
"LABEL_21535",
"LABEL_21536",
"LABEL_21537",
"LABEL_21538",
"LABEL_21539",
"LABEL_2154",
"LABEL_21540",
"LABEL_21541",
"LABEL_21542",
"LABEL_21543",
"LABEL_21544",
"LABEL_21545",
"LABEL_21546",
"LABEL_21547",
"LABEL_21548",
"LABEL_21549",
"LABEL_2155",
"LABEL_21550",
"LABEL_21551",
"LABEL_21552",
"LABEL_21553",
"LABEL_21554",
"LABEL_21555",
"LABEL_21556",
"LABEL_21557",
"LABEL_21558",
"LABEL_21559",
"LABEL_2156",
"LABEL_21560",
"LABEL_21561",
"LABEL_21562",
"LABEL_21563",
"LABEL_21564",
"LABEL_21565",
"LABEL_21566",
"LABEL_21567",
"LABEL_21568",
"LABEL_21569",
"LABEL_2157",
"LABEL_21570",
"LABEL_21571",
"LABEL_21572",
"LABEL_21573",
"LABEL_21574",
"LABEL_21575",
"LABEL_21576",
"LABEL_21577",
"LABEL_21578",
"LABEL_21579",
"LABEL_2158",
"LABEL_21580",
"LABEL_21581",
"LABEL_21582",
"LABEL_21583",
"LABEL_21584",
"LABEL_21585",
"LABEL_21586",
"LABEL_21587",
"LABEL_21588",
"LABEL_21589",
"LABEL_2159",
"LABEL_21590",
"LABEL_21591",
"LABEL_21592",
"LABEL_21593",
"LABEL_21594",
"LABEL_21595",
"LABEL_21596",
"LABEL_21597",
"LABEL_21598",
"LABEL_21599",
"LABEL_216",
"LABEL_2160",
"LABEL_21600",
"LABEL_21601",
"LABEL_21602",
"LABEL_21603",
"LABEL_21604",
"LABEL_21605",
"LABEL_21606",
"LABEL_21607",
"LABEL_21608",
"LABEL_21609",
"LABEL_2161",
"LABEL_21610",
"LABEL_21611",
"LABEL_21612",
"LABEL_21613",
"LABEL_21614",
"LABEL_21615",
"LABEL_21616",
"LABEL_21617",
"LABEL_21618",
"LABEL_21619",
"LABEL_2162",
"LABEL_21620",
"LABEL_21621",
"LABEL_21622",
"LABEL_21623",
"LABEL_21624",
"LABEL_21625",
"LABEL_21626",
"LABEL_21627",
"LABEL_21628",
"LABEL_21629",
"LABEL_2163",
"LABEL_21630",
"LABEL_21631",
"LABEL_21632",
"LABEL_21633",
"LABEL_21634",
"LABEL_21635",
"LABEL_21636",
"LABEL_21637",
"LABEL_21638",
"LABEL_21639",
"LABEL_2164",
"LABEL_21640",
"LABEL_21641",
"LABEL_21642",
"LABEL_21643",
"LABEL_21644",
"LABEL_21645",
"LABEL_21646",
"LABEL_21647",
"LABEL_21648",
"LABEL_21649",
"LABEL_2165",
"LABEL_21650",
"LABEL_21651",
"LABEL_21652",
"LABEL_21653",
"LABEL_21654",
"LABEL_21655",
"LABEL_21656",
"LABEL_21657",
"LABEL_21658",
"LABEL_21659",
"LABEL_2166",
"LABEL_21660",
"LABEL_21661",
"LABEL_21662",
"LABEL_21663",
"LABEL_21664",
"LABEL_21665",
"LABEL_21666",
"LABEL_21667",
"LABEL_21668",
"LABEL_21669",
"LABEL_2167",
"LABEL_21670",
"LABEL_21671",
"LABEL_21672",
"LABEL_21673",
"LABEL_21674",
"LABEL_21675",
"LABEL_21676",
"LABEL_21677",
"LABEL_21678",
"LABEL_21679",
"LABEL_2168",
"LABEL_21680",
"LABEL_21681",
"LABEL_21682",
"LABEL_21683",
"LABEL_21684",
"LABEL_21685",
"LABEL_21686",
"LABEL_21687",
"LABEL_21688",
"LABEL_21689",
"LABEL_2169",
"LABEL_21690",
"LABEL_21691",
"LABEL_21692",
"LABEL_21693",
"LABEL_21694",
"LABEL_21695",
"LABEL_21696",
"LABEL_21697",
"LABEL_21698",
"LABEL_21699",
"LABEL_217",
"LABEL_2170",
"LABEL_21700",
"LABEL_21701",
"LABEL_21702",
"LABEL_21703",
"LABEL_21704",
"LABEL_21705",
"LABEL_21706",
"LABEL_21707",
"LABEL_21708",
"LABEL_21709",
"LABEL_2171",
"LABEL_21710",
"LABEL_21711",
"LABEL_21712",
"LABEL_21713",
"LABEL_21714",
"LABEL_21715",
"LABEL_21716",
"LABEL_21717",
"LABEL_21718",
"LABEL_21719",
"LABEL_2172",
"LABEL_21720",
"LABEL_21721",
"LABEL_21722",
"LABEL_21723",
"LABEL_21724",
"LABEL_21725",
"LABEL_21726",
"LABEL_21727",
"LABEL_21728",
"LABEL_21729",
"LABEL_2173",
"LABEL_21730",
"LABEL_21731",
"LABEL_21732",
"LABEL_21733",
"LABEL_21734",
"LABEL_21735",
"LABEL_21736",
"LABEL_21737",
"LABEL_21738",
"LABEL_21739",
"LABEL_2174",
"LABEL_21740",
"LABEL_21741",
"LABEL_21742",
"LABEL_21743",
"LABEL_21744",
"LABEL_21745",
"LABEL_21746",
"LABEL_21747",
"LABEL_21748",
"LABEL_21749",
"LABEL_2175",
"LABEL_21750",
"LABEL_21751",
"LABEL_21752",
"LABEL_21753",
"LABEL_21754",
"LABEL_21755",
"LABEL_21756",
"LABEL_21757",
"LABEL_21758",
"LABEL_21759",
"LABEL_2176",
"LABEL_21760",
"LABEL_21761",
"LABEL_21762",
"LABEL_21763",
"LABEL_21764",
"LABEL_21765",
"LABEL_21766",
"LABEL_21767",
"LABEL_21768",
"LABEL_21769",
"LABEL_2177",
"LABEL_21770",
"LABEL_21771",
"LABEL_21772",
"LABEL_21773",
"LABEL_21774",
"LABEL_21775",
"LABEL_21776",
"LABEL_21777",
"LABEL_21778",
"LABEL_21779",
"LABEL_2178",
"LABEL_21780",
"LABEL_21781",
"LABEL_21782",
"LABEL_21783",
"LABEL_21784",
"LABEL_21785",
"LABEL_21786",
"LABEL_21787",
"LABEL_21788",
"LABEL_21789",
"LABEL_2179",
"LABEL_21790",
"LABEL_21791",
"LABEL_21792",
"LABEL_21793",
"LABEL_21794",
"LABEL_21795",
"LABEL_21796",
"LABEL_21797",
"LABEL_21798",
"LABEL_21799",
"LABEL_218",
"LABEL_2180",
"LABEL_21800",
"LABEL_21801",
"LABEL_21802",
"LABEL_21803",
"LABEL_21804",
"LABEL_21805",
"LABEL_21806",
"LABEL_21807",
"LABEL_21808",
"LABEL_21809",
"LABEL_2181",
"LABEL_21810",
"LABEL_21811",
"LABEL_21812",
"LABEL_21813",
"LABEL_21814",
"LABEL_21815",
"LABEL_21816",
"LABEL_21817",
"LABEL_21818",
"LABEL_21819",
"LABEL_2182",
"LABEL_21820",
"LABEL_21821",
"LABEL_21822",
"LABEL_21823",
"LABEL_21824",
"LABEL_21825",
"LABEL_21826",
"LABEL_21827",
"LABEL_21828",
"LABEL_21829",
"LABEL_2183",
"LABEL_21830",
"LABEL_21831",
"LABEL_21832",
"LABEL_21833",
"LABEL_21834",
"LABEL_21835",
"LABEL_21836",
"LABEL_21837",
"LABEL_21838",
"LABEL_21839",
"LABEL_2184",
"LABEL_21840",
"LABEL_21841",
"LABEL_21842",
"LABEL_21843",
"LABEL_21844",
"LABEL_21845",
"LABEL_21846",
"LABEL_21847",
"LABEL_21848",
"LABEL_21849",
"LABEL_2185",
"LABEL_21850",
"LABEL_21851",
"LABEL_21852",
"LABEL_21853",
"LABEL_21854",
"LABEL_21855",
"LABEL_21856",
"LABEL_21857",
"LABEL_21858",
"LABEL_21859",
"LABEL_2186",
"LABEL_21860",
"LABEL_21861",
"LABEL_21862",
"LABEL_21863",
"LABEL_21864",
"LABEL_21865",
"LABEL_21866",
"LABEL_21867",
"LABEL_21868",
"LABEL_21869",
"LABEL_2187",
"LABEL_21870",
"LABEL_21871",
"LABEL_21872",
"LABEL_21873",
"LABEL_21874",
"LABEL_21875",
"LABEL_21876",
"LABEL_21877",
"LABEL_21878",
"LABEL_21879",
"LABEL_2188",
"LABEL_21880",
"LABEL_21881",
"LABEL_21882",
"LABEL_21883",
"LABEL_21884",
"LABEL_21885",
"LABEL_21886",
"LABEL_21887",
"LABEL_21888",
"LABEL_21889",
"LABEL_2189",
"LABEL_21890",
"LABEL_21891",
"LABEL_21892",
"LABEL_21893",
"LABEL_21894",
"LABEL_21895",
"LABEL_21896",
"LABEL_21897",
"LABEL_21898",
"LABEL_21899",
"LABEL_219",
"LABEL_2190",
"LABEL_21900",
"LABEL_21901",
"LABEL_21902",
"LABEL_21903",
"LABEL_21904",
"LABEL_21905",
"LABEL_21906",
"LABEL_21907",
"LABEL_21908",
"LABEL_21909",
"LABEL_2191",
"LABEL_21910",
"LABEL_21911",
"LABEL_21912",
"LABEL_21913",
"LABEL_21914",
"LABEL_21915",
"LABEL_21916",
"LABEL_21917",
"LABEL_21918",
"LABEL_21919",
"LABEL_2192",
"LABEL_21920",
"LABEL_21921",
"LABEL_21922",
"LABEL_21923",
"LABEL_21924",
"LABEL_21925",
"LABEL_21926",
"LABEL_21927",
"LABEL_21928",
"LABEL_21929",
"LABEL_2193",
"LABEL_21930",
"LABEL_21931",
"LABEL_21932",
"LABEL_21933",
"LABEL_21934",
"LABEL_21935",
"LABEL_21936",
"LABEL_21937",
"LABEL_21938",
"LABEL_21939",
"LABEL_2194",
"LABEL_21940",
"LABEL_21941",
"LABEL_21942",
"LABEL_21943",
"LABEL_21944",
"LABEL_21945",
"LABEL_21946",
"LABEL_21947",
"LABEL_21948",
"LABEL_21949",
"LABEL_2195",
"LABEL_21950",
"LABEL_21951",
"LABEL_21952",
"LABEL_21953",
"LABEL_21954",
"LABEL_21955",
"LABEL_21956",
"LABEL_21957",
"LABEL_21958",
"LABEL_21959",
"LABEL_2196",
"LABEL_21960",
"LABEL_21961",
"LABEL_21962",
"LABEL_21963",
"LABEL_21964",
"LABEL_21965",
"LABEL_21966",
"LABEL_21967",
"LABEL_21968",
"LABEL_21969",
"LABEL_2197",
"LABEL_21970",
"LABEL_21971",
"LABEL_21972",
"LABEL_21973",
"LABEL_21974",
"LABEL_21975",
"LABEL_21976",
"LABEL_21977",
"LABEL_21978",
"LABEL_21979",
"LABEL_2198",
"LABEL_21980",
"LABEL_21981",
"LABEL_21982",
"LABEL_21983",
"LABEL_21984",
"LABEL_21985",
"LABEL_21986",
"LABEL_21987",
"LABEL_21988",
"LABEL_21989",
"LABEL_2199",
"LABEL_21990",
"LABEL_21991",
"LABEL_21992",
"LABEL_21993",
"LABEL_21994",
"LABEL_21995",
"LABEL_21996",
"LABEL_21997",
"LABEL_21998",
"LABEL_21999",
"LABEL_22",
"LABEL_220",
"LABEL_2200",
"LABEL_22000",
"LABEL_22001",
"LABEL_22002",
"LABEL_22003",
"LABEL_22004",
"LABEL_22005",
"LABEL_22006",
"LABEL_22007",
"LABEL_22008",
"LABEL_22009",
"LABEL_2201",
"LABEL_22010",
"LABEL_22011",
"LABEL_22012",
"LABEL_22013",
"LABEL_22014",
"LABEL_22015",
"LABEL_22016",
"LABEL_22017",
"LABEL_22018",
"LABEL_22019",
"LABEL_2202",
"LABEL_22020",
"LABEL_22021",
"LABEL_22022",
"LABEL_22023",
"LABEL_22024",
"LABEL_22025",
"LABEL_22026",
"LABEL_22027",
"LABEL_22028",
"LABEL_22029",
"LABEL_2203",
"LABEL_22030",
"LABEL_22031",
"LABEL_22032",
"LABEL_22033",
"LABEL_22034",
"LABEL_22035",
"LABEL_22036",
"LABEL_22037",
"LABEL_22038",
"LABEL_22039",
"LABEL_2204",
"LABEL_22040",
"LABEL_22041",
"LABEL_22042",
"LABEL_22043",
"LABEL_22044",
"LABEL_22045",
"LABEL_22046",
"LABEL_22047",
"LABEL_22048",
"LABEL_22049",
"LABEL_2205",
"LABEL_22050",
"LABEL_22051",
"LABEL_22052",
"LABEL_22053",
"LABEL_22054",
"LABEL_22055",
"LABEL_22056",
"LABEL_22057",
"LABEL_22058",
"LABEL_22059",
"LABEL_2206",
"LABEL_22060",
"LABEL_22061",
"LABEL_22062",
"LABEL_22063",
"LABEL_22064",
"LABEL_22065",
"LABEL_22066",
"LABEL_22067",
"LABEL_22068",
"LABEL_22069",
"LABEL_2207",
"LABEL_22070",
"LABEL_22071",
"LABEL_22072",
"LABEL_22073",
"LABEL_22074",
"LABEL_22075",
"LABEL_22076",
"LABEL_22077",
"LABEL_22078",
"LABEL_22079",
"LABEL_2208",
"LABEL_22080",
"LABEL_22081",
"LABEL_22082",
"LABEL_22083",
"LABEL_22084",
"LABEL_22085",
"LABEL_22086",
"LABEL_22087",
"LABEL_22088",
"LABEL_22089",
"LABEL_2209",
"LABEL_22090",
"LABEL_22091",
"LABEL_22092",
"LABEL_22093",
"LABEL_22094",
"LABEL_22095",
"LABEL_22096",
"LABEL_22097",
"LABEL_22098",
"LABEL_22099",
"LABEL_221",
"LABEL_2210",
"LABEL_22100",
"LABEL_22101",
"LABEL_22102",
"LABEL_22103",
"LABEL_22104",
"LABEL_22105",
"LABEL_22106",
"LABEL_22107",
"LABEL_22108",
"LABEL_22109",
"LABEL_2211",
"LABEL_22110",
"LABEL_22111",
"LABEL_22112",
"LABEL_22113",
"LABEL_22114",
"LABEL_22115",
"LABEL_22116",
"LABEL_22117",
"LABEL_22118",
"LABEL_22119",
"LABEL_2212",
"LABEL_22120",
"LABEL_22121",
"LABEL_22122",
"LABEL_22123",
"LABEL_22124",
"LABEL_22125",
"LABEL_22126",
"LABEL_22127",
"LABEL_22128",
"LABEL_22129",
"LABEL_2213",
"LABEL_22130",
"LABEL_22131",
"LABEL_22132",
"LABEL_22133",
"LABEL_22134",
"LABEL_22135",
"LABEL_22136",
"LABEL_22137",
"LABEL_22138",
"LABEL_22139",
"LABEL_2214",
"LABEL_22140",
"LABEL_22141",
"LABEL_22142",
"LABEL_22143",
"LABEL_22144",
"LABEL_22145",
"LABEL_22146",
"LABEL_22147",
"LABEL_22148",
"LABEL_22149",
"LABEL_2215",
"LABEL_22150",
"LABEL_22151",
"LABEL_22152",
"LABEL_22153",
"LABEL_22154",
"LABEL_22155",
"LABEL_22156",
"LABEL_22157",
"LABEL_22158",
"LABEL_22159",
"LABEL_2216",
"LABEL_22160",
"LABEL_22161",
"LABEL_22162",
"LABEL_22163",
"LABEL_22164",
"LABEL_22165",
"LABEL_22166",
"LABEL_22167",
"LABEL_22168",
"LABEL_22169",
"LABEL_2217",
"LABEL_22170",
"LABEL_22171",
"LABEL_22172",
"LABEL_22173",
"LABEL_22174",
"LABEL_22175",
"LABEL_22176",
"LABEL_22177",
"LABEL_22178",
"LABEL_22179",
"LABEL_2218",
"LABEL_22180",
"LABEL_22181",
"LABEL_22182",
"LABEL_22183",
"LABEL_22184",
"LABEL_22185",
"LABEL_22186",
"LABEL_22187",
"LABEL_22188",
"LABEL_22189",
"LABEL_2219",
"LABEL_22190",
"LABEL_22191",
"LABEL_22192",
"LABEL_22193",
"LABEL_22194",
"LABEL_22195",
"LABEL_22196",
"LABEL_22197",
"LABEL_22198",
"LABEL_22199",
"LABEL_222",
"LABEL_2220",
"LABEL_22200",
"LABEL_22201",
"LABEL_22202",
"LABEL_22203",
"LABEL_22204",
"LABEL_22205",
"LABEL_22206",
"LABEL_22207",
"LABEL_22208",
"LABEL_22209",
"LABEL_2221",
"LABEL_22210",
"LABEL_22211",
"LABEL_22212",
"LABEL_22213",
"LABEL_22214",
"LABEL_22215",
"LABEL_22216",
"LABEL_22217",
"LABEL_22218",
"LABEL_22219",
"LABEL_2222",
"LABEL_22220",
"LABEL_22221",
"LABEL_22222",
"LABEL_22223",
"LABEL_22224",
"LABEL_22225",
"LABEL_22226",
"LABEL_22227",
"LABEL_22228",
"LABEL_22229",
"LABEL_2223",
"LABEL_22230",
"LABEL_22231",
"LABEL_22232",
"LABEL_22233",
"LABEL_22234",
"LABEL_22235",
"LABEL_22236",
"LABEL_22237",
"LABEL_22238",
"LABEL_22239",
"LABEL_2224",
"LABEL_22240",
"LABEL_22241",
"LABEL_22242",
"LABEL_22243",
"LABEL_22244",
"LABEL_22245",
"LABEL_22246",
"LABEL_22247",
"LABEL_22248",
"LABEL_22249",
"LABEL_2225",
"LABEL_22250",
"LABEL_22251",
"LABEL_22252",
"LABEL_22253",
"LABEL_22254",
"LABEL_22255",
"LABEL_22256",
"LABEL_22257",
"LABEL_22258",
"LABEL_22259",
"LABEL_2226",
"LABEL_22260",
"LABEL_22261",
"LABEL_22262",
"LABEL_22263",
"LABEL_22264",
"LABEL_22265",
"LABEL_22266",
"LABEL_22267",
"LABEL_22268",
"LABEL_22269",
"LABEL_2227",
"LABEL_22270",
"LABEL_22271",
"LABEL_22272",
"LABEL_22273",
"LABEL_22274",
"LABEL_22275",
"LABEL_22276",
"LABEL_22277",
"LABEL_22278",
"LABEL_22279",
"LABEL_2228",
"LABEL_22280",
"LABEL_22281",
"LABEL_22282",
"LABEL_22283",
"LABEL_22284",
"LABEL_22285",
"LABEL_22286",
"LABEL_22287",
"LABEL_22288",
"LABEL_22289",
"LABEL_2229",
"LABEL_22290",
"LABEL_22291",
"LABEL_22292",
"LABEL_22293",
"LABEL_22294",
"LABEL_22295",
"LABEL_22296",
"LABEL_22297",
"LABEL_22298",
"LABEL_22299",
"LABEL_223",
"LABEL_2230",
"LABEL_22300",
"LABEL_22301",
"LABEL_22302",
"LABEL_22303",
"LABEL_22304",
"LABEL_22305",
"LABEL_22306",
"LABEL_22307",
"LABEL_22308",
"LABEL_22309",
"LABEL_2231",
"LABEL_22310",
"LABEL_22311",
"LABEL_22312",
"LABEL_22313",
"LABEL_22314",
"LABEL_22315",
"LABEL_22316",
"LABEL_22317",
"LABEL_22318",
"LABEL_22319",
"LABEL_2232",
"LABEL_22320",
"LABEL_22321",
"LABEL_22322",
"LABEL_22323",
"LABEL_22324",
"LABEL_22325",
"LABEL_22326",
"LABEL_22327",
"LABEL_22328",
"LABEL_22329",
"LABEL_2233",
"LABEL_22330",
"LABEL_22331",
"LABEL_22332",
"LABEL_22333",
"LABEL_22334",
"LABEL_22335",
"LABEL_22336",
"LABEL_22337",
"LABEL_22338",
"LABEL_22339",
"LABEL_2234",
"LABEL_22340",
"LABEL_22341",
"LABEL_22342",
"LABEL_22343",
"LABEL_22344",
"LABEL_22345",
"LABEL_22346",
"LABEL_22347",
"LABEL_22348",
"LABEL_22349",
"LABEL_2235",
"LABEL_22350",
"LABEL_22351",
"LABEL_22352",
"LABEL_22353",
"LABEL_22354",
"LABEL_22355",
"LABEL_22356",
"LABEL_22357",
"LABEL_22358",
"LABEL_22359",
"LABEL_2236",
"LABEL_22360",
"LABEL_22361",
"LABEL_22362",
"LABEL_22363",
"LABEL_22364",
"LABEL_22365",
"LABEL_22366",
"LABEL_22367",
"LABEL_22368",
"LABEL_22369",
"LABEL_2237",
"LABEL_22370",
"LABEL_22371",
"LABEL_22372",
"LABEL_22373",
"LABEL_22374",
"LABEL_22375",
"LABEL_22376",
"LABEL_22377",
"LABEL_22378",
"LABEL_22379",
"LABEL_2238",
"LABEL_22380",
"LABEL_22381",
"LABEL_22382",
"LABEL_22383",
"LABEL_22384",
"LABEL_22385",
"LABEL_22386",
"LABEL_22387",
"LABEL_22388",
"LABEL_22389",
"LABEL_2239",
"LABEL_22390",
"LABEL_22391",
"LABEL_22392",
"LABEL_22393",
"LABEL_22394",
"LABEL_22395",
"LABEL_22396",
"LABEL_22397",
"LABEL_22398",
"LABEL_22399",
"LABEL_224",
"LABEL_2240",
"LABEL_22400",
"LABEL_22401",
"LABEL_22402",
"LABEL_22403",
"LABEL_22404",
"LABEL_22405",
"LABEL_22406",
"LABEL_22407",
"LABEL_22408",
"LABEL_22409",
"LABEL_2241",
"LABEL_22410",
"LABEL_22411",
"LABEL_22412",
"LABEL_22413",
"LABEL_22414",
"LABEL_22415",
"LABEL_22416",
"LABEL_22417",
"LABEL_22418",
"LABEL_22419",
"LABEL_2242",
"LABEL_22420",
"LABEL_22421",
"LABEL_22422",
"LABEL_22423",
"LABEL_22424",
"LABEL_22425",
"LABEL_22426",
"LABEL_22427",
"LABEL_22428",
"LABEL_22429",
"LABEL_2243",
"LABEL_22430",
"LABEL_22431",
"LABEL_22432",
"LABEL_22433",
"LABEL_22434",
"LABEL_22435",
"LABEL_22436",
"LABEL_22437",
"LABEL_22438",
"LABEL_22439",
"LABEL_2244",
"LABEL_22440",
"LABEL_22441",
"LABEL_22442",
"LABEL_22443",
"LABEL_22444",
"LABEL_22445",
"LABEL_22446",
"LABEL_22447",
"LABEL_22448",
"LABEL_22449",
"LABEL_2245",
"LABEL_22450",
"LABEL_22451",
"LABEL_22452",
"LABEL_22453",
"LABEL_22454",
"LABEL_22455",
"LABEL_22456",
"LABEL_22457",
"LABEL_22458",
"LABEL_22459",
"LABEL_2246",
"LABEL_22460",
"LABEL_22461",
"LABEL_22462",
"LABEL_22463",
"LABEL_22464",
"LABEL_22465",
"LABEL_22466",
"LABEL_22467",
"LABEL_22468",
"LABEL_22469",
"LABEL_2247",
"LABEL_22470",
"LABEL_22471",
"LABEL_22472",
"LABEL_22473",
"LABEL_22474",
"LABEL_22475",
"LABEL_22476",
"LABEL_22477",
"LABEL_22478",
"LABEL_22479",
"LABEL_2248",
"LABEL_22480",
"LABEL_22481",
"LABEL_22482",
"LABEL_22483",
"LABEL_22484",
"LABEL_22485",
"LABEL_22486",
"LABEL_22487",
"LABEL_22488",
"LABEL_22489",
"LABEL_2249",
"LABEL_22490",
"LABEL_22491",
"LABEL_22492",
"LABEL_22493",
"LABEL_22494",
"LABEL_22495",
"LABEL_22496",
"LABEL_22497",
"LABEL_22498",
"LABEL_22499",
"LABEL_225",
"LABEL_2250",
"LABEL_22500",
"LABEL_22501",
"LABEL_22502",
"LABEL_22503",
"LABEL_22504",
"LABEL_22505",
"LABEL_22506",
"LABEL_22507",
"LABEL_22508",
"LABEL_22509",
"LABEL_2251",
"LABEL_22510",
"LABEL_22511",
"LABEL_22512",
"LABEL_22513",
"LABEL_22514",
"LABEL_22515",
"LABEL_22516",
"LABEL_22517",
"LABEL_22518",
"LABEL_22519",
"LABEL_2252",
"LABEL_22520",
"LABEL_22521",
"LABEL_22522",
"LABEL_22523",
"LABEL_22524",
"LABEL_22525",
"LABEL_22526",
"LABEL_22527",
"LABEL_22528",
"LABEL_22529",
"LABEL_2253",
"LABEL_22530",
"LABEL_22531",
"LABEL_22532",
"LABEL_22533",
"LABEL_22534",
"LABEL_22535",
"LABEL_22536",
"LABEL_22537",
"LABEL_22538",
"LABEL_22539",
"LABEL_2254",
"LABEL_22540",
"LABEL_22541",
"LABEL_22542",
"LABEL_22543",
"LABEL_22544",
"LABEL_22545",
"LABEL_22546",
"LABEL_22547",
"LABEL_22548",
"LABEL_22549",
"LABEL_2255",
"LABEL_22550",
"LABEL_22551",
"LABEL_22552",
"LABEL_22553",
"LABEL_22554",
"LABEL_22555",
"LABEL_22556",
"LABEL_22557",
"LABEL_22558",
"LABEL_22559",
"LABEL_2256",
"LABEL_22560",
"LABEL_22561",
"LABEL_22562",
"LABEL_22563",
"LABEL_22564",
"LABEL_22565",
"LABEL_22566",
"LABEL_22567",
"LABEL_22568",
"LABEL_22569",
"LABEL_2257",
"LABEL_22570",
"LABEL_22571",
"LABEL_22572",
"LABEL_22573",
"LABEL_22574",
"LABEL_22575",
"LABEL_22576",
"LABEL_22577",
"LABEL_22578",
"LABEL_22579",
"LABEL_2258",
"LABEL_22580",
"LABEL_22581",
"LABEL_22582",
"LABEL_22583",
"LABEL_22584",
"LABEL_22585",
"LABEL_22586",
"LABEL_22587",
"LABEL_22588",
"LABEL_22589",
"LABEL_2259",
"LABEL_22590",
"LABEL_22591",
"LABEL_22592",
"LABEL_22593",
"LABEL_22594",
"LABEL_22595",
"LABEL_22596",
"LABEL_22597",
"LABEL_22598",
"LABEL_22599",
"LABEL_226",
"LABEL_2260",
"LABEL_22600",
"LABEL_22601",
"LABEL_22602",
"LABEL_22603",
"LABEL_22604",
"LABEL_22605",
"LABEL_22606",
"LABEL_22607",
"LABEL_22608",
"LABEL_22609",
"LABEL_2261",
"LABEL_22610",
"LABEL_22611",
"LABEL_22612",
"LABEL_22613",
"LABEL_22614",
"LABEL_22615",
"LABEL_22616",
"LABEL_22617",
"LABEL_22618",
"LABEL_22619",
"LABEL_2262",
"LABEL_22620",
"LABEL_22621",
"LABEL_22622",
"LABEL_22623",
"LABEL_22624",
"LABEL_22625",
"LABEL_22626",
"LABEL_22627",
"LABEL_22628",
"LABEL_22629",
"LABEL_2263",
"LABEL_22630",
"LABEL_22631",
"LABEL_22632",
"LABEL_22633",
"LABEL_22634",
"LABEL_22635",
"LABEL_22636",
"LABEL_22637",
"LABEL_22638",
"LABEL_22639",
"LABEL_2264",
"LABEL_22640",
"LABEL_22641",
"LABEL_22642",
"LABEL_22643",
"LABEL_22644",
"LABEL_22645",
"LABEL_22646",
"LABEL_22647",
"LABEL_22648",
"LABEL_22649",
"LABEL_2265",
"LABEL_22650",
"LABEL_22651",
"LABEL_22652",
"LABEL_22653",
"LABEL_22654",
"LABEL_22655",
"LABEL_22656",
"LABEL_22657",
"LABEL_22658",
"LABEL_22659",
"LABEL_2266",
"LABEL_22660",
"LABEL_22661",
"LABEL_22662",
"LABEL_22663",
"LABEL_22664",
"LABEL_22665",
"LABEL_22666",
"LABEL_22667",
"LABEL_22668",
"LABEL_22669",
"LABEL_2267",
"LABEL_22670",
"LABEL_22671",
"LABEL_22672",
"LABEL_22673",
"LABEL_22674",
"LABEL_22675",
"LABEL_22676",
"LABEL_22677",
"LABEL_22678",
"LABEL_22679",
"LABEL_2268",
"LABEL_22680",
"LABEL_22681",
"LABEL_22682",
"LABEL_22683",
"LABEL_22684",
"LABEL_22685",
"LABEL_22686",
"LABEL_22687",
"LABEL_22688",
"LABEL_22689",
"LABEL_2269",
"LABEL_22690",
"LABEL_22691",
"LABEL_22692",
"LABEL_22693",
"LABEL_22694",
"LABEL_22695",
"LABEL_22696",
"LABEL_22697",
"LABEL_22698",
"LABEL_22699",
"LABEL_227",
"LABEL_2270",
"LABEL_22700",
"LABEL_22701",
"LABEL_22702",
"LABEL_22703",
"LABEL_22704",
"LABEL_22705",
"LABEL_22706",
"LABEL_22707",
"LABEL_22708",
"LABEL_22709",
"LABEL_2271",
"LABEL_22710",
"LABEL_22711",
"LABEL_22712",
"LABEL_22713",
"LABEL_22714",
"LABEL_22715",
"LABEL_22716",
"LABEL_22717",
"LABEL_22718",
"LABEL_22719",
"LABEL_2272",
"LABEL_22720",
"LABEL_22721",
"LABEL_22722",
"LABEL_22723",
"LABEL_22724",
"LABEL_22725",
"LABEL_22726",
"LABEL_22727",
"LABEL_22728",
"LABEL_22729",
"LABEL_2273",
"LABEL_22730",
"LABEL_22731",
"LABEL_22732",
"LABEL_22733",
"LABEL_22734",
"LABEL_22735",
"LABEL_22736",
"LABEL_22737",
"LABEL_22738",
"LABEL_22739",
"LABEL_2274",
"LABEL_22740",
"LABEL_22741",
"LABEL_22742",
"LABEL_22743",
"LABEL_22744",
"LABEL_22745",
"LABEL_22746",
"LABEL_22747",
"LABEL_22748",
"LABEL_22749",
"LABEL_2275",
"LABEL_22750",
"LABEL_22751",
"LABEL_22752",
"LABEL_22753",
"LABEL_22754",
"LABEL_22755",
"LABEL_22756",
"LABEL_22757",
"LABEL_22758",
"LABEL_22759",
"LABEL_2276",
"LABEL_22760",
"LABEL_22761",
"LABEL_22762",
"LABEL_22763",
"LABEL_22764",
"LABEL_22765",
"LABEL_22766",
"LABEL_22767",
"LABEL_22768",
"LABEL_22769",
"LABEL_2277",
"LABEL_22770",
"LABEL_22771",
"LABEL_22772",
"LABEL_22773",
"LABEL_22774",
"LABEL_22775",
"LABEL_22776",
"LABEL_22777",
"LABEL_22778",
"LABEL_22779",
"LABEL_2278",
"LABEL_22780",
"LABEL_22781",
"LABEL_22782",
"LABEL_22783",
"LABEL_22784",
"LABEL_22785",
"LABEL_22786",
"LABEL_22787",
"LABEL_22788",
"LABEL_22789",
"LABEL_2279",
"LABEL_22790",
"LABEL_22791",
"LABEL_22792",
"LABEL_22793",
"LABEL_22794",
"LABEL_22795",
"LABEL_22796",
"LABEL_22797",
"LABEL_22798",
"LABEL_22799",
"LABEL_228",
"LABEL_2280",
"LABEL_22800",
"LABEL_22801",
"LABEL_22802",
"LABEL_22803",
"LABEL_22804",
"LABEL_22805",
"LABEL_22806",
"LABEL_22807",
"LABEL_22808",
"LABEL_22809",
"LABEL_2281",
"LABEL_22810",
"LABEL_22811",
"LABEL_22812",
"LABEL_22813",
"LABEL_22814",
"LABEL_22815",
"LABEL_22816",
"LABEL_22817",
"LABEL_22818",
"LABEL_22819",
"LABEL_2282",
"LABEL_22820",
"LABEL_22821",
"LABEL_22822",
"LABEL_22823",
"LABEL_22824",
"LABEL_22825",
"LABEL_22826",
"LABEL_22827",
"LABEL_22828",
"LABEL_22829",
"LABEL_2283",
"LABEL_22830",
"LABEL_22831",
"LABEL_22832",
"LABEL_22833",
"LABEL_22834",
"LABEL_22835",
"LABEL_22836",
"LABEL_22837",
"LABEL_22838",
"LABEL_22839",
"LABEL_2284",
"LABEL_22840",
"LABEL_22841",
"LABEL_22842",
"LABEL_22843",
"LABEL_22844",
"LABEL_22845",
"LABEL_22846",
"LABEL_22847",
"LABEL_22848",
"LABEL_22849",
"LABEL_2285",
"LABEL_22850",
"LABEL_22851",
"LABEL_22852",
"LABEL_22853",
"LABEL_22854",
"LABEL_22855",
"LABEL_22856",
"LABEL_22857",
"LABEL_22858",
"LABEL_22859",
"LABEL_2286",
"LABEL_22860",
"LABEL_22861",
"LABEL_22862",
"LABEL_22863",
"LABEL_22864",
"LABEL_22865",
"LABEL_22866",
"LABEL_22867",
"LABEL_22868",
"LABEL_22869",
"LABEL_2287",
"LABEL_22870",
"LABEL_22871",
"LABEL_22872",
"LABEL_22873",
"LABEL_22874",
"LABEL_22875",
"LABEL_22876",
"LABEL_22877",
"LABEL_22878",
"LABEL_22879",
"LABEL_2288",
"LABEL_22880",
"LABEL_22881",
"LABEL_22882",
"LABEL_22883",
"LABEL_22884",
"LABEL_22885",
"LABEL_22886",
"LABEL_22887",
"LABEL_22888",
"LABEL_22889",
"LABEL_2289",
"LABEL_22890",
"LABEL_22891",
"LABEL_22892",
"LABEL_22893",
"LABEL_22894",
"LABEL_22895",
"LABEL_22896",
"LABEL_22897",
"LABEL_22898",
"LABEL_22899",
"LABEL_229",
"LABEL_2290",
"LABEL_22900",
"LABEL_22901",
"LABEL_22902",
"LABEL_22903",
"LABEL_22904",
"LABEL_22905",
"LABEL_22906",
"LABEL_22907",
"LABEL_22908",
"LABEL_22909",
"LABEL_2291",
"LABEL_22910",
"LABEL_22911",
"LABEL_22912",
"LABEL_22913",
"LABEL_22914",
"LABEL_22915",
"LABEL_22916",
"LABEL_22917",
"LABEL_22918",
"LABEL_22919",
"LABEL_2292",
"LABEL_22920",
"LABEL_22921",
"LABEL_22922",
"LABEL_22923",
"LABEL_22924",
"LABEL_22925",
"LABEL_22926",
"LABEL_22927",
"LABEL_22928",
"LABEL_22929",
"LABEL_2293",
"LABEL_22930",
"LABEL_22931",
"LABEL_22932",
"LABEL_22933",
"LABEL_22934",
"LABEL_22935",
"LABEL_22936",
"LABEL_22937",
"LABEL_22938",
"LABEL_22939",
"LABEL_2294",
"LABEL_22940",
"LABEL_22941",
"LABEL_22942",
"LABEL_22943",
"LABEL_22944",
"LABEL_22945",
"LABEL_22946",
"LABEL_22947",
"LABEL_22948",
"LABEL_22949",
"LABEL_2295",
"LABEL_22950",
"LABEL_22951",
"LABEL_22952",
"LABEL_22953",
"LABEL_22954",
"LABEL_22955",
"LABEL_22956",
"LABEL_22957",
"LABEL_22958",
"LABEL_22959",
"LABEL_2296",
"LABEL_22960",
"LABEL_22961",
"LABEL_22962",
"LABEL_22963",
"LABEL_22964",
"LABEL_22965",
"LABEL_22966",
"LABEL_22967",
"LABEL_22968",
"LABEL_22969",
"LABEL_2297",
"LABEL_22970",
"LABEL_22971",
"LABEL_22972",
"LABEL_22973",
"LABEL_22974",
"LABEL_22975",
"LABEL_22976",
"LABEL_22977",
"LABEL_22978",
"LABEL_22979",
"LABEL_2298",
"LABEL_22980",
"LABEL_22981",
"LABEL_22982",
"LABEL_22983",
"LABEL_22984",
"LABEL_22985",
"LABEL_22986",
"LABEL_22987",
"LABEL_22988",
"LABEL_22989",
"LABEL_2299",
"LABEL_22990",
"LABEL_22991",
"LABEL_22992",
"LABEL_22993",
"LABEL_22994",
"LABEL_22995",
"LABEL_22996",
"LABEL_22997",
"LABEL_22998",
"LABEL_22999",
"LABEL_23",
"LABEL_230",
"LABEL_2300",
"LABEL_23000",
"LABEL_23001",
"LABEL_23002",
"LABEL_23003",
"LABEL_23004",
"LABEL_23005",
"LABEL_23006",
"LABEL_23007",
"LABEL_23008",
"LABEL_23009",
"LABEL_2301",
"LABEL_23010",
"LABEL_23011",
"LABEL_23012",
"LABEL_23013",
"LABEL_23014",
"LABEL_23015",
"LABEL_23016",
"LABEL_23017",
"LABEL_23018",
"LABEL_23019",
"LABEL_2302",
"LABEL_23020",
"LABEL_23021",
"LABEL_23022",
"LABEL_23023",
"LABEL_23024",
"LABEL_23025",
"LABEL_23026",
"LABEL_23027",
"LABEL_23028",
"LABEL_23029",
"LABEL_2303",
"LABEL_23030",
"LABEL_23031",
"LABEL_23032",
"LABEL_23033",
"LABEL_23034",
"LABEL_23035",
"LABEL_23036",
"LABEL_23037",
"LABEL_23038",
"LABEL_23039",
"LABEL_2304",
"LABEL_23040",
"LABEL_23041",
"LABEL_23042",
"LABEL_23043",
"LABEL_23044",
"LABEL_23045",
"LABEL_23046",
"LABEL_23047",
"LABEL_23048",
"LABEL_23049",
"LABEL_2305",
"LABEL_23050",
"LABEL_23051",
"LABEL_23052",
"LABEL_23053",
"LABEL_23054",
"LABEL_23055",
"LABEL_23056",
"LABEL_23057",
"LABEL_23058",
"LABEL_23059",
"LABEL_2306",
"LABEL_23060",
"LABEL_23061",
"LABEL_23062",
"LABEL_23063",
"LABEL_23064",
"LABEL_23065",
"LABEL_23066",
"LABEL_23067",
"LABEL_23068",
"LABEL_23069",
"LABEL_2307",
"LABEL_23070",
"LABEL_23071",
"LABEL_23072",
"LABEL_23073",
"LABEL_23074",
"LABEL_23075",
"LABEL_23076",
"LABEL_23077",
"LABEL_23078",
"LABEL_23079",
"LABEL_2308",
"LABEL_23080",
"LABEL_23081",
"LABEL_23082",
"LABEL_23083",
"LABEL_23084",
"LABEL_23085",
"LABEL_23086",
"LABEL_23087",
"LABEL_23088",
"LABEL_23089",
"LABEL_2309",
"LABEL_23090",
"LABEL_23091",
"LABEL_23092",
"LABEL_23093",
"LABEL_23094",
"LABEL_23095",
"LABEL_23096",
"LABEL_23097",
"LABEL_23098",
"LABEL_23099",
"LABEL_231",
"LABEL_2310",
"LABEL_23100",
"LABEL_23101",
"LABEL_23102",
"LABEL_23103",
"LABEL_23104",
"LABEL_23105",
"LABEL_23106",
"LABEL_23107",
"LABEL_23108",
"LABEL_23109",
"LABEL_2311",
"LABEL_23110",
"LABEL_23111",
"LABEL_23112",
"LABEL_23113",
"LABEL_23114",
"LABEL_23115",
"LABEL_23116",
"LABEL_23117",
"LABEL_23118",
"LABEL_23119",
"LABEL_2312",
"LABEL_23120",
"LABEL_23121",
"LABEL_23122",
"LABEL_23123",
"LABEL_23124",
"LABEL_23125",
"LABEL_23126",
"LABEL_23127",
"LABEL_23128",
"LABEL_23129",
"LABEL_2313",
"LABEL_23130",
"LABEL_23131",
"LABEL_23132",
"LABEL_23133",
"LABEL_23134",
"LABEL_23135",
"LABEL_23136",
"LABEL_23137",
"LABEL_23138",
"LABEL_23139",
"LABEL_2314",
"LABEL_23140",
"LABEL_23141",
"LABEL_23142",
"LABEL_23143",
"LABEL_23144",
"LABEL_23145",
"LABEL_23146",
"LABEL_23147",
"LABEL_23148",
"LABEL_23149",
"LABEL_2315",
"LABEL_23150",
"LABEL_23151",
"LABEL_23152",
"LABEL_23153",
"LABEL_23154",
"LABEL_23155",
"LABEL_23156",
"LABEL_23157",
"LABEL_23158",
"LABEL_23159",
"LABEL_2316",
"LABEL_23160",
"LABEL_23161",
"LABEL_23162",
"LABEL_23163",
"LABEL_23164",
"LABEL_23165",
"LABEL_23166",
"LABEL_23167",
"LABEL_23168",
"LABEL_23169",
"LABEL_2317",
"LABEL_23170",
"LABEL_23171",
"LABEL_23172",
"LABEL_23173",
"LABEL_23174",
"LABEL_23175",
"LABEL_23176",
"LABEL_23177",
"LABEL_23178",
"LABEL_23179",
"LABEL_2318",
"LABEL_23180",
"LABEL_23181",
"LABEL_23182",
"LABEL_23183",
"LABEL_23184",
"LABEL_23185",
"LABEL_23186",
"LABEL_23187",
"LABEL_23188",
"LABEL_23189",
"LABEL_2319",
"LABEL_23190",
"LABEL_23191",
"LABEL_23192",
"LABEL_23193",
"LABEL_23194",
"LABEL_23195",
"LABEL_23196",
"LABEL_23197",
"LABEL_23198",
"LABEL_23199",
"LABEL_232",
"LABEL_2320",
"LABEL_23200",
"LABEL_23201",
"LABEL_23202",
"LABEL_23203",
"LABEL_23204",
"LABEL_23205",
"LABEL_23206",
"LABEL_23207",
"LABEL_23208",
"LABEL_23209",
"LABEL_2321",
"LABEL_23210",
"LABEL_23211",
"LABEL_23212",
"LABEL_23213",
"LABEL_23214",
"LABEL_23215",
"LABEL_23216",
"LABEL_23217",
"LABEL_23218",
"LABEL_23219",
"LABEL_2322",
"LABEL_23220",
"LABEL_23221",
"LABEL_23222",
"LABEL_23223",
"LABEL_23224",
"LABEL_23225",
"LABEL_23226",
"LABEL_23227",
"LABEL_23228",
"LABEL_23229",
"LABEL_2323",
"LABEL_23230",
"LABEL_23231",
"LABEL_23232",
"LABEL_23233",
"LABEL_23234",
"LABEL_23235",
"LABEL_23236",
"LABEL_23237",
"LABEL_23238",
"LABEL_23239",
"LABEL_2324",
"LABEL_23240",
"LABEL_23241",
"LABEL_23242",
"LABEL_23243",
"LABEL_23244",
"LABEL_23245",
"LABEL_23246",
"LABEL_23247",
"LABEL_23248",
"LABEL_23249",
"LABEL_2325",
"LABEL_23250",
"LABEL_23251",
"LABEL_23252",
"LABEL_23253",
"LABEL_23254",
"LABEL_23255",
"LABEL_23256",
"LABEL_23257",
"LABEL_23258",
"LABEL_23259",
"LABEL_2326",
"LABEL_23260",
"LABEL_23261",
"LABEL_23262",
"LABEL_23263",
"LABEL_23264",
"LABEL_23265",
"LABEL_23266",
"LABEL_23267",
"LABEL_23268",
"LABEL_23269",
"LABEL_2327",
"LABEL_23270",
"LABEL_23271",
"LABEL_23272",
"LABEL_23273",
"LABEL_23274",
"LABEL_23275",
"LABEL_23276",
"LABEL_23277",
"LABEL_23278",
"LABEL_23279",
"LABEL_2328",
"LABEL_23280",
"LABEL_23281",
"LABEL_23282",
"LABEL_23283",
"LABEL_23284",
"LABEL_23285",
"LABEL_23286",
"LABEL_23287",
"LABEL_23288",
"LABEL_23289",
"LABEL_2329",
"LABEL_23290",
"LABEL_23291",
"LABEL_23292",
"LABEL_23293",
"LABEL_23294",
"LABEL_23295",
"LABEL_23296",
"LABEL_23297",
"LABEL_23298",
"LABEL_23299",
"LABEL_233",
"LABEL_2330",
"LABEL_23300",
"LABEL_23301",
"LABEL_23302",
"LABEL_23303",
"LABEL_23304",
"LABEL_23305",
"LABEL_23306",
"LABEL_23307",
"LABEL_23308",
"LABEL_23309",
"LABEL_2331",
"LABEL_23310",
"LABEL_23311",
"LABEL_23312",
"LABEL_23313",
"LABEL_23314",
"LABEL_23315",
"LABEL_23316",
"LABEL_23317",
"LABEL_23318",
"LABEL_23319",
"LABEL_2332",
"LABEL_23320",
"LABEL_23321",
"LABEL_23322",
"LABEL_23323",
"LABEL_23324",
"LABEL_23325",
"LABEL_23326",
"LABEL_23327",
"LABEL_23328",
"LABEL_23329",
"LABEL_2333",
"LABEL_23330",
"LABEL_23331",
"LABEL_23332",
"LABEL_23333",
"LABEL_23334",
"LABEL_23335",
"LABEL_23336",
"LABEL_23337",
"LABEL_23338",
"LABEL_23339",
"LABEL_2334",
"LABEL_23340",
"LABEL_23341",
"LABEL_23342",
"LABEL_23343",
"LABEL_23344",
"LABEL_23345",
"LABEL_23346",
"LABEL_23347",
"LABEL_23348",
"LABEL_23349",
"LABEL_2335",
"LABEL_23350",
"LABEL_23351",
"LABEL_23352",
"LABEL_23353",
"LABEL_23354",
"LABEL_23355",
"LABEL_23356",
"LABEL_23357",
"LABEL_23358",
"LABEL_23359",
"LABEL_2336",
"LABEL_23360",
"LABEL_23361",
"LABEL_23362",
"LABEL_23363",
"LABEL_23364",
"LABEL_23365",
"LABEL_23366",
"LABEL_23367",
"LABEL_23368",
"LABEL_23369",
"LABEL_2337",
"LABEL_23370",
"LABEL_23371",
"LABEL_23372",
"LABEL_23373",
"LABEL_23374",
"LABEL_23375",
"LABEL_23376",
"LABEL_23377",
"LABEL_23378",
"LABEL_23379",
"LABEL_2338",
"LABEL_23380",
"LABEL_23381",
"LABEL_23382",
"LABEL_23383",
"LABEL_23384",
"LABEL_23385",
"LABEL_23386",
"LABEL_23387",
"LABEL_23388",
"LABEL_23389",
"LABEL_2339",
"LABEL_23390",
"LABEL_23391",
"LABEL_23392",
"LABEL_23393",
"LABEL_23394",
"LABEL_23395",
"LABEL_23396",
"LABEL_23397",
"LABEL_23398",
"LABEL_23399",
"LABEL_234",
"LABEL_2340",
"LABEL_23400",
"LABEL_23401",
"LABEL_23402",
"LABEL_23403",
"LABEL_23404",
"LABEL_23405",
"LABEL_23406",
"LABEL_23407",
"LABEL_23408",
"LABEL_23409",
"LABEL_2341",
"LABEL_23410",
"LABEL_23411",
"LABEL_23412",
"LABEL_23413",
"LABEL_23414",
"LABEL_23415",
"LABEL_23416",
"LABEL_23417",
"LABEL_23418",
"LABEL_23419",
"LABEL_2342",
"LABEL_23420",
"LABEL_23421",
"LABEL_23422",
"LABEL_23423",
"LABEL_23424",
"LABEL_23425",
"LABEL_23426",
"LABEL_23427",
"LABEL_23428",
"LABEL_23429",
"LABEL_2343",
"LABEL_23430",
"LABEL_23431",
"LABEL_23432",
"LABEL_23433",
"LABEL_23434",
"LABEL_23435",
"LABEL_23436",
"LABEL_23437",
"LABEL_23438",
"LABEL_23439",
"LABEL_2344",
"LABEL_23440",
"LABEL_23441",
"LABEL_23442",
"LABEL_23443",
"LABEL_23444",
"LABEL_23445",
"LABEL_23446",
"LABEL_23447",
"LABEL_23448",
"LABEL_23449",
"LABEL_2345",
"LABEL_23450",
"LABEL_23451",
"LABEL_23452",
"LABEL_23453",
"LABEL_23454",
"LABEL_23455",
"LABEL_23456",
"LABEL_23457",
"LABEL_23458",
"LABEL_23459",
"LABEL_2346",
"LABEL_23460",
"LABEL_23461",
"LABEL_23462",
"LABEL_23463",
"LABEL_23464",
"LABEL_23465",
"LABEL_23466",
"LABEL_23467",
"LABEL_23468",
"LABEL_23469",
"LABEL_2347",
"LABEL_23470",
"LABEL_23471",
"LABEL_23472",
"LABEL_23473",
"LABEL_23474",
"LABEL_23475",
"LABEL_23476",
"LABEL_23477",
"LABEL_23478",
"LABEL_23479",
"LABEL_2348",
"LABEL_23480",
"LABEL_23481",
"LABEL_23482",
"LABEL_23483",
"LABEL_23484",
"LABEL_23485",
"LABEL_23486",
"LABEL_23487",
"LABEL_23488",
"LABEL_23489",
"LABEL_2349",
"LABEL_23490",
"LABEL_23491",
"LABEL_23492",
"LABEL_23493",
"LABEL_23494",
"LABEL_23495",
"LABEL_23496",
"LABEL_23497",
"LABEL_23498",
"LABEL_23499",
"LABEL_235",
"LABEL_2350",
"LABEL_23500",
"LABEL_23501",
"LABEL_23502",
"LABEL_23503",
"LABEL_23504",
"LABEL_23505",
"LABEL_23506",
"LABEL_23507",
"LABEL_23508",
"LABEL_23509",
"LABEL_2351",
"LABEL_23510",
"LABEL_23511",
"LABEL_23512",
"LABEL_23513",
"LABEL_23514",
"LABEL_23515",
"LABEL_23516",
"LABEL_23517",
"LABEL_23518",
"LABEL_23519",
"LABEL_2352",
"LABEL_23520",
"LABEL_23521",
"LABEL_23522",
"LABEL_23523",
"LABEL_23524",
"LABEL_23525",
"LABEL_23526",
"LABEL_23527",
"LABEL_23528",
"LABEL_23529",
"LABEL_2353",
"LABEL_23530",
"LABEL_23531",
"LABEL_23532",
"LABEL_23533",
"LABEL_23534",
"LABEL_23535",
"LABEL_23536",
"LABEL_23537",
"LABEL_23538",
"LABEL_23539",
"LABEL_2354",
"LABEL_23540",
"LABEL_23541",
"LABEL_23542",
"LABEL_23543",
"LABEL_23544",
"LABEL_23545",
"LABEL_23546",
"LABEL_23547",
"LABEL_23548",
"LABEL_23549",
"LABEL_2355",
"LABEL_23550",
"LABEL_23551",
"LABEL_23552",
"LABEL_23553",
"LABEL_23554",
"LABEL_23555",
"LABEL_23556",
"LABEL_23557",
"LABEL_23558",
"LABEL_23559",
"LABEL_2356",
"LABEL_23560",
"LABEL_23561",
"LABEL_23562",
"LABEL_23563",
"LABEL_23564",
"LABEL_23565",
"LABEL_23566",
"LABEL_23567",
"LABEL_23568",
"LABEL_23569",
"LABEL_2357",
"LABEL_23570",
"LABEL_23571",
"LABEL_23572",
"LABEL_23573",
"LABEL_23574",
"LABEL_23575",
"LABEL_23576",
"LABEL_23577",
"LABEL_23578",
"LABEL_23579",
"LABEL_2358",
"LABEL_23580",
"LABEL_23581",
"LABEL_23582",
"LABEL_23583",
"LABEL_23584",
"LABEL_23585",
"LABEL_23586",
"LABEL_23587",
"LABEL_23588",
"LABEL_23589",
"LABEL_2359",
"LABEL_23590",
"LABEL_23591",
"LABEL_23592",
"LABEL_23593",
"LABEL_23594",
"LABEL_23595",
"LABEL_23596",
"LABEL_23597",
"LABEL_23598",
"LABEL_23599",
"LABEL_236",
"LABEL_2360",
"LABEL_23600",
"LABEL_23601",
"LABEL_23602",
"LABEL_23603",
"LABEL_23604",
"LABEL_23605",
"LABEL_23606",
"LABEL_23607",
"LABEL_23608",
"LABEL_23609",
"LABEL_2361",
"LABEL_23610",
"LABEL_23611",
"LABEL_23612",
"LABEL_23613",
"LABEL_23614",
"LABEL_23615",
"LABEL_23616",
"LABEL_23617",
"LABEL_23618",
"LABEL_23619",
"LABEL_2362",
"LABEL_23620",
"LABEL_23621",
"LABEL_23622",
"LABEL_23623",
"LABEL_23624",
"LABEL_23625",
"LABEL_23626",
"LABEL_23627",
"LABEL_23628",
"LABEL_23629",
"LABEL_2363",
"LABEL_23630",
"LABEL_23631",
"LABEL_23632",
"LABEL_23633",
"LABEL_23634",
"LABEL_23635",
"LABEL_23636",
"LABEL_23637",
"LABEL_23638",
"LABEL_23639",
"LABEL_2364",
"LABEL_23640",
"LABEL_23641",
"LABEL_23642",
"LABEL_23643",
"LABEL_23644",
"LABEL_23645",
"LABEL_23646",
"LABEL_23647",
"LABEL_23648",
"LABEL_23649",
"LABEL_2365",
"LABEL_23650",
"LABEL_23651",
"LABEL_23652",
"LABEL_23653",
"LABEL_23654",
"LABEL_23655",
"LABEL_23656",
"LABEL_23657",
"LABEL_23658",
"LABEL_23659",
"LABEL_2366",
"LABEL_23660",
"LABEL_23661",
"LABEL_23662",
"LABEL_23663",
"LABEL_23664",
"LABEL_23665",
"LABEL_23666",
"LABEL_23667",
"LABEL_23668",
"LABEL_23669",
"LABEL_2367",
"LABEL_23670",
"LABEL_23671",
"LABEL_23672",
"LABEL_23673",
"LABEL_23674",
"LABEL_23675",
"LABEL_23676",
"LABEL_23677",
"LABEL_23678",
"LABEL_23679",
"LABEL_2368",
"LABEL_23680",
"LABEL_23681",
"LABEL_23682",
"LABEL_23683",
"LABEL_23684",
"LABEL_23685",
"LABEL_23686",
"LABEL_23687",
"LABEL_23688",
"LABEL_23689",
"LABEL_2369",
"LABEL_23690",
"LABEL_23691",
"LABEL_23692",
"LABEL_23693",
"LABEL_23694",
"LABEL_23695",
"LABEL_23696",
"LABEL_23697",
"LABEL_23698",
"LABEL_23699",
"LABEL_237",
"LABEL_2370",
"LABEL_23700",
"LABEL_23701",
"LABEL_23702",
"LABEL_23703",
"LABEL_23704",
"LABEL_23705",
"LABEL_23706",
"LABEL_23707",
"LABEL_23708",
"LABEL_23709",
"LABEL_2371",
"LABEL_23710",
"LABEL_23711",
"LABEL_23712",
"LABEL_23713",
"LABEL_23714",
"LABEL_23715",
"LABEL_23716",
"LABEL_23717",
"LABEL_23718",
"LABEL_23719",
"LABEL_2372",
"LABEL_23720",
"LABEL_23721",
"LABEL_23722",
"LABEL_23723",
"LABEL_23724",
"LABEL_23725",
"LABEL_23726",
"LABEL_23727",
"LABEL_23728",
"LABEL_23729",
"LABEL_2373",
"LABEL_23730",
"LABEL_23731",
"LABEL_23732",
"LABEL_23733",
"LABEL_23734",
"LABEL_23735",
"LABEL_23736",
"LABEL_23737",
"LABEL_23738",
"LABEL_23739",
"LABEL_2374",
"LABEL_23740",
"LABEL_23741",
"LABEL_23742",
"LABEL_23743",
"LABEL_23744",
"LABEL_23745",
"LABEL_23746",
"LABEL_23747",
"LABEL_23748",
"LABEL_23749",
"LABEL_2375",
"LABEL_23750",
"LABEL_23751",
"LABEL_23752",
"LABEL_23753",
"LABEL_23754",
"LABEL_23755",
"LABEL_23756",
"LABEL_23757",
"LABEL_23758",
"LABEL_23759",
"LABEL_2376",
"LABEL_23760",
"LABEL_23761",
"LABEL_23762",
"LABEL_23763",
"LABEL_23764",
"LABEL_23765",
"LABEL_23766",
"LABEL_23767",
"LABEL_23768",
"LABEL_23769",
"LABEL_2377",
"LABEL_23770",
"LABEL_23771",
"LABEL_23772",
"LABEL_23773",
"LABEL_23774",
"LABEL_23775",
"LABEL_23776",
"LABEL_23777",
"LABEL_23778",
"LABEL_23779",
"LABEL_2378",
"LABEL_23780",
"LABEL_23781",
"LABEL_23782",
"LABEL_23783",
"LABEL_23784",
"LABEL_23785",
"LABEL_23786",
"LABEL_23787",
"LABEL_23788",
"LABEL_23789",
"LABEL_2379",
"LABEL_23790",
"LABEL_23791",
"LABEL_23792",
"LABEL_23793",
"LABEL_23794",
"LABEL_23795",
"LABEL_23796",
"LABEL_23797",
"LABEL_23798",
"LABEL_23799",
"LABEL_238",
"LABEL_2380",
"LABEL_23800",
"LABEL_23801",
"LABEL_23802",
"LABEL_23803",
"LABEL_23804",
"LABEL_23805",
"LABEL_23806",
"LABEL_23807",
"LABEL_23808",
"LABEL_23809",
"LABEL_2381",
"LABEL_23810",
"LABEL_23811",
"LABEL_23812",
"LABEL_23813",
"LABEL_23814",
"LABEL_23815",
"LABEL_23816",
"LABEL_23817",
"LABEL_23818",
"LABEL_23819",
"LABEL_2382",
"LABEL_23820",
"LABEL_23821",
"LABEL_23822",
"LABEL_23823",
"LABEL_23824",
"LABEL_23825",
"LABEL_23826",
"LABEL_23827",
"LABEL_23828",
"LABEL_23829",
"LABEL_2383",
"LABEL_23830",
"LABEL_23831",
"LABEL_23832",
"LABEL_23833",
"LABEL_23834",
"LABEL_23835",
"LABEL_23836",
"LABEL_23837",
"LABEL_23838",
"LABEL_23839",
"LABEL_2384",
"LABEL_23840",
"LABEL_23841",
"LABEL_23842",
"LABEL_23843",
"LABEL_23844",
"LABEL_23845",
"LABEL_23846",
"LABEL_23847",
"LABEL_23848",
"LABEL_23849",
"LABEL_2385",
"LABEL_23850",
"LABEL_23851",
"LABEL_23852",
"LABEL_23853",
"LABEL_23854",
"LABEL_23855",
"LABEL_23856",
"LABEL_23857",
"LABEL_23858",
"LABEL_23859",
"LABEL_2386",
"LABEL_23860",
"LABEL_23861",
"LABEL_23862",
"LABEL_23863",
"LABEL_23864",
"LABEL_23865",
"LABEL_23866",
"LABEL_23867",
"LABEL_23868",
"LABEL_23869",
"LABEL_2387",
"LABEL_23870",
"LABEL_23871",
"LABEL_23872",
"LABEL_23873",
"LABEL_23874",
"LABEL_23875",
"LABEL_23876",
"LABEL_23877",
"LABEL_23878",
"LABEL_23879",
"LABEL_2388",
"LABEL_23880",
"LABEL_23881",
"LABEL_23882",
"LABEL_23883",
"LABEL_23884",
"LABEL_23885",
"LABEL_23886",
"LABEL_23887",
"LABEL_23888",
"LABEL_23889",
"LABEL_2389",
"LABEL_23890",
"LABEL_23891",
"LABEL_23892",
"LABEL_23893",
"LABEL_23894",
"LABEL_23895",
"LABEL_23896",
"LABEL_23897",
"LABEL_23898",
"LABEL_23899",
"LABEL_239",
"LABEL_2390",
"LABEL_23900",
"LABEL_23901",
"LABEL_23902",
"LABEL_23903",
"LABEL_23904",
"LABEL_23905",
"LABEL_23906",
"LABEL_23907",
"LABEL_23908",
"LABEL_23909",
"LABEL_2391",
"LABEL_23910",
"LABEL_23911",
"LABEL_23912",
"LABEL_23913",
"LABEL_23914",
"LABEL_23915",
"LABEL_23916",
"LABEL_23917",
"LABEL_23918",
"LABEL_23919",
"LABEL_2392",
"LABEL_23920",
"LABEL_23921",
"LABEL_23922",
"LABEL_23923",
"LABEL_23924",
"LABEL_23925",
"LABEL_23926",
"LABEL_23927",
"LABEL_23928",
"LABEL_23929",
"LABEL_2393",
"LABEL_23930",
"LABEL_23931",
"LABEL_23932",
"LABEL_23933",
"LABEL_23934",
"LABEL_23935",
"LABEL_23936",
"LABEL_23937",
"LABEL_23938",
"LABEL_23939",
"LABEL_2394",
"LABEL_23940",
"LABEL_23941",
"LABEL_23942",
"LABEL_23943",
"LABEL_23944",
"LABEL_23945",
"LABEL_23946",
"LABEL_23947",
"LABEL_23948",
"LABEL_23949",
"LABEL_2395",
"LABEL_23950",
"LABEL_23951",
"LABEL_23952",
"LABEL_23953",
"LABEL_23954",
"LABEL_23955",
"LABEL_23956",
"LABEL_23957",
"LABEL_23958",
"LABEL_23959",
"LABEL_2396",
"LABEL_23960",
"LABEL_23961",
"LABEL_23962",
"LABEL_23963",
"LABEL_23964",
"LABEL_23965",
"LABEL_23966",
"LABEL_23967",
"LABEL_23968",
"LABEL_23969",
"LABEL_2397",
"LABEL_23970",
"LABEL_23971",
"LABEL_23972",
"LABEL_23973",
"LABEL_23974",
"LABEL_23975",
"LABEL_23976",
"LABEL_23977",
"LABEL_23978",
"LABEL_23979",
"LABEL_2398",
"LABEL_23980",
"LABEL_23981",
"LABEL_23982",
"LABEL_23983",
"LABEL_23984",
"LABEL_23985",
"LABEL_23986",
"LABEL_23987",
"LABEL_23988",
"LABEL_23989",
"LABEL_2399",
"LABEL_23990",
"LABEL_23991",
"LABEL_23992",
"LABEL_23993",
"LABEL_23994",
"LABEL_23995",
"LABEL_23996",
"LABEL_23997",
"LABEL_23998",
"LABEL_23999",
"LABEL_24",
"LABEL_240",
"LABEL_2400",
"LABEL_24000",
"LABEL_24001",
"LABEL_24002",
"LABEL_24003",
"LABEL_24004",
"LABEL_24005",
"LABEL_24006",
"LABEL_24007",
"LABEL_24008",
"LABEL_24009",
"LABEL_2401",
"LABEL_24010",
"LABEL_24011",
"LABEL_24012",
"LABEL_24013",
"LABEL_24014",
"LABEL_24015",
"LABEL_24016",
"LABEL_24017",
"LABEL_24018",
"LABEL_24019",
"LABEL_2402",
"LABEL_24020",
"LABEL_24021",
"LABEL_24022",
"LABEL_24023",
"LABEL_24024",
"LABEL_24025",
"LABEL_24026",
"LABEL_24027",
"LABEL_24028",
"LABEL_24029",
"LABEL_2403",
"LABEL_24030",
"LABEL_24031",
"LABEL_24032",
"LABEL_24033",
"LABEL_24034",
"LABEL_24035",
"LABEL_24036",
"LABEL_24037",
"LABEL_24038",
"LABEL_24039",
"LABEL_2404",
"LABEL_24040",
"LABEL_24041",
"LABEL_24042",
"LABEL_24043",
"LABEL_24044",
"LABEL_24045",
"LABEL_24046",
"LABEL_24047",
"LABEL_24048",
"LABEL_24049",
"LABEL_2405",
"LABEL_24050",
"LABEL_24051",
"LABEL_24052",
"LABEL_24053",
"LABEL_24054",
"LABEL_24055",
"LABEL_24056",
"LABEL_24057",
"LABEL_24058",
"LABEL_24059",
"LABEL_2406",
"LABEL_24060",
"LABEL_24061",
"LABEL_24062",
"LABEL_24063",
"LABEL_24064",
"LABEL_24065",
"LABEL_24066",
"LABEL_24067",
"LABEL_24068",
"LABEL_24069",
"LABEL_2407",
"LABEL_24070",
"LABEL_24071",
"LABEL_24072",
"LABEL_24073",
"LABEL_24074",
"LABEL_24075",
"LABEL_24076",
"LABEL_24077",
"LABEL_24078",
"LABEL_24079",
"LABEL_2408",
"LABEL_24080",
"LABEL_24081",
"LABEL_24082",
"LABEL_24083",
"LABEL_24084",
"LABEL_24085",
"LABEL_24086",
"LABEL_24087",
"LABEL_24088",
"LABEL_24089",
"LABEL_2409",
"LABEL_24090",
"LABEL_24091",
"LABEL_24092",
"LABEL_24093",
"LABEL_24094",
"LABEL_24095",
"LABEL_24096",
"LABEL_24097",
"LABEL_24098",
"LABEL_24099",
"LABEL_241",
"LABEL_2410",
"LABEL_24100",
"LABEL_24101",
"LABEL_24102",
"LABEL_24103",
"LABEL_24104",
"LABEL_24105",
"LABEL_24106",
"LABEL_24107",
"LABEL_24108",
"LABEL_24109",
"LABEL_2411",
"LABEL_24110",
"LABEL_24111",
"LABEL_24112",
"LABEL_24113",
"LABEL_24114",
"LABEL_24115",
"LABEL_24116",
"LABEL_24117",
"LABEL_24118",
"LABEL_24119",
"LABEL_2412",
"LABEL_24120",
"LABEL_24121",
"LABEL_24122",
"LABEL_24123",
"LABEL_24124",
"LABEL_24125",
"LABEL_24126",
"LABEL_24127",
"LABEL_24128",
"LABEL_24129",
"LABEL_2413",
"LABEL_24130",
"LABEL_24131",
"LABEL_24132",
"LABEL_24133",
"LABEL_24134",
"LABEL_24135",
"LABEL_24136",
"LABEL_24137",
"LABEL_24138",
"LABEL_24139",
"LABEL_2414",
"LABEL_24140",
"LABEL_24141",
"LABEL_24142",
"LABEL_24143",
"LABEL_24144",
"LABEL_24145",
"LABEL_24146",
"LABEL_24147",
"LABEL_24148",
"LABEL_24149",
"LABEL_2415",
"LABEL_24150",
"LABEL_24151",
"LABEL_24152",
"LABEL_24153",
"LABEL_24154",
"LABEL_24155",
"LABEL_24156",
"LABEL_24157",
"LABEL_24158",
"LABEL_24159",
"LABEL_2416",
"LABEL_24160",
"LABEL_24161",
"LABEL_24162",
"LABEL_24163",
"LABEL_24164",
"LABEL_24165",
"LABEL_24166",
"LABEL_24167",
"LABEL_24168",
"LABEL_24169",
"LABEL_2417",
"LABEL_24170",
"LABEL_24171",
"LABEL_24172",
"LABEL_24173",
"LABEL_24174",
"LABEL_24175",
"LABEL_24176",
"LABEL_24177",
"LABEL_24178",
"LABEL_24179",
"LABEL_2418",
"LABEL_24180",
"LABEL_24181",
"LABEL_24182",
"LABEL_24183",
"LABEL_24184",
"LABEL_24185",
"LABEL_24186",
"LABEL_24187",
"LABEL_24188",
"LABEL_24189",
"LABEL_2419",
"LABEL_24190",
"LABEL_24191",
"LABEL_24192",
"LABEL_24193",
"LABEL_24194",
"LABEL_24195",
"LABEL_24196",
"LABEL_24197",
"LABEL_24198",
"LABEL_24199",
"LABEL_242",
"LABEL_2420",
"LABEL_24200",
"LABEL_24201",
"LABEL_24202",
"LABEL_24203",
"LABEL_24204",
"LABEL_24205",
"LABEL_24206",
"LABEL_24207",
"LABEL_24208",
"LABEL_24209",
"LABEL_2421",
"LABEL_24210",
"LABEL_24211",
"LABEL_24212",
"LABEL_24213",
"LABEL_24214",
"LABEL_24215",
"LABEL_24216",
"LABEL_24217",
"LABEL_24218",
"LABEL_24219",
"LABEL_2422",
"LABEL_24220",
"LABEL_24221",
"LABEL_24222",
"LABEL_24223",
"LABEL_24224",
"LABEL_24225",
"LABEL_24226",
"LABEL_24227",
"LABEL_24228",
"LABEL_24229",
"LABEL_2423",
"LABEL_24230",
"LABEL_24231",
"LABEL_24232",
"LABEL_24233",
"LABEL_24234",
"LABEL_24235",
"LABEL_24236",
"LABEL_24237",
"LABEL_24238",
"LABEL_24239",
"LABEL_2424",
"LABEL_24240",
"LABEL_24241",
"LABEL_24242",
"LABEL_24243",
"LABEL_24244",
"LABEL_24245",
"LABEL_24246",
"LABEL_24247",
"LABEL_24248",
"LABEL_24249",
"LABEL_2425",
"LABEL_24250",
"LABEL_24251",
"LABEL_24252",
"LABEL_24253",
"LABEL_24254",
"LABEL_24255",
"LABEL_24256",
"LABEL_24257",
"LABEL_24258",
"LABEL_24259",
"LABEL_2426",
"LABEL_24260",
"LABEL_24261",
"LABEL_24262",
"LABEL_24263",
"LABEL_24264",
"LABEL_24265",
"LABEL_24266",
"LABEL_24267",
"LABEL_24268",
"LABEL_24269",
"LABEL_2427",
"LABEL_24270",
"LABEL_24271",
"LABEL_24272",
"LABEL_24273",
"LABEL_24274",
"LABEL_24275",
"LABEL_24276",
"LABEL_24277",
"LABEL_24278",
"LABEL_24279",
"LABEL_2428",
"LABEL_24280",
"LABEL_24281",
"LABEL_24282",
"LABEL_24283",
"LABEL_24284",
"LABEL_24285",
"LABEL_24286",
"LABEL_24287",
"LABEL_24288",
"LABEL_24289",
"LABEL_2429",
"LABEL_24290",
"LABEL_24291",
"LABEL_24292",
"LABEL_24293",
"LABEL_24294",
"LABEL_24295",
"LABEL_24296",
"LABEL_24297",
"LABEL_24298",
"LABEL_24299",
"LABEL_243",
"LABEL_2430",
"LABEL_24300",
"LABEL_24301",
"LABEL_24302",
"LABEL_24303",
"LABEL_24304",
"LABEL_24305",
"LABEL_24306",
"LABEL_24307",
"LABEL_24308",
"LABEL_24309",
"LABEL_2431",
"LABEL_24310",
"LABEL_24311",
"LABEL_24312",
"LABEL_24313",
"LABEL_24314",
"LABEL_24315",
"LABEL_24316",
"LABEL_24317",
"LABEL_24318",
"LABEL_24319",
"LABEL_2432",
"LABEL_24320",
"LABEL_24321",
"LABEL_24322",
"LABEL_24323",
"LABEL_24324",
"LABEL_24325",
"LABEL_24326",
"LABEL_24327",
"LABEL_24328",
"LABEL_24329",
"LABEL_2433",
"LABEL_24330",
"LABEL_24331",
"LABEL_24332",
"LABEL_24333",
"LABEL_24334",
"LABEL_24335",
"LABEL_24336",
"LABEL_24337",
"LABEL_24338",
"LABEL_24339",
"LABEL_2434",
"LABEL_24340",
"LABEL_24341",
"LABEL_24342",
"LABEL_24343",
"LABEL_24344",
"LABEL_24345",
"LABEL_24346",
"LABEL_24347",
"LABEL_24348",
"LABEL_24349",
"LABEL_2435",
"LABEL_24350",
"LABEL_24351",
"LABEL_24352",
"LABEL_24353",
"LABEL_24354",
"LABEL_24355",
"LABEL_24356",
"LABEL_24357",
"LABEL_24358",
"LABEL_24359",
"LABEL_2436",
"LABEL_24360",
"LABEL_24361",
"LABEL_24362",
"LABEL_24363",
"LABEL_24364",
"LABEL_24365",
"LABEL_24366",
"LABEL_24367",
"LABEL_24368",
"LABEL_24369",
"LABEL_2437",
"LABEL_24370",
"LABEL_24371",
"LABEL_24372",
"LABEL_24373",
"LABEL_24374",
"LABEL_24375",
"LABEL_24376",
"LABEL_24377",
"LABEL_24378",
"LABEL_24379",
"LABEL_2438",
"LABEL_24380",
"LABEL_24381",
"LABEL_24382",
"LABEL_24383",
"LABEL_24384",
"LABEL_24385",
"LABEL_24386",
"LABEL_24387",
"LABEL_24388",
"LABEL_24389",
"LABEL_2439",
"LABEL_24390",
"LABEL_24391",
"LABEL_24392",
"LABEL_24393",
"LABEL_24394",
"LABEL_24395",
"LABEL_24396",
"LABEL_24397",
"LABEL_24398",
"LABEL_24399",
"LABEL_244",
"LABEL_2440",
"LABEL_24400",
"LABEL_24401",
"LABEL_24402",
"LABEL_24403",
"LABEL_24404",
"LABEL_24405",
"LABEL_24406",
"LABEL_24407",
"LABEL_24408",
"LABEL_24409",
"LABEL_2441",
"LABEL_24410",
"LABEL_24411",
"LABEL_24412",
"LABEL_24413",
"LABEL_24414",
"LABEL_24415",
"LABEL_24416",
"LABEL_24417",
"LABEL_24418",
"LABEL_24419",
"LABEL_2442",
"LABEL_24420",
"LABEL_24421",
"LABEL_24422",
"LABEL_24423",
"LABEL_24424",
"LABEL_24425",
"LABEL_24426",
"LABEL_24427",
"LABEL_24428",
"LABEL_24429",
"LABEL_2443",
"LABEL_24430",
"LABEL_24431",
"LABEL_24432",
"LABEL_24433",
"LABEL_24434",
"LABEL_24435",
"LABEL_24436",
"LABEL_24437",
"LABEL_24438",
"LABEL_24439",
"LABEL_2444",
"LABEL_24440",
"LABEL_24441",
"LABEL_24442",
"LABEL_24443",
"LABEL_24444",
"LABEL_24445",
"LABEL_24446",
"LABEL_24447",
"LABEL_24448",
"LABEL_24449",
"LABEL_2445",
"LABEL_24450",
"LABEL_24451",
"LABEL_24452",
"LABEL_24453",
"LABEL_24454",
"LABEL_24455",
"LABEL_24456",
"LABEL_24457",
"LABEL_24458",
"LABEL_24459",
"LABEL_2446",
"LABEL_24460",
"LABEL_24461",
"LABEL_24462",
"LABEL_24463",
"LABEL_24464",
"LABEL_24465",
"LABEL_24466",
"LABEL_24467",
"LABEL_24468",
"LABEL_24469",
"LABEL_2447",
"LABEL_24470",
"LABEL_24471",
"LABEL_24472",
"LABEL_24473",
"LABEL_24474",
"LABEL_24475",
"LABEL_24476",
"LABEL_24477",
"LABEL_24478",
"LABEL_24479",
"LABEL_2448",
"LABEL_24480",
"LABEL_24481",
"LABEL_24482",
"LABEL_24483",
"LABEL_24484",
"LABEL_24485",
"LABEL_24486",
"LABEL_24487",
"LABEL_24488",
"LABEL_24489",
"LABEL_2449",
"LABEL_24490",
"LABEL_24491",
"LABEL_24492",
"LABEL_24493",
"LABEL_24494",
"LABEL_24495",
"LABEL_24496",
"LABEL_24497",
"LABEL_24498",
"LABEL_24499",
"LABEL_245",
"LABEL_2450",
"LABEL_24500",
"LABEL_24501",
"LABEL_24502",
"LABEL_24503",
"LABEL_24504",
"LABEL_24505",
"LABEL_24506",
"LABEL_24507",
"LABEL_24508",
"LABEL_24509",
"LABEL_2451",
"LABEL_24510",
"LABEL_24511",
"LABEL_24512",
"LABEL_24513",
"LABEL_24514",
"LABEL_24515",
"LABEL_24516",
"LABEL_24517",
"LABEL_24518",
"LABEL_24519",
"LABEL_2452",
"LABEL_24520",
"LABEL_24521",
"LABEL_24522",
"LABEL_24523",
"LABEL_24524",
"LABEL_24525",
"LABEL_24526",
"LABEL_24527",
"LABEL_24528",
"LABEL_24529",
"LABEL_2453",
"LABEL_24530",
"LABEL_24531",
"LABEL_24532",
"LABEL_24533",
"LABEL_24534",
"LABEL_24535",
"LABEL_24536",
"LABEL_24537",
"LABEL_24538",
"LABEL_24539",
"LABEL_2454",
"LABEL_24540",
"LABEL_24541",
"LABEL_24542",
"LABEL_24543",
"LABEL_24544",
"LABEL_24545",
"LABEL_24546",
"LABEL_24547",
"LABEL_24548",
"LABEL_24549",
"LABEL_2455",
"LABEL_24550",
"LABEL_24551",
"LABEL_24552",
"LABEL_24553",
"LABEL_24554",
"LABEL_24555",
"LABEL_24556",
"LABEL_24557",
"LABEL_24558",
"LABEL_24559",
"LABEL_2456",
"LABEL_24560",
"LABEL_24561",
"LABEL_24562",
"LABEL_24563",
"LABEL_24564",
"LABEL_24565",
"LABEL_24566",
"LABEL_24567",
"LABEL_24568",
"LABEL_24569",
"LABEL_2457",
"LABEL_24570",
"LABEL_24571",
"LABEL_24572",
"LABEL_24573",
"LABEL_24574",
"LABEL_24575",
"LABEL_24576",
"LABEL_24577",
"LABEL_24578",
"LABEL_24579",
"LABEL_2458",
"LABEL_24580",
"LABEL_24581",
"LABEL_24582",
"LABEL_24583",
"LABEL_24584",
"LABEL_24585",
"LABEL_24586",
"LABEL_24587",
"LABEL_24588",
"LABEL_24589",
"LABEL_2459",
"LABEL_24590",
"LABEL_24591",
"LABEL_24592",
"LABEL_24593",
"LABEL_24594",
"LABEL_24595",
"LABEL_24596",
"LABEL_24597",
"LABEL_24598",
"LABEL_24599",
"LABEL_246",
"LABEL_2460",
"LABEL_24600",
"LABEL_24601",
"LABEL_24602",
"LABEL_24603",
"LABEL_24604",
"LABEL_24605",
"LABEL_24606",
"LABEL_24607",
"LABEL_24608",
"LABEL_24609",
"LABEL_2461",
"LABEL_24610",
"LABEL_24611",
"LABEL_24612",
"LABEL_24613",
"LABEL_24614",
"LABEL_24615",
"LABEL_24616",
"LABEL_24617",
"LABEL_24618",
"LABEL_24619",
"LABEL_2462",
"LABEL_24620",
"LABEL_24621",
"LABEL_24622",
"LABEL_24623",
"LABEL_24624",
"LABEL_24625",
"LABEL_24626",
"LABEL_24627",
"LABEL_24628",
"LABEL_24629",
"LABEL_2463",
"LABEL_24630",
"LABEL_24631",
"LABEL_24632",
"LABEL_24633",
"LABEL_24634",
"LABEL_24635",
"LABEL_24636",
"LABEL_24637",
"LABEL_24638",
"LABEL_24639",
"LABEL_2464",
"LABEL_24640",
"LABEL_24641",
"LABEL_24642",
"LABEL_24643",
"LABEL_24644",
"LABEL_24645",
"LABEL_24646",
"LABEL_24647",
"LABEL_24648",
"LABEL_24649",
"LABEL_2465",
"LABEL_24650",
"LABEL_24651",
"LABEL_24652",
"LABEL_24653",
"LABEL_24654",
"LABEL_24655",
"LABEL_24656",
"LABEL_24657",
"LABEL_24658",
"LABEL_24659",
"LABEL_2466",
"LABEL_24660",
"LABEL_24661",
"LABEL_24662",
"LABEL_24663",
"LABEL_24664",
"LABEL_24665",
"LABEL_24666",
"LABEL_24667",
"LABEL_24668",
"LABEL_24669",
"LABEL_2467",
"LABEL_24670",
"LABEL_24671",
"LABEL_24672",
"LABEL_24673",
"LABEL_24674",
"LABEL_24675",
"LABEL_24676",
"LABEL_24677",
"LABEL_24678",
"LABEL_24679",
"LABEL_2468",
"LABEL_24680",
"LABEL_24681",
"LABEL_24682",
"LABEL_24683",
"LABEL_24684",
"LABEL_24685",
"LABEL_24686",
"LABEL_24687",
"LABEL_24688",
"LABEL_24689",
"LABEL_2469",
"LABEL_24690",
"LABEL_24691",
"LABEL_24692",
"LABEL_24693",
"LABEL_24694",
"LABEL_24695",
"LABEL_24696",
"LABEL_24697",
"LABEL_24698",
"LABEL_24699",
"LABEL_247",
"LABEL_2470",
"LABEL_24700",
"LABEL_24701",
"LABEL_24702",
"LABEL_24703",
"LABEL_24704",
"LABEL_24705",
"LABEL_24706",
"LABEL_24707",
"LABEL_24708",
"LABEL_24709",
"LABEL_2471",
"LABEL_24710",
"LABEL_24711",
"LABEL_24712",
"LABEL_24713",
"LABEL_24714",
"LABEL_24715",
"LABEL_24716",
"LABEL_24717",
"LABEL_24718",
"LABEL_24719",
"LABEL_2472",
"LABEL_24720",
"LABEL_24721",
"LABEL_24722",
"LABEL_24723",
"LABEL_24724",
"LABEL_24725",
"LABEL_24726",
"LABEL_24727",
"LABEL_24728",
"LABEL_24729",
"LABEL_2473",
"LABEL_24730",
"LABEL_24731",
"LABEL_24732",
"LABEL_24733",
"LABEL_24734",
"LABEL_24735",
"LABEL_24736",
"LABEL_24737",
"LABEL_24738",
"LABEL_24739",
"LABEL_2474",
"LABEL_24740",
"LABEL_24741",
"LABEL_24742",
"LABEL_24743",
"LABEL_24744",
"LABEL_24745",
"LABEL_24746",
"LABEL_24747",
"LABEL_24748",
"LABEL_24749",
"LABEL_2475",
"LABEL_24750",
"LABEL_24751",
"LABEL_24752",
"LABEL_24753",
"LABEL_24754",
"LABEL_24755",
"LABEL_24756",
"LABEL_24757",
"LABEL_24758",
"LABEL_24759",
"LABEL_2476",
"LABEL_24760",
"LABEL_24761",
"LABEL_24762",
"LABEL_24763",
"LABEL_24764",
"LABEL_24765",
"LABEL_24766",
"LABEL_24767",
"LABEL_24768",
"LABEL_24769",
"LABEL_2477",
"LABEL_24770",
"LABEL_24771",
"LABEL_24772",
"LABEL_24773",
"LABEL_24774",
"LABEL_24775",
"LABEL_24776",
"LABEL_24777",
"LABEL_24778",
"LABEL_24779",
"LABEL_2478",
"LABEL_24780",
"LABEL_24781",
"LABEL_24782",
"LABEL_24783",
"LABEL_24784",
"LABEL_24785",
"LABEL_24786",
"LABEL_24787",
"LABEL_24788",
"LABEL_24789",
"LABEL_2479",
"LABEL_24790",
"LABEL_24791",
"LABEL_24792",
"LABEL_24793",
"LABEL_24794",
"LABEL_24795",
"LABEL_24796",
"LABEL_24797",
"LABEL_24798",
"LABEL_24799",
"LABEL_248",
"LABEL_2480",
"LABEL_24800",
"LABEL_24801",
"LABEL_24802",
"LABEL_24803",
"LABEL_24804",
"LABEL_24805",
"LABEL_24806",
"LABEL_24807",
"LABEL_24808",
"LABEL_24809",
"LABEL_2481",
"LABEL_24810",
"LABEL_24811",
"LABEL_24812",
"LABEL_24813",
"LABEL_24814",
"LABEL_24815",
"LABEL_24816",
"LABEL_24817",
"LABEL_24818",
"LABEL_24819",
"LABEL_2482",
"LABEL_24820",
"LABEL_24821",
"LABEL_24822",
"LABEL_24823",
"LABEL_24824",
"LABEL_24825",
"LABEL_24826",
"LABEL_24827",
"LABEL_24828",
"LABEL_24829",
"LABEL_2483",
"LABEL_24830",
"LABEL_24831",
"LABEL_24832",
"LABEL_24833",
"LABEL_24834",
"LABEL_24835",
"LABEL_24836",
"LABEL_24837",
"LABEL_24838",
"LABEL_24839",
"LABEL_2484",
"LABEL_24840",
"LABEL_24841",
"LABEL_24842",
"LABEL_24843",
"LABEL_24844",
"LABEL_24845",
"LABEL_24846",
"LABEL_24847",
"LABEL_24848",
"LABEL_24849",
"LABEL_2485",
"LABEL_24850",
"LABEL_24851",
"LABEL_24852",
"LABEL_24853",
"LABEL_24854",
"LABEL_24855",
"LABEL_24856",
"LABEL_24857",
"LABEL_24858",
"LABEL_24859",
"LABEL_2486",
"LABEL_24860",
"LABEL_24861",
"LABEL_24862",
"LABEL_24863",
"LABEL_24864",
"LABEL_24865",
"LABEL_24866",
"LABEL_24867",
"LABEL_24868",
"LABEL_24869",
"LABEL_2487",
"LABEL_24870",
"LABEL_24871",
"LABEL_24872",
"LABEL_24873",
"LABEL_24874",
"LABEL_24875",
"LABEL_24876",
"LABEL_24877",
"LABEL_24878",
"LABEL_24879",
"LABEL_2488",
"LABEL_24880",
"LABEL_24881",
"LABEL_24882",
"LABEL_24883",
"LABEL_24884",
"LABEL_24885",
"LABEL_24886",
"LABEL_24887",
"LABEL_24888",
"LABEL_24889",
"LABEL_2489",
"LABEL_24890",
"LABEL_24891",
"LABEL_24892",
"LABEL_24893",
"LABEL_24894",
"LABEL_24895",
"LABEL_24896",
"LABEL_24897",
"LABEL_24898",
"LABEL_24899",
"LABEL_249",
"LABEL_2490",
"LABEL_24900",
"LABEL_24901",
"LABEL_24902",
"LABEL_24903",
"LABEL_24904",
"LABEL_24905",
"LABEL_24906",
"LABEL_24907",
"LABEL_24908",
"LABEL_24909",
"LABEL_2491",
"LABEL_24910",
"LABEL_24911",
"LABEL_24912",
"LABEL_24913",
"LABEL_24914",
"LABEL_24915",
"LABEL_24916",
"LABEL_24917",
"LABEL_24918",
"LABEL_24919",
"LABEL_2492",
"LABEL_24920",
"LABEL_24921",
"LABEL_24922",
"LABEL_24923",
"LABEL_24924",
"LABEL_24925",
"LABEL_24926",
"LABEL_24927",
"LABEL_24928",
"LABEL_24929",
"LABEL_2493",
"LABEL_24930",
"LABEL_24931",
"LABEL_24932",
"LABEL_24933",
"LABEL_24934",
"LABEL_24935",
"LABEL_24936",
"LABEL_24937",
"LABEL_24938",
"LABEL_24939",
"LABEL_2494",
"LABEL_24940",
"LABEL_24941",
"LABEL_24942",
"LABEL_24943",
"LABEL_24944",
"LABEL_24945",
"LABEL_24946",
"LABEL_24947",
"LABEL_24948",
"LABEL_24949",
"LABEL_2495",
"LABEL_24950",
"LABEL_24951",
"LABEL_24952",
"LABEL_24953",
"LABEL_24954",
"LABEL_24955",
"LABEL_24956",
"LABEL_24957",
"LABEL_24958",
"LABEL_24959",
"LABEL_2496",
"LABEL_24960",
"LABEL_24961",
"LABEL_24962",
"LABEL_24963",
"LABEL_24964",
"LABEL_24965",
"LABEL_24966",
"LABEL_24967",
"LABEL_24968",
"LABEL_24969",
"LABEL_2497",
"LABEL_24970",
"LABEL_24971",
"LABEL_24972",
"LABEL_24973",
"LABEL_24974",
"LABEL_24975",
"LABEL_24976",
"LABEL_24977",
"LABEL_24978",
"LABEL_24979",
"LABEL_2498",
"LABEL_24980",
"LABEL_24981",
"LABEL_24982",
"LABEL_24983",
"LABEL_24984",
"LABEL_24985",
"LABEL_24986",
"LABEL_24987",
"LABEL_24988",
"LABEL_24989",
"LABEL_2499",
"LABEL_24990",
"LABEL_24991",
"LABEL_24992",
"LABEL_24993",
"LABEL_24994",
"LABEL_24995",
"LABEL_24996",
"LABEL_24997",
"LABEL_24998",
"LABEL_24999",
"LABEL_25",
"LABEL_250",
"LABEL_2500",
"LABEL_25000",
"LABEL_25001",
"LABEL_25002",
"LABEL_25003",
"LABEL_25004",
"LABEL_25005",
"LABEL_25006",
"LABEL_25007",
"LABEL_25008",
"LABEL_25009",
"LABEL_2501",
"LABEL_25010",
"LABEL_25011",
"LABEL_25012",
"LABEL_25013",
"LABEL_25014",
"LABEL_25015",
"LABEL_25016",
"LABEL_25017",
"LABEL_25018",
"LABEL_25019",
"LABEL_2502",
"LABEL_25020",
"LABEL_25021",
"LABEL_25022",
"LABEL_25023",
"LABEL_25024",
"LABEL_25025",
"LABEL_25026",
"LABEL_25027",
"LABEL_25028",
"LABEL_25029",
"LABEL_2503",
"LABEL_25030",
"LABEL_25031",
"LABEL_25032",
"LABEL_25033",
"LABEL_25034",
"LABEL_25035",
"LABEL_25036",
"LABEL_25037",
"LABEL_25038",
"LABEL_25039",
"LABEL_2504",
"LABEL_25040",
"LABEL_25041",
"LABEL_25042",
"LABEL_25043",
"LABEL_25044",
"LABEL_25045",
"LABEL_25046",
"LABEL_25047",
"LABEL_25048",
"LABEL_25049",
"LABEL_2505",
"LABEL_25050",
"LABEL_25051",
"LABEL_25052",
"LABEL_25053",
"LABEL_25054",
"LABEL_25055",
"LABEL_25056",
"LABEL_25057",
"LABEL_25058",
"LABEL_25059",
"LABEL_2506",
"LABEL_25060",
"LABEL_25061",
"LABEL_25062",
"LABEL_25063",
"LABEL_25064",
"LABEL_25065",
"LABEL_25066",
"LABEL_25067",
"LABEL_25068",
"LABEL_25069",
"LABEL_2507",
"LABEL_25070",
"LABEL_25071",
"LABEL_25072",
"LABEL_25073",
"LABEL_25074",
"LABEL_25075",
"LABEL_25076",
"LABEL_25077",
"LABEL_25078",
"LABEL_25079",
"LABEL_2508",
"LABEL_25080",
"LABEL_25081",
"LABEL_25082",
"LABEL_25083",
"LABEL_25084",
"LABEL_25085",
"LABEL_25086",
"LABEL_25087",
"LABEL_25088",
"LABEL_25089",
"LABEL_2509",
"LABEL_25090",
"LABEL_25091",
"LABEL_25092",
"LABEL_25093",
"LABEL_25094",
"LABEL_25095",
"LABEL_25096",
"LABEL_25097",
"LABEL_25098",
"LABEL_25099",
"LABEL_251",
"LABEL_2510",
"LABEL_25100",
"LABEL_25101",
"LABEL_25102",
"LABEL_25103",
"LABEL_25104",
"LABEL_25105",
"LABEL_25106",
"LABEL_25107",
"LABEL_25108",
"LABEL_25109",
"LABEL_2511",
"LABEL_25110",
"LABEL_25111",
"LABEL_25112",
"LABEL_25113",
"LABEL_25114",
"LABEL_25115",
"LABEL_25116",
"LABEL_25117",
"LABEL_25118",
"LABEL_25119",
"LABEL_2512",
"LABEL_25120",
"LABEL_25121",
"LABEL_25122",
"LABEL_25123",
"LABEL_25124",
"LABEL_25125",
"LABEL_25126",
"LABEL_25127",
"LABEL_25128",
"LABEL_25129",
"LABEL_2513",
"LABEL_25130",
"LABEL_25131",
"LABEL_25132",
"LABEL_25133",
"LABEL_25134",
"LABEL_25135",
"LABEL_25136",
"LABEL_25137",
"LABEL_25138",
"LABEL_25139",
"LABEL_2514",
"LABEL_25140",
"LABEL_25141",
"LABEL_25142",
"LABEL_25143",
"LABEL_25144",
"LABEL_25145",
"LABEL_25146",
"LABEL_25147",
"LABEL_25148",
"LABEL_25149",
"LABEL_2515",
"LABEL_25150",
"LABEL_25151",
"LABEL_25152",
"LABEL_25153",
"LABEL_25154",
"LABEL_25155",
"LABEL_25156",
"LABEL_25157",
"LABEL_25158",
"LABEL_25159",
"LABEL_2516",
"LABEL_25160",
"LABEL_25161",
"LABEL_25162",
"LABEL_25163",
"LABEL_25164",
"LABEL_25165",
"LABEL_25166",
"LABEL_25167",
"LABEL_25168",
"LABEL_25169",
"LABEL_2517",
"LABEL_25170",
"LABEL_25171",
"LABEL_25172",
"LABEL_25173",
"LABEL_25174",
"LABEL_25175",
"LABEL_25176",
"LABEL_25177",
"LABEL_25178",
"LABEL_25179",
"LABEL_2518",
"LABEL_25180",
"LABEL_25181",
"LABEL_25182",
"LABEL_25183",
"LABEL_25184",
"LABEL_25185",
"LABEL_25186",
"LABEL_25187",
"LABEL_25188",
"LABEL_25189",
"LABEL_2519",
"LABEL_25190",
"LABEL_25191",
"LABEL_25192",
"LABEL_25193",
"LABEL_25194",
"LABEL_25195",
"LABEL_25196",
"LABEL_25197",
"LABEL_25198",
"LABEL_25199",
"LABEL_252",
"LABEL_2520",
"LABEL_25200",
"LABEL_25201",
"LABEL_25202",
"LABEL_25203",
"LABEL_25204",
"LABEL_25205",
"LABEL_25206",
"LABEL_25207",
"LABEL_25208",
"LABEL_25209",
"LABEL_2521",
"LABEL_25210",
"LABEL_25211",
"LABEL_25212",
"LABEL_25213",
"LABEL_25214",
"LABEL_25215",
"LABEL_25216",
"LABEL_25217",
"LABEL_25218",
"LABEL_25219",
"LABEL_2522",
"LABEL_25220",
"LABEL_25221",
"LABEL_25222",
"LABEL_25223",
"LABEL_25224",
"LABEL_25225",
"LABEL_25226",
"LABEL_25227",
"LABEL_25228",
"LABEL_25229",
"LABEL_2523",
"LABEL_25230",
"LABEL_25231",
"LABEL_25232",
"LABEL_25233",
"LABEL_25234",
"LABEL_25235",
"LABEL_25236",
"LABEL_25237",
"LABEL_25238",
"LABEL_25239",
"LABEL_2524",
"LABEL_25240",
"LABEL_25241",
"LABEL_25242",
"LABEL_25243",
"LABEL_25244",
"LABEL_25245",
"LABEL_25246",
"LABEL_25247",
"LABEL_25248",
"LABEL_25249",
"LABEL_2525",
"LABEL_25250",
"LABEL_25251",
"LABEL_25252",
"LABEL_25253",
"LABEL_25254",
"LABEL_25255",
"LABEL_25256",
"LABEL_25257",
"LABEL_25258",
"LABEL_25259",
"LABEL_2526",
"LABEL_25260",
"LABEL_25261",
"LABEL_25262",
"LABEL_25263",
"LABEL_25264",
"LABEL_25265",
"LABEL_25266",
"LABEL_25267",
"LABEL_25268",
"LABEL_25269",
"LABEL_2527",
"LABEL_25270",
"LABEL_25271",
"LABEL_25272",
"LABEL_25273",
"LABEL_25274",
"LABEL_25275",
"LABEL_25276",
"LABEL_25277",
"LABEL_25278",
"LABEL_25279",
"LABEL_2528",
"LABEL_25280",
"LABEL_25281",
"LABEL_25282",
"LABEL_25283",
"LABEL_25284",
"LABEL_25285",
"LABEL_25286",
"LABEL_25287",
"LABEL_25288",
"LABEL_25289",
"LABEL_2529",
"LABEL_25290",
"LABEL_25291",
"LABEL_25292",
"LABEL_25293",
"LABEL_25294",
"LABEL_25295",
"LABEL_25296",
"LABEL_25297",
"LABEL_25298",
"LABEL_25299",
"LABEL_253",
"LABEL_2530",
"LABEL_25300",
"LABEL_25301",
"LABEL_25302",
"LABEL_25303",
"LABEL_25304",
"LABEL_25305",
"LABEL_25306",
"LABEL_25307",
"LABEL_25308",
"LABEL_25309",
"LABEL_2531",
"LABEL_25310",
"LABEL_25311",
"LABEL_25312",
"LABEL_25313",
"LABEL_25314",
"LABEL_25315",
"LABEL_25316",
"LABEL_25317",
"LABEL_25318",
"LABEL_25319",
"LABEL_2532",
"LABEL_25320",
"LABEL_25321",
"LABEL_25322",
"LABEL_25323",
"LABEL_25324",
"LABEL_25325",
"LABEL_25326",
"LABEL_25327",
"LABEL_25328",
"LABEL_25329",
"LABEL_2533",
"LABEL_25330",
"LABEL_25331",
"LABEL_25332",
"LABEL_25333",
"LABEL_25334",
"LABEL_25335",
"LABEL_25336",
"LABEL_25337",
"LABEL_25338",
"LABEL_25339",
"LABEL_2534",
"LABEL_25340",
"LABEL_25341",
"LABEL_25342",
"LABEL_25343",
"LABEL_25344",
"LABEL_25345",
"LABEL_25346",
"LABEL_25347",
"LABEL_25348",
"LABEL_25349",
"LABEL_2535",
"LABEL_25350",
"LABEL_25351",
"LABEL_25352",
"LABEL_25353",
"LABEL_25354",
"LABEL_25355",
"LABEL_25356",
"LABEL_25357",
"LABEL_25358",
"LABEL_25359",
"LABEL_2536",
"LABEL_25360",
"LABEL_25361",
"LABEL_25362",
"LABEL_25363",
"LABEL_25364",
"LABEL_25365",
"LABEL_25366",
"LABEL_25367",
"LABEL_25368",
"LABEL_25369",
"LABEL_2537",
"LABEL_25370",
"LABEL_25371",
"LABEL_25372",
"LABEL_25373",
"LABEL_25374",
"LABEL_25375",
"LABEL_25376",
"LABEL_25377",
"LABEL_25378",
"LABEL_25379",
"LABEL_2538",
"LABEL_25380",
"LABEL_25381",
"LABEL_25382",
"LABEL_25383",
"LABEL_25384",
"LABEL_25385",
"LABEL_25386",
"LABEL_25387",
"LABEL_25388",
"LABEL_25389",
"LABEL_2539",
"LABEL_25390",
"LABEL_25391",
"LABEL_25392",
"LABEL_25393",
"LABEL_25394",
"LABEL_25395",
"LABEL_25396",
"LABEL_25397",
"LABEL_25398",
"LABEL_25399",
"LABEL_254",
"LABEL_2540",
"LABEL_25400",
"LABEL_25401",
"LABEL_25402",
"LABEL_25403",
"LABEL_25404",
"LABEL_25405",
"LABEL_25406",
"LABEL_25407",
"LABEL_25408",
"LABEL_25409",
"LABEL_2541",
"LABEL_25410",
"LABEL_25411",
"LABEL_25412",
"LABEL_25413",
"LABEL_25414",
"LABEL_25415",
"LABEL_25416",
"LABEL_25417",
"LABEL_25418",
"LABEL_25419",
"LABEL_2542",
"LABEL_25420",
"LABEL_25421",
"LABEL_25422",
"LABEL_25423",
"LABEL_25424",
"LABEL_25425",
"LABEL_25426",
"LABEL_25427",
"LABEL_25428",
"LABEL_25429",
"LABEL_2543",
"LABEL_25430",
"LABEL_25431",
"LABEL_25432",
"LABEL_25433",
"LABEL_25434",
"LABEL_25435",
"LABEL_25436",
"LABEL_25437",
"LABEL_25438",
"LABEL_25439",
"LABEL_2544",
"LABEL_25440",
"LABEL_25441",
"LABEL_25442",
"LABEL_25443",
"LABEL_25444",
"LABEL_25445",
"LABEL_25446",
"LABEL_25447",
"LABEL_25448",
"LABEL_25449",
"LABEL_2545",
"LABEL_25450",
"LABEL_25451",
"LABEL_25452",
"LABEL_25453",
"LABEL_25454",
"LABEL_25455",
"LABEL_25456",
"LABEL_25457",
"LABEL_25458",
"LABEL_25459",
"LABEL_2546",
"LABEL_25460",
"LABEL_25461",
"LABEL_25462",
"LABEL_25463",
"LABEL_25464",
"LABEL_25465",
"LABEL_25466",
"LABEL_25467",
"LABEL_25468",
"LABEL_25469",
"LABEL_2547",
"LABEL_25470",
"LABEL_25471",
"LABEL_25472",
"LABEL_25473",
"LABEL_25474",
"LABEL_25475",
"LABEL_25476",
"LABEL_25477",
"LABEL_25478",
"LABEL_25479",
"LABEL_2548",
"LABEL_25480",
"LABEL_25481",
"LABEL_25482",
"LABEL_25483",
"LABEL_25484",
"LABEL_25485",
"LABEL_25486",
"LABEL_25487",
"LABEL_25488",
"LABEL_25489",
"LABEL_2549",
"LABEL_25490",
"LABEL_25491",
"LABEL_25492",
"LABEL_25493",
"LABEL_25494",
"LABEL_25495",
"LABEL_25496",
"LABEL_25497",
"LABEL_25498",
"LABEL_25499",
"LABEL_255",
"LABEL_2550",
"LABEL_25500",
"LABEL_25501",
"LABEL_25502",
"LABEL_25503",
"LABEL_25504",
"LABEL_25505",
"LABEL_25506",
"LABEL_25507",
"LABEL_25508",
"LABEL_25509",
"LABEL_2551",
"LABEL_25510",
"LABEL_25511",
"LABEL_25512",
"LABEL_25513",
"LABEL_25514",
"LABEL_25515",
"LABEL_25516",
"LABEL_25517",
"LABEL_25518",
"LABEL_25519",
"LABEL_2552",
"LABEL_25520",
"LABEL_25521",
"LABEL_25522",
"LABEL_25523",
"LABEL_25524",
"LABEL_25525",
"LABEL_25526",
"LABEL_25527",
"LABEL_25528",
"LABEL_25529",
"LABEL_2553",
"LABEL_25530",
"LABEL_25531",
"LABEL_25532",
"LABEL_25533",
"LABEL_25534",
"LABEL_25535",
"LABEL_25536",
"LABEL_25537",
"LABEL_25538",
"LABEL_25539",
"LABEL_2554",
"LABEL_25540",
"LABEL_25541",
"LABEL_25542",
"LABEL_25543",
"LABEL_25544",
"LABEL_25545",
"LABEL_25546",
"LABEL_25547",
"LABEL_25548",
"LABEL_25549",
"LABEL_2555",
"LABEL_25550",
"LABEL_25551",
"LABEL_25552",
"LABEL_25553",
"LABEL_25554",
"LABEL_25555",
"LABEL_25556",
"LABEL_25557",
"LABEL_25558",
"LABEL_25559",
"LABEL_2556",
"LABEL_25560",
"LABEL_25561",
"LABEL_25562",
"LABEL_25563",
"LABEL_25564",
"LABEL_25565",
"LABEL_25566",
"LABEL_25567",
"LABEL_25568",
"LABEL_25569",
"LABEL_2557",
"LABEL_25570",
"LABEL_25571",
"LABEL_25572",
"LABEL_25573",
"LABEL_25574",
"LABEL_25575",
"LABEL_25576",
"LABEL_25577",
"LABEL_25578",
"LABEL_25579",
"LABEL_2558",
"LABEL_25580",
"LABEL_25581",
"LABEL_25582",
"LABEL_25583",
"LABEL_25584",
"LABEL_25585",
"LABEL_25586",
"LABEL_25587",
"LABEL_25588",
"LABEL_25589",
"LABEL_2559",
"LABEL_25590",
"LABEL_25591",
"LABEL_25592",
"LABEL_25593",
"LABEL_25594",
"LABEL_25595",
"LABEL_25596",
"LABEL_25597",
"LABEL_25598",
"LABEL_25599",
"LABEL_256",
"LABEL_2560",
"LABEL_25600",
"LABEL_25601",
"LABEL_25602",
"LABEL_25603",
"LABEL_25604",
"LABEL_25605",
"LABEL_25606",
"LABEL_25607",
"LABEL_25608",
"LABEL_25609",
"LABEL_2561",
"LABEL_25610",
"LABEL_25611",
"LABEL_25612",
"LABEL_25613",
"LABEL_25614",
"LABEL_25615",
"LABEL_25616",
"LABEL_25617",
"LABEL_25618",
"LABEL_25619",
"LABEL_2562",
"LABEL_25620",
"LABEL_25621",
"LABEL_25622",
"LABEL_25623",
"LABEL_25624",
"LABEL_25625",
"LABEL_25626",
"LABEL_25627",
"LABEL_25628",
"LABEL_25629",
"LABEL_2563",
"LABEL_25630",
"LABEL_25631",
"LABEL_25632",
"LABEL_25633",
"LABEL_25634",
"LABEL_25635",
"LABEL_25636",
"LABEL_25637",
"LABEL_25638",
"LABEL_25639",
"LABEL_2564",
"LABEL_25640",
"LABEL_25641",
"LABEL_25642",
"LABEL_25643",
"LABEL_25644",
"LABEL_25645",
"LABEL_25646",
"LABEL_25647",
"LABEL_25648",
"LABEL_25649",
"LABEL_2565",
"LABEL_25650",
"LABEL_25651",
"LABEL_25652",
"LABEL_25653",
"LABEL_25654",
"LABEL_25655",
"LABEL_25656",
"LABEL_25657",
"LABEL_25658",
"LABEL_25659",
"LABEL_2566",
"LABEL_25660",
"LABEL_25661",
"LABEL_25662",
"LABEL_25663",
"LABEL_25664",
"LABEL_25665",
"LABEL_25666",
"LABEL_25667",
"LABEL_25668",
"LABEL_25669",
"LABEL_2567",
"LABEL_25670",
"LABEL_25671",
"LABEL_25672",
"LABEL_25673",
"LABEL_25674",
"LABEL_25675",
"LABEL_25676",
"LABEL_25677",
"LABEL_25678",
"LABEL_25679",
"LABEL_2568",
"LABEL_25680",
"LABEL_25681",
"LABEL_25682",
"LABEL_25683",
"LABEL_25684",
"LABEL_25685",
"LABEL_25686",
"LABEL_25687",
"LABEL_25688",
"LABEL_25689",
"LABEL_2569",
"LABEL_25690",
"LABEL_25691",
"LABEL_25692",
"LABEL_25693",
"LABEL_25694",
"LABEL_25695",
"LABEL_25696",
"LABEL_25697",
"LABEL_25698",
"LABEL_25699",
"LABEL_257",
"LABEL_2570",
"LABEL_25700",
"LABEL_25701",
"LABEL_25702",
"LABEL_25703",
"LABEL_25704",
"LABEL_25705",
"LABEL_25706",
"LABEL_25707",
"LABEL_25708",
"LABEL_25709",
"LABEL_2571",
"LABEL_25710",
"LABEL_25711",
"LABEL_25712",
"LABEL_25713",
"LABEL_25714",
"LABEL_25715",
"LABEL_25716",
"LABEL_25717",
"LABEL_25718",
"LABEL_25719",
"LABEL_2572",
"LABEL_25720",
"LABEL_25721",
"LABEL_25722",
"LABEL_25723",
"LABEL_25724",
"LABEL_25725",
"LABEL_25726",
"LABEL_25727",
"LABEL_25728",
"LABEL_25729",
"LABEL_2573",
"LABEL_25730",
"LABEL_25731",
"LABEL_25732",
"LABEL_25733",
"LABEL_25734",
"LABEL_25735",
"LABEL_25736",
"LABEL_25737",
"LABEL_25738",
"LABEL_25739",
"LABEL_2574",
"LABEL_25740",
"LABEL_25741",
"LABEL_25742",
"LABEL_25743",
"LABEL_25744",
"LABEL_25745",
"LABEL_25746",
"LABEL_25747",
"LABEL_25748",
"LABEL_25749",
"LABEL_2575",
"LABEL_25750",
"LABEL_25751",
"LABEL_25752",
"LABEL_25753",
"LABEL_25754",
"LABEL_25755",
"LABEL_25756",
"LABEL_25757",
"LABEL_25758",
"LABEL_25759",
"LABEL_2576",
"LABEL_25760",
"LABEL_25761",
"LABEL_25762",
"LABEL_25763",
"LABEL_25764",
"LABEL_25765",
"LABEL_25766",
"LABEL_25767",
"LABEL_25768",
"LABEL_25769",
"LABEL_2577",
"LABEL_25770",
"LABEL_25771",
"LABEL_25772",
"LABEL_25773",
"LABEL_25774",
"LABEL_25775",
"LABEL_25776",
"LABEL_25777",
"LABEL_25778",
"LABEL_25779",
"LABEL_2578",
"LABEL_25780",
"LABEL_25781",
"LABEL_25782",
"LABEL_25783",
"LABEL_25784",
"LABEL_25785",
"LABEL_25786",
"LABEL_25787",
"LABEL_25788",
"LABEL_25789",
"LABEL_2579",
"LABEL_25790",
"LABEL_25791",
"LABEL_25792",
"LABEL_25793",
"LABEL_25794",
"LABEL_25795",
"LABEL_25796",
"LABEL_25797",
"LABEL_25798",
"LABEL_25799",
"LABEL_258",
"LABEL_2580",
"LABEL_25800",
"LABEL_25801",
"LABEL_25802",
"LABEL_25803",
"LABEL_25804",
"LABEL_25805",
"LABEL_25806",
"LABEL_25807",
"LABEL_25808",
"LABEL_25809",
"LABEL_2581",
"LABEL_25810",
"LABEL_25811",
"LABEL_25812",
"LABEL_25813",
"LABEL_25814",
"LABEL_25815",
"LABEL_25816",
"LABEL_25817",
"LABEL_25818",
"LABEL_25819",
"LABEL_2582",
"LABEL_25820",
"LABEL_25821",
"LABEL_25822",
"LABEL_25823",
"LABEL_25824",
"LABEL_25825",
"LABEL_25826",
"LABEL_25827",
"LABEL_25828",
"LABEL_25829",
"LABEL_2583",
"LABEL_25830",
"LABEL_25831",
"LABEL_25832",
"LABEL_25833",
"LABEL_25834",
"LABEL_25835",
"LABEL_25836",
"LABEL_25837",
"LABEL_25838",
"LABEL_25839",
"LABEL_2584",
"LABEL_25840",
"LABEL_25841",
"LABEL_25842",
"LABEL_25843",
"LABEL_25844",
"LABEL_25845",
"LABEL_25846",
"LABEL_25847",
"LABEL_25848",
"LABEL_25849",
"LABEL_2585",
"LABEL_25850",
"LABEL_25851",
"LABEL_25852",
"LABEL_25853",
"LABEL_25854",
"LABEL_25855",
"LABEL_25856",
"LABEL_25857",
"LABEL_25858",
"LABEL_25859",
"LABEL_2586",
"LABEL_25860",
"LABEL_25861",
"LABEL_25862",
"LABEL_25863",
"LABEL_25864",
"LABEL_25865",
"LABEL_25866",
"LABEL_25867",
"LABEL_25868",
"LABEL_25869",
"LABEL_2587",
"LABEL_25870",
"LABEL_25871",
"LABEL_25872",
"LABEL_25873",
"LABEL_25874",
"LABEL_25875",
"LABEL_25876",
"LABEL_25877",
"LABEL_25878",
"LABEL_25879",
"LABEL_2588",
"LABEL_25880",
"LABEL_25881",
"LABEL_25882",
"LABEL_25883",
"LABEL_25884",
"LABEL_25885",
"LABEL_25886",
"LABEL_25887",
"LABEL_25888",
"LABEL_25889",
"LABEL_2589",
"LABEL_25890",
"LABEL_25891",
"LABEL_25892",
"LABEL_25893",
"LABEL_25894",
"LABEL_25895",
"LABEL_25896",
"LABEL_25897",
"LABEL_25898",
"LABEL_25899",
"LABEL_259",
"LABEL_2590",
"LABEL_25900",
"LABEL_25901",
"LABEL_25902",
"LABEL_25903",
"LABEL_25904",
"LABEL_25905",
"LABEL_25906",
"LABEL_25907",
"LABEL_25908",
"LABEL_25909",
"LABEL_2591",
"LABEL_25910",
"LABEL_25911",
"LABEL_25912",
"LABEL_25913",
"LABEL_25914",
"LABEL_25915",
"LABEL_25916",
"LABEL_25917",
"LABEL_25918",
"LABEL_25919",
"LABEL_2592",
"LABEL_25920",
"LABEL_25921",
"LABEL_25922",
"LABEL_25923",
"LABEL_25924",
"LABEL_25925",
"LABEL_25926",
"LABEL_25927",
"LABEL_25928",
"LABEL_25929",
"LABEL_2593",
"LABEL_25930",
"LABEL_25931",
"LABEL_25932",
"LABEL_25933",
"LABEL_25934",
"LABEL_25935",
"LABEL_25936",
"LABEL_25937",
"LABEL_25938",
"LABEL_25939",
"LABEL_2594",
"LABEL_25940",
"LABEL_25941",
"LABEL_25942",
"LABEL_25943",
"LABEL_25944",
"LABEL_25945",
"LABEL_25946",
"LABEL_25947",
"LABEL_25948",
"LABEL_25949",
"LABEL_2595",
"LABEL_25950",
"LABEL_25951",
"LABEL_25952",
"LABEL_25953",
"LABEL_25954",
"LABEL_25955",
"LABEL_25956",
"LABEL_25957",
"LABEL_25958",
"LABEL_25959",
"LABEL_2596",
"LABEL_25960",
"LABEL_25961",
"LABEL_25962",
"LABEL_25963",
"LABEL_25964",
"LABEL_25965",
"LABEL_25966",
"LABEL_25967",
"LABEL_25968",
"LABEL_25969",
"LABEL_2597",
"LABEL_25970",
"LABEL_25971",
"LABEL_25972",
"LABEL_25973",
"LABEL_25974",
"LABEL_25975",
"LABEL_25976",
"LABEL_25977",
"LABEL_25978",
"LABEL_25979",
"LABEL_2598",
"LABEL_25980",
"LABEL_25981",
"LABEL_25982",
"LABEL_25983",
"LABEL_25984",
"LABEL_25985",
"LABEL_25986",
"LABEL_25987",
"LABEL_25988",
"LABEL_25989",
"LABEL_2599",
"LABEL_25990",
"LABEL_25991",
"LABEL_25992",
"LABEL_25993",
"LABEL_25994",
"LABEL_25995",
"LABEL_25996",
"LABEL_25997",
"LABEL_25998",
"LABEL_25999",
"LABEL_26",
"LABEL_260",
"LABEL_2600",
"LABEL_26000",
"LABEL_26001",
"LABEL_26002",
"LABEL_26003",
"LABEL_26004",
"LABEL_26005",
"LABEL_26006",
"LABEL_26007",
"LABEL_26008",
"LABEL_26009",
"LABEL_2601",
"LABEL_26010",
"LABEL_26011",
"LABEL_26012",
"LABEL_26013",
"LABEL_26014",
"LABEL_26015",
"LABEL_26016",
"LABEL_26017",
"LABEL_26018",
"LABEL_26019",
"LABEL_2602",
"LABEL_26020",
"LABEL_26021",
"LABEL_26022",
"LABEL_26023",
"LABEL_26024",
"LABEL_26025",
"LABEL_26026",
"LABEL_26027",
"LABEL_26028",
"LABEL_26029",
"LABEL_2603",
"LABEL_26030",
"LABEL_26031",
"LABEL_26032",
"LABEL_26033",
"LABEL_26034",
"LABEL_26035",
"LABEL_26036",
"LABEL_26037",
"LABEL_26038",
"LABEL_26039",
"LABEL_2604",
"LABEL_26040",
"LABEL_26041",
"LABEL_26042",
"LABEL_26043",
"LABEL_26044",
"LABEL_26045",
"LABEL_26046",
"LABEL_26047",
"LABEL_26048",
"LABEL_26049",
"LABEL_2605",
"LABEL_26050",
"LABEL_26051",
"LABEL_26052",
"LABEL_26053",
"LABEL_26054",
"LABEL_26055",
"LABEL_26056",
"LABEL_26057",
"LABEL_26058",
"LABEL_26059",
"LABEL_2606",
"LABEL_26060",
"LABEL_26061",
"LABEL_26062",
"LABEL_26063",
"LABEL_26064",
"LABEL_26065",
"LABEL_26066",
"LABEL_26067",
"LABEL_26068",
"LABEL_26069",
"LABEL_2607",
"LABEL_26070",
"LABEL_26071",
"LABEL_26072",
"LABEL_26073",
"LABEL_26074",
"LABEL_26075",
"LABEL_26076",
"LABEL_26077",
"LABEL_26078",
"LABEL_26079",
"LABEL_2608",
"LABEL_26080",
"LABEL_26081",
"LABEL_26082",
"LABEL_26083",
"LABEL_26084",
"LABEL_26085",
"LABEL_26086",
"LABEL_26087",
"LABEL_26088",
"LABEL_26089",
"LABEL_2609",
"LABEL_26090",
"LABEL_26091",
"LABEL_26092",
"LABEL_26093",
"LABEL_26094",
"LABEL_26095",
"LABEL_26096",
"LABEL_26097",
"LABEL_26098",
"LABEL_26099",
"LABEL_261",
"LABEL_2610",
"LABEL_26100",
"LABEL_26101",
"LABEL_26102",
"LABEL_26103",
"LABEL_26104",
"LABEL_26105",
"LABEL_26106",
"LABEL_26107",
"LABEL_26108",
"LABEL_26109",
"LABEL_2611",
"LABEL_26110",
"LABEL_26111",
"LABEL_26112",
"LABEL_26113",
"LABEL_26114",
"LABEL_26115",
"LABEL_26116",
"LABEL_26117",
"LABEL_26118",
"LABEL_26119",
"LABEL_2612",
"LABEL_26120",
"LABEL_26121",
"LABEL_26122",
"LABEL_26123",
"LABEL_26124",
"LABEL_26125",
"LABEL_26126",
"LABEL_26127",
"LABEL_26128",
"LABEL_26129",
"LABEL_2613",
"LABEL_26130",
"LABEL_26131",
"LABEL_26132",
"LABEL_26133",
"LABEL_26134",
"LABEL_26135",
"LABEL_26136",
"LABEL_26137",
"LABEL_26138",
"LABEL_26139",
"LABEL_2614",
"LABEL_26140",
"LABEL_26141",
"LABEL_26142",
"LABEL_26143",
"LABEL_26144",
"LABEL_26145",
"LABEL_26146",
"LABEL_26147",
"LABEL_26148",
"LABEL_26149",
"LABEL_2615",
"LABEL_26150",
"LABEL_26151",
"LABEL_26152",
"LABEL_26153",
"LABEL_26154",
"LABEL_26155",
"LABEL_26156",
"LABEL_26157",
"LABEL_26158",
"LABEL_26159",
"LABEL_2616",
"LABEL_26160",
"LABEL_26161",
"LABEL_26162",
"LABEL_26163",
"LABEL_26164",
"LABEL_26165",
"LABEL_26166",
"LABEL_26167",
"LABEL_26168",
"LABEL_26169",
"LABEL_2617",
"LABEL_26170",
"LABEL_26171",
"LABEL_26172",
"LABEL_26173",
"LABEL_26174",
"LABEL_26175",
"LABEL_26176",
"LABEL_26177",
"LABEL_26178",
"LABEL_26179",
"LABEL_2618",
"LABEL_26180",
"LABEL_26181",
"LABEL_26182",
"LABEL_26183",
"LABEL_26184",
"LABEL_26185",
"LABEL_26186",
"LABEL_26187",
"LABEL_26188",
"LABEL_26189",
"LABEL_2619",
"LABEL_26190",
"LABEL_26191",
"LABEL_26192",
"LABEL_26193",
"LABEL_26194",
"LABEL_26195",
"LABEL_26196",
"LABEL_26197",
"LABEL_26198",
"LABEL_26199",
"LABEL_262",
"LABEL_2620",
"LABEL_26200",
"LABEL_26201",
"LABEL_26202",
"LABEL_26203",
"LABEL_26204",
"LABEL_26205",
"LABEL_26206",
"LABEL_26207",
"LABEL_26208",
"LABEL_26209",
"LABEL_2621",
"LABEL_26210",
"LABEL_26211",
"LABEL_26212",
"LABEL_26213",
"LABEL_26214",
"LABEL_26215",
"LABEL_26216",
"LABEL_26217",
"LABEL_26218",
"LABEL_26219",
"LABEL_2622",
"LABEL_26220",
"LABEL_26221",
"LABEL_26222",
"LABEL_26223",
"LABEL_26224",
"LABEL_26225",
"LABEL_26226",
"LABEL_26227",
"LABEL_26228",
"LABEL_26229",
"LABEL_2623",
"LABEL_26230",
"LABEL_26231",
"LABEL_26232",
"LABEL_26233",
"LABEL_26234",
"LABEL_26235",
"LABEL_26236",
"LABEL_26237",
"LABEL_26238",
"LABEL_26239",
"LABEL_2624",
"LABEL_26240",
"LABEL_26241",
"LABEL_26242",
"LABEL_26243",
"LABEL_26244",
"LABEL_26245",
"LABEL_26246",
"LABEL_26247",
"LABEL_26248",
"LABEL_26249",
"LABEL_2625",
"LABEL_26250",
"LABEL_26251",
"LABEL_26252",
"LABEL_26253",
"LABEL_26254",
"LABEL_26255",
"LABEL_26256",
"LABEL_26257",
"LABEL_26258",
"LABEL_26259",
"LABEL_2626",
"LABEL_26260",
"LABEL_26261",
"LABEL_26262",
"LABEL_26263",
"LABEL_26264",
"LABEL_26265",
"LABEL_26266",
"LABEL_26267",
"LABEL_26268",
"LABEL_26269",
"LABEL_2627",
"LABEL_26270",
"LABEL_26271",
"LABEL_26272",
"LABEL_26273",
"LABEL_26274",
"LABEL_26275",
"LABEL_26276",
"LABEL_26277",
"LABEL_26278",
"LABEL_26279",
"LABEL_2628",
"LABEL_26280",
"LABEL_26281",
"LABEL_26282",
"LABEL_26283",
"LABEL_26284",
"LABEL_26285",
"LABEL_26286",
"LABEL_26287",
"LABEL_26288",
"LABEL_26289",
"LABEL_2629",
"LABEL_26290",
"LABEL_26291",
"LABEL_26292",
"LABEL_26293",
"LABEL_26294",
"LABEL_26295",
"LABEL_26296",
"LABEL_26297",
"LABEL_26298",
"LABEL_26299",
"LABEL_263",
"LABEL_2630",
"LABEL_26300",
"LABEL_26301",
"LABEL_26302",
"LABEL_26303",
"LABEL_26304",
"LABEL_26305",
"LABEL_26306",
"LABEL_26307",
"LABEL_26308",
"LABEL_26309",
"LABEL_2631",
"LABEL_26310",
"LABEL_26311",
"LABEL_26312",
"LABEL_26313",
"LABEL_26314",
"LABEL_26315",
"LABEL_26316",
"LABEL_26317",
"LABEL_26318",
"LABEL_26319",
"LABEL_2632",
"LABEL_26320",
"LABEL_26321",
"LABEL_26322",
"LABEL_26323",
"LABEL_26324",
"LABEL_26325",
"LABEL_26326",
"LABEL_26327",
"LABEL_26328",
"LABEL_26329",
"LABEL_2633",
"LABEL_26330",
"LABEL_26331",
"LABEL_26332",
"LABEL_26333",
"LABEL_26334",
"LABEL_26335",
"LABEL_26336",
"LABEL_26337",
"LABEL_26338",
"LABEL_26339",
"LABEL_2634",
"LABEL_26340",
"LABEL_26341",
"LABEL_26342",
"LABEL_26343",
"LABEL_26344",
"LABEL_26345",
"LABEL_26346",
"LABEL_26347",
"LABEL_26348",
"LABEL_26349",
"LABEL_2635",
"LABEL_26350",
"LABEL_26351",
"LABEL_26352",
"LABEL_26353",
"LABEL_26354",
"LABEL_26355",
"LABEL_26356",
"LABEL_26357",
"LABEL_26358",
"LABEL_26359",
"LABEL_2636",
"LABEL_26360",
"LABEL_26361",
"LABEL_26362",
"LABEL_26363",
"LABEL_26364",
"LABEL_26365",
"LABEL_26366",
"LABEL_26367",
"LABEL_26368",
"LABEL_26369",
"LABEL_2637",
"LABEL_26370",
"LABEL_26371",
"LABEL_26372",
"LABEL_26373",
"LABEL_26374",
"LABEL_26375",
"LABEL_26376",
"LABEL_26377",
"LABEL_26378",
"LABEL_26379",
"LABEL_2638",
"LABEL_26380",
"LABEL_26381",
"LABEL_26382",
"LABEL_26383",
"LABEL_26384",
"LABEL_26385",
"LABEL_26386",
"LABEL_26387",
"LABEL_26388",
"LABEL_26389",
"LABEL_2639",
"LABEL_26390",
"LABEL_26391",
"LABEL_26392",
"LABEL_26393",
"LABEL_26394",
"LABEL_26395",
"LABEL_26396",
"LABEL_26397",
"LABEL_26398",
"LABEL_26399",
"LABEL_264",
"LABEL_2640",
"LABEL_26400",
"LABEL_26401",
"LABEL_26402",
"LABEL_26403",
"LABEL_26404",
"LABEL_26405",
"LABEL_26406",
"LABEL_26407",
"LABEL_26408",
"LABEL_26409",
"LABEL_2641",
"LABEL_26410",
"LABEL_26411",
"LABEL_26412",
"LABEL_26413",
"LABEL_26414",
"LABEL_26415",
"LABEL_26416",
"LABEL_26417",
"LABEL_26418",
"LABEL_26419",
"LABEL_2642",
"LABEL_26420",
"LABEL_26421",
"LABEL_26422",
"LABEL_26423",
"LABEL_26424",
"LABEL_26425",
"LABEL_26426",
"LABEL_26427",
"LABEL_26428",
"LABEL_26429",
"LABEL_2643",
"LABEL_26430",
"LABEL_26431",
"LABEL_26432",
"LABEL_26433",
"LABEL_26434",
"LABEL_26435",
"LABEL_26436",
"LABEL_26437",
"LABEL_26438",
"LABEL_26439",
"LABEL_2644",
"LABEL_26440",
"LABEL_26441",
"LABEL_26442",
"LABEL_26443",
"LABEL_26444",
"LABEL_26445",
"LABEL_26446",
"LABEL_26447",
"LABEL_26448",
"LABEL_26449",
"LABEL_2645",
"LABEL_26450",
"LABEL_26451",
"LABEL_26452",
"LABEL_26453",
"LABEL_26454",
"LABEL_26455",
"LABEL_26456",
"LABEL_26457",
"LABEL_26458",
"LABEL_26459",
"LABEL_2646",
"LABEL_26460",
"LABEL_26461",
"LABEL_26462",
"LABEL_26463",
"LABEL_26464",
"LABEL_26465",
"LABEL_26466",
"LABEL_26467",
"LABEL_26468",
"LABEL_26469",
"LABEL_2647",
"LABEL_26470",
"LABEL_26471",
"LABEL_26472",
"LABEL_26473",
"LABEL_26474",
"LABEL_26475",
"LABEL_26476",
"LABEL_26477",
"LABEL_26478",
"LABEL_26479",
"LABEL_2648",
"LABEL_26480",
"LABEL_26481",
"LABEL_26482",
"LABEL_26483",
"LABEL_26484",
"LABEL_26485",
"LABEL_26486",
"LABEL_26487",
"LABEL_26488",
"LABEL_26489",
"LABEL_2649",
"LABEL_26490",
"LABEL_26491",
"LABEL_26492",
"LABEL_26493",
"LABEL_26494",
"LABEL_26495",
"LABEL_26496",
"LABEL_26497",
"LABEL_26498",
"LABEL_26499",
"LABEL_265",
"LABEL_2650",
"LABEL_26500",
"LABEL_26501",
"LABEL_26502",
"LABEL_26503",
"LABEL_26504",
"LABEL_26505",
"LABEL_26506",
"LABEL_26507",
"LABEL_26508",
"LABEL_26509",
"LABEL_2651",
"LABEL_26510",
"LABEL_26511",
"LABEL_26512",
"LABEL_26513",
"LABEL_26514",
"LABEL_26515",
"LABEL_26516",
"LABEL_26517",
"LABEL_26518",
"LABEL_26519",
"LABEL_2652",
"LABEL_26520",
"LABEL_26521",
"LABEL_26522",
"LABEL_26523",
"LABEL_26524",
"LABEL_26525",
"LABEL_26526",
"LABEL_26527",
"LABEL_26528",
"LABEL_26529",
"LABEL_2653",
"LABEL_26530",
"LABEL_26531",
"LABEL_26532",
"LABEL_26533",
"LABEL_26534",
"LABEL_26535",
"LABEL_26536",
"LABEL_26537",
"LABEL_26538",
"LABEL_26539",
"LABEL_2654",
"LABEL_26540",
"LABEL_26541",
"LABEL_26542",
"LABEL_26543",
"LABEL_26544",
"LABEL_26545",
"LABEL_26546",
"LABEL_26547",
"LABEL_26548",
"LABEL_26549",
"LABEL_2655",
"LABEL_26550",
"LABEL_26551",
"LABEL_26552",
"LABEL_26553",
"LABEL_26554",
"LABEL_26555",
"LABEL_26556",
"LABEL_26557",
"LABEL_26558",
"LABEL_26559",
"LABEL_2656",
"LABEL_26560",
"LABEL_26561",
"LABEL_26562",
"LABEL_26563",
"LABEL_26564",
"LABEL_26565",
"LABEL_26566",
"LABEL_26567",
"LABEL_26568",
"LABEL_26569",
"LABEL_2657",
"LABEL_26570",
"LABEL_26571",
"LABEL_26572",
"LABEL_26573",
"LABEL_26574",
"LABEL_26575",
"LABEL_26576",
"LABEL_26577",
"LABEL_26578",
"LABEL_26579",
"LABEL_2658",
"LABEL_26580",
"LABEL_26581",
"LABEL_26582",
"LABEL_26583",
"LABEL_26584",
"LABEL_26585",
"LABEL_26586",
"LABEL_26587",
"LABEL_26588",
"LABEL_26589",
"LABEL_2659",
"LABEL_26590",
"LABEL_26591",
"LABEL_26592",
"LABEL_26593",
"LABEL_26594",
"LABEL_26595",
"LABEL_26596",
"LABEL_26597",
"LABEL_26598",
"LABEL_26599",
"LABEL_266",
"LABEL_2660",
"LABEL_26600",
"LABEL_26601",
"LABEL_26602",
"LABEL_26603",
"LABEL_26604",
"LABEL_26605",
"LABEL_26606",
"LABEL_26607",
"LABEL_26608",
"LABEL_26609",
"LABEL_2661",
"LABEL_26610",
"LABEL_26611",
"LABEL_26612",
"LABEL_26613",
"LABEL_26614",
"LABEL_26615",
"LABEL_26616",
"LABEL_26617",
"LABEL_26618",
"LABEL_26619",
"LABEL_2662",
"LABEL_26620",
"LABEL_26621",
"LABEL_26622",
"LABEL_26623",
"LABEL_26624",
"LABEL_26625",
"LABEL_26626",
"LABEL_26627",
"LABEL_26628",
"LABEL_26629",
"LABEL_2663",
"LABEL_26630",
"LABEL_26631",
"LABEL_26632",
"LABEL_26633",
"LABEL_26634",
"LABEL_26635",
"LABEL_26636",
"LABEL_26637",
"LABEL_26638",
"LABEL_26639",
"LABEL_2664",
"LABEL_26640",
"LABEL_26641",
"LABEL_26642",
"LABEL_26643",
"LABEL_26644",
"LABEL_26645",
"LABEL_26646",
"LABEL_26647",
"LABEL_26648",
"LABEL_26649",
"LABEL_2665",
"LABEL_26650",
"LABEL_26651",
"LABEL_26652",
"LABEL_26653",
"LABEL_26654",
"LABEL_26655",
"LABEL_26656",
"LABEL_26657",
"LABEL_26658",
"LABEL_26659",
"LABEL_2666",
"LABEL_26660",
"LABEL_26661",
"LABEL_26662",
"LABEL_26663",
"LABEL_26664",
"LABEL_26665",
"LABEL_26666",
"LABEL_26667",
"LABEL_26668",
"LABEL_26669",
"LABEL_2667",
"LABEL_26670",
"LABEL_26671",
"LABEL_26672",
"LABEL_26673",
"LABEL_26674",
"LABEL_26675",
"LABEL_26676",
"LABEL_26677",
"LABEL_26678",
"LABEL_26679",
"LABEL_2668",
"LABEL_26680",
"LABEL_26681",
"LABEL_26682",
"LABEL_26683",
"LABEL_26684",
"LABEL_26685",
"LABEL_26686",
"LABEL_26687",
"LABEL_26688",
"LABEL_26689",
"LABEL_2669",
"LABEL_26690",
"LABEL_26691",
"LABEL_26692",
"LABEL_26693",
"LABEL_26694",
"LABEL_26695",
"LABEL_26696",
"LABEL_26697",
"LABEL_26698",
"LABEL_26699",
"LABEL_267",
"LABEL_2670",
"LABEL_26700",
"LABEL_26701",
"LABEL_26702",
"LABEL_26703",
"LABEL_26704",
"LABEL_26705",
"LABEL_26706",
"LABEL_26707",
"LABEL_26708",
"LABEL_26709",
"LABEL_2671",
"LABEL_26710",
"LABEL_26711",
"LABEL_26712",
"LABEL_26713",
"LABEL_26714",
"LABEL_26715",
"LABEL_26716",
"LABEL_26717",
"LABEL_26718",
"LABEL_26719",
"LABEL_2672",
"LABEL_26720",
"LABEL_26721",
"LABEL_26722",
"LABEL_26723",
"LABEL_26724",
"LABEL_26725",
"LABEL_26726",
"LABEL_26727",
"LABEL_26728",
"LABEL_26729",
"LABEL_2673",
"LABEL_26730",
"LABEL_26731",
"LABEL_26732",
"LABEL_26733",
"LABEL_26734",
"LABEL_26735",
"LABEL_26736",
"LABEL_26737",
"LABEL_26738",
"LABEL_26739",
"LABEL_2674",
"LABEL_26740",
"LABEL_26741",
"LABEL_26742",
"LABEL_26743",
"LABEL_26744",
"LABEL_26745",
"LABEL_26746",
"LABEL_26747",
"LABEL_26748",
"LABEL_26749",
"LABEL_2675",
"LABEL_26750",
"LABEL_26751",
"LABEL_26752",
"LABEL_26753",
"LABEL_26754",
"LABEL_26755",
"LABEL_26756",
"LABEL_26757",
"LABEL_26758",
"LABEL_26759",
"LABEL_2676",
"LABEL_26760",
"LABEL_26761",
"LABEL_26762",
"LABEL_26763",
"LABEL_26764",
"LABEL_26765",
"LABEL_26766",
"LABEL_26767",
"LABEL_26768",
"LABEL_26769",
"LABEL_2677",
"LABEL_26770",
"LABEL_26771",
"LABEL_26772",
"LABEL_26773",
"LABEL_26774",
"LABEL_26775",
"LABEL_26776",
"LABEL_26777",
"LABEL_26778",
"LABEL_26779",
"LABEL_2678",
"LABEL_26780",
"LABEL_26781",
"LABEL_26782",
"LABEL_26783",
"LABEL_26784",
"LABEL_26785",
"LABEL_26786",
"LABEL_26787",
"LABEL_26788",
"LABEL_26789",
"LABEL_2679",
"LABEL_26790",
"LABEL_26791",
"LABEL_26792",
"LABEL_26793",
"LABEL_26794",
"LABEL_26795",
"LABEL_26796",
"LABEL_26797",
"LABEL_26798",
"LABEL_26799",
"LABEL_268",
"LABEL_2680",
"LABEL_26800",
"LABEL_26801",
"LABEL_26802",
"LABEL_26803",
"LABEL_26804",
"LABEL_26805",
"LABEL_26806",
"LABEL_26807",
"LABEL_26808",
"LABEL_26809",
"LABEL_2681",
"LABEL_26810",
"LABEL_26811",
"LABEL_26812",
"LABEL_26813",
"LABEL_26814",
"LABEL_26815",
"LABEL_26816",
"LABEL_26817",
"LABEL_26818",
"LABEL_26819",
"LABEL_2682",
"LABEL_26820",
"LABEL_26821",
"LABEL_26822",
"LABEL_26823",
"LABEL_26824",
"LABEL_26825",
"LABEL_26826",
"LABEL_26827",
"LABEL_26828",
"LABEL_26829",
"LABEL_2683",
"LABEL_26830",
"LABEL_26831",
"LABEL_26832",
"LABEL_26833",
"LABEL_26834",
"LABEL_26835",
"LABEL_26836",
"LABEL_26837",
"LABEL_26838",
"LABEL_26839",
"LABEL_2684",
"LABEL_26840",
"LABEL_26841",
"LABEL_26842",
"LABEL_26843",
"LABEL_26844",
"LABEL_26845",
"LABEL_26846",
"LABEL_26847",
"LABEL_26848",
"LABEL_26849",
"LABEL_2685",
"LABEL_26850",
"LABEL_26851",
"LABEL_26852",
"LABEL_26853",
"LABEL_26854",
"LABEL_26855",
"LABEL_26856",
"LABEL_26857",
"LABEL_26858",
"LABEL_26859",
"LABEL_2686",
"LABEL_26860",
"LABEL_26861",
"LABEL_26862",
"LABEL_26863",
"LABEL_26864",
"LABEL_26865",
"LABEL_26866",
"LABEL_26867",
"LABEL_26868",
"LABEL_26869",
"LABEL_2687",
"LABEL_26870",
"LABEL_26871",
"LABEL_26872",
"LABEL_26873",
"LABEL_26874",
"LABEL_26875",
"LABEL_26876",
"LABEL_26877",
"LABEL_26878",
"LABEL_26879",
"LABEL_2688",
"LABEL_26880",
"LABEL_26881",
"LABEL_26882",
"LABEL_26883",
"LABEL_26884",
"LABEL_26885",
"LABEL_26886",
"LABEL_26887",
"LABEL_26888",
"LABEL_26889",
"LABEL_2689",
"LABEL_26890",
"LABEL_26891",
"LABEL_26892",
"LABEL_26893",
"LABEL_26894",
"LABEL_26895",
"LABEL_26896",
"LABEL_26897",
"LABEL_26898",
"LABEL_26899",
"LABEL_269",
"LABEL_2690",
"LABEL_26900",
"LABEL_26901",
"LABEL_26902",
"LABEL_26903",
"LABEL_26904",
"LABEL_26905",
"LABEL_26906",
"LABEL_26907",
"LABEL_26908",
"LABEL_26909",
"LABEL_2691",
"LABEL_26910",
"LABEL_26911",
"LABEL_26912",
"LABEL_26913",
"LABEL_26914",
"LABEL_26915",
"LABEL_26916",
"LABEL_26917",
"LABEL_26918",
"LABEL_26919",
"LABEL_2692",
"LABEL_26920",
"LABEL_26921",
"LABEL_26922",
"LABEL_26923",
"LABEL_26924",
"LABEL_26925",
"LABEL_26926",
"LABEL_26927",
"LABEL_26928",
"LABEL_26929",
"LABEL_2693",
"LABEL_26930",
"LABEL_26931",
"LABEL_26932",
"LABEL_26933",
"LABEL_26934",
"LABEL_26935",
"LABEL_26936",
"LABEL_26937",
"LABEL_26938",
"LABEL_26939",
"LABEL_2694",
"LABEL_26940",
"LABEL_26941",
"LABEL_26942",
"LABEL_26943",
"LABEL_26944",
"LABEL_26945",
"LABEL_26946",
"LABEL_26947",
"LABEL_26948",
"LABEL_26949",
"LABEL_2695",
"LABEL_26950",
"LABEL_26951",
"LABEL_26952",
"LABEL_26953",
"LABEL_26954",
"LABEL_26955",
"LABEL_26956",
"LABEL_26957",
"LABEL_26958",
"LABEL_26959",
"LABEL_2696",
"LABEL_26960",
"LABEL_26961",
"LABEL_26962",
"LABEL_26963",
"LABEL_26964",
"LABEL_26965",
"LABEL_26966",
"LABEL_26967",
"LABEL_26968",
"LABEL_26969",
"LABEL_2697",
"LABEL_26970",
"LABEL_26971",
"LABEL_26972",
"LABEL_26973",
"LABEL_26974",
"LABEL_26975",
"LABEL_26976",
"LABEL_26977",
"LABEL_26978",
"LABEL_26979",
"LABEL_2698",
"LABEL_26980",
"LABEL_26981",
"LABEL_26982",
"LABEL_26983",
"LABEL_26984",
"LABEL_26985",
"LABEL_26986",
"LABEL_26987",
"LABEL_26988",
"LABEL_26989",
"LABEL_2699",
"LABEL_26990",
"LABEL_26991",
"LABEL_26992",
"LABEL_26993",
"LABEL_26994",
"LABEL_26995",
"LABEL_26996",
"LABEL_26997",
"LABEL_26998",
"LABEL_26999",
"LABEL_27",
"LABEL_270",
"LABEL_2700",
"LABEL_27000",
"LABEL_27001",
"LABEL_27002",
"LABEL_27003",
"LABEL_27004",
"LABEL_27005",
"LABEL_27006",
"LABEL_27007",
"LABEL_27008",
"LABEL_27009",
"LABEL_2701",
"LABEL_27010",
"LABEL_27011",
"LABEL_27012",
"LABEL_27013",
"LABEL_27014",
"LABEL_27015",
"LABEL_27016",
"LABEL_27017",
"LABEL_27018",
"LABEL_27019",
"LABEL_2702",
"LABEL_27020",
"LABEL_27021",
"LABEL_27022",
"LABEL_27023",
"LABEL_27024",
"LABEL_27025",
"LABEL_27026",
"LABEL_27027",
"LABEL_27028",
"LABEL_27029",
"LABEL_2703",
"LABEL_27030",
"LABEL_27031",
"LABEL_27032",
"LABEL_27033",
"LABEL_27034",
"LABEL_27035",
"LABEL_27036",
"LABEL_27037",
"LABEL_27038",
"LABEL_27039",
"LABEL_2704",
"LABEL_27040",
"LABEL_27041",
"LABEL_27042",
"LABEL_27043",
"LABEL_27044",
"LABEL_27045",
"LABEL_27046",
"LABEL_27047",
"LABEL_27048",
"LABEL_27049",
"LABEL_2705",
"LABEL_27050",
"LABEL_27051",
"LABEL_27052",
"LABEL_27053",
"LABEL_27054",
"LABEL_27055",
"LABEL_27056",
"LABEL_27057",
"LABEL_27058",
"LABEL_27059",
"LABEL_2706",
"LABEL_27060",
"LABEL_27061",
"LABEL_27062",
"LABEL_27063",
"LABEL_27064",
"LABEL_27065",
"LABEL_27066",
"LABEL_27067",
"LABEL_27068",
"LABEL_27069",
"LABEL_2707",
"LABEL_27070",
"LABEL_27071",
"LABEL_27072",
"LABEL_27073",
"LABEL_27074",
"LABEL_27075",
"LABEL_27076",
"LABEL_27077",
"LABEL_27078",
"LABEL_27079",
"LABEL_2708",
"LABEL_27080",
"LABEL_27081",
"LABEL_27082",
"LABEL_27083",
"LABEL_27084",
"LABEL_27085",
"LABEL_27086",
"LABEL_27087",
"LABEL_27088",
"LABEL_27089",
"LABEL_2709",
"LABEL_27090",
"LABEL_27091",
"LABEL_27092",
"LABEL_27093",
"LABEL_27094",
"LABEL_27095",
"LABEL_27096",
"LABEL_27097",
"LABEL_27098",
"LABEL_27099",
"LABEL_271",
"LABEL_2710",
"LABEL_27100",
"LABEL_27101",
"LABEL_27102",
"LABEL_27103",
"LABEL_27104",
"LABEL_27105",
"LABEL_27106",
"LABEL_27107",
"LABEL_27108",
"LABEL_27109",
"LABEL_2711",
"LABEL_27110",
"LABEL_27111",
"LABEL_27112",
"LABEL_27113",
"LABEL_27114",
"LABEL_27115",
"LABEL_27116",
"LABEL_27117",
"LABEL_27118",
"LABEL_27119",
"LABEL_2712",
"LABEL_27120",
"LABEL_27121",
"LABEL_27122",
"LABEL_27123",
"LABEL_27124",
"LABEL_27125",
"LABEL_27126",
"LABEL_27127",
"LABEL_27128",
"LABEL_27129",
"LABEL_2713",
"LABEL_27130",
"LABEL_27131",
"LABEL_27132",
"LABEL_27133",
"LABEL_27134",
"LABEL_27135",
"LABEL_27136",
"LABEL_27137",
"LABEL_27138",
"LABEL_27139",
"LABEL_2714",
"LABEL_27140",
"LABEL_27141",
"LABEL_27142",
"LABEL_27143",
"LABEL_27144",
"LABEL_27145",
"LABEL_27146",
"LABEL_27147",
"LABEL_27148",
"LABEL_27149",
"LABEL_2715",
"LABEL_27150",
"LABEL_27151",
"LABEL_27152",
"LABEL_27153",
"LABEL_27154",
"LABEL_27155",
"LABEL_27156",
"LABEL_27157",
"LABEL_27158",
"LABEL_27159",
"LABEL_2716",
"LABEL_27160",
"LABEL_27161",
"LABEL_27162",
"LABEL_27163",
"LABEL_27164",
"LABEL_27165",
"LABEL_27166",
"LABEL_27167",
"LABEL_27168",
"LABEL_27169",
"LABEL_2717",
"LABEL_27170",
"LABEL_27171",
"LABEL_27172",
"LABEL_27173",
"LABEL_27174",
"LABEL_27175",
"LABEL_27176",
"LABEL_27177",
"LABEL_27178",
"LABEL_27179",
"LABEL_2718",
"LABEL_27180",
"LABEL_27181",
"LABEL_27182",
"LABEL_27183",
"LABEL_27184",
"LABEL_27185",
"LABEL_27186",
"LABEL_27187",
"LABEL_27188",
"LABEL_27189",
"LABEL_2719",
"LABEL_27190",
"LABEL_27191",
"LABEL_27192",
"LABEL_27193",
"LABEL_27194",
"LABEL_27195",
"LABEL_27196",
"LABEL_27197",
"LABEL_27198",
"LABEL_27199",
"LABEL_272",
"LABEL_2720",
"LABEL_27200",
"LABEL_27201",
"LABEL_27202",
"LABEL_27203",
"LABEL_27204",
"LABEL_27205",
"LABEL_27206",
"LABEL_27207",
"LABEL_27208",
"LABEL_27209",
"LABEL_2721",
"LABEL_27210",
"LABEL_27211",
"LABEL_27212",
"LABEL_27213",
"LABEL_27214",
"LABEL_27215",
"LABEL_27216",
"LABEL_27217",
"LABEL_27218",
"LABEL_27219",
"LABEL_2722",
"LABEL_27220",
"LABEL_27221",
"LABEL_27222",
"LABEL_27223",
"LABEL_27224",
"LABEL_27225",
"LABEL_27226",
"LABEL_27227",
"LABEL_27228",
"LABEL_27229",
"LABEL_2723",
"LABEL_27230",
"LABEL_27231",
"LABEL_27232",
"LABEL_27233",
"LABEL_27234",
"LABEL_27235",
"LABEL_27236",
"LABEL_27237",
"LABEL_27238",
"LABEL_27239",
"LABEL_2724",
"LABEL_27240",
"LABEL_27241",
"LABEL_27242",
"LABEL_27243",
"LABEL_27244",
"LABEL_27245",
"LABEL_27246",
"LABEL_27247",
"LABEL_27248",
"LABEL_27249",
"LABEL_2725",
"LABEL_27250",
"LABEL_27251",
"LABEL_27252",
"LABEL_27253",
"LABEL_27254",
"LABEL_27255",
"LABEL_27256",
"LABEL_27257",
"LABEL_27258",
"LABEL_27259",
"LABEL_2726",
"LABEL_27260",
"LABEL_27261",
"LABEL_27262",
"LABEL_27263",
"LABEL_27264",
"LABEL_27265",
"LABEL_27266",
"LABEL_27267",
"LABEL_27268",
"LABEL_27269",
"LABEL_2727",
"LABEL_27270",
"LABEL_27271",
"LABEL_27272",
"LABEL_27273",
"LABEL_27274",
"LABEL_27275",
"LABEL_27276",
"LABEL_27277",
"LABEL_27278",
"LABEL_27279",
"LABEL_2728",
"LABEL_27280",
"LABEL_27281",
"LABEL_27282",
"LABEL_27283",
"LABEL_27284",
"LABEL_27285",
"LABEL_27286",
"LABEL_27287",
"LABEL_27288",
"LABEL_27289",
"LABEL_2729",
"LABEL_27290",
"LABEL_27291",
"LABEL_27292",
"LABEL_27293",
"LABEL_27294",
"LABEL_27295",
"LABEL_27296",
"LABEL_27297",
"LABEL_27298",
"LABEL_27299",
"LABEL_273",
"LABEL_2730",
"LABEL_27300",
"LABEL_27301",
"LABEL_27302",
"LABEL_27303",
"LABEL_27304",
"LABEL_27305",
"LABEL_27306",
"LABEL_27307",
"LABEL_27308",
"LABEL_27309",
"LABEL_2731",
"LABEL_27310",
"LABEL_27311",
"LABEL_27312",
"LABEL_27313",
"LABEL_27314",
"LABEL_27315",
"LABEL_27316",
"LABEL_27317",
"LABEL_27318",
"LABEL_27319",
"LABEL_2732",
"LABEL_27320",
"LABEL_27321",
"LABEL_27322",
"LABEL_27323",
"LABEL_27324",
"LABEL_27325",
"LABEL_27326",
"LABEL_27327",
"LABEL_27328",
"LABEL_27329",
"LABEL_2733",
"LABEL_27330",
"LABEL_27331",
"LABEL_27332",
"LABEL_27333",
"LABEL_27334",
"LABEL_27335",
"LABEL_27336",
"LABEL_27337",
"LABEL_27338",
"LABEL_27339",
"LABEL_2734",
"LABEL_27340",
"LABEL_27341",
"LABEL_27342",
"LABEL_27343",
"LABEL_27344",
"LABEL_27345",
"LABEL_27346",
"LABEL_27347",
"LABEL_27348",
"LABEL_27349",
"LABEL_2735",
"LABEL_27350",
"LABEL_27351",
"LABEL_27352",
"LABEL_27353",
"LABEL_27354",
"LABEL_27355",
"LABEL_27356",
"LABEL_27357",
"LABEL_27358",
"LABEL_27359",
"LABEL_2736",
"LABEL_27360",
"LABEL_27361",
"LABEL_27362",
"LABEL_27363",
"LABEL_27364",
"LABEL_27365",
"LABEL_27366",
"LABEL_27367",
"LABEL_27368",
"LABEL_27369",
"LABEL_2737",
"LABEL_27370",
"LABEL_27371",
"LABEL_27372",
"LABEL_27373",
"LABEL_27374",
"LABEL_27375",
"LABEL_27376",
"LABEL_27377",
"LABEL_27378",
"LABEL_27379",
"LABEL_2738",
"LABEL_27380",
"LABEL_27381",
"LABEL_27382",
"LABEL_27383",
"LABEL_27384",
"LABEL_27385",
"LABEL_27386",
"LABEL_27387",
"LABEL_27388",
"LABEL_27389",
"LABEL_2739",
"LABEL_27390",
"LABEL_27391",
"LABEL_27392",
"LABEL_27393",
"LABEL_27394",
"LABEL_27395",
"LABEL_27396",
"LABEL_27397",
"LABEL_27398",
"LABEL_27399",
"LABEL_274",
"LABEL_2740",
"LABEL_27400",
"LABEL_27401",
"LABEL_27402",
"LABEL_27403",
"LABEL_27404",
"LABEL_27405",
"LABEL_27406",
"LABEL_27407",
"LABEL_27408",
"LABEL_27409",
"LABEL_2741",
"LABEL_27410",
"LABEL_27411",
"LABEL_27412",
"LABEL_27413",
"LABEL_27414",
"LABEL_27415",
"LABEL_27416",
"LABEL_27417",
"LABEL_27418",
"LABEL_27419",
"LABEL_2742",
"LABEL_27420",
"LABEL_27421",
"LABEL_27422",
"LABEL_27423",
"LABEL_27424",
"LABEL_27425",
"LABEL_27426",
"LABEL_27427",
"LABEL_27428",
"LABEL_27429",
"LABEL_2743",
"LABEL_27430",
"LABEL_27431",
"LABEL_27432",
"LABEL_27433",
"LABEL_27434",
"LABEL_27435",
"LABEL_27436",
"LABEL_27437",
"LABEL_27438",
"LABEL_27439",
"LABEL_2744",
"LABEL_27440",
"LABEL_27441",
"LABEL_27442",
"LABEL_27443",
"LABEL_27444",
"LABEL_27445",
"LABEL_27446",
"LABEL_27447",
"LABEL_27448",
"LABEL_27449",
"LABEL_2745",
"LABEL_27450",
"LABEL_27451",
"LABEL_27452",
"LABEL_27453",
"LABEL_27454",
"LABEL_27455",
"LABEL_27456",
"LABEL_27457",
"LABEL_27458",
"LABEL_27459",
"LABEL_2746",
"LABEL_27460",
"LABEL_27461",
"LABEL_27462",
"LABEL_27463",
"LABEL_27464",
"LABEL_27465",
"LABEL_27466",
"LABEL_27467",
"LABEL_27468",
"LABEL_27469",
"LABEL_2747",
"LABEL_27470",
"LABEL_27471",
"LABEL_27472",
"LABEL_27473",
"LABEL_27474",
"LABEL_27475",
"LABEL_27476",
"LABEL_27477",
"LABEL_27478",
"LABEL_27479",
"LABEL_2748",
"LABEL_27480",
"LABEL_27481",
"LABEL_27482",
"LABEL_27483",
"LABEL_27484",
"LABEL_27485",
"LABEL_27486",
"LABEL_27487",
"LABEL_27488",
"LABEL_27489",
"LABEL_2749",
"LABEL_27490",
"LABEL_27491",
"LABEL_27492",
"LABEL_27493",
"LABEL_27494",
"LABEL_27495",
"LABEL_27496",
"LABEL_27497",
"LABEL_27498",
"LABEL_27499",
"LABEL_275",
"LABEL_2750",
"LABEL_27500",
"LABEL_27501",
"LABEL_27502",
"LABEL_27503",
"LABEL_27504",
"LABEL_27505",
"LABEL_27506",
"LABEL_27507",
"LABEL_27508",
"LABEL_27509",
"LABEL_2751",
"LABEL_27510",
"LABEL_27511",
"LABEL_27512",
"LABEL_27513",
"LABEL_27514",
"LABEL_27515",
"LABEL_27516",
"LABEL_27517",
"LABEL_27518",
"LABEL_27519",
"LABEL_2752",
"LABEL_27520",
"LABEL_27521",
"LABEL_27522",
"LABEL_27523",
"LABEL_27524",
"LABEL_27525",
"LABEL_27526",
"LABEL_27527",
"LABEL_27528",
"LABEL_27529",
"LABEL_2753",
"LABEL_27530",
"LABEL_27531",
"LABEL_27532",
"LABEL_27533",
"LABEL_27534",
"LABEL_27535",
"LABEL_27536",
"LABEL_27537",
"LABEL_27538",
"LABEL_27539",
"LABEL_2754",
"LABEL_27540",
"LABEL_27541",
"LABEL_27542",
"LABEL_27543",
"LABEL_27544",
"LABEL_27545",
"LABEL_27546",
"LABEL_27547",
"LABEL_27548",
"LABEL_27549",
"LABEL_2755",
"LABEL_27550",
"LABEL_27551",
"LABEL_27552",
"LABEL_27553",
"LABEL_27554",
"LABEL_27555",
"LABEL_27556",
"LABEL_27557",
"LABEL_27558",
"LABEL_27559",
"LABEL_2756",
"LABEL_27560",
"LABEL_27561",
"LABEL_27562",
"LABEL_27563",
"LABEL_27564",
"LABEL_27565",
"LABEL_27566",
"LABEL_27567",
"LABEL_27568",
"LABEL_27569",
"LABEL_2757",
"LABEL_27570",
"LABEL_27571",
"LABEL_27572",
"LABEL_27573",
"LABEL_27574",
"LABEL_27575",
"LABEL_27576",
"LABEL_27577",
"LABEL_27578",
"LABEL_27579",
"LABEL_2758",
"LABEL_27580",
"LABEL_27581",
"LABEL_27582",
"LABEL_27583",
"LABEL_27584",
"LABEL_27585",
"LABEL_27586",
"LABEL_27587",
"LABEL_27588",
"LABEL_27589",
"LABEL_2759",
"LABEL_27590",
"LABEL_27591",
"LABEL_27592",
"LABEL_27593",
"LABEL_27594",
"LABEL_27595",
"LABEL_27596",
"LABEL_27597",
"LABEL_27598",
"LABEL_27599",
"LABEL_276",
"LABEL_2760",
"LABEL_27600",
"LABEL_27601",
"LABEL_27602",
"LABEL_27603",
"LABEL_27604",
"LABEL_27605",
"LABEL_27606",
"LABEL_27607",
"LABEL_27608",
"LABEL_27609",
"LABEL_2761",
"LABEL_27610",
"LABEL_27611",
"LABEL_27612",
"LABEL_27613",
"LABEL_27614",
"LABEL_27615",
"LABEL_27616",
"LABEL_27617",
"LABEL_27618",
"LABEL_27619",
"LABEL_2762",
"LABEL_27620",
"LABEL_27621",
"LABEL_27622",
"LABEL_27623",
"LABEL_27624",
"LABEL_27625",
"LABEL_27626",
"LABEL_27627",
"LABEL_27628",
"LABEL_27629",
"LABEL_2763",
"LABEL_27630",
"LABEL_27631",
"LABEL_27632",
"LABEL_27633",
"LABEL_27634",
"LABEL_27635",
"LABEL_27636",
"LABEL_27637",
"LABEL_27638",
"LABEL_27639",
"LABEL_2764",
"LABEL_27640",
"LABEL_27641",
"LABEL_27642",
"LABEL_27643",
"LABEL_27644",
"LABEL_27645",
"LABEL_27646",
"LABEL_27647",
"LABEL_27648",
"LABEL_27649",
"LABEL_2765",
"LABEL_27650",
"LABEL_27651",
"LABEL_27652",
"LABEL_27653",
"LABEL_27654",
"LABEL_27655",
"LABEL_27656",
"LABEL_27657",
"LABEL_27658",
"LABEL_27659",
"LABEL_2766",
"LABEL_27660",
"LABEL_27661",
"LABEL_27662",
"LABEL_27663",
"LABEL_27664",
"LABEL_27665",
"LABEL_27666",
"LABEL_27667",
"LABEL_27668",
"LABEL_27669",
"LABEL_2767",
"LABEL_27670",
"LABEL_27671",
"LABEL_27672",
"LABEL_27673",
"LABEL_27674",
"LABEL_27675",
"LABEL_27676",
"LABEL_27677",
"LABEL_27678",
"LABEL_27679",
"LABEL_2768",
"LABEL_27680",
"LABEL_27681",
"LABEL_27682",
"LABEL_27683",
"LABEL_27684",
"LABEL_27685",
"LABEL_27686",
"LABEL_27687",
"LABEL_27688",
"LABEL_27689",
"LABEL_2769",
"LABEL_27690",
"LABEL_27691",
"LABEL_27692",
"LABEL_27693",
"LABEL_27694",
"LABEL_27695",
"LABEL_27696",
"LABEL_27697",
"LABEL_27698",
"LABEL_27699",
"LABEL_277",
"LABEL_2770",
"LABEL_27700",
"LABEL_27701",
"LABEL_27702",
"LABEL_27703",
"LABEL_27704",
"LABEL_27705",
"LABEL_27706",
"LABEL_27707",
"LABEL_27708",
"LABEL_27709",
"LABEL_2771",
"LABEL_27710",
"LABEL_27711",
"LABEL_27712",
"LABEL_27713",
"LABEL_27714",
"LABEL_27715",
"LABEL_27716",
"LABEL_27717",
"LABEL_27718",
"LABEL_27719",
"LABEL_2772",
"LABEL_27720",
"LABEL_27721",
"LABEL_27722",
"LABEL_27723",
"LABEL_27724",
"LABEL_27725",
"LABEL_27726",
"LABEL_27727",
"LABEL_27728",
"LABEL_27729",
"LABEL_2773",
"LABEL_27730",
"LABEL_27731",
"LABEL_27732",
"LABEL_27733",
"LABEL_27734",
"LABEL_27735",
"LABEL_27736",
"LABEL_27737",
"LABEL_27738",
"LABEL_27739",
"LABEL_2774",
"LABEL_27740",
"LABEL_27741",
"LABEL_27742",
"LABEL_27743",
"LABEL_27744",
"LABEL_27745",
"LABEL_27746",
"LABEL_27747",
"LABEL_27748",
"LABEL_27749",
"LABEL_2775",
"LABEL_27750",
"LABEL_27751",
"LABEL_27752",
"LABEL_27753",
"LABEL_27754",
"LABEL_27755",
"LABEL_27756",
"LABEL_27757",
"LABEL_27758",
"LABEL_27759",
"LABEL_2776",
"LABEL_27760",
"LABEL_27761",
"LABEL_27762",
"LABEL_27763",
"LABEL_27764",
"LABEL_27765",
"LABEL_27766",
"LABEL_27767",
"LABEL_27768",
"LABEL_27769",
"LABEL_2777",
"LABEL_27770",
"LABEL_27771",
"LABEL_27772",
"LABEL_27773",
"LABEL_27774",
"LABEL_27775",
"LABEL_27776",
"LABEL_27777",
"LABEL_27778",
"LABEL_27779",
"LABEL_2778",
"LABEL_27780",
"LABEL_27781",
"LABEL_27782",
"LABEL_27783",
"LABEL_27784",
"LABEL_27785",
"LABEL_27786",
"LABEL_27787",
"LABEL_27788",
"LABEL_27789",
"LABEL_2779",
"LABEL_27790",
"LABEL_27791",
"LABEL_27792",
"LABEL_27793",
"LABEL_27794",
"LABEL_27795",
"LABEL_27796",
"LABEL_27797",
"LABEL_27798",
"LABEL_27799",
"LABEL_278",
"LABEL_2780",
"LABEL_27800",
"LABEL_27801",
"LABEL_27802",
"LABEL_27803",
"LABEL_27804",
"LABEL_27805",
"LABEL_27806",
"LABEL_27807",
"LABEL_27808",
"LABEL_27809",
"LABEL_2781",
"LABEL_27810",
"LABEL_27811",
"LABEL_27812",
"LABEL_27813",
"LABEL_27814",
"LABEL_27815",
"LABEL_27816",
"LABEL_27817",
"LABEL_27818",
"LABEL_27819",
"LABEL_2782",
"LABEL_27820",
"LABEL_27821",
"LABEL_27822",
"LABEL_27823",
"LABEL_27824",
"LABEL_27825",
"LABEL_27826",
"LABEL_27827",
"LABEL_27828",
"LABEL_27829",
"LABEL_2783",
"LABEL_27830",
"LABEL_27831",
"LABEL_27832",
"LABEL_27833",
"LABEL_27834",
"LABEL_27835",
"LABEL_27836",
"LABEL_27837",
"LABEL_27838",
"LABEL_27839",
"LABEL_2784",
"LABEL_27840",
"LABEL_27841",
"LABEL_27842",
"LABEL_27843",
"LABEL_27844",
"LABEL_27845",
"LABEL_27846",
"LABEL_27847",
"LABEL_27848",
"LABEL_27849",
"LABEL_2785",
"LABEL_27850",
"LABEL_27851",
"LABEL_27852",
"LABEL_27853",
"LABEL_27854",
"LABEL_27855",
"LABEL_27856",
"LABEL_27857",
"LABEL_27858",
"LABEL_27859",
"LABEL_2786",
"LABEL_27860",
"LABEL_27861",
"LABEL_27862",
"LABEL_27863",
"LABEL_27864",
"LABEL_27865",
"LABEL_27866",
"LABEL_27867",
"LABEL_27868",
"LABEL_27869",
"LABEL_2787",
"LABEL_27870",
"LABEL_27871",
"LABEL_27872",
"LABEL_27873",
"LABEL_27874",
"LABEL_27875",
"LABEL_27876",
"LABEL_27877",
"LABEL_27878",
"LABEL_27879",
"LABEL_2788",
"LABEL_27880",
"LABEL_27881",
"LABEL_27882",
"LABEL_27883",
"LABEL_27884",
"LABEL_27885",
"LABEL_27886",
"LABEL_27887",
"LABEL_27888",
"LABEL_27889",
"LABEL_2789",
"LABEL_27890",
"LABEL_27891",
"LABEL_27892",
"LABEL_27893",
"LABEL_27894",
"LABEL_27895",
"LABEL_27896",
"LABEL_27897",
"LABEL_27898",
"LABEL_27899",
"LABEL_279",
"LABEL_2790",
"LABEL_27900",
"LABEL_27901",
"LABEL_27902",
"LABEL_27903",
"LABEL_27904",
"LABEL_27905",
"LABEL_27906",
"LABEL_27907",
"LABEL_27908",
"LABEL_27909",
"LABEL_2791",
"LABEL_27910",
"LABEL_27911",
"LABEL_27912",
"LABEL_27913",
"LABEL_27914",
"LABEL_27915",
"LABEL_27916",
"LABEL_27917",
"LABEL_27918",
"LABEL_27919",
"LABEL_2792",
"LABEL_27920",
"LABEL_27921",
"LABEL_27922",
"LABEL_27923",
"LABEL_27924",
"LABEL_27925",
"LABEL_27926",
"LABEL_27927",
"LABEL_27928",
"LABEL_27929",
"LABEL_2793",
"LABEL_27930",
"LABEL_27931",
"LABEL_27932",
"LABEL_27933",
"LABEL_27934",
"LABEL_27935",
"LABEL_27936",
"LABEL_27937",
"LABEL_27938",
"LABEL_27939",
"LABEL_2794",
"LABEL_27940",
"LABEL_27941",
"LABEL_27942",
"LABEL_27943",
"LABEL_27944",
"LABEL_27945",
"LABEL_27946",
"LABEL_27947",
"LABEL_27948",
"LABEL_27949",
"LABEL_2795",
"LABEL_27950",
"LABEL_27951",
"LABEL_27952",
"LABEL_27953",
"LABEL_27954",
"LABEL_27955",
"LABEL_27956",
"LABEL_27957",
"LABEL_27958",
"LABEL_27959",
"LABEL_2796",
"LABEL_27960",
"LABEL_27961",
"LABEL_27962",
"LABEL_27963",
"LABEL_27964",
"LABEL_27965",
"LABEL_27966",
"LABEL_27967",
"LABEL_27968",
"LABEL_27969",
"LABEL_2797",
"LABEL_27970",
"LABEL_27971",
"LABEL_27972",
"LABEL_27973",
"LABEL_27974",
"LABEL_27975",
"LABEL_27976",
"LABEL_27977",
"LABEL_27978",
"LABEL_27979",
"LABEL_2798",
"LABEL_27980",
"LABEL_27981",
"LABEL_27982",
"LABEL_27983",
"LABEL_27984",
"LABEL_27985",
"LABEL_27986",
"LABEL_27987",
"LABEL_27988",
"LABEL_27989",
"LABEL_2799",
"LABEL_27990",
"LABEL_27991",
"LABEL_27992",
"LABEL_27993",
"LABEL_27994",
"LABEL_27995",
"LABEL_27996",
"LABEL_27997",
"LABEL_27998",
"LABEL_27999",
"LABEL_28",
"LABEL_280",
"LABEL_2800",
"LABEL_28000",
"LABEL_28001",
"LABEL_28002",
"LABEL_28003",
"LABEL_28004",
"LABEL_28005",
"LABEL_28006",
"LABEL_28007",
"LABEL_28008",
"LABEL_28009",
"LABEL_2801",
"LABEL_28010",
"LABEL_28011",
"LABEL_28012",
"LABEL_28013",
"LABEL_28014",
"LABEL_28015",
"LABEL_28016",
"LABEL_28017",
"LABEL_28018",
"LABEL_28019",
"LABEL_2802",
"LABEL_28020",
"LABEL_28021",
"LABEL_28022",
"LABEL_28023",
"LABEL_28024",
"LABEL_28025",
"LABEL_28026",
"LABEL_28027",
"LABEL_28028",
"LABEL_28029",
"LABEL_2803",
"LABEL_28030",
"LABEL_28031",
"LABEL_28032",
"LABEL_28033",
"LABEL_28034",
"LABEL_28035",
"LABEL_28036",
"LABEL_28037",
"LABEL_28038",
"LABEL_28039",
"LABEL_2804",
"LABEL_28040",
"LABEL_28041",
"LABEL_28042",
"LABEL_28043",
"LABEL_28044",
"LABEL_28045",
"LABEL_28046",
"LABEL_28047",
"LABEL_28048",
"LABEL_28049",
"LABEL_2805",
"LABEL_28050",
"LABEL_28051",
"LABEL_28052",
"LABEL_28053",
"LABEL_28054",
"LABEL_28055",
"LABEL_28056",
"LABEL_28057",
"LABEL_28058",
"LABEL_28059",
"LABEL_2806",
"LABEL_28060",
"LABEL_28061",
"LABEL_28062",
"LABEL_28063",
"LABEL_28064",
"LABEL_28065",
"LABEL_28066",
"LABEL_28067",
"LABEL_28068",
"LABEL_28069",
"LABEL_2807",
"LABEL_28070",
"LABEL_28071",
"LABEL_28072",
"LABEL_28073",
"LABEL_28074",
"LABEL_28075",
"LABEL_28076",
"LABEL_28077",
"LABEL_28078",
"LABEL_28079",
"LABEL_2808",
"LABEL_28080",
"LABEL_28081",
"LABEL_28082",
"LABEL_28083",
"LABEL_28084",
"LABEL_28085",
"LABEL_28086",
"LABEL_28087",
"LABEL_28088",
"LABEL_28089",
"LABEL_2809",
"LABEL_28090",
"LABEL_28091",
"LABEL_28092",
"LABEL_28093",
"LABEL_28094",
"LABEL_28095",
"LABEL_28096",
"LABEL_28097",
"LABEL_28098",
"LABEL_28099",
"LABEL_281",
"LABEL_2810",
"LABEL_28100",
"LABEL_28101",
"LABEL_28102",
"LABEL_28103",
"LABEL_28104",
"LABEL_28105",
"LABEL_28106",
"LABEL_28107",
"LABEL_28108",
"LABEL_28109",
"LABEL_2811",
"LABEL_28110",
"LABEL_28111",
"LABEL_28112",
"LABEL_28113",
"LABEL_28114",
"LABEL_28115",
"LABEL_28116",
"LABEL_28117",
"LABEL_28118",
"LABEL_28119",
"LABEL_2812",
"LABEL_28120",
"LABEL_28121",
"LABEL_28122",
"LABEL_28123",
"LABEL_28124",
"LABEL_28125",
"LABEL_28126",
"LABEL_28127",
"LABEL_28128",
"LABEL_28129",
"LABEL_2813",
"LABEL_28130",
"LABEL_28131",
"LABEL_28132",
"LABEL_28133",
"LABEL_28134",
"LABEL_28135",
"LABEL_28136",
"LABEL_28137",
"LABEL_28138",
"LABEL_28139",
"LABEL_2814",
"LABEL_28140",
"LABEL_28141",
"LABEL_28142",
"LABEL_28143",
"LABEL_28144",
"LABEL_28145",
"LABEL_28146",
"LABEL_28147",
"LABEL_28148",
"LABEL_28149",
"LABEL_2815",
"LABEL_28150",
"LABEL_28151",
"LABEL_28152",
"LABEL_28153",
"LABEL_28154",
"LABEL_28155",
"LABEL_28156",
"LABEL_28157",
"LABEL_28158",
"LABEL_28159",
"LABEL_2816",
"LABEL_28160",
"LABEL_28161",
"LABEL_28162",
"LABEL_28163",
"LABEL_28164",
"LABEL_28165",
"LABEL_28166",
"LABEL_28167",
"LABEL_28168",
"LABEL_28169",
"LABEL_2817",
"LABEL_28170",
"LABEL_28171",
"LABEL_28172",
"LABEL_28173",
"LABEL_28174",
"LABEL_28175",
"LABEL_28176",
"LABEL_28177",
"LABEL_28178",
"LABEL_28179",
"LABEL_2818",
"LABEL_28180",
"LABEL_28181",
"LABEL_28182",
"LABEL_28183",
"LABEL_28184",
"LABEL_28185",
"LABEL_28186",
"LABEL_28187",
"LABEL_28188",
"LABEL_28189",
"LABEL_2819",
"LABEL_28190",
"LABEL_28191",
"LABEL_28192",
"LABEL_28193",
"LABEL_28194",
"LABEL_28195",
"LABEL_28196",
"LABEL_28197",
"LABEL_28198",
"LABEL_28199",
"LABEL_282",
"LABEL_2820",
"LABEL_28200",
"LABEL_28201",
"LABEL_28202",
"LABEL_28203",
"LABEL_28204",
"LABEL_28205",
"LABEL_28206",
"LABEL_28207",
"LABEL_28208",
"LABEL_28209",
"LABEL_2821",
"LABEL_28210",
"LABEL_28211",
"LABEL_28212",
"LABEL_28213",
"LABEL_28214",
"LABEL_28215",
"LABEL_28216",
"LABEL_28217",
"LABEL_28218",
"LABEL_28219",
"LABEL_2822",
"LABEL_28220",
"LABEL_28221",
"LABEL_28222",
"LABEL_28223",
"LABEL_28224",
"LABEL_28225",
"LABEL_28226",
"LABEL_28227",
"LABEL_28228",
"LABEL_28229",
"LABEL_2823",
"LABEL_28230",
"LABEL_28231",
"LABEL_28232",
"LABEL_28233",
"LABEL_28234",
"LABEL_28235",
"LABEL_28236",
"LABEL_28237",
"LABEL_28238",
"LABEL_28239",
"LABEL_2824",
"LABEL_28240",
"LABEL_28241",
"LABEL_28242",
"LABEL_28243",
"LABEL_28244",
"LABEL_28245",
"LABEL_28246",
"LABEL_28247",
"LABEL_28248",
"LABEL_28249",
"LABEL_2825",
"LABEL_28250",
"LABEL_28251",
"LABEL_28252",
"LABEL_28253",
"LABEL_28254",
"LABEL_28255",
"LABEL_28256",
"LABEL_28257",
"LABEL_28258",
"LABEL_28259",
"LABEL_2826",
"LABEL_28260",
"LABEL_28261",
"LABEL_28262",
"LABEL_28263",
"LABEL_28264",
"LABEL_28265",
"LABEL_28266",
"LABEL_28267",
"LABEL_28268",
"LABEL_28269",
"LABEL_2827",
"LABEL_28270",
"LABEL_28271",
"LABEL_28272",
"LABEL_28273",
"LABEL_28274",
"LABEL_28275",
"LABEL_28276",
"LABEL_28277",
"LABEL_28278",
"LABEL_28279",
"LABEL_2828",
"LABEL_28280",
"LABEL_28281",
"LABEL_28282",
"LABEL_28283",
"LABEL_28284",
"LABEL_28285",
"LABEL_28286",
"LABEL_28287",
"LABEL_28288",
"LABEL_28289",
"LABEL_2829",
"LABEL_28290",
"LABEL_28291",
"LABEL_28292",
"LABEL_28293",
"LABEL_28294",
"LABEL_28295",
"LABEL_28296",
"LABEL_28297",
"LABEL_28298",
"LABEL_28299",
"LABEL_283",
"LABEL_2830",
"LABEL_28300",
"LABEL_28301",
"LABEL_28302",
"LABEL_28303",
"LABEL_28304",
"LABEL_28305",
"LABEL_28306",
"LABEL_28307",
"LABEL_28308",
"LABEL_28309",
"LABEL_2831",
"LABEL_28310",
"LABEL_28311",
"LABEL_28312",
"LABEL_28313",
"LABEL_28314",
"LABEL_28315",
"LABEL_28316",
"LABEL_28317",
"LABEL_28318",
"LABEL_28319",
"LABEL_2832",
"LABEL_28320",
"LABEL_28321",
"LABEL_28322",
"LABEL_28323",
"LABEL_28324",
"LABEL_28325",
"LABEL_28326",
"LABEL_28327",
"LABEL_28328",
"LABEL_28329",
"LABEL_2833",
"LABEL_28330",
"LABEL_28331",
"LABEL_28332",
"LABEL_28333",
"LABEL_28334",
"LABEL_28335",
"LABEL_28336",
"LABEL_28337",
"LABEL_28338",
"LABEL_28339",
"LABEL_2834",
"LABEL_28340",
"LABEL_28341",
"LABEL_28342",
"LABEL_28343",
"LABEL_28344",
"LABEL_28345",
"LABEL_28346",
"LABEL_28347",
"LABEL_28348",
"LABEL_28349",
"LABEL_2835",
"LABEL_28350",
"LABEL_28351",
"LABEL_28352",
"LABEL_28353",
"LABEL_28354",
"LABEL_28355",
"LABEL_28356",
"LABEL_28357",
"LABEL_28358",
"LABEL_28359",
"LABEL_2836",
"LABEL_28360",
"LABEL_28361",
"LABEL_28362",
"LABEL_28363",
"LABEL_28364",
"LABEL_28365",
"LABEL_28366",
"LABEL_28367",
"LABEL_28368",
"LABEL_28369",
"LABEL_2837",
"LABEL_28370",
"LABEL_28371",
"LABEL_28372",
"LABEL_28373",
"LABEL_28374",
"LABEL_28375",
"LABEL_28376",
"LABEL_28377",
"LABEL_28378",
"LABEL_28379",
"LABEL_2838",
"LABEL_28380",
"LABEL_28381",
"LABEL_28382",
"LABEL_28383",
"LABEL_28384",
"LABEL_28385",
"LABEL_28386",
"LABEL_28387",
"LABEL_28388",
"LABEL_28389",
"LABEL_2839",
"LABEL_28390",
"LABEL_28391",
"LABEL_28392",
"LABEL_28393",
"LABEL_28394",
"LABEL_28395",
"LABEL_28396",
"LABEL_28397",
"LABEL_28398",
"LABEL_28399",
"LABEL_284",
"LABEL_2840",
"LABEL_28400",
"LABEL_28401",
"LABEL_28402",
"LABEL_28403",
"LABEL_28404",
"LABEL_28405",
"LABEL_28406",
"LABEL_28407",
"LABEL_28408",
"LABEL_28409",
"LABEL_2841",
"LABEL_28410",
"LABEL_28411",
"LABEL_28412",
"LABEL_28413",
"LABEL_28414",
"LABEL_28415",
"LABEL_28416",
"LABEL_28417",
"LABEL_28418",
"LABEL_28419",
"LABEL_2842",
"LABEL_28420",
"LABEL_28421",
"LABEL_28422",
"LABEL_28423",
"LABEL_28424",
"LABEL_28425",
"LABEL_28426",
"LABEL_28427",
"LABEL_28428",
"LABEL_28429",
"LABEL_2843",
"LABEL_28430",
"LABEL_28431",
"LABEL_28432",
"LABEL_28433",
"LABEL_28434",
"LABEL_28435",
"LABEL_28436",
"LABEL_28437",
"LABEL_28438",
"LABEL_28439",
"LABEL_2844",
"LABEL_28440",
"LABEL_28441",
"LABEL_28442",
"LABEL_28443",
"LABEL_28444",
"LABEL_28445",
"LABEL_28446",
"LABEL_28447",
"LABEL_28448",
"LABEL_28449",
"LABEL_2845",
"LABEL_28450",
"LABEL_28451",
"LABEL_28452",
"LABEL_28453",
"LABEL_28454",
"LABEL_28455",
"LABEL_28456",
"LABEL_28457",
"LABEL_28458",
"LABEL_28459",
"LABEL_2846",
"LABEL_28460",
"LABEL_28461",
"LABEL_28462",
"LABEL_28463",
"LABEL_28464",
"LABEL_28465",
"LABEL_28466",
"LABEL_28467",
"LABEL_28468",
"LABEL_28469",
"LABEL_2847",
"LABEL_28470",
"LABEL_28471",
"LABEL_28472",
"LABEL_28473",
"LABEL_28474",
"LABEL_28475",
"LABEL_28476",
"LABEL_28477",
"LABEL_28478",
"LABEL_28479",
"LABEL_2848",
"LABEL_28480",
"LABEL_28481",
"LABEL_28482",
"LABEL_28483",
"LABEL_28484",
"LABEL_28485",
"LABEL_28486",
"LABEL_28487",
"LABEL_28488",
"LABEL_28489",
"LABEL_2849",
"LABEL_28490",
"LABEL_28491",
"LABEL_28492",
"LABEL_28493",
"LABEL_28494",
"LABEL_28495",
"LABEL_28496",
"LABEL_28497",
"LABEL_28498",
"LABEL_28499",
"LABEL_285",
"LABEL_2850",
"LABEL_28500",
"LABEL_28501",
"LABEL_28502",
"LABEL_28503",
"LABEL_28504",
"LABEL_28505",
"LABEL_28506",
"LABEL_28507",
"LABEL_28508",
"LABEL_28509",
"LABEL_2851",
"LABEL_28510",
"LABEL_28511",
"LABEL_28512",
"LABEL_28513",
"LABEL_28514",
"LABEL_28515",
"LABEL_28516",
"LABEL_28517",
"LABEL_28518",
"LABEL_28519",
"LABEL_2852",
"LABEL_28520",
"LABEL_28521",
"LABEL_28522",
"LABEL_28523",
"LABEL_28524",
"LABEL_28525",
"LABEL_28526",
"LABEL_28527",
"LABEL_28528",
"LABEL_28529",
"LABEL_2853",
"LABEL_28530",
"LABEL_28531",
"LABEL_28532",
"LABEL_28533",
"LABEL_28534",
"LABEL_28535",
"LABEL_28536",
"LABEL_28537",
"LABEL_28538",
"LABEL_28539",
"LABEL_2854",
"LABEL_28540",
"LABEL_28541",
"LABEL_28542",
"LABEL_28543",
"LABEL_28544",
"LABEL_28545",
"LABEL_28546",
"LABEL_28547",
"LABEL_28548",
"LABEL_28549",
"LABEL_2855",
"LABEL_28550",
"LABEL_28551",
"LABEL_28552",
"LABEL_28553",
"LABEL_28554",
"LABEL_28555",
"LABEL_28556",
"LABEL_28557",
"LABEL_28558",
"LABEL_28559",
"LABEL_2856",
"LABEL_28560",
"LABEL_28561",
"LABEL_28562",
"LABEL_28563",
"LABEL_28564",
"LABEL_28565",
"LABEL_28566",
"LABEL_28567",
"LABEL_28568",
"LABEL_28569",
"LABEL_2857",
"LABEL_28570",
"LABEL_28571",
"LABEL_28572",
"LABEL_28573",
"LABEL_28574",
"LABEL_28575",
"LABEL_28576",
"LABEL_28577",
"LABEL_28578",
"LABEL_28579",
"LABEL_2858",
"LABEL_28580",
"LABEL_28581",
"LABEL_28582",
"LABEL_28583",
"LABEL_28584",
"LABEL_28585",
"LABEL_28586",
"LABEL_28587",
"LABEL_28588",
"LABEL_28589",
"LABEL_2859",
"LABEL_28590",
"LABEL_28591",
"LABEL_28592",
"LABEL_28593",
"LABEL_28594",
"LABEL_28595",
"LABEL_28596",
"LABEL_28597",
"LABEL_28598",
"LABEL_28599",
"LABEL_286",
"LABEL_2860",
"LABEL_28600",
"LABEL_28601",
"LABEL_28602",
"LABEL_28603",
"LABEL_28604",
"LABEL_28605",
"LABEL_28606",
"LABEL_28607",
"LABEL_28608",
"LABEL_28609",
"LABEL_2861",
"LABEL_28610",
"LABEL_28611",
"LABEL_28612",
"LABEL_28613",
"LABEL_28614",
"LABEL_28615",
"LABEL_28616",
"LABEL_28617",
"LABEL_28618",
"LABEL_28619",
"LABEL_2862",
"LABEL_28620",
"LABEL_28621",
"LABEL_28622",
"LABEL_28623",
"LABEL_28624",
"LABEL_28625",
"LABEL_28626",
"LABEL_28627",
"LABEL_28628",
"LABEL_28629",
"LABEL_2863",
"LABEL_28630",
"LABEL_28631",
"LABEL_28632",
"LABEL_28633",
"LABEL_28634",
"LABEL_28635",
"LABEL_28636",
"LABEL_28637",
"LABEL_28638",
"LABEL_28639",
"LABEL_2864",
"LABEL_28640",
"LABEL_28641",
"LABEL_28642",
"LABEL_28643",
"LABEL_28644",
"LABEL_28645",
"LABEL_28646",
"LABEL_28647",
"LABEL_28648",
"LABEL_28649",
"LABEL_2865",
"LABEL_28650",
"LABEL_28651",
"LABEL_28652",
"LABEL_28653",
"LABEL_28654",
"LABEL_28655",
"LABEL_28656",
"LABEL_28657",
"LABEL_28658",
"LABEL_28659",
"LABEL_2866",
"LABEL_28660",
"LABEL_28661",
"LABEL_28662",
"LABEL_28663",
"LABEL_28664",
"LABEL_28665",
"LABEL_28666",
"LABEL_28667",
"LABEL_28668",
"LABEL_28669",
"LABEL_2867",
"LABEL_28670",
"LABEL_28671",
"LABEL_28672",
"LABEL_28673",
"LABEL_28674",
"LABEL_28675",
"LABEL_28676",
"LABEL_28677",
"LABEL_28678",
"LABEL_28679",
"LABEL_2868",
"LABEL_28680",
"LABEL_28681",
"LABEL_28682",
"LABEL_28683",
"LABEL_28684",
"LABEL_28685",
"LABEL_28686",
"LABEL_28687",
"LABEL_28688",
"LABEL_28689",
"LABEL_2869",
"LABEL_28690",
"LABEL_28691",
"LABEL_28692",
"LABEL_28693",
"LABEL_28694",
"LABEL_28695",
"LABEL_28696",
"LABEL_28697",
"LABEL_28698",
"LABEL_28699",
"LABEL_287",
"LABEL_2870",
"LABEL_28700",
"LABEL_28701",
"LABEL_28702",
"LABEL_28703",
"LABEL_28704",
"LABEL_28705",
"LABEL_28706",
"LABEL_28707",
"LABEL_28708",
"LABEL_28709",
"LABEL_2871",
"LABEL_28710",
"LABEL_28711",
"LABEL_28712",
"LABEL_28713",
"LABEL_28714",
"LABEL_28715",
"LABEL_28716",
"LABEL_28717",
"LABEL_28718",
"LABEL_28719",
"LABEL_2872",
"LABEL_28720",
"LABEL_28721",
"LABEL_28722",
"LABEL_28723",
"LABEL_28724",
"LABEL_28725",
"LABEL_28726",
"LABEL_28727",
"LABEL_28728",
"LABEL_28729",
"LABEL_2873",
"LABEL_28730",
"LABEL_28731",
"LABEL_28732",
"LABEL_28733",
"LABEL_28734",
"LABEL_28735",
"LABEL_28736",
"LABEL_28737",
"LABEL_28738",
"LABEL_28739",
"LABEL_2874",
"LABEL_28740",
"LABEL_28741",
"LABEL_28742",
"LABEL_28743",
"LABEL_28744",
"LABEL_28745",
"LABEL_28746",
"LABEL_28747",
"LABEL_28748",
"LABEL_28749",
"LABEL_2875",
"LABEL_28750",
"LABEL_28751",
"LABEL_28752",
"LABEL_28753",
"LABEL_28754",
"LABEL_28755",
"LABEL_28756",
"LABEL_28757",
"LABEL_28758",
"LABEL_28759",
"LABEL_2876",
"LABEL_28760",
"LABEL_2877",
"LABEL_2878",
"LABEL_2879",
"LABEL_288",
"LABEL_2880",
"LABEL_2881",
"LABEL_2882",
"LABEL_2883",
"LABEL_2884",
"LABEL_2885",
"LABEL_2886",
"LABEL_2887",
"LABEL_2888",
"LABEL_2889",
"LABEL_289",
"LABEL_2890",
"LABEL_2891",
"LABEL_2892",
"LABEL_2893",
"LABEL_2894",
"LABEL_2895",
"LABEL_2896",
"LABEL_2897",
"LABEL_2898",
"LABEL_2899",
"LABEL_29",
"LABEL_290",
"LABEL_2900",
"LABEL_2901",
"LABEL_2902",
"LABEL_2903",
"LABEL_2904",
"LABEL_2905",
"LABEL_2906",
"LABEL_2907",
"LABEL_2908",
"LABEL_2909",
"LABEL_291",
"LABEL_2910",
"LABEL_2911",
"LABEL_2912",
"LABEL_2913",
"LABEL_2914",
"LABEL_2915",
"LABEL_2916",
"LABEL_2917",
"LABEL_2918",
"LABEL_2919",
"LABEL_292",
"LABEL_2920",
"LABEL_2921",
"LABEL_2922",
"LABEL_2923",
"LABEL_2924",
"LABEL_2925",
"LABEL_2926",
"LABEL_2927",
"LABEL_2928",
"LABEL_2929",
"LABEL_293",
"LABEL_2930",
"LABEL_2931",
"LABEL_2932",
"LABEL_2933",
"LABEL_2934",
"LABEL_2935",
"LABEL_2936",
"LABEL_2937",
"LABEL_2938",
"LABEL_2939",
"LABEL_294",
"LABEL_2940",
"LABEL_2941",
"LABEL_2942",
"LABEL_2943",
"LABEL_2944",
"LABEL_2945",
"LABEL_2946",
"LABEL_2947",
"LABEL_2948",
"LABEL_2949",
"LABEL_295",
"LABEL_2950",
"LABEL_2951",
"LABEL_2952",
"LABEL_2953",
"LABEL_2954",
"LABEL_2955",
"LABEL_2956",
"LABEL_2957",
"LABEL_2958",
"LABEL_2959",
"LABEL_296",
"LABEL_2960",
"LABEL_2961",
"LABEL_2962",
"LABEL_2963",
"LABEL_2964",
"LABEL_2965",
"LABEL_2966",
"LABEL_2967",
"LABEL_2968",
"LABEL_2969",
"LABEL_297",
"LABEL_2970",
"LABEL_2971",
"LABEL_2972",
"LABEL_2973",
"LABEL_2974",
"LABEL_2975",
"LABEL_2976",
"LABEL_2977",
"LABEL_2978",
"LABEL_2979",
"LABEL_298",
"LABEL_2980",
"LABEL_2981",
"LABEL_2982",
"LABEL_2983",
"LABEL_2984",
"LABEL_2985",
"LABEL_2986",
"LABEL_2987",
"LABEL_2988",
"LABEL_2989",
"LABEL_299",
"LABEL_2990",
"LABEL_2991",
"LABEL_2992",
"LABEL_2993",
"LABEL_2994",
"LABEL_2995",
"LABEL_2996",
"LABEL_2997",
"LABEL_2998",
"LABEL_2999",
"LABEL_3",
"LABEL_30",
"LABEL_300",
"LABEL_3000",
"LABEL_3001",
"LABEL_3002",
"LABEL_3003",
"LABEL_3004",
"LABEL_3005",
"LABEL_3006",
"LABEL_3007",
"LABEL_3008",
"LABEL_3009",
"LABEL_301",
"LABEL_3010",
"LABEL_3011",
"LABEL_3012",
"LABEL_3013",
"LABEL_3014",
"LABEL_3015",
"LABEL_3016",
"LABEL_3017",
"LABEL_3018",
"LABEL_3019",
"LABEL_302",
"LABEL_3020",
"LABEL_3021",
"LABEL_3022",
"LABEL_3023",
"LABEL_3024",
"LABEL_3025",
"LABEL_3026",
"LABEL_3027",
"LABEL_3028",
"LABEL_3029",
"LABEL_303",
"LABEL_3030",
"LABEL_3031",
"LABEL_3032",
"LABEL_3033",
"LABEL_3034",
"LABEL_3035",
"LABEL_3036",
"LABEL_3037",
"LABEL_3038",
"LABEL_3039",
"LABEL_304",
"LABEL_3040",
"LABEL_3041",
"LABEL_3042",
"LABEL_3043",
"LABEL_3044",
"LABEL_3045",
"LABEL_3046",
"LABEL_3047",
"LABEL_3048",
"LABEL_3049",
"LABEL_305",
"LABEL_3050",
"LABEL_3051",
"LABEL_3052",
"LABEL_3053",
"LABEL_3054",
"LABEL_3055",
"LABEL_3056",
"LABEL_3057",
"LABEL_3058",
"LABEL_3059",
"LABEL_306",
"LABEL_3060",
"LABEL_3061",
"LABEL_3062",
"LABEL_3063",
"LABEL_3064",
"LABEL_3065",
"LABEL_3066",
"LABEL_3067",
"LABEL_3068",
"LABEL_3069",
"LABEL_307",
"LABEL_3070",
"LABEL_3071",
"LABEL_3072",
"LABEL_3073",
"LABEL_3074",
"LABEL_3075",
"LABEL_3076",
"LABEL_3077",
"LABEL_3078",
"LABEL_3079",
"LABEL_308",
"LABEL_3080",
"LABEL_3081",
"LABEL_3082",
"LABEL_3083",
"LABEL_3084",
"LABEL_3085",
"LABEL_3086",
"LABEL_3087",
"LABEL_3088",
"LABEL_3089",
"LABEL_309",
"LABEL_3090",
"LABEL_3091",
"LABEL_3092",
"LABEL_3093",
"LABEL_3094",
"LABEL_3095",
"LABEL_3096",
"LABEL_3097",
"LABEL_3098",
"LABEL_3099",
"LABEL_31",
"LABEL_310",
"LABEL_3100",
"LABEL_3101",
"LABEL_3102",
"LABEL_3103",
"LABEL_3104",
"LABEL_3105",
"LABEL_3106",
"LABEL_3107",
"LABEL_3108",
"LABEL_3109",
"LABEL_311",
"LABEL_3110",
"LABEL_3111",
"LABEL_3112",
"LABEL_3113",
"LABEL_3114",
"LABEL_3115",
"LABEL_3116",
"LABEL_3117",
"LABEL_3118",
"LABEL_3119",
"LABEL_312",
"LABEL_3120",
"LABEL_3121",
"LABEL_3122",
"LABEL_3123",
"LABEL_3124",
"LABEL_3125",
"LABEL_3126",
"LABEL_3127",
"LABEL_3128",
"LABEL_3129",
"LABEL_313",
"LABEL_3130",
"LABEL_3131",
"LABEL_3132",
"LABEL_3133",
"LABEL_3134",
"LABEL_3135",
"LABEL_3136",
"LABEL_3137",
"LABEL_3138",
"LABEL_3139",
"LABEL_314",
"LABEL_3140",
"LABEL_3141",
"LABEL_3142",
"LABEL_3143",
"LABEL_3144",
"LABEL_3145",
"LABEL_3146",
"LABEL_3147",
"LABEL_3148",
"LABEL_3149",
"LABEL_315",
"LABEL_3150",
"LABEL_3151",
"LABEL_3152",
"LABEL_3153",
"LABEL_3154",
"LABEL_3155",
"LABEL_3156",
"LABEL_3157",
"LABEL_3158",
"LABEL_3159",
"LABEL_316",
"LABEL_3160",
"LABEL_3161",
"LABEL_3162",
"LABEL_3163",
"LABEL_3164",
"LABEL_3165",
"LABEL_3166",
"LABEL_3167",
"LABEL_3168",
"LABEL_3169",
"LABEL_317",
"LABEL_3170",
"LABEL_3171",
"LABEL_3172",
"LABEL_3173",
"LABEL_3174",
"LABEL_3175",
"LABEL_3176",
"LABEL_3177",
"LABEL_3178",
"LABEL_3179",
"LABEL_318",
"LABEL_3180",
"LABEL_3181",
"LABEL_3182",
"LABEL_3183",
"LABEL_3184",
"LABEL_3185",
"LABEL_3186",
"LABEL_3187",
"LABEL_3188",
"LABEL_3189",
"LABEL_319",
"LABEL_3190",
"LABEL_3191",
"LABEL_3192",
"LABEL_3193",
"LABEL_3194",
"LABEL_3195",
"LABEL_3196",
"LABEL_3197",
"LABEL_3198",
"LABEL_3199",
"LABEL_32",
"LABEL_320",
"LABEL_3200",
"LABEL_3201",
"LABEL_3202",
"LABEL_3203",
"LABEL_3204",
"LABEL_3205",
"LABEL_3206",
"LABEL_3207",
"LABEL_3208",
"LABEL_3209",
"LABEL_321",
"LABEL_3210",
"LABEL_3211",
"LABEL_3212",
"LABEL_3213",
"LABEL_3214",
"LABEL_3215",
"LABEL_3216",
"LABEL_3217",
"LABEL_3218",
"LABEL_3219",
"LABEL_322",
"LABEL_3220",
"LABEL_3221",
"LABEL_3222",
"LABEL_3223",
"LABEL_3224",
"LABEL_3225",
"LABEL_3226",
"LABEL_3227",
"LABEL_3228",
"LABEL_3229",
"LABEL_323",
"LABEL_3230",
"LABEL_3231",
"LABEL_3232",
"LABEL_3233",
"LABEL_3234",
"LABEL_3235",
"LABEL_3236",
"LABEL_3237",
"LABEL_3238",
"LABEL_3239",
"LABEL_324",
"LABEL_3240",
"LABEL_3241",
"LABEL_3242",
"LABEL_3243",
"LABEL_3244",
"LABEL_3245",
"LABEL_3246",
"LABEL_3247",
"LABEL_3248",
"LABEL_3249",
"LABEL_325",
"LABEL_3250",
"LABEL_3251",
"LABEL_3252",
"LABEL_3253",
"LABEL_3254",
"LABEL_3255",
"LABEL_3256",
"LABEL_3257",
"LABEL_3258",
"LABEL_3259",
"LABEL_326",
"LABEL_3260",
"LABEL_3261",
"LABEL_3262",
"LABEL_3263",
"LABEL_3264",
"LABEL_3265",
"LABEL_3266",
"LABEL_3267",
"LABEL_3268",
"LABEL_3269",
"LABEL_327",
"LABEL_3270",
"LABEL_3271",
"LABEL_3272",
"LABEL_3273",
"LABEL_3274",
"LABEL_3275",
"LABEL_3276",
"LABEL_3277",
"LABEL_3278",
"LABEL_3279",
"LABEL_328",
"LABEL_3280",
"LABEL_3281",
"LABEL_3282",
"LABEL_3283",
"LABEL_3284",
"LABEL_3285",
"LABEL_3286",
"LABEL_3287",
"LABEL_3288",
"LABEL_3289",
"LABEL_329",
"LABEL_3290",
"LABEL_3291",
"LABEL_3292",
"LABEL_3293",
"LABEL_3294",
"LABEL_3295",
"LABEL_3296",
"LABEL_3297",
"LABEL_3298",
"LABEL_3299",
"LABEL_33",
"LABEL_330",
"LABEL_3300",
"LABEL_3301",
"LABEL_3302",
"LABEL_3303",
"LABEL_3304",
"LABEL_3305",
"LABEL_3306",
"LABEL_3307",
"LABEL_3308",
"LABEL_3309",
"LABEL_331",
"LABEL_3310",
"LABEL_3311",
"LABEL_3312",
"LABEL_3313",
"LABEL_3314",
"LABEL_3315",
"LABEL_3316",
"LABEL_3317",
"LABEL_3318",
"LABEL_3319",
"LABEL_332",
"LABEL_3320",
"LABEL_3321",
"LABEL_3322",
"LABEL_3323",
"LABEL_3324",
"LABEL_3325",
"LABEL_3326",
"LABEL_3327",
"LABEL_3328",
"LABEL_3329",
"LABEL_333",
"LABEL_3330",
"LABEL_3331",
"LABEL_3332",
"LABEL_3333",
"LABEL_3334",
"LABEL_3335",
"LABEL_3336",
"LABEL_3337",
"LABEL_3338",
"LABEL_3339",
"LABEL_334",
"LABEL_3340",
"LABEL_3341",
"LABEL_3342",
"LABEL_3343",
"LABEL_3344",
"LABEL_3345",
"LABEL_3346",
"LABEL_3347",
"LABEL_3348",
"LABEL_3349",
"LABEL_335",
"LABEL_3350",
"LABEL_3351",
"LABEL_3352",
"LABEL_3353",
"LABEL_3354",
"LABEL_3355",
"LABEL_3356",
"LABEL_3357",
"LABEL_3358",
"LABEL_3359",
"LABEL_336",
"LABEL_3360",
"LABEL_3361",
"LABEL_3362",
"LABEL_3363",
"LABEL_3364",
"LABEL_3365",
"LABEL_3366",
"LABEL_3367",
"LABEL_3368",
"LABEL_3369",
"LABEL_337",
"LABEL_3370",
"LABEL_3371",
"LABEL_3372",
"LABEL_3373",
"LABEL_3374",
"LABEL_3375",
"LABEL_3376",
"LABEL_3377",
"LABEL_3378",
"LABEL_3379",
"LABEL_338",
"LABEL_3380",
"LABEL_3381",
"LABEL_3382",
"LABEL_3383",
"LABEL_3384",
"LABEL_3385",
"LABEL_3386",
"LABEL_3387",
"LABEL_3388",
"LABEL_3389",
"LABEL_339",
"LABEL_3390",
"LABEL_3391",
"LABEL_3392",
"LABEL_3393",
"LABEL_3394",
"LABEL_3395",
"LABEL_3396",
"LABEL_3397",
"LABEL_3398",
"LABEL_3399",
"LABEL_34",
"LABEL_340",
"LABEL_3400",
"LABEL_3401",
"LABEL_3402",
"LABEL_3403",
"LABEL_3404",
"LABEL_3405",
"LABEL_3406",
"LABEL_3407",
"LABEL_3408",
"LABEL_3409",
"LABEL_341",
"LABEL_3410",
"LABEL_3411",
"LABEL_3412",
"LABEL_3413",
"LABEL_3414",
"LABEL_3415",
"LABEL_3416",
"LABEL_3417",
"LABEL_3418",
"LABEL_3419",
"LABEL_342",
"LABEL_3420",
"LABEL_3421",
"LABEL_3422",
"LABEL_3423",
"LABEL_3424",
"LABEL_3425",
"LABEL_3426",
"LABEL_3427",
"LABEL_3428",
"LABEL_3429",
"LABEL_343",
"LABEL_3430",
"LABEL_3431",
"LABEL_3432",
"LABEL_3433",
"LABEL_3434",
"LABEL_3435",
"LABEL_3436",
"LABEL_3437",
"LABEL_3438",
"LABEL_3439",
"LABEL_344",
"LABEL_3440",
"LABEL_3441",
"LABEL_3442",
"LABEL_3443",
"LABEL_3444",
"LABEL_3445",
"LABEL_3446",
"LABEL_3447",
"LABEL_3448",
"LABEL_3449",
"LABEL_345",
"LABEL_3450",
"LABEL_3451",
"LABEL_3452",
"LABEL_3453",
"LABEL_3454",
"LABEL_3455",
"LABEL_3456",
"LABEL_3457",
"LABEL_3458",
"LABEL_3459",
"LABEL_346",
"LABEL_3460",
"LABEL_3461",
"LABEL_3462",
"LABEL_3463",
"LABEL_3464",
"LABEL_3465",
"LABEL_3466",
"LABEL_3467",
"LABEL_3468",
"LABEL_3469",
"LABEL_347",
"LABEL_3470",
"LABEL_3471",
"LABEL_3472",
"LABEL_3473",
"LABEL_3474",
"LABEL_3475",
"LABEL_3476",
"LABEL_3477",
"LABEL_3478",
"LABEL_3479",
"LABEL_348",
"LABEL_3480",
"LABEL_3481",
"LABEL_3482",
"LABEL_3483",
"LABEL_3484",
"LABEL_3485",
"LABEL_3486",
"LABEL_3487",
"LABEL_3488",
"LABEL_3489",
"LABEL_349",
"LABEL_3490",
"LABEL_3491",
"LABEL_3492",
"LABEL_3493",
"LABEL_3494",
"LABEL_3495",
"LABEL_3496",
"LABEL_3497",
"LABEL_3498",
"LABEL_3499",
"LABEL_35",
"LABEL_350",
"LABEL_3500",
"LABEL_3501",
"LABEL_3502",
"LABEL_3503",
"LABEL_3504",
"LABEL_3505",
"LABEL_3506",
"LABEL_3507",
"LABEL_3508",
"LABEL_3509",
"LABEL_351",
"LABEL_3510",
"LABEL_3511",
"LABEL_3512",
"LABEL_3513",
"LABEL_3514",
"LABEL_3515",
"LABEL_3516",
"LABEL_3517",
"LABEL_3518",
"LABEL_3519",
"LABEL_352",
"LABEL_3520",
"LABEL_3521",
"LABEL_3522",
"LABEL_3523",
"LABEL_3524",
"LABEL_3525",
"LABEL_3526",
"LABEL_3527",
"LABEL_3528",
"LABEL_3529",
"LABEL_353",
"LABEL_3530",
"LABEL_3531",
"LABEL_3532",
"LABEL_3533",
"LABEL_3534",
"LABEL_3535",
"LABEL_3536",
"LABEL_3537",
"LABEL_3538",
"LABEL_3539",
"LABEL_354",
"LABEL_3540",
"LABEL_3541",
"LABEL_3542",
"LABEL_3543",
"LABEL_3544",
"LABEL_3545",
"LABEL_3546",
"LABEL_3547",
"LABEL_3548",
"LABEL_3549",
"LABEL_355",
"LABEL_3550",
"LABEL_3551",
"LABEL_3552",
"LABEL_3553",
"LABEL_3554",
"LABEL_3555",
"LABEL_3556",
"LABEL_3557",
"LABEL_3558",
"LABEL_3559",
"LABEL_356",
"LABEL_3560",
"LABEL_3561",
"LABEL_3562",
"LABEL_3563",
"LABEL_3564",
"LABEL_3565",
"LABEL_3566",
"LABEL_3567",
"LABEL_3568",
"LABEL_3569",
"LABEL_357",
"LABEL_3570",
"LABEL_3571",
"LABEL_3572",
"LABEL_3573",
"LABEL_3574",
"LABEL_3575",
"LABEL_3576",
"LABEL_3577",
"LABEL_3578",
"LABEL_3579",
"LABEL_358",
"LABEL_3580",
"LABEL_3581",
"LABEL_3582",
"LABEL_3583",
"LABEL_3584",
"LABEL_3585",
"LABEL_3586",
"LABEL_3587",
"LABEL_3588",
"LABEL_3589",
"LABEL_359",
"LABEL_3590",
"LABEL_3591",
"LABEL_3592",
"LABEL_3593",
"LABEL_3594",
"LABEL_3595",
"LABEL_3596",
"LABEL_3597",
"LABEL_3598",
"LABEL_3599",
"LABEL_36",
"LABEL_360",
"LABEL_3600",
"LABEL_3601",
"LABEL_3602",
"LABEL_3603",
"LABEL_3604",
"LABEL_3605",
"LABEL_3606",
"LABEL_3607",
"LABEL_3608",
"LABEL_3609",
"LABEL_361",
"LABEL_3610",
"LABEL_3611",
"LABEL_3612",
"LABEL_3613",
"LABEL_3614",
"LABEL_3615",
"LABEL_3616",
"LABEL_3617",
"LABEL_3618",
"LABEL_3619",
"LABEL_362",
"LABEL_3620",
"LABEL_3621",
"LABEL_3622",
"LABEL_3623",
"LABEL_3624",
"LABEL_3625",
"LABEL_3626",
"LABEL_3627",
"LABEL_3628",
"LABEL_3629",
"LABEL_363",
"LABEL_3630",
"LABEL_3631",
"LABEL_3632",
"LABEL_3633",
"LABEL_3634",
"LABEL_3635",
"LABEL_3636",
"LABEL_3637",
"LABEL_3638",
"LABEL_3639",
"LABEL_364",
"LABEL_3640",
"LABEL_3641",
"LABEL_3642",
"LABEL_3643",
"LABEL_3644",
"LABEL_3645",
"LABEL_3646",
"LABEL_3647",
"LABEL_3648",
"LABEL_3649",
"LABEL_365",
"LABEL_3650",
"LABEL_3651",
"LABEL_3652",
"LABEL_3653",
"LABEL_3654",
"LABEL_3655",
"LABEL_3656",
"LABEL_3657",
"LABEL_3658",
"LABEL_3659",
"LABEL_366",
"LABEL_3660",
"LABEL_3661",
"LABEL_3662",
"LABEL_3663",
"LABEL_3664",
"LABEL_3665",
"LABEL_3666",
"LABEL_3667",
"LABEL_3668",
"LABEL_3669",
"LABEL_367",
"LABEL_3670",
"LABEL_3671",
"LABEL_3672",
"LABEL_3673",
"LABEL_3674",
"LABEL_3675",
"LABEL_3676",
"LABEL_3677",
"LABEL_3678",
"LABEL_3679",
"LABEL_368",
"LABEL_3680",
"LABEL_3681",
"LABEL_3682",
"LABEL_3683",
"LABEL_3684",
"LABEL_3685",
"LABEL_3686",
"LABEL_3687",
"LABEL_3688",
"LABEL_3689",
"LABEL_369",
"LABEL_3690",
"LABEL_3691",
"LABEL_3692",
"LABEL_3693",
"LABEL_3694",
"LABEL_3695",
"LABEL_3696",
"LABEL_3697",
"LABEL_3698",
"LABEL_3699",
"LABEL_37",
"LABEL_370",
"LABEL_3700",
"LABEL_3701",
"LABEL_3702",
"LABEL_3703",
"LABEL_3704",
"LABEL_3705",
"LABEL_3706",
"LABEL_3707",
"LABEL_3708",
"LABEL_3709",
"LABEL_371",
"LABEL_3710",
"LABEL_3711",
"LABEL_3712",
"LABEL_3713",
"LABEL_3714",
"LABEL_3715",
"LABEL_3716",
"LABEL_3717",
"LABEL_3718",
"LABEL_3719",
"LABEL_372",
"LABEL_3720",
"LABEL_3721",
"LABEL_3722",
"LABEL_3723",
"LABEL_3724",
"LABEL_3725",
"LABEL_3726",
"LABEL_3727",
"LABEL_3728",
"LABEL_3729",
"LABEL_373",
"LABEL_3730",
"LABEL_3731",
"LABEL_3732",
"LABEL_3733",
"LABEL_3734",
"LABEL_3735",
"LABEL_3736",
"LABEL_3737",
"LABEL_3738",
"LABEL_3739",
"LABEL_374",
"LABEL_3740",
"LABEL_3741",
"LABEL_3742",
"LABEL_3743",
"LABEL_3744",
"LABEL_3745",
"LABEL_3746",
"LABEL_3747",
"LABEL_3748",
"LABEL_3749",
"LABEL_375",
"LABEL_3750",
"LABEL_3751",
"LABEL_3752",
"LABEL_3753",
"LABEL_3754",
"LABEL_3755",
"LABEL_3756",
"LABEL_3757",
"LABEL_3758",
"LABEL_3759",
"LABEL_376",
"LABEL_3760",
"LABEL_3761",
"LABEL_3762",
"LABEL_3763",
"LABEL_3764",
"LABEL_3765",
"LABEL_3766",
"LABEL_3767",
"LABEL_3768",
"LABEL_3769",
"LABEL_377",
"LABEL_3770",
"LABEL_3771",
"LABEL_3772",
"LABEL_3773",
"LABEL_3774",
"LABEL_3775",
"LABEL_3776",
"LABEL_3777",
"LABEL_3778",
"LABEL_3779",
"LABEL_378",
"LABEL_3780",
"LABEL_3781",
"LABEL_3782",
"LABEL_3783",
"LABEL_3784",
"LABEL_3785",
"LABEL_3786",
"LABEL_3787",
"LABEL_3788",
"LABEL_3789",
"LABEL_379",
"LABEL_3790",
"LABEL_3791",
"LABEL_3792",
"LABEL_3793",
"LABEL_3794",
"LABEL_3795",
"LABEL_3796",
"LABEL_3797",
"LABEL_3798",
"LABEL_3799",
"LABEL_38",
"LABEL_380",
"LABEL_3800",
"LABEL_3801",
"LABEL_3802",
"LABEL_3803",
"LABEL_3804",
"LABEL_3805",
"LABEL_3806",
"LABEL_3807",
"LABEL_3808",
"LABEL_3809",
"LABEL_381",
"LABEL_3810",
"LABEL_3811",
"LABEL_3812",
"LABEL_3813",
"LABEL_3814",
"LABEL_3815",
"LABEL_3816",
"LABEL_3817",
"LABEL_3818",
"LABEL_3819",
"LABEL_382",
"LABEL_3820",
"LABEL_3821",
"LABEL_3822",
"LABEL_3823",
"LABEL_3824",
"LABEL_3825",
"LABEL_3826",
"LABEL_3827",
"LABEL_3828",
"LABEL_3829",
"LABEL_383",
"LABEL_3830",
"LABEL_3831",
"LABEL_3832",
"LABEL_3833",
"LABEL_3834",
"LABEL_3835",
"LABEL_3836",
"LABEL_3837",
"LABEL_3838",
"LABEL_3839",
"LABEL_384",
"LABEL_3840",
"LABEL_3841",
"LABEL_3842",
"LABEL_3843",
"LABEL_3844",
"LABEL_3845",
"LABEL_3846",
"LABEL_3847",
"LABEL_3848",
"LABEL_3849",
"LABEL_385",
"LABEL_3850",
"LABEL_3851",
"LABEL_3852",
"LABEL_3853",
"LABEL_3854",
"LABEL_3855",
"LABEL_3856",
"LABEL_3857",
"LABEL_3858",
"LABEL_3859",
"LABEL_386",
"LABEL_3860",
"LABEL_3861",
"LABEL_3862",
"LABEL_3863",
"LABEL_3864",
"LABEL_3865",
"LABEL_3866",
"LABEL_3867",
"LABEL_3868",
"LABEL_3869",
"LABEL_387",
"LABEL_3870",
"LABEL_3871",
"LABEL_3872",
"LABEL_3873",
"LABEL_3874",
"LABEL_3875",
"LABEL_3876",
"LABEL_3877",
"LABEL_3878",
"LABEL_3879",
"LABEL_388",
"LABEL_3880",
"LABEL_3881",
"LABEL_3882",
"LABEL_3883",
"LABEL_3884",
"LABEL_3885",
"LABEL_3886",
"LABEL_3887",
"LABEL_3888",
"LABEL_3889",
"LABEL_389",
"LABEL_3890",
"LABEL_3891",
"LABEL_3892",
"LABEL_3893",
"LABEL_3894",
"LABEL_3895",
"LABEL_3896",
"LABEL_3897",
"LABEL_3898",
"LABEL_3899",
"LABEL_39",
"LABEL_390",
"LABEL_3900",
"LABEL_3901",
"LABEL_3902",
"LABEL_3903",
"LABEL_3904",
"LABEL_3905",
"LABEL_3906",
"LABEL_3907",
"LABEL_3908",
"LABEL_3909",
"LABEL_391",
"LABEL_3910",
"LABEL_3911",
"LABEL_3912",
"LABEL_3913",
"LABEL_3914",
"LABEL_3915",
"LABEL_3916",
"LABEL_3917",
"LABEL_3918",
"LABEL_3919",
"LABEL_392",
"LABEL_3920",
"LABEL_3921",
"LABEL_3922",
"LABEL_3923",
"LABEL_3924",
"LABEL_3925",
"LABEL_3926",
"LABEL_3927",
"LABEL_3928",
"LABEL_3929",
"LABEL_393",
"LABEL_3930",
"LABEL_3931",
"LABEL_3932",
"LABEL_3933",
"LABEL_3934",
"LABEL_3935",
"LABEL_3936",
"LABEL_3937",
"LABEL_3938",
"LABEL_3939",
"LABEL_394",
"LABEL_3940",
"LABEL_3941",
"LABEL_3942",
"LABEL_3943",
"LABEL_3944",
"LABEL_3945",
"LABEL_3946",
"LABEL_3947",
"LABEL_3948",
"LABEL_3949",
"LABEL_395",
"LABEL_3950",
"LABEL_3951",
"LABEL_3952",
"LABEL_3953",
"LABEL_3954",
"LABEL_3955",
"LABEL_3956",
"LABEL_3957",
"LABEL_3958",
"LABEL_3959",
"LABEL_396",
"LABEL_3960",
"LABEL_3961",
"LABEL_3962",
"LABEL_3963",
"LABEL_3964",
"LABEL_3965",
"LABEL_3966",
"LABEL_3967",
"LABEL_3968",
"LABEL_3969",
"LABEL_397",
"LABEL_3970",
"LABEL_3971",
"LABEL_3972",
"LABEL_3973",
"LABEL_3974",
"LABEL_3975",
"LABEL_3976",
"LABEL_3977",
"LABEL_3978",
"LABEL_3979",
"LABEL_398",
"LABEL_3980",
"LABEL_3981",
"LABEL_3982",
"LABEL_3983",
"LABEL_3984",
"LABEL_3985",
"LABEL_3986",
"LABEL_3987",
"LABEL_3988",
"LABEL_3989",
"LABEL_399",
"LABEL_3990",
"LABEL_3991",
"LABEL_3992",
"LABEL_3993",
"LABEL_3994",
"LABEL_3995",
"LABEL_3996",
"LABEL_3997",
"LABEL_3998",
"LABEL_3999",
"LABEL_4",
"LABEL_40",
"LABEL_400",
"LABEL_4000",
"LABEL_4001",
"LABEL_4002",
"LABEL_4003",
"LABEL_4004",
"LABEL_4005",
"LABEL_4006",
"LABEL_4007",
"LABEL_4008",
"LABEL_4009",
"LABEL_401",
"LABEL_4010",
"LABEL_4011",
"LABEL_4012",
"LABEL_4013",
"LABEL_4014",
"LABEL_4015",
"LABEL_4016",
"LABEL_4017",
"LABEL_4018",
"LABEL_4019",
"LABEL_402",
"LABEL_4020",
"LABEL_4021",
"LABEL_4022",
"LABEL_4023",
"LABEL_4024",
"LABEL_4025",
"LABEL_4026",
"LABEL_4027",
"LABEL_4028",
"LABEL_4029",
"LABEL_403",
"LABEL_4030",
"LABEL_4031",
"LABEL_4032",
"LABEL_4033",
"LABEL_4034",
"LABEL_4035",
"LABEL_4036",
"LABEL_4037",
"LABEL_4038",
"LABEL_4039",
"LABEL_404",
"LABEL_4040",
"LABEL_4041",
"LABEL_4042",
"LABEL_4043",
"LABEL_4044",
"LABEL_4045",
"LABEL_4046",
"LABEL_4047",
"LABEL_4048",
"LABEL_4049",
"LABEL_405",
"LABEL_4050",
"LABEL_4051",
"LABEL_4052",
"LABEL_4053",
"LABEL_4054",
"LABEL_4055",
"LABEL_4056",
"LABEL_4057",
"LABEL_4058",
"LABEL_4059",
"LABEL_406",
"LABEL_4060",
"LABEL_4061",
"LABEL_4062",
"LABEL_4063",
"LABEL_4064",
"LABEL_4065",
"LABEL_4066",
"LABEL_4067",
"LABEL_4068",
"LABEL_4069",
"LABEL_407",
"LABEL_4070",
"LABEL_4071",
"LABEL_4072",
"LABEL_4073",
"LABEL_4074",
"LABEL_4075",
"LABEL_4076",
"LABEL_4077",
"LABEL_4078",
"LABEL_4079",
"LABEL_408",
"LABEL_4080",
"LABEL_4081",
"LABEL_4082",
"LABEL_4083",
"LABEL_4084",
"LABEL_4085",
"LABEL_4086",
"LABEL_4087",
"LABEL_4088",
"LABEL_4089",
"LABEL_409",
"LABEL_4090",
"LABEL_4091",
"LABEL_4092",
"LABEL_4093",
"LABEL_4094",
"LABEL_4095",
"LABEL_4096",
"LABEL_4097",
"LABEL_4098",
"LABEL_4099",
"LABEL_41",
"LABEL_410",
"LABEL_4100",
"LABEL_4101",
"LABEL_4102",
"LABEL_4103",
"LABEL_4104",
"LABEL_4105",
"LABEL_4106",
"LABEL_4107",
"LABEL_4108",
"LABEL_4109",
"LABEL_411",
"LABEL_4110",
"LABEL_4111",
"LABEL_4112",
"LABEL_4113",
"LABEL_4114",
"LABEL_4115",
"LABEL_4116",
"LABEL_4117",
"LABEL_4118",
"LABEL_4119",
"LABEL_412",
"LABEL_4120",
"LABEL_4121",
"LABEL_4122",
"LABEL_4123",
"LABEL_4124",
"LABEL_4125",
"LABEL_4126",
"LABEL_4127",
"LABEL_4128",
"LABEL_4129",
"LABEL_413",
"LABEL_4130",
"LABEL_4131",
"LABEL_4132",
"LABEL_4133",
"LABEL_4134",
"LABEL_4135",
"LABEL_4136",
"LABEL_4137",
"LABEL_4138",
"LABEL_4139",
"LABEL_414",
"LABEL_4140",
"LABEL_4141",
"LABEL_4142",
"LABEL_4143",
"LABEL_4144",
"LABEL_4145",
"LABEL_4146",
"LABEL_4147",
"LABEL_4148",
"LABEL_4149",
"LABEL_415",
"LABEL_4150",
"LABEL_4151",
"LABEL_4152",
"LABEL_4153",
"LABEL_4154",
"LABEL_4155",
"LABEL_4156",
"LABEL_4157",
"LABEL_4158",
"LABEL_4159",
"LABEL_416",
"LABEL_4160",
"LABEL_4161",
"LABEL_4162",
"LABEL_4163",
"LABEL_4164",
"LABEL_4165",
"LABEL_4166",
"LABEL_4167",
"LABEL_4168",
"LABEL_4169",
"LABEL_417",
"LABEL_4170",
"LABEL_4171",
"LABEL_4172",
"LABEL_4173",
"LABEL_4174",
"LABEL_4175",
"LABEL_4176",
"LABEL_4177",
"LABEL_4178",
"LABEL_4179",
"LABEL_418",
"LABEL_4180",
"LABEL_4181",
"LABEL_4182",
"LABEL_4183",
"LABEL_4184",
"LABEL_4185",
"LABEL_4186",
"LABEL_4187",
"LABEL_4188",
"LABEL_4189",
"LABEL_419",
"LABEL_4190",
"LABEL_4191",
"LABEL_4192",
"LABEL_4193",
"LABEL_4194",
"LABEL_4195",
"LABEL_4196",
"LABEL_4197",
"LABEL_4198",
"LABEL_4199",
"LABEL_42",
"LABEL_420",
"LABEL_4200",
"LABEL_4201",
"LABEL_4202",
"LABEL_4203",
"LABEL_4204",
"LABEL_4205",
"LABEL_4206",
"LABEL_4207",
"LABEL_4208",
"LABEL_4209",
"LABEL_421",
"LABEL_4210",
"LABEL_4211",
"LABEL_4212",
"LABEL_4213",
"LABEL_4214",
"LABEL_4215",
"LABEL_4216",
"LABEL_4217",
"LABEL_4218",
"LABEL_4219",
"LABEL_422",
"LABEL_4220",
"LABEL_4221",
"LABEL_4222",
"LABEL_4223",
"LABEL_4224",
"LABEL_4225",
"LABEL_4226",
"LABEL_4227",
"LABEL_4228",
"LABEL_4229",
"LABEL_423",
"LABEL_4230",
"LABEL_4231",
"LABEL_4232",
"LABEL_4233",
"LABEL_4234",
"LABEL_4235",
"LABEL_4236",
"LABEL_4237",
"LABEL_4238",
"LABEL_4239",
"LABEL_424",
"LABEL_4240",
"LABEL_4241",
"LABEL_4242",
"LABEL_4243",
"LABEL_4244",
"LABEL_4245",
"LABEL_4246",
"LABEL_4247",
"LABEL_4248",
"LABEL_4249",
"LABEL_425",
"LABEL_4250",
"LABEL_4251",
"LABEL_4252",
"LABEL_4253",
"LABEL_4254",
"LABEL_4255",
"LABEL_4256",
"LABEL_4257",
"LABEL_4258",
"LABEL_4259",
"LABEL_426",
"LABEL_4260",
"LABEL_4261",
"LABEL_4262",
"LABEL_4263",
"LABEL_4264",
"LABEL_4265",
"LABEL_4266",
"LABEL_4267",
"LABEL_4268",
"LABEL_4269",
"LABEL_427",
"LABEL_4270",
"LABEL_4271",
"LABEL_4272",
"LABEL_4273",
"LABEL_4274",
"LABEL_4275",
"LABEL_4276",
"LABEL_4277",
"LABEL_4278",
"LABEL_4279",
"LABEL_428",
"LABEL_4280",
"LABEL_4281",
"LABEL_4282",
"LABEL_4283",
"LABEL_4284",
"LABEL_4285",
"LABEL_4286",
"LABEL_4287",
"LABEL_4288",
"LABEL_4289",
"LABEL_429",
"LABEL_4290",
"LABEL_4291",
"LABEL_4292",
"LABEL_4293",
"LABEL_4294",
"LABEL_4295",
"LABEL_4296",
"LABEL_4297",
"LABEL_4298",
"LABEL_4299",
"LABEL_43",
"LABEL_430",
"LABEL_4300",
"LABEL_4301",
"LABEL_4302",
"LABEL_4303",
"LABEL_4304",
"LABEL_4305",
"LABEL_4306",
"LABEL_4307",
"LABEL_4308",
"LABEL_4309",
"LABEL_431",
"LABEL_4310",
"LABEL_4311",
"LABEL_4312",
"LABEL_4313",
"LABEL_4314",
"LABEL_4315",
"LABEL_4316",
"LABEL_4317",
"LABEL_4318",
"LABEL_4319",
"LABEL_432",
"LABEL_4320",
"LABEL_4321",
"LABEL_4322",
"LABEL_4323",
"LABEL_4324",
"LABEL_4325",
"LABEL_4326",
"LABEL_4327",
"LABEL_4328",
"LABEL_4329",
"LABEL_433",
"LABEL_4330",
"LABEL_4331",
"LABEL_4332",
"LABEL_4333",
"LABEL_4334",
"LABEL_4335",
"LABEL_4336",
"LABEL_4337",
"LABEL_4338",
"LABEL_4339",
"LABEL_434",
"LABEL_4340",
"LABEL_4341",
"LABEL_4342",
"LABEL_4343",
"LABEL_4344",
"LABEL_4345",
"LABEL_4346",
"LABEL_4347",
"LABEL_4348",
"LABEL_4349",
"LABEL_435",
"LABEL_4350",
"LABEL_4351",
"LABEL_4352",
"LABEL_4353",
"LABEL_4354",
"LABEL_4355",
"LABEL_4356",
"LABEL_4357",
"LABEL_4358",
"LABEL_4359",
"LABEL_436",
"LABEL_4360",
"LABEL_4361",
"LABEL_4362",
"LABEL_4363",
"LABEL_4364",
"LABEL_4365",
"LABEL_4366",
"LABEL_4367",
"LABEL_4368",
"LABEL_4369",
"LABEL_437",
"LABEL_4370",
"LABEL_4371",
"LABEL_4372",
"LABEL_4373",
"LABEL_4374",
"LABEL_4375",
"LABEL_4376",
"LABEL_4377",
"LABEL_4378",
"LABEL_4379",
"LABEL_438",
"LABEL_4380",
"LABEL_4381",
"LABEL_4382",
"LABEL_4383",
"LABEL_4384",
"LABEL_4385",
"LABEL_4386",
"LABEL_4387",
"LABEL_4388",
"LABEL_4389",
"LABEL_439",
"LABEL_4390",
"LABEL_4391",
"LABEL_4392",
"LABEL_4393",
"LABEL_4394",
"LABEL_4395",
"LABEL_4396",
"LABEL_4397",
"LABEL_4398",
"LABEL_4399",
"LABEL_44",
"LABEL_440",
"LABEL_4400",
"LABEL_4401",
"LABEL_4402",
"LABEL_4403",
"LABEL_4404",
"LABEL_4405",
"LABEL_4406",
"LABEL_4407",
"LABEL_4408",
"LABEL_4409",
"LABEL_441",
"LABEL_4410",
"LABEL_4411",
"LABEL_4412",
"LABEL_4413",
"LABEL_4414",
"LABEL_4415",
"LABEL_4416",
"LABEL_4417",
"LABEL_4418",
"LABEL_4419",
"LABEL_442",
"LABEL_4420",
"LABEL_4421",
"LABEL_4422",
"LABEL_4423",
"LABEL_4424",
"LABEL_4425",
"LABEL_4426",
"LABEL_4427",
"LABEL_4428",
"LABEL_4429",
"LABEL_443",
"LABEL_4430",
"LABEL_4431",
"LABEL_4432",
"LABEL_4433",
"LABEL_4434",
"LABEL_4435",
"LABEL_4436",
"LABEL_4437",
"LABEL_4438",
"LABEL_4439",
"LABEL_444",
"LABEL_4440",
"LABEL_4441",
"LABEL_4442",
"LABEL_4443",
"LABEL_4444",
"LABEL_4445",
"LABEL_4446",
"LABEL_4447",
"LABEL_4448",
"LABEL_4449",
"LABEL_445",
"LABEL_4450",
"LABEL_4451",
"LABEL_4452",
"LABEL_4453",
"LABEL_4454",
"LABEL_4455",
"LABEL_4456",
"LABEL_4457",
"LABEL_4458",
"LABEL_4459",
"LABEL_446",
"LABEL_4460",
"LABEL_4461",
"LABEL_4462",
"LABEL_4463",
"LABEL_4464",
"LABEL_4465",
"LABEL_4466",
"LABEL_4467",
"LABEL_4468",
"LABEL_4469",
"LABEL_447",
"LABEL_4470",
"LABEL_4471",
"LABEL_4472",
"LABEL_4473",
"LABEL_4474",
"LABEL_4475",
"LABEL_4476",
"LABEL_4477",
"LABEL_4478",
"LABEL_4479",
"LABEL_448",
"LABEL_4480",
"LABEL_4481",
"LABEL_4482",
"LABEL_4483",
"LABEL_4484",
"LABEL_4485",
"LABEL_4486",
"LABEL_4487",
"LABEL_4488",
"LABEL_4489",
"LABEL_449",
"LABEL_4490",
"LABEL_4491",
"LABEL_4492",
"LABEL_4493",
"LABEL_4494",
"LABEL_4495",
"LABEL_4496",
"LABEL_4497",
"LABEL_4498",
"LABEL_4499",
"LABEL_45",
"LABEL_450",
"LABEL_4500",
"LABEL_4501",
"LABEL_4502",
"LABEL_4503",
"LABEL_4504",
"LABEL_4505",
"LABEL_4506",
"LABEL_4507",
"LABEL_4508",
"LABEL_4509",
"LABEL_451",
"LABEL_4510",
"LABEL_4511",
"LABEL_4512",
"LABEL_4513",
"LABEL_4514",
"LABEL_4515",
"LABEL_4516",
"LABEL_4517",
"LABEL_4518",
"LABEL_4519",
"LABEL_452",
"LABEL_4520",
"LABEL_4521",
"LABEL_4522",
"LABEL_4523",
"LABEL_4524",
"LABEL_4525",
"LABEL_4526",
"LABEL_4527",
"LABEL_4528",
"LABEL_4529",
"LABEL_453",
"LABEL_4530",
"LABEL_4531",
"LABEL_4532",
"LABEL_4533",
"LABEL_4534",
"LABEL_4535",
"LABEL_4536",
"LABEL_4537",
"LABEL_4538",
"LABEL_4539",
"LABEL_454",
"LABEL_4540",
"LABEL_4541",
"LABEL_4542",
"LABEL_4543",
"LABEL_4544",
"LABEL_4545",
"LABEL_4546",
"LABEL_4547",
"LABEL_4548",
"LABEL_4549",
"LABEL_455",
"LABEL_4550",
"LABEL_4551",
"LABEL_4552",
"LABEL_4553",
"LABEL_4554",
"LABEL_4555",
"LABEL_4556",
"LABEL_4557",
"LABEL_4558",
"LABEL_4559",
"LABEL_456",
"LABEL_4560",
"LABEL_4561",
"LABEL_4562",
"LABEL_4563",
"LABEL_4564",
"LABEL_4565",
"LABEL_4566",
"LABEL_4567",
"LABEL_4568",
"LABEL_4569",
"LABEL_457",
"LABEL_4570",
"LABEL_4571",
"LABEL_4572",
"LABEL_4573",
"LABEL_4574",
"LABEL_4575",
"LABEL_4576",
"LABEL_4577",
"LABEL_4578",
"LABEL_4579",
"LABEL_458",
"LABEL_4580",
"LABEL_4581",
"LABEL_4582",
"LABEL_4583",
"LABEL_4584",
"LABEL_4585",
"LABEL_4586",
"LABEL_4587",
"LABEL_4588",
"LABEL_4589",
"LABEL_459",
"LABEL_4590",
"LABEL_4591",
"LABEL_4592",
"LABEL_4593",
"LABEL_4594",
"LABEL_4595",
"LABEL_4596",
"LABEL_4597",
"LABEL_4598",
"LABEL_4599",
"LABEL_46",
"LABEL_460",
"LABEL_4600",
"LABEL_4601",
"LABEL_4602",
"LABEL_4603",
"LABEL_4604",
"LABEL_4605",
"LABEL_4606",
"LABEL_4607",
"LABEL_4608",
"LABEL_4609",
"LABEL_461",
"LABEL_4610",
"LABEL_4611",
"LABEL_4612",
"LABEL_4613",
"LABEL_4614",
"LABEL_4615",
"LABEL_4616",
"LABEL_4617",
"LABEL_4618",
"LABEL_4619",
"LABEL_462",
"LABEL_4620",
"LABEL_4621",
"LABEL_4622",
"LABEL_4623",
"LABEL_4624",
"LABEL_4625",
"LABEL_4626",
"LABEL_4627",
"LABEL_4628",
"LABEL_4629",
"LABEL_463",
"LABEL_4630",
"LABEL_4631",
"LABEL_4632",
"LABEL_4633",
"LABEL_4634",
"LABEL_4635",
"LABEL_4636",
"LABEL_4637",
"LABEL_4638",
"LABEL_4639",
"LABEL_464",
"LABEL_4640",
"LABEL_4641",
"LABEL_4642",
"LABEL_4643",
"LABEL_4644",
"LABEL_4645",
"LABEL_4646",
"LABEL_4647",
"LABEL_4648",
"LABEL_4649",
"LABEL_465",
"LABEL_4650",
"LABEL_4651",
"LABEL_4652",
"LABEL_4653",
"LABEL_4654",
"LABEL_4655",
"LABEL_4656",
"LABEL_4657",
"LABEL_4658",
"LABEL_4659",
"LABEL_466",
"LABEL_4660",
"LABEL_4661",
"LABEL_4662",
"LABEL_4663",
"LABEL_4664",
"LABEL_4665",
"LABEL_4666",
"LABEL_4667",
"LABEL_4668",
"LABEL_4669",
"LABEL_467",
"LABEL_4670",
"LABEL_4671",
"LABEL_4672",
"LABEL_4673",
"LABEL_4674",
"LABEL_4675",
"LABEL_4676",
"LABEL_4677",
"LABEL_4678",
"LABEL_4679",
"LABEL_468",
"LABEL_4680",
"LABEL_4681",
"LABEL_4682",
"LABEL_4683",
"LABEL_4684",
"LABEL_4685",
"LABEL_4686",
"LABEL_4687",
"LABEL_4688",
"LABEL_4689",
"LABEL_469",
"LABEL_4690",
"LABEL_4691",
"LABEL_4692",
"LABEL_4693",
"LABEL_4694",
"LABEL_4695",
"LABEL_4696",
"LABEL_4697",
"LABEL_4698",
"LABEL_4699",
"LABEL_47",
"LABEL_470",
"LABEL_4700",
"LABEL_4701",
"LABEL_4702",
"LABEL_4703",
"LABEL_4704",
"LABEL_4705",
"LABEL_4706",
"LABEL_4707",
"LABEL_4708",
"LABEL_4709",
"LABEL_471",
"LABEL_4710",
"LABEL_4711",
"LABEL_4712",
"LABEL_4713",
"LABEL_4714",
"LABEL_4715",
"LABEL_4716",
"LABEL_4717",
"LABEL_4718",
"LABEL_4719",
"LABEL_472",
"LABEL_4720",
"LABEL_4721",
"LABEL_4722",
"LABEL_4723",
"LABEL_4724",
"LABEL_4725",
"LABEL_4726",
"LABEL_4727",
"LABEL_4728",
"LABEL_4729",
"LABEL_473",
"LABEL_4730",
"LABEL_4731",
"LABEL_4732",
"LABEL_4733",
"LABEL_4734",
"LABEL_4735",
"LABEL_4736",
"LABEL_4737",
"LABEL_4738",
"LABEL_4739",
"LABEL_474",
"LABEL_4740",
"LABEL_4741",
"LABEL_4742",
"LABEL_4743",
"LABEL_4744",
"LABEL_4745",
"LABEL_4746",
"LABEL_4747",
"LABEL_4748",
"LABEL_4749",
"LABEL_475",
"LABEL_4750",
"LABEL_4751",
"LABEL_4752",
"LABEL_4753",
"LABEL_4754",
"LABEL_4755",
"LABEL_4756",
"LABEL_4757",
"LABEL_4758",
"LABEL_4759",
"LABEL_476",
"LABEL_4760",
"LABEL_4761",
"LABEL_4762",
"LABEL_4763",
"LABEL_4764",
"LABEL_4765",
"LABEL_4766",
"LABEL_4767",
"LABEL_4768",
"LABEL_4769",
"LABEL_477",
"LABEL_4770",
"LABEL_4771",
"LABEL_4772",
"LABEL_4773",
"LABEL_4774",
"LABEL_4775",
"LABEL_4776",
"LABEL_4777",
"LABEL_4778",
"LABEL_4779",
"LABEL_478",
"LABEL_4780",
"LABEL_4781",
"LABEL_4782",
"LABEL_4783",
"LABEL_4784",
"LABEL_4785",
"LABEL_4786",
"LABEL_4787",
"LABEL_4788",
"LABEL_4789",
"LABEL_479",
"LABEL_4790",
"LABEL_4791",
"LABEL_4792",
"LABEL_4793",
"LABEL_4794",
"LABEL_4795",
"LABEL_4796",
"LABEL_4797",
"LABEL_4798",
"LABEL_4799",
"LABEL_48",
"LABEL_480",
"LABEL_4800",
"LABEL_4801",
"LABEL_4802",
"LABEL_4803",
"LABEL_4804",
"LABEL_4805",
"LABEL_4806",
"LABEL_4807",
"LABEL_4808",
"LABEL_4809",
"LABEL_481",
"LABEL_4810",
"LABEL_4811",
"LABEL_4812",
"LABEL_4813",
"LABEL_4814",
"LABEL_4815",
"LABEL_4816",
"LABEL_4817",
"LABEL_4818",
"LABEL_4819",
"LABEL_482",
"LABEL_4820",
"LABEL_4821",
"LABEL_4822",
"LABEL_4823",
"LABEL_4824",
"LABEL_4825",
"LABEL_4826",
"LABEL_4827",
"LABEL_4828",
"LABEL_4829",
"LABEL_483",
"LABEL_4830",
"LABEL_4831",
"LABEL_4832",
"LABEL_4833",
"LABEL_4834",
"LABEL_4835",
"LABEL_4836",
"LABEL_4837",
"LABEL_4838",
"LABEL_4839",
"LABEL_484",
"LABEL_4840",
"LABEL_4841",
"LABEL_4842",
"LABEL_4843",
"LABEL_4844",
"LABEL_4845",
"LABEL_4846",
"LABEL_4847",
"LABEL_4848",
"LABEL_4849",
"LABEL_485",
"LABEL_4850",
"LABEL_4851",
"LABEL_4852",
"LABEL_4853",
"LABEL_4854",
"LABEL_4855",
"LABEL_4856",
"LABEL_4857",
"LABEL_4858",
"LABEL_4859",
"LABEL_486",
"LABEL_4860",
"LABEL_4861",
"LABEL_4862",
"LABEL_4863",
"LABEL_4864",
"LABEL_4865",
"LABEL_4866",
"LABEL_4867",
"LABEL_4868",
"LABEL_4869",
"LABEL_487",
"LABEL_4870",
"LABEL_4871",
"LABEL_4872",
"LABEL_4873",
"LABEL_4874",
"LABEL_4875",
"LABEL_4876",
"LABEL_4877",
"LABEL_4878",
"LABEL_4879",
"LABEL_488",
"LABEL_4880",
"LABEL_4881",
"LABEL_4882",
"LABEL_4883",
"LABEL_4884",
"LABEL_4885",
"LABEL_4886",
"LABEL_4887",
"LABEL_4888",
"LABEL_4889",
"LABEL_489",
"LABEL_4890",
"LABEL_4891",
"LABEL_4892",
"LABEL_4893",
"LABEL_4894",
"LABEL_4895",
"LABEL_4896",
"LABEL_4897",
"LABEL_4898",
"LABEL_4899",
"LABEL_49",
"LABEL_490",
"LABEL_4900",
"LABEL_4901",
"LABEL_4902",
"LABEL_4903",
"LABEL_4904",
"LABEL_4905",
"LABEL_4906",
"LABEL_4907",
"LABEL_4908",
"LABEL_4909",
"LABEL_491",
"LABEL_4910",
"LABEL_4911",
"LABEL_4912",
"LABEL_4913",
"LABEL_4914",
"LABEL_4915",
"LABEL_4916",
"LABEL_4917",
"LABEL_4918",
"LABEL_4919",
"LABEL_492",
"LABEL_4920",
"LABEL_4921",
"LABEL_4922",
"LABEL_4923",
"LABEL_4924",
"LABEL_4925",
"LABEL_4926",
"LABEL_4927",
"LABEL_4928",
"LABEL_4929",
"LABEL_493",
"LABEL_4930",
"LABEL_4931",
"LABEL_4932",
"LABEL_4933",
"LABEL_4934",
"LABEL_4935",
"LABEL_4936",
"LABEL_4937",
"LABEL_4938",
"LABEL_4939",
"LABEL_494",
"LABEL_4940",
"LABEL_4941",
"LABEL_4942",
"LABEL_4943",
"LABEL_4944",
"LABEL_4945",
"LABEL_4946",
"LABEL_4947",
"LABEL_4948",
"LABEL_4949",
"LABEL_495",
"LABEL_4950",
"LABEL_4951",
"LABEL_4952",
"LABEL_4953",
"LABEL_4954",
"LABEL_4955",
"LABEL_4956",
"LABEL_4957",
"LABEL_4958",
"LABEL_4959",
"LABEL_496",
"LABEL_4960",
"LABEL_4961",
"LABEL_4962",
"LABEL_4963",
"LABEL_4964",
"LABEL_4965",
"LABEL_4966",
"LABEL_4967",
"LABEL_4968",
"LABEL_4969",
"LABEL_497",
"LABEL_4970",
"LABEL_4971",
"LABEL_4972",
"LABEL_4973",
"LABEL_4974",
"LABEL_4975",
"LABEL_4976",
"LABEL_4977",
"LABEL_4978",
"LABEL_4979",
"LABEL_498",
"LABEL_4980",
"LABEL_4981",
"LABEL_4982",
"LABEL_4983",
"LABEL_4984",
"LABEL_4985",
"LABEL_4986",
"LABEL_4987",
"LABEL_4988",
"LABEL_4989",
"LABEL_499",
"LABEL_4990",
"LABEL_4991",
"LABEL_4992",
"LABEL_4993",
"LABEL_4994",
"LABEL_4995",
"LABEL_4996",
"LABEL_4997",
"LABEL_4998",
"LABEL_4999",
"LABEL_5",
"LABEL_50",
"LABEL_500",
"LABEL_5000",
"LABEL_5001",
"LABEL_5002",
"LABEL_5003",
"LABEL_5004",
"LABEL_5005",
"LABEL_5006",
"LABEL_5007",
"LABEL_5008",
"LABEL_5009",
"LABEL_501",
"LABEL_5010",
"LABEL_5011",
"LABEL_5012",
"LABEL_5013",
"LABEL_5014",
"LABEL_5015",
"LABEL_5016",
"LABEL_5017",
"LABEL_5018",
"LABEL_5019",
"LABEL_502",
"LABEL_5020",
"LABEL_5021",
"LABEL_5022",
"LABEL_5023",
"LABEL_5024",
"LABEL_5025",
"LABEL_5026",
"LABEL_5027",
"LABEL_5028",
"LABEL_5029",
"LABEL_503",
"LABEL_5030",
"LABEL_5031",
"LABEL_5032",
"LABEL_5033",
"LABEL_5034",
"LABEL_5035",
"LABEL_5036",
"LABEL_5037",
"LABEL_5038",
"LABEL_5039",
"LABEL_504",
"LABEL_5040",
"LABEL_5041",
"LABEL_5042",
"LABEL_5043",
"LABEL_5044",
"LABEL_5045",
"LABEL_5046",
"LABEL_5047",
"LABEL_5048",
"LABEL_5049",
"LABEL_505",
"LABEL_5050",
"LABEL_5051",
"LABEL_5052",
"LABEL_5053",
"LABEL_5054",
"LABEL_5055",
"LABEL_5056",
"LABEL_5057",
"LABEL_5058",
"LABEL_5059",
"LABEL_506",
"LABEL_5060",
"LABEL_5061",
"LABEL_5062",
"LABEL_5063",
"LABEL_5064",
"LABEL_5065",
"LABEL_5066",
"LABEL_5067",
"LABEL_5068",
"LABEL_5069",
"LABEL_507",
"LABEL_5070",
"LABEL_5071",
"LABEL_5072",
"LABEL_5073",
"LABEL_5074",
"LABEL_5075",
"LABEL_5076",
"LABEL_5077",
"LABEL_5078",
"LABEL_5079",
"LABEL_508",
"LABEL_5080",
"LABEL_5081",
"LABEL_5082",
"LABEL_5083",
"LABEL_5084",
"LABEL_5085",
"LABEL_5086",
"LABEL_5087",
"LABEL_5088",
"LABEL_5089",
"LABEL_509",
"LABEL_5090",
"LABEL_5091",
"LABEL_5092",
"LABEL_5093",
"LABEL_5094",
"LABEL_5095",
"LABEL_5096",
"LABEL_5097",
"LABEL_5098",
"LABEL_5099",
"LABEL_51",
"LABEL_510",
"LABEL_5100",
"LABEL_5101",
"LABEL_5102",
"LABEL_5103",
"LABEL_5104",
"LABEL_5105",
"LABEL_5106",
"LABEL_5107",
"LABEL_5108",
"LABEL_5109",
"LABEL_511",
"LABEL_5110",
"LABEL_5111",
"LABEL_5112",
"LABEL_5113",
"LABEL_5114",
"LABEL_5115",
"LABEL_5116",
"LABEL_5117",
"LABEL_5118",
"LABEL_5119",
"LABEL_512",
"LABEL_5120",
"LABEL_5121",
"LABEL_5122",
"LABEL_5123",
"LABEL_5124",
"LABEL_5125",
"LABEL_5126",
"LABEL_5127",
"LABEL_5128",
"LABEL_5129",
"LABEL_513",
"LABEL_5130",
"LABEL_5131",
"LABEL_5132",
"LABEL_5133",
"LABEL_5134",
"LABEL_5135",
"LABEL_5136",
"LABEL_5137",
"LABEL_5138",
"LABEL_5139",
"LABEL_514",
"LABEL_5140",
"LABEL_5141",
"LABEL_5142",
"LABEL_5143",
"LABEL_5144",
"LABEL_5145",
"LABEL_5146",
"LABEL_5147",
"LABEL_5148",
"LABEL_5149",
"LABEL_515",
"LABEL_5150",
"LABEL_5151",
"LABEL_5152",
"LABEL_5153",
"LABEL_5154",
"LABEL_5155",
"LABEL_5156",
"LABEL_5157",
"LABEL_5158",
"LABEL_5159",
"LABEL_516",
"LABEL_5160",
"LABEL_5161",
"LABEL_5162",
"LABEL_5163",
"LABEL_5164",
"LABEL_5165",
"LABEL_5166",
"LABEL_5167",
"LABEL_5168",
"LABEL_5169",
"LABEL_517",
"LABEL_5170",
"LABEL_5171",
"LABEL_5172",
"LABEL_5173",
"LABEL_5174",
"LABEL_5175",
"LABEL_5176",
"LABEL_5177",
"LABEL_5178",
"LABEL_5179",
"LABEL_518",
"LABEL_5180",
"LABEL_5181",
"LABEL_5182",
"LABEL_5183",
"LABEL_5184",
"LABEL_5185",
"LABEL_5186",
"LABEL_5187",
"LABEL_5188",
"LABEL_5189",
"LABEL_519",
"LABEL_5190",
"LABEL_5191",
"LABEL_5192",
"LABEL_5193",
"LABEL_5194",
"LABEL_5195",
"LABEL_5196",
"LABEL_5197",
"LABEL_5198",
"LABEL_5199",
"LABEL_52",
"LABEL_520",
"LABEL_5200",
"LABEL_5201",
"LABEL_5202",
"LABEL_5203",
"LABEL_5204",
"LABEL_5205",
"LABEL_5206",
"LABEL_5207",
"LABEL_5208",
"LABEL_5209",
"LABEL_521",
"LABEL_5210",
"LABEL_5211",
"LABEL_5212",
"LABEL_5213",
"LABEL_5214",
"LABEL_5215",
"LABEL_5216",
"LABEL_5217",
"LABEL_5218",
"LABEL_5219",
"LABEL_522",
"LABEL_5220",
"LABEL_5221",
"LABEL_5222",
"LABEL_5223",
"LABEL_5224",
"LABEL_5225",
"LABEL_5226",
"LABEL_5227",
"LABEL_5228",
"LABEL_5229",
"LABEL_523",
"LABEL_5230",
"LABEL_5231",
"LABEL_5232",
"LABEL_5233",
"LABEL_5234",
"LABEL_5235",
"LABEL_5236",
"LABEL_5237",
"LABEL_5238",
"LABEL_5239",
"LABEL_524",
"LABEL_5240",
"LABEL_5241",
"LABEL_5242",
"LABEL_5243",
"LABEL_5244",
"LABEL_5245",
"LABEL_5246",
"LABEL_5247",
"LABEL_5248",
"LABEL_5249",
"LABEL_525",
"LABEL_5250",
"LABEL_5251",
"LABEL_5252",
"LABEL_5253",
"LABEL_5254",
"LABEL_5255",
"LABEL_5256",
"LABEL_5257",
"LABEL_5258",
"LABEL_5259",
"LABEL_526",
"LABEL_5260",
"LABEL_5261",
"LABEL_5262",
"LABEL_5263",
"LABEL_5264",
"LABEL_5265",
"LABEL_5266",
"LABEL_5267",
"LABEL_5268",
"LABEL_5269",
"LABEL_527",
"LABEL_5270",
"LABEL_5271",
"LABEL_5272",
"LABEL_5273",
"LABEL_5274",
"LABEL_5275",
"LABEL_5276",
"LABEL_5277",
"LABEL_5278",
"LABEL_5279",
"LABEL_528",
"LABEL_5280",
"LABEL_5281",
"LABEL_5282",
"LABEL_5283",
"LABEL_5284",
"LABEL_5285",
"LABEL_5286",
"LABEL_5287",
"LABEL_5288",
"LABEL_5289",
"LABEL_529",
"LABEL_5290",
"LABEL_5291",
"LABEL_5292",
"LABEL_5293",
"LABEL_5294",
"LABEL_5295",
"LABEL_5296",
"LABEL_5297",
"LABEL_5298",
"LABEL_5299",
"LABEL_53",
"LABEL_530",
"LABEL_5300",
"LABEL_5301",
"LABEL_5302",
"LABEL_5303",
"LABEL_5304",
"LABEL_5305",
"LABEL_5306",
"LABEL_5307",
"LABEL_5308",
"LABEL_5309",
"LABEL_531",
"LABEL_5310",
"LABEL_5311",
"LABEL_5312",
"LABEL_5313",
"LABEL_5314",
"LABEL_5315",
"LABEL_5316",
"LABEL_5317",
"LABEL_5318",
"LABEL_5319",
"LABEL_532",
"LABEL_5320",
"LABEL_5321",
"LABEL_5322",
"LABEL_5323",
"LABEL_5324",
"LABEL_5325",
"LABEL_5326",
"LABEL_5327",
"LABEL_5328",
"LABEL_5329",
"LABEL_533",
"LABEL_5330",
"LABEL_5331",
"LABEL_5332",
"LABEL_5333",
"LABEL_5334",
"LABEL_5335",
"LABEL_5336",
"LABEL_5337",
"LABEL_5338",
"LABEL_5339",
"LABEL_534",
"LABEL_5340",
"LABEL_5341",
"LABEL_5342",
"LABEL_5343",
"LABEL_5344",
"LABEL_5345",
"LABEL_5346",
"LABEL_5347",
"LABEL_5348",
"LABEL_5349",
"LABEL_535",
"LABEL_5350",
"LABEL_5351",
"LABEL_5352",
"LABEL_5353",
"LABEL_5354",
"LABEL_5355",
"LABEL_5356",
"LABEL_5357",
"LABEL_5358",
"LABEL_5359",
"LABEL_536",
"LABEL_5360",
"LABEL_5361",
"LABEL_5362",
"LABEL_5363",
"LABEL_5364",
"LABEL_5365",
"LABEL_5366",
"LABEL_5367",
"LABEL_5368",
"LABEL_5369",
"LABEL_537",
"LABEL_5370",
"LABEL_5371",
"LABEL_5372",
"LABEL_5373",
"LABEL_5374",
"LABEL_5375",
"LABEL_5376",
"LABEL_5377",
"LABEL_5378",
"LABEL_5379",
"LABEL_538",
"LABEL_5380",
"LABEL_5381",
"LABEL_5382",
"LABEL_5383",
"LABEL_5384",
"LABEL_5385",
"LABEL_5386",
"LABEL_5387",
"LABEL_5388",
"LABEL_5389",
"LABEL_539",
"LABEL_5390",
"LABEL_5391",
"LABEL_5392",
"LABEL_5393",
"LABEL_5394",
"LABEL_5395",
"LABEL_5396",
"LABEL_5397",
"LABEL_5398",
"LABEL_5399",
"LABEL_54",
"LABEL_540",
"LABEL_5400",
"LABEL_5401",
"LABEL_5402",
"LABEL_5403",
"LABEL_5404",
"LABEL_5405",
"LABEL_5406",
"LABEL_5407",
"LABEL_5408",
"LABEL_5409",
"LABEL_541",
"LABEL_5410",
"LABEL_5411",
"LABEL_5412",
"LABEL_5413",
"LABEL_5414",
"LABEL_5415",
"LABEL_5416",
"LABEL_5417",
"LABEL_5418",
"LABEL_5419",
"LABEL_542",
"LABEL_5420",
"LABEL_5421",
"LABEL_5422",
"LABEL_5423",
"LABEL_5424",
"LABEL_5425",
"LABEL_5426",
"LABEL_5427",
"LABEL_5428",
"LABEL_5429",
"LABEL_543",
"LABEL_5430",
"LABEL_5431",
"LABEL_5432",
"LABEL_5433",
"LABEL_5434",
"LABEL_5435",
"LABEL_5436",
"LABEL_5437",
"LABEL_5438",
"LABEL_5439",
"LABEL_544",
"LABEL_5440",
"LABEL_5441",
"LABEL_5442",
"LABEL_5443",
"LABEL_5444",
"LABEL_5445",
"LABEL_5446",
"LABEL_5447",
"LABEL_5448",
"LABEL_5449",
"LABEL_545",
"LABEL_5450",
"LABEL_5451",
"LABEL_5452",
"LABEL_5453",
"LABEL_5454",
"LABEL_5455",
"LABEL_5456",
"LABEL_5457",
"LABEL_5458",
"LABEL_5459",
"LABEL_546",
"LABEL_5460",
"LABEL_5461",
"LABEL_5462",
"LABEL_5463",
"LABEL_5464",
"LABEL_5465",
"LABEL_5466",
"LABEL_5467",
"LABEL_5468",
"LABEL_5469",
"LABEL_547",
"LABEL_5470",
"LABEL_5471",
"LABEL_5472",
"LABEL_5473",
"LABEL_5474",
"LABEL_5475",
"LABEL_5476",
"LABEL_5477",
"LABEL_5478",
"LABEL_5479",
"LABEL_548",
"LABEL_5480",
"LABEL_5481",
"LABEL_5482",
"LABEL_5483",
"LABEL_5484",
"LABEL_5485",
"LABEL_5486",
"LABEL_5487",
"LABEL_5488",
"LABEL_5489",
"LABEL_549",
"LABEL_5490",
"LABEL_5491",
"LABEL_5492",
"LABEL_5493",
"LABEL_5494",
"LABEL_5495",
"LABEL_5496",
"LABEL_5497",
"LABEL_5498",
"LABEL_5499",
"LABEL_55",
"LABEL_550",
"LABEL_5500",
"LABEL_5501",
"LABEL_5502",
"LABEL_5503",
"LABEL_5504",
"LABEL_5505",
"LABEL_5506",
"LABEL_5507",
"LABEL_5508",
"LABEL_5509",
"LABEL_551",
"LABEL_5510",
"LABEL_5511",
"LABEL_5512",
"LABEL_5513",
"LABEL_5514",
"LABEL_5515",
"LABEL_5516",
"LABEL_5517",
"LABEL_5518",
"LABEL_5519",
"LABEL_552",
"LABEL_5520",
"LABEL_5521",
"LABEL_5522",
"LABEL_5523",
"LABEL_5524",
"LABEL_5525",
"LABEL_5526",
"LABEL_5527",
"LABEL_5528",
"LABEL_5529",
"LABEL_553",
"LABEL_5530",
"LABEL_5531",
"LABEL_5532",
"LABEL_5533",
"LABEL_5534",
"LABEL_5535",
"LABEL_5536",
"LABEL_5537",
"LABEL_5538",
"LABEL_5539",
"LABEL_554",
"LABEL_5540",
"LABEL_5541",
"LABEL_5542",
"LABEL_5543",
"LABEL_5544",
"LABEL_5545",
"LABEL_5546",
"LABEL_5547",
"LABEL_5548",
"LABEL_5549",
"LABEL_555",
"LABEL_5550",
"LABEL_5551",
"LABEL_5552",
"LABEL_5553",
"LABEL_5554",
"LABEL_5555",
"LABEL_5556",
"LABEL_5557",
"LABEL_5558",
"LABEL_5559",
"LABEL_556",
"LABEL_5560",
"LABEL_5561",
"LABEL_5562",
"LABEL_5563",
"LABEL_5564",
"LABEL_5565",
"LABEL_5566",
"LABEL_5567",
"LABEL_5568",
"LABEL_5569",
"LABEL_557",
"LABEL_5570",
"LABEL_5571",
"LABEL_5572",
"LABEL_5573",
"LABEL_5574",
"LABEL_5575",
"LABEL_5576",
"LABEL_5577",
"LABEL_5578",
"LABEL_5579",
"LABEL_558",
"LABEL_5580",
"LABEL_5581",
"LABEL_5582",
"LABEL_5583",
"LABEL_5584",
"LABEL_5585",
"LABEL_5586",
"LABEL_5587",
"LABEL_5588",
"LABEL_5589",
"LABEL_559",
"LABEL_5590",
"LABEL_5591",
"LABEL_5592",
"LABEL_5593",
"LABEL_5594",
"LABEL_5595",
"LABEL_5596",
"LABEL_5597",
"LABEL_5598",
"LABEL_5599",
"LABEL_56",
"LABEL_560",
"LABEL_5600",
"LABEL_5601",
"LABEL_5602",
"LABEL_5603",
"LABEL_5604",
"LABEL_5605",
"LABEL_5606",
"LABEL_5607",
"LABEL_5608",
"LABEL_5609",
"LABEL_561",
"LABEL_5610",
"LABEL_5611",
"LABEL_5612",
"LABEL_5613",
"LABEL_5614",
"LABEL_5615",
"LABEL_5616",
"LABEL_5617",
"LABEL_5618",
"LABEL_5619",
"LABEL_562",
"LABEL_5620",
"LABEL_5621",
"LABEL_5622",
"LABEL_5623",
"LABEL_5624",
"LABEL_5625",
"LABEL_5626",
"LABEL_5627",
"LABEL_5628",
"LABEL_5629",
"LABEL_563",
"LABEL_5630",
"LABEL_5631",
"LABEL_5632",
"LABEL_5633",
"LABEL_5634",
"LABEL_5635",
"LABEL_5636",
"LABEL_5637",
"LABEL_5638",
"LABEL_5639",
"LABEL_564",
"LABEL_5640",
"LABEL_5641",
"LABEL_5642",
"LABEL_5643",
"LABEL_5644",
"LABEL_5645",
"LABEL_5646",
"LABEL_5647",
"LABEL_5648",
"LABEL_5649",
"LABEL_565",
"LABEL_5650",
"LABEL_5651",
"LABEL_5652",
"LABEL_5653",
"LABEL_5654",
"LABEL_5655",
"LABEL_5656",
"LABEL_5657",
"LABEL_5658",
"LABEL_5659",
"LABEL_566",
"LABEL_5660",
"LABEL_5661",
"LABEL_5662",
"LABEL_5663",
"LABEL_5664",
"LABEL_5665",
"LABEL_5666",
"LABEL_5667",
"LABEL_5668",
"LABEL_5669",
"LABEL_567",
"LABEL_5670",
"LABEL_5671",
"LABEL_5672",
"LABEL_5673",
"LABEL_5674",
"LABEL_5675",
"LABEL_5676",
"LABEL_5677",
"LABEL_5678",
"LABEL_5679",
"LABEL_568",
"LABEL_5680",
"LABEL_5681",
"LABEL_5682",
"LABEL_5683",
"LABEL_5684",
"LABEL_5685",
"LABEL_5686",
"LABEL_5687",
"LABEL_5688",
"LABEL_5689",
"LABEL_569",
"LABEL_5690",
"LABEL_5691",
"LABEL_5692",
"LABEL_5693",
"LABEL_5694",
"LABEL_5695",
"LABEL_5696",
"LABEL_5697",
"LABEL_5698",
"LABEL_5699",
"LABEL_57",
"LABEL_570",
"LABEL_5700",
"LABEL_5701",
"LABEL_5702",
"LABEL_5703",
"LABEL_5704",
"LABEL_5705",
"LABEL_5706",
"LABEL_5707",
"LABEL_5708",
"LABEL_5709",
"LABEL_571",
"LABEL_5710",
"LABEL_5711",
"LABEL_5712",
"LABEL_5713",
"LABEL_5714",
"LABEL_5715",
"LABEL_5716",
"LABEL_5717",
"LABEL_5718",
"LABEL_5719",
"LABEL_572",
"LABEL_5720",
"LABEL_5721",
"LABEL_5722",
"LABEL_5723",
"LABEL_5724",
"LABEL_5725",
"LABEL_5726",
"LABEL_5727",
"LABEL_5728",
"LABEL_5729",
"LABEL_573",
"LABEL_5730",
"LABEL_5731",
"LABEL_5732",
"LABEL_5733",
"LABEL_5734",
"LABEL_5735",
"LABEL_5736",
"LABEL_5737",
"LABEL_5738",
"LABEL_5739",
"LABEL_574",
"LABEL_5740",
"LABEL_5741",
"LABEL_5742",
"LABEL_5743",
"LABEL_5744",
"LABEL_5745",
"LABEL_5746",
"LABEL_5747",
"LABEL_5748",
"LABEL_5749",
"LABEL_575",
"LABEL_5750",
"LABEL_5751",
"LABEL_5752",
"LABEL_5753",
"LABEL_5754",
"LABEL_5755",
"LABEL_5756",
"LABEL_5757",
"LABEL_5758",
"LABEL_5759",
"LABEL_576",
"LABEL_5760",
"LABEL_5761",
"LABEL_5762",
"LABEL_5763",
"LABEL_5764",
"LABEL_5765",
"LABEL_5766",
"LABEL_5767",
"LABEL_5768",
"LABEL_5769",
"LABEL_577",
"LABEL_5770",
"LABEL_5771",
"LABEL_5772",
"LABEL_5773",
"LABEL_5774",
"LABEL_5775",
"LABEL_5776",
"LABEL_5777",
"LABEL_5778",
"LABEL_5779",
"LABEL_578",
"LABEL_5780",
"LABEL_5781",
"LABEL_5782",
"LABEL_5783",
"LABEL_5784",
"LABEL_5785",
"LABEL_5786",
"LABEL_5787",
"LABEL_5788",
"LABEL_5789",
"LABEL_579",
"LABEL_5790",
"LABEL_5791",
"LABEL_5792",
"LABEL_5793",
"LABEL_5794",
"LABEL_5795",
"LABEL_5796",
"LABEL_5797",
"LABEL_5798",
"LABEL_5799",
"LABEL_58",
"LABEL_580",
"LABEL_5800",
"LABEL_5801",
"LABEL_5802",
"LABEL_5803",
"LABEL_5804",
"LABEL_5805",
"LABEL_5806",
"LABEL_5807",
"LABEL_5808",
"LABEL_5809",
"LABEL_581",
"LABEL_5810",
"LABEL_5811",
"LABEL_5812",
"LABEL_5813",
"LABEL_5814",
"LABEL_5815",
"LABEL_5816",
"LABEL_5817",
"LABEL_5818",
"LABEL_5819",
"LABEL_582",
"LABEL_5820",
"LABEL_5821",
"LABEL_5822",
"LABEL_5823",
"LABEL_5824",
"LABEL_5825",
"LABEL_5826",
"LABEL_5827",
"LABEL_5828",
"LABEL_5829",
"LABEL_583",
"LABEL_5830",
"LABEL_5831",
"LABEL_5832",
"LABEL_5833",
"LABEL_5834",
"LABEL_5835",
"LABEL_5836",
"LABEL_5837",
"LABEL_5838",
"LABEL_5839",
"LABEL_584",
"LABEL_5840",
"LABEL_5841",
"LABEL_5842",
"LABEL_5843",
"LABEL_5844",
"LABEL_5845",
"LABEL_5846",
"LABEL_5847",
"LABEL_5848",
"LABEL_5849",
"LABEL_585",
"LABEL_5850",
"LABEL_5851",
"LABEL_5852",
"LABEL_5853",
"LABEL_5854",
"LABEL_5855",
"LABEL_5856",
"LABEL_5857",
"LABEL_5858",
"LABEL_5859",
"LABEL_586",
"LABEL_5860",
"LABEL_5861",
"LABEL_5862",
"LABEL_5863",
"LABEL_5864",
"LABEL_5865",
"LABEL_5866",
"LABEL_5867",
"LABEL_5868",
"LABEL_5869",
"LABEL_587",
"LABEL_5870",
"LABEL_5871",
"LABEL_5872",
"LABEL_5873",
"LABEL_5874",
"LABEL_5875",
"LABEL_5876",
"LABEL_5877",
"LABEL_5878",
"LABEL_5879",
"LABEL_588",
"LABEL_5880",
"LABEL_5881",
"LABEL_5882",
"LABEL_5883",
"LABEL_5884",
"LABEL_5885",
"LABEL_5886",
"LABEL_5887",
"LABEL_5888",
"LABEL_5889",
"LABEL_589",
"LABEL_5890",
"LABEL_5891",
"LABEL_5892",
"LABEL_5893",
"LABEL_5894",
"LABEL_5895",
"LABEL_5896",
"LABEL_5897",
"LABEL_5898",
"LABEL_5899",
"LABEL_59",
"LABEL_590",
"LABEL_5900",
"LABEL_5901",
"LABEL_5902",
"LABEL_5903",
"LABEL_5904",
"LABEL_5905",
"LABEL_5906",
"LABEL_5907",
"LABEL_5908",
"LABEL_5909",
"LABEL_591",
"LABEL_5910",
"LABEL_5911",
"LABEL_5912",
"LABEL_5913",
"LABEL_5914",
"LABEL_5915",
"LABEL_5916",
"LABEL_5917",
"LABEL_5918",
"LABEL_5919",
"LABEL_592",
"LABEL_5920",
"LABEL_5921",
"LABEL_5922",
"LABEL_5923",
"LABEL_5924",
"LABEL_5925",
"LABEL_5926",
"LABEL_5927",
"LABEL_5928",
"LABEL_5929",
"LABEL_593",
"LABEL_5930",
"LABEL_5931",
"LABEL_5932",
"LABEL_5933",
"LABEL_5934",
"LABEL_5935",
"LABEL_5936",
"LABEL_5937",
"LABEL_5938",
"LABEL_5939",
"LABEL_594",
"LABEL_5940",
"LABEL_5941",
"LABEL_5942",
"LABEL_5943",
"LABEL_5944",
"LABEL_5945",
"LABEL_5946",
"LABEL_5947",
"LABEL_5948",
"LABEL_5949",
"LABEL_595",
"LABEL_5950",
"LABEL_5951",
"LABEL_5952",
"LABEL_5953",
"LABEL_5954",
"LABEL_5955",
"LABEL_5956",
"LABEL_5957",
"LABEL_5958",
"LABEL_5959",
"LABEL_596",
"LABEL_5960",
"LABEL_5961",
"LABEL_5962",
"LABEL_5963",
"LABEL_5964",
"LABEL_5965",
"LABEL_5966",
"LABEL_5967",
"LABEL_5968",
"LABEL_5969",
"LABEL_597",
"LABEL_5970",
"LABEL_5971",
"LABEL_5972",
"LABEL_5973",
"LABEL_5974",
"LABEL_5975",
"LABEL_5976",
"LABEL_5977",
"LABEL_5978",
"LABEL_5979",
"LABEL_598",
"LABEL_5980",
"LABEL_5981",
"LABEL_5982",
"LABEL_5983",
"LABEL_5984",
"LABEL_5985",
"LABEL_5986",
"LABEL_5987",
"LABEL_5988",
"LABEL_5989",
"LABEL_599",
"LABEL_5990",
"LABEL_5991",
"LABEL_5992",
"LABEL_5993",
"LABEL_5994",
"LABEL_5995",
"LABEL_5996",
"LABEL_5997",
"LABEL_5998",
"LABEL_5999",
"LABEL_6",
"LABEL_60",
"LABEL_600",
"LABEL_6000",
"LABEL_6001",
"LABEL_6002",
"LABEL_6003",
"LABEL_6004",
"LABEL_6005",
"LABEL_6006",
"LABEL_6007",
"LABEL_6008",
"LABEL_6009",
"LABEL_601",
"LABEL_6010",
"LABEL_6011",
"LABEL_6012",
"LABEL_6013",
"LABEL_6014",
"LABEL_6015",
"LABEL_6016",
"LABEL_6017",
"LABEL_6018",
"LABEL_6019",
"LABEL_602",
"LABEL_6020",
"LABEL_6021",
"LABEL_6022",
"LABEL_6023",
"LABEL_6024",
"LABEL_6025",
"LABEL_6026",
"LABEL_6027",
"LABEL_6028",
"LABEL_6029",
"LABEL_603",
"LABEL_6030",
"LABEL_6031",
"LABEL_6032",
"LABEL_6033",
"LABEL_6034",
"LABEL_6035",
"LABEL_6036",
"LABEL_6037",
"LABEL_6038",
"LABEL_6039",
"LABEL_604",
"LABEL_6040",
"LABEL_6041",
"LABEL_6042",
"LABEL_6043",
"LABEL_6044",
"LABEL_6045",
"LABEL_6046",
"LABEL_6047",
"LABEL_6048",
"LABEL_6049",
"LABEL_605",
"LABEL_6050",
"LABEL_6051",
"LABEL_6052",
"LABEL_6053",
"LABEL_6054",
"LABEL_6055",
"LABEL_6056",
"LABEL_6057",
"LABEL_6058",
"LABEL_6059",
"LABEL_606",
"LABEL_6060",
"LABEL_6061",
"LABEL_6062",
"LABEL_6063",
"LABEL_6064",
"LABEL_6065",
"LABEL_6066",
"LABEL_6067",
"LABEL_6068",
"LABEL_6069",
"LABEL_607",
"LABEL_6070",
"LABEL_6071",
"LABEL_6072",
"LABEL_6073",
"LABEL_6074",
"LABEL_6075",
"LABEL_6076",
"LABEL_6077",
"LABEL_6078",
"LABEL_6079",
"LABEL_608",
"LABEL_6080",
"LABEL_6081",
"LABEL_6082",
"LABEL_6083",
"LABEL_6084",
"LABEL_6085",
"LABEL_6086",
"LABEL_6087",
"LABEL_6088",
"LABEL_6089",
"LABEL_609",
"LABEL_6090",
"LABEL_6091",
"LABEL_6092",
"LABEL_6093",
"LABEL_6094",
"LABEL_6095",
"LABEL_6096",
"LABEL_6097",
"LABEL_6098",
"LABEL_6099",
"LABEL_61",
"LABEL_610",
"LABEL_6100",
"LABEL_6101",
"LABEL_6102",
"LABEL_6103",
"LABEL_6104",
"LABEL_6105",
"LABEL_6106",
"LABEL_6107",
"LABEL_6108",
"LABEL_6109",
"LABEL_611",
"LABEL_6110",
"LABEL_6111",
"LABEL_6112",
"LABEL_6113",
"LABEL_6114",
"LABEL_6115",
"LABEL_6116",
"LABEL_6117",
"LABEL_6118",
"LABEL_6119",
"LABEL_612",
"LABEL_6120",
"LABEL_6121",
"LABEL_6122",
"LABEL_6123",
"LABEL_6124",
"LABEL_6125",
"LABEL_6126",
"LABEL_6127",
"LABEL_6128",
"LABEL_6129",
"LABEL_613",
"LABEL_6130",
"LABEL_6131",
"LABEL_6132",
"LABEL_6133",
"LABEL_6134",
"LABEL_6135",
"LABEL_6136",
"LABEL_6137",
"LABEL_6138",
"LABEL_6139",
"LABEL_614",
"LABEL_6140",
"LABEL_6141",
"LABEL_6142",
"LABEL_6143",
"LABEL_6144",
"LABEL_6145",
"LABEL_6146",
"LABEL_6147",
"LABEL_6148",
"LABEL_6149",
"LABEL_615",
"LABEL_6150",
"LABEL_6151",
"LABEL_6152",
"LABEL_6153",
"LABEL_6154",
"LABEL_6155",
"LABEL_6156",
"LABEL_6157",
"LABEL_6158",
"LABEL_6159",
"LABEL_616",
"LABEL_6160",
"LABEL_6161",
"LABEL_6162",
"LABEL_6163",
"LABEL_6164",
"LABEL_6165",
"LABEL_6166",
"LABEL_6167",
"LABEL_6168",
"LABEL_6169",
"LABEL_617",
"LABEL_6170",
"LABEL_6171",
"LABEL_6172",
"LABEL_6173",
"LABEL_6174",
"LABEL_6175",
"LABEL_6176",
"LABEL_6177",
"LABEL_6178",
"LABEL_6179",
"LABEL_618",
"LABEL_6180",
"LABEL_6181",
"LABEL_6182",
"LABEL_6183",
"LABEL_6184",
"LABEL_6185",
"LABEL_6186",
"LABEL_6187",
"LABEL_6188",
"LABEL_6189",
"LABEL_619",
"LABEL_6190",
"LABEL_6191",
"LABEL_6192",
"LABEL_6193",
"LABEL_6194",
"LABEL_6195",
"LABEL_6196",
"LABEL_6197",
"LABEL_6198",
"LABEL_6199",
"LABEL_62",
"LABEL_620",
"LABEL_6200",
"LABEL_6201",
"LABEL_6202",
"LABEL_6203",
"LABEL_6204",
"LABEL_6205",
"LABEL_6206",
"LABEL_6207",
"LABEL_6208",
"LABEL_6209",
"LABEL_621",
"LABEL_6210",
"LABEL_6211",
"LABEL_6212",
"LABEL_6213",
"LABEL_6214",
"LABEL_6215",
"LABEL_6216",
"LABEL_6217",
"LABEL_6218",
"LABEL_6219",
"LABEL_622",
"LABEL_6220",
"LABEL_6221",
"LABEL_6222",
"LABEL_6223",
"LABEL_6224",
"LABEL_6225",
"LABEL_6226",
"LABEL_6227",
"LABEL_6228",
"LABEL_6229",
"LABEL_623",
"LABEL_6230",
"LABEL_6231",
"LABEL_6232",
"LABEL_6233",
"LABEL_6234",
"LABEL_6235",
"LABEL_6236",
"LABEL_6237",
"LABEL_6238",
"LABEL_6239",
"LABEL_624",
"LABEL_6240",
"LABEL_6241",
"LABEL_6242",
"LABEL_6243",
"LABEL_6244",
"LABEL_6245",
"LABEL_6246",
"LABEL_6247",
"LABEL_6248",
"LABEL_6249",
"LABEL_625",
"LABEL_6250",
"LABEL_6251",
"LABEL_6252",
"LABEL_6253",
"LABEL_6254",
"LABEL_6255",
"LABEL_6256",
"LABEL_6257",
"LABEL_6258",
"LABEL_6259",
"LABEL_626",
"LABEL_6260",
"LABEL_6261",
"LABEL_6262",
"LABEL_6263",
"LABEL_6264",
"LABEL_6265",
"LABEL_6266",
"LABEL_6267",
"LABEL_6268",
"LABEL_6269",
"LABEL_627",
"LABEL_6270",
"LABEL_6271",
"LABEL_6272",
"LABEL_6273",
"LABEL_6274",
"LABEL_6275",
"LABEL_6276",
"LABEL_6277",
"LABEL_6278",
"LABEL_6279",
"LABEL_628",
"LABEL_6280",
"LABEL_6281",
"LABEL_6282",
"LABEL_6283",
"LABEL_6284",
"LABEL_6285",
"LABEL_6286",
"LABEL_6287",
"LABEL_6288",
"LABEL_6289",
"LABEL_629",
"LABEL_6290",
"LABEL_6291",
"LABEL_6292",
"LABEL_6293",
"LABEL_6294",
"LABEL_6295",
"LABEL_6296",
"LABEL_6297",
"LABEL_6298",
"LABEL_6299",
"LABEL_63",
"LABEL_630",
"LABEL_6300",
"LABEL_6301",
"LABEL_6302",
"LABEL_6303",
"LABEL_6304",
"LABEL_6305",
"LABEL_6306",
"LABEL_6307",
"LABEL_6308",
"LABEL_6309",
"LABEL_631",
"LABEL_6310",
"LABEL_6311",
"LABEL_6312",
"LABEL_6313",
"LABEL_6314",
"LABEL_6315",
"LABEL_6316",
"LABEL_6317",
"LABEL_6318",
"LABEL_6319",
"LABEL_632",
"LABEL_6320",
"LABEL_6321",
"LABEL_6322",
"LABEL_6323",
"LABEL_6324",
"LABEL_6325",
"LABEL_6326",
"LABEL_6327",
"LABEL_6328",
"LABEL_6329",
"LABEL_633",
"LABEL_6330",
"LABEL_6331",
"LABEL_6332",
"LABEL_6333",
"LABEL_6334",
"LABEL_6335",
"LABEL_6336",
"LABEL_6337",
"LABEL_6338",
"LABEL_6339",
"LABEL_634",
"LABEL_6340",
"LABEL_6341",
"LABEL_6342",
"LABEL_6343",
"LABEL_6344",
"LABEL_6345",
"LABEL_6346",
"LABEL_6347",
"LABEL_6348",
"LABEL_6349",
"LABEL_635",
"LABEL_6350",
"LABEL_6351",
"LABEL_6352",
"LABEL_6353",
"LABEL_6354",
"LABEL_6355",
"LABEL_6356",
"LABEL_6357",
"LABEL_6358",
"LABEL_6359",
"LABEL_636",
"LABEL_6360",
"LABEL_6361",
"LABEL_6362",
"LABEL_6363",
"LABEL_6364",
"LABEL_6365",
"LABEL_6366",
"LABEL_6367",
"LABEL_6368",
"LABEL_6369",
"LABEL_637",
"LABEL_6370",
"LABEL_6371",
"LABEL_6372",
"LABEL_6373",
"LABEL_6374",
"LABEL_6375",
"LABEL_6376",
"LABEL_6377",
"LABEL_6378",
"LABEL_6379",
"LABEL_638",
"LABEL_6380",
"LABEL_6381",
"LABEL_6382",
"LABEL_6383",
"LABEL_6384",
"LABEL_6385",
"LABEL_6386",
"LABEL_6387",
"LABEL_6388",
"LABEL_6389",
"LABEL_639",
"LABEL_6390",
"LABEL_6391",
"LABEL_6392",
"LABEL_6393",
"LABEL_6394",
"LABEL_6395",
"LABEL_6396",
"LABEL_6397",
"LABEL_6398",
"LABEL_6399",
"LABEL_64",
"LABEL_640",
"LABEL_6400",
"LABEL_6401",
"LABEL_6402",
"LABEL_6403",
"LABEL_6404",
"LABEL_6405",
"LABEL_6406",
"LABEL_6407",
"LABEL_6408",
"LABEL_6409",
"LABEL_641",
"LABEL_6410",
"LABEL_6411",
"LABEL_6412",
"LABEL_6413",
"LABEL_6414",
"LABEL_6415",
"LABEL_6416",
"LABEL_6417",
"LABEL_6418",
"LABEL_6419",
"LABEL_642",
"LABEL_6420",
"LABEL_6421",
"LABEL_6422",
"LABEL_6423",
"LABEL_6424",
"LABEL_6425",
"LABEL_6426",
"LABEL_6427",
"LABEL_6428",
"LABEL_6429",
"LABEL_643",
"LABEL_6430",
"LABEL_6431",
"LABEL_6432",
"LABEL_6433",
"LABEL_6434",
"LABEL_6435",
"LABEL_6436",
"LABEL_6437",
"LABEL_6438",
"LABEL_6439",
"LABEL_644",
"LABEL_6440",
"LABEL_6441",
"LABEL_6442",
"LABEL_6443",
"LABEL_6444",
"LABEL_6445",
"LABEL_6446",
"LABEL_6447",
"LABEL_6448",
"LABEL_6449",
"LABEL_645",
"LABEL_6450",
"LABEL_6451",
"LABEL_6452",
"LABEL_6453",
"LABEL_6454",
"LABEL_6455",
"LABEL_6456",
"LABEL_6457",
"LABEL_6458",
"LABEL_6459",
"LABEL_646",
"LABEL_6460",
"LABEL_6461",
"LABEL_6462",
"LABEL_6463",
"LABEL_6464",
"LABEL_6465",
"LABEL_6466",
"LABEL_6467",
"LABEL_6468",
"LABEL_6469",
"LABEL_647",
"LABEL_6470",
"LABEL_6471",
"LABEL_6472",
"LABEL_6473",
"LABEL_6474",
"LABEL_6475",
"LABEL_6476",
"LABEL_6477",
"LABEL_6478",
"LABEL_6479",
"LABEL_648",
"LABEL_6480",
"LABEL_6481",
"LABEL_6482",
"LABEL_6483",
"LABEL_6484",
"LABEL_6485",
"LABEL_6486",
"LABEL_6487",
"LABEL_6488",
"LABEL_6489",
"LABEL_649",
"LABEL_6490",
"LABEL_6491",
"LABEL_6492",
"LABEL_6493",
"LABEL_6494",
"LABEL_6495",
"LABEL_6496",
"LABEL_6497",
"LABEL_6498",
"LABEL_6499",
"LABEL_65",
"LABEL_650",
"LABEL_6500",
"LABEL_6501",
"LABEL_6502",
"LABEL_6503",
"LABEL_6504",
"LABEL_6505",
"LABEL_6506",
"LABEL_6507",
"LABEL_6508",
"LABEL_6509",
"LABEL_651",
"LABEL_6510",
"LABEL_6511",
"LABEL_6512",
"LABEL_6513",
"LABEL_6514",
"LABEL_6515",
"LABEL_6516",
"LABEL_6517",
"LABEL_6518",
"LABEL_6519",
"LABEL_652",
"LABEL_6520",
"LABEL_6521",
"LABEL_6522",
"LABEL_6523",
"LABEL_6524",
"LABEL_6525",
"LABEL_6526",
"LABEL_6527",
"LABEL_6528",
"LABEL_6529",
"LABEL_653",
"LABEL_6530",
"LABEL_6531",
"LABEL_6532",
"LABEL_6533",
"LABEL_6534",
"LABEL_6535",
"LABEL_6536",
"LABEL_6537",
"LABEL_6538",
"LABEL_6539",
"LABEL_654",
"LABEL_6540",
"LABEL_6541",
"LABEL_6542",
"LABEL_6543",
"LABEL_6544",
"LABEL_6545",
"LABEL_6546",
"LABEL_6547",
"LABEL_6548",
"LABEL_6549",
"LABEL_655",
"LABEL_6550",
"LABEL_6551",
"LABEL_6552",
"LABEL_6553",
"LABEL_6554",
"LABEL_6555",
"LABEL_6556",
"LABEL_6557",
"LABEL_6558",
"LABEL_6559",
"LABEL_656",
"LABEL_6560",
"LABEL_6561",
"LABEL_6562",
"LABEL_6563",
"LABEL_6564",
"LABEL_6565",
"LABEL_6566",
"LABEL_6567",
"LABEL_6568",
"LABEL_6569",
"LABEL_657",
"LABEL_6570",
"LABEL_6571",
"LABEL_6572",
"LABEL_6573",
"LABEL_6574",
"LABEL_6575",
"LABEL_6576",
"LABEL_6577",
"LABEL_6578",
"LABEL_6579",
"LABEL_658",
"LABEL_6580",
"LABEL_6581",
"LABEL_6582",
"LABEL_6583",
"LABEL_6584",
"LABEL_6585",
"LABEL_6586",
"LABEL_6587",
"LABEL_6588",
"LABEL_6589",
"LABEL_659",
"LABEL_6590",
"LABEL_6591",
"LABEL_6592",
"LABEL_6593",
"LABEL_6594",
"LABEL_6595",
"LABEL_6596",
"LABEL_6597",
"LABEL_6598",
"LABEL_6599",
"LABEL_66",
"LABEL_660",
"LABEL_6600",
"LABEL_6601",
"LABEL_6602",
"LABEL_6603",
"LABEL_6604",
"LABEL_6605",
"LABEL_6606",
"LABEL_6607",
"LABEL_6608",
"LABEL_6609",
"LABEL_661",
"LABEL_6610",
"LABEL_6611",
"LABEL_6612",
"LABEL_6613",
"LABEL_6614",
"LABEL_6615",
"LABEL_6616",
"LABEL_6617",
"LABEL_6618",
"LABEL_6619",
"LABEL_662",
"LABEL_6620",
"LABEL_6621",
"LABEL_6622",
"LABEL_6623",
"LABEL_6624",
"LABEL_6625",
"LABEL_6626",
"LABEL_6627",
"LABEL_6628",
"LABEL_6629",
"LABEL_663",
"LABEL_6630",
"LABEL_6631",
"LABEL_6632",
"LABEL_6633",
"LABEL_6634",
"LABEL_6635",
"LABEL_6636",
"LABEL_6637",
"LABEL_6638",
"LABEL_6639",
"LABEL_664",
"LABEL_6640",
"LABEL_6641",
"LABEL_6642",
"LABEL_6643",
"LABEL_6644",
"LABEL_6645",
"LABEL_6646",
"LABEL_6647",
"LABEL_6648",
"LABEL_6649",
"LABEL_665",
"LABEL_6650",
"LABEL_6651",
"LABEL_6652",
"LABEL_6653",
"LABEL_6654",
"LABEL_6655",
"LABEL_6656",
"LABEL_6657",
"LABEL_6658",
"LABEL_6659",
"LABEL_666",
"LABEL_6660",
"LABEL_6661",
"LABEL_6662",
"LABEL_6663",
"LABEL_6664",
"LABEL_6665",
"LABEL_6666",
"LABEL_6667",
"LABEL_6668",
"LABEL_6669",
"LABEL_667",
"LABEL_6670",
"LABEL_6671",
"LABEL_6672",
"LABEL_6673",
"LABEL_6674",
"LABEL_6675",
"LABEL_6676",
"LABEL_6677",
"LABEL_6678",
"LABEL_6679",
"LABEL_668",
"LABEL_6680",
"LABEL_6681",
"LABEL_6682",
"LABEL_6683",
"LABEL_6684",
"LABEL_6685",
"LABEL_6686",
"LABEL_6687",
"LABEL_6688",
"LABEL_6689",
"LABEL_669",
"LABEL_6690",
"LABEL_6691",
"LABEL_6692",
"LABEL_6693",
"LABEL_6694",
"LABEL_6695",
"LABEL_6696",
"LABEL_6697",
"LABEL_6698",
"LABEL_6699",
"LABEL_67",
"LABEL_670",
"LABEL_6700",
"LABEL_6701",
"LABEL_6702",
"LABEL_6703",
"LABEL_6704",
"LABEL_6705",
"LABEL_6706",
"LABEL_6707",
"LABEL_6708",
"LABEL_6709",
"LABEL_671",
"LABEL_6710",
"LABEL_6711",
"LABEL_6712",
"LABEL_6713",
"LABEL_6714",
"LABEL_6715",
"LABEL_6716",
"LABEL_6717",
"LABEL_6718",
"LABEL_6719",
"LABEL_672",
"LABEL_6720",
"LABEL_6721",
"LABEL_6722",
"LABEL_6723",
"LABEL_6724",
"LABEL_6725",
"LABEL_6726",
"LABEL_6727",
"LABEL_6728",
"LABEL_6729",
"LABEL_673",
"LABEL_6730",
"LABEL_6731",
"LABEL_6732",
"LABEL_6733",
"LABEL_6734",
"LABEL_6735",
"LABEL_6736",
"LABEL_6737",
"LABEL_6738",
"LABEL_6739",
"LABEL_674",
"LABEL_6740",
"LABEL_6741",
"LABEL_6742",
"LABEL_6743",
"LABEL_6744",
"LABEL_6745",
"LABEL_6746",
"LABEL_6747",
"LABEL_6748",
"LABEL_6749",
"LABEL_675",
"LABEL_6750",
"LABEL_6751",
"LABEL_6752",
"LABEL_6753",
"LABEL_6754",
"LABEL_6755",
"LABEL_6756",
"LABEL_6757",
"LABEL_6758",
"LABEL_6759",
"LABEL_676",
"LABEL_6760",
"LABEL_6761",
"LABEL_6762",
"LABEL_6763",
"LABEL_6764",
"LABEL_6765",
"LABEL_6766",
"LABEL_6767",
"LABEL_6768",
"LABEL_6769",
"LABEL_677",
"LABEL_6770",
"LABEL_6771",
"LABEL_6772",
"LABEL_6773",
"LABEL_6774",
"LABEL_6775",
"LABEL_6776",
"LABEL_6777",
"LABEL_6778",
"LABEL_6779",
"LABEL_678",
"LABEL_6780",
"LABEL_6781",
"LABEL_6782",
"LABEL_6783",
"LABEL_6784",
"LABEL_6785",
"LABEL_6786",
"LABEL_6787",
"LABEL_6788",
"LABEL_6789",
"LABEL_679",
"LABEL_6790",
"LABEL_6791",
"LABEL_6792",
"LABEL_6793",
"LABEL_6794",
"LABEL_6795",
"LABEL_6796",
"LABEL_6797",
"LABEL_6798",
"LABEL_6799",
"LABEL_68",
"LABEL_680",
"LABEL_6800",
"LABEL_6801",
"LABEL_6802",
"LABEL_6803",
"LABEL_6804",
"LABEL_6805",
"LABEL_6806",
"LABEL_6807",
"LABEL_6808",
"LABEL_6809",
"LABEL_681",
"LABEL_6810",
"LABEL_6811",
"LABEL_6812",
"LABEL_6813",
"LABEL_6814",
"LABEL_6815",
"LABEL_6816",
"LABEL_6817",
"LABEL_6818",
"LABEL_6819",
"LABEL_682",
"LABEL_6820",
"LABEL_6821",
"LABEL_6822",
"LABEL_6823",
"LABEL_6824",
"LABEL_6825",
"LABEL_6826",
"LABEL_6827",
"LABEL_6828",
"LABEL_6829",
"LABEL_683",
"LABEL_6830",
"LABEL_6831",
"LABEL_6832",
"LABEL_6833",
"LABEL_6834",
"LABEL_6835",
"LABEL_6836",
"LABEL_6837",
"LABEL_6838",
"LABEL_6839",
"LABEL_684",
"LABEL_6840",
"LABEL_6841",
"LABEL_6842",
"LABEL_6843",
"LABEL_6844",
"LABEL_6845",
"LABEL_6846",
"LABEL_6847",
"LABEL_6848",
"LABEL_6849",
"LABEL_685",
"LABEL_6850",
"LABEL_6851",
"LABEL_6852",
"LABEL_6853",
"LABEL_6854",
"LABEL_6855",
"LABEL_6856",
"LABEL_6857",
"LABEL_6858",
"LABEL_6859",
"LABEL_686",
"LABEL_6860",
"LABEL_6861",
"LABEL_6862",
"LABEL_6863",
"LABEL_6864",
"LABEL_6865",
"LABEL_6866",
"LABEL_6867",
"LABEL_6868",
"LABEL_6869",
"LABEL_687",
"LABEL_6870",
"LABEL_6871",
"LABEL_6872",
"LABEL_6873",
"LABEL_6874",
"LABEL_6875",
"LABEL_6876",
"LABEL_6877",
"LABEL_6878",
"LABEL_6879",
"LABEL_688",
"LABEL_6880",
"LABEL_6881",
"LABEL_6882",
"LABEL_6883",
"LABEL_6884",
"LABEL_6885",
"LABEL_6886",
"LABEL_6887",
"LABEL_6888",
"LABEL_6889",
"LABEL_689",
"LABEL_6890",
"LABEL_6891",
"LABEL_6892",
"LABEL_6893",
"LABEL_6894",
"LABEL_6895",
"LABEL_6896",
"LABEL_6897",
"LABEL_6898",
"LABEL_6899",
"LABEL_69",
"LABEL_690",
"LABEL_6900",
"LABEL_6901",
"LABEL_6902",
"LABEL_6903",
"LABEL_6904",
"LABEL_6905",
"LABEL_6906",
"LABEL_6907",
"LABEL_6908",
"LABEL_6909",
"LABEL_691",
"LABEL_6910",
"LABEL_6911",
"LABEL_6912",
"LABEL_6913",
"LABEL_6914",
"LABEL_6915",
"LABEL_6916",
"LABEL_6917",
"LABEL_6918",
"LABEL_6919",
"LABEL_692",
"LABEL_6920",
"LABEL_6921",
"LABEL_6922",
"LABEL_6923",
"LABEL_6924",
"LABEL_6925",
"LABEL_6926",
"LABEL_6927",
"LABEL_6928",
"LABEL_6929",
"LABEL_693",
"LABEL_6930",
"LABEL_6931",
"LABEL_6932",
"LABEL_6933",
"LABEL_6934",
"LABEL_6935",
"LABEL_6936",
"LABEL_6937",
"LABEL_6938",
"LABEL_6939",
"LABEL_694",
"LABEL_6940",
"LABEL_6941",
"LABEL_6942",
"LABEL_6943",
"LABEL_6944",
"LABEL_6945",
"LABEL_6946",
"LABEL_6947",
"LABEL_6948",
"LABEL_6949",
"LABEL_695",
"LABEL_6950",
"LABEL_6951",
"LABEL_6952",
"LABEL_6953",
"LABEL_6954",
"LABEL_6955",
"LABEL_6956",
"LABEL_6957",
"LABEL_6958",
"LABEL_6959",
"LABEL_696",
"LABEL_6960",
"LABEL_6961",
"LABEL_6962",
"LABEL_6963",
"LABEL_6964",
"LABEL_6965",
"LABEL_6966",
"LABEL_6967",
"LABEL_6968",
"LABEL_6969",
"LABEL_697",
"LABEL_6970",
"LABEL_6971",
"LABEL_6972",
"LABEL_6973",
"LABEL_6974",
"LABEL_6975",
"LABEL_6976",
"LABEL_6977",
"LABEL_6978",
"LABEL_6979",
"LABEL_698",
"LABEL_6980",
"LABEL_6981",
"LABEL_6982",
"LABEL_6983",
"LABEL_6984",
"LABEL_6985",
"LABEL_6986",
"LABEL_6987",
"LABEL_6988",
"LABEL_6989",
"LABEL_699",
"LABEL_6990",
"LABEL_6991",
"LABEL_6992",
"LABEL_6993",
"LABEL_6994",
"LABEL_6995",
"LABEL_6996",
"LABEL_6997",
"LABEL_6998",
"LABEL_6999",
"LABEL_7",
"LABEL_70",
"LABEL_700",
"LABEL_7000",
"LABEL_7001",
"LABEL_7002",
"LABEL_7003",
"LABEL_7004",
"LABEL_7005",
"LABEL_7006",
"LABEL_7007",
"LABEL_7008",
"LABEL_7009",
"LABEL_701",
"LABEL_7010",
"LABEL_7011",
"LABEL_7012",
"LABEL_7013",
"LABEL_7014",
"LABEL_7015",
"LABEL_7016",
"LABEL_7017",
"LABEL_7018",
"LABEL_7019",
"LABEL_702",
"LABEL_7020",
"LABEL_7021",
"LABEL_7022",
"LABEL_7023",
"LABEL_7024",
"LABEL_7025",
"LABEL_7026",
"LABEL_7027",
"LABEL_7028",
"LABEL_7029",
"LABEL_703",
"LABEL_7030",
"LABEL_7031",
"LABEL_7032",
"LABEL_7033",
"LABEL_7034",
"LABEL_7035",
"LABEL_7036",
"LABEL_7037",
"LABEL_7038",
"LABEL_7039",
"LABEL_704",
"LABEL_7040",
"LABEL_7041",
"LABEL_7042",
"LABEL_7043",
"LABEL_7044",
"LABEL_7045",
"LABEL_7046",
"LABEL_7047",
"LABEL_7048",
"LABEL_7049",
"LABEL_705",
"LABEL_7050",
"LABEL_7051",
"LABEL_7052",
"LABEL_7053",
"LABEL_7054",
"LABEL_7055",
"LABEL_7056",
"LABEL_7057",
"LABEL_7058",
"LABEL_7059",
"LABEL_706",
"LABEL_7060",
"LABEL_7061",
"LABEL_7062",
"LABEL_7063",
"LABEL_7064",
"LABEL_7065",
"LABEL_7066",
"LABEL_7067",
"LABEL_7068",
"LABEL_7069",
"LABEL_707",
"LABEL_7070",
"LABEL_7071",
"LABEL_7072",
"LABEL_7073",
"LABEL_7074",
"LABEL_7075",
"LABEL_7076",
"LABEL_7077",
"LABEL_7078",
"LABEL_7079",
"LABEL_708",
"LABEL_7080",
"LABEL_7081",
"LABEL_7082",
"LABEL_7083",
"LABEL_7084",
"LABEL_7085",
"LABEL_7086",
"LABEL_7087",
"LABEL_7088",
"LABEL_7089",
"LABEL_709",
"LABEL_7090",
"LABEL_7091",
"LABEL_7092",
"LABEL_7093",
"LABEL_7094",
"LABEL_7095",
"LABEL_7096",
"LABEL_7097",
"LABEL_7098",
"LABEL_7099",
"LABEL_71",
"LABEL_710",
"LABEL_7100",
"LABEL_7101",
"LABEL_7102",
"LABEL_7103",
"LABEL_7104",
"LABEL_7105",
"LABEL_7106",
"LABEL_7107",
"LABEL_7108",
"LABEL_7109",
"LABEL_711",
"LABEL_7110",
"LABEL_7111",
"LABEL_7112",
"LABEL_7113",
"LABEL_7114",
"LABEL_7115",
"LABEL_7116",
"LABEL_7117",
"LABEL_7118",
"LABEL_7119",
"LABEL_712",
"LABEL_7120",
"LABEL_7121",
"LABEL_7122",
"LABEL_7123",
"LABEL_7124",
"LABEL_7125",
"LABEL_7126",
"LABEL_7127",
"LABEL_7128",
"LABEL_7129",
"LABEL_713",
"LABEL_7130",
"LABEL_7131",
"LABEL_7132",
"LABEL_7133",
"LABEL_7134",
"LABEL_7135",
"LABEL_7136",
"LABEL_7137",
"LABEL_7138",
"LABEL_7139",
"LABEL_714",
"LABEL_7140",
"LABEL_7141",
"LABEL_7142",
"LABEL_7143",
"LABEL_7144",
"LABEL_7145",
"LABEL_7146",
"LABEL_7147",
"LABEL_7148",
"LABEL_7149",
"LABEL_715",
"LABEL_7150",
"LABEL_7151",
"LABEL_7152",
"LABEL_7153",
"LABEL_7154",
"LABEL_7155",
"LABEL_7156",
"LABEL_7157",
"LABEL_7158",
"LABEL_7159",
"LABEL_716",
"LABEL_7160",
"LABEL_7161",
"LABEL_7162",
"LABEL_7163",
"LABEL_7164",
"LABEL_7165",
"LABEL_7166",
"LABEL_7167",
"LABEL_7168",
"LABEL_7169",
"LABEL_717",
"LABEL_7170",
"LABEL_7171",
"LABEL_7172",
"LABEL_7173",
"LABEL_7174",
"LABEL_7175",
"LABEL_7176",
"LABEL_7177",
"LABEL_7178",
"LABEL_7179",
"LABEL_718",
"LABEL_7180",
"LABEL_7181",
"LABEL_7182",
"LABEL_7183",
"LABEL_7184",
"LABEL_7185",
"LABEL_7186",
"LABEL_7187",
"LABEL_7188",
"LABEL_7189",
"LABEL_719",
"LABEL_7190",
"LABEL_7191",
"LABEL_7192",
"LABEL_7193",
"LABEL_7194",
"LABEL_7195",
"LABEL_7196",
"LABEL_7197",
"LABEL_7198",
"LABEL_7199",
"LABEL_72",
"LABEL_720",
"LABEL_7200",
"LABEL_7201",
"LABEL_7202",
"LABEL_7203",
"LABEL_7204",
"LABEL_7205",
"LABEL_7206",
"LABEL_7207",
"LABEL_7208",
"LABEL_7209",
"LABEL_721",
"LABEL_7210",
"LABEL_7211",
"LABEL_7212",
"LABEL_7213",
"LABEL_7214",
"LABEL_7215",
"LABEL_7216",
"LABEL_7217",
"LABEL_7218",
"LABEL_7219",
"LABEL_722",
"LABEL_7220",
"LABEL_7221",
"LABEL_7222",
"LABEL_7223",
"LABEL_7224",
"LABEL_7225",
"LABEL_7226",
"LABEL_7227",
"LABEL_7228",
"LABEL_7229",
"LABEL_723",
"LABEL_7230",
"LABEL_7231",
"LABEL_7232",
"LABEL_7233",
"LABEL_7234",
"LABEL_7235",
"LABEL_7236",
"LABEL_7237",
"LABEL_7238",
"LABEL_7239",
"LABEL_724",
"LABEL_7240",
"LABEL_7241",
"LABEL_7242",
"LABEL_7243",
"LABEL_7244",
"LABEL_7245",
"LABEL_7246",
"LABEL_7247",
"LABEL_7248",
"LABEL_7249",
"LABEL_725",
"LABEL_7250",
"LABEL_7251",
"LABEL_7252",
"LABEL_7253",
"LABEL_7254",
"LABEL_7255",
"LABEL_7256",
"LABEL_7257",
"LABEL_7258",
"LABEL_7259",
"LABEL_726",
"LABEL_7260",
"LABEL_7261",
"LABEL_7262",
"LABEL_7263",
"LABEL_7264",
"LABEL_7265",
"LABEL_7266",
"LABEL_7267",
"LABEL_7268",
"LABEL_7269",
"LABEL_727",
"LABEL_7270",
"LABEL_7271",
"LABEL_7272",
"LABEL_7273",
"LABEL_7274",
"LABEL_7275",
"LABEL_7276",
"LABEL_7277",
"LABEL_7278",
"LABEL_7279",
"LABEL_728",
"LABEL_7280",
"LABEL_7281",
"LABEL_7282",
"LABEL_7283",
"LABEL_7284",
"LABEL_7285",
"LABEL_7286",
"LABEL_7287",
"LABEL_7288",
"LABEL_7289",
"LABEL_729",
"LABEL_7290",
"LABEL_7291",
"LABEL_7292",
"LABEL_7293",
"LABEL_7294",
"LABEL_7295",
"LABEL_7296",
"LABEL_7297",
"LABEL_7298",
"LABEL_7299",
"LABEL_73",
"LABEL_730",
"LABEL_7300",
"LABEL_7301",
"LABEL_7302",
"LABEL_7303",
"LABEL_7304",
"LABEL_7305",
"LABEL_7306",
"LABEL_7307",
"LABEL_7308",
"LABEL_7309",
"LABEL_731",
"LABEL_7310",
"LABEL_7311",
"LABEL_7312",
"LABEL_7313",
"LABEL_7314",
"LABEL_7315",
"LABEL_7316",
"LABEL_7317",
"LABEL_7318",
"LABEL_7319",
"LABEL_732",
"LABEL_7320",
"LABEL_7321",
"LABEL_7322",
"LABEL_7323",
"LABEL_7324",
"LABEL_7325",
"LABEL_7326",
"LABEL_7327",
"LABEL_7328",
"LABEL_7329",
"LABEL_733",
"LABEL_7330",
"LABEL_7331",
"LABEL_7332",
"LABEL_7333",
"LABEL_7334",
"LABEL_7335",
"LABEL_7336",
"LABEL_7337",
"LABEL_7338",
"LABEL_7339",
"LABEL_734",
"LABEL_7340",
"LABEL_7341",
"LABEL_7342",
"LABEL_7343",
"LABEL_7344",
"LABEL_7345",
"LABEL_7346",
"LABEL_7347",
"LABEL_7348",
"LABEL_7349",
"LABEL_735",
"LABEL_7350",
"LABEL_7351",
"LABEL_7352",
"LABEL_7353",
"LABEL_7354",
"LABEL_7355",
"LABEL_7356",
"LABEL_7357",
"LABEL_7358",
"LABEL_7359",
"LABEL_736",
"LABEL_7360",
"LABEL_7361",
"LABEL_7362",
"LABEL_7363",
"LABEL_7364",
"LABEL_7365",
"LABEL_7366",
"LABEL_7367",
"LABEL_7368",
"LABEL_7369",
"LABEL_737",
"LABEL_7370",
"LABEL_7371",
"LABEL_7372",
"LABEL_7373",
"LABEL_7374",
"LABEL_7375",
"LABEL_7376",
"LABEL_7377",
"LABEL_7378",
"LABEL_7379",
"LABEL_738",
"LABEL_7380",
"LABEL_7381",
"LABEL_7382",
"LABEL_7383",
"LABEL_7384",
"LABEL_7385",
"LABEL_7386",
"LABEL_7387",
"LABEL_7388",
"LABEL_7389",
"LABEL_739",
"LABEL_7390",
"LABEL_7391",
"LABEL_7392",
"LABEL_7393",
"LABEL_7394",
"LABEL_7395",
"LABEL_7396",
"LABEL_7397",
"LABEL_7398",
"LABEL_7399",
"LABEL_74",
"LABEL_740",
"LABEL_7400",
"LABEL_7401",
"LABEL_7402",
"LABEL_7403",
"LABEL_7404",
"LABEL_7405",
"LABEL_7406",
"LABEL_7407",
"LABEL_7408",
"LABEL_7409",
"LABEL_741",
"LABEL_7410",
"LABEL_7411",
"LABEL_7412",
"LABEL_7413",
"LABEL_7414",
"LABEL_7415",
"LABEL_7416",
"LABEL_7417",
"LABEL_7418",
"LABEL_7419",
"LABEL_742",
"LABEL_7420",
"LABEL_7421",
"LABEL_7422",
"LABEL_7423",
"LABEL_7424",
"LABEL_7425",
"LABEL_7426",
"LABEL_7427",
"LABEL_7428",
"LABEL_7429",
"LABEL_743",
"LABEL_7430",
"LABEL_7431",
"LABEL_7432",
"LABEL_7433",
"LABEL_7434",
"LABEL_7435",
"LABEL_7436",
"LABEL_7437",
"LABEL_7438",
"LABEL_7439",
"LABEL_744",
"LABEL_7440",
"LABEL_7441",
"LABEL_7442",
"LABEL_7443",
"LABEL_7444",
"LABEL_7445",
"LABEL_7446",
"LABEL_7447",
"LABEL_7448",
"LABEL_7449",
"LABEL_745",
"LABEL_7450",
"LABEL_7451",
"LABEL_7452",
"LABEL_7453",
"LABEL_7454",
"LABEL_7455",
"LABEL_7456",
"LABEL_7457",
"LABEL_7458",
"LABEL_7459",
"LABEL_746",
"LABEL_7460",
"LABEL_7461",
"LABEL_7462",
"LABEL_7463",
"LABEL_7464",
"LABEL_7465",
"LABEL_7466",
"LABEL_7467",
"LABEL_7468",
"LABEL_7469",
"LABEL_747",
"LABEL_7470",
"LABEL_7471",
"LABEL_7472",
"LABEL_7473",
"LABEL_7474",
"LABEL_7475",
"LABEL_7476",
"LABEL_7477",
"LABEL_7478",
"LABEL_7479",
"LABEL_748",
"LABEL_7480",
"LABEL_7481",
"LABEL_7482",
"LABEL_7483",
"LABEL_7484",
"LABEL_7485",
"LABEL_7486",
"LABEL_7487",
"LABEL_7488",
"LABEL_7489",
"LABEL_749",
"LABEL_7490",
"LABEL_7491",
"LABEL_7492",
"LABEL_7493",
"LABEL_7494",
"LABEL_7495",
"LABEL_7496",
"LABEL_7497",
"LABEL_7498",
"LABEL_7499",
"LABEL_75",
"LABEL_750",
"LABEL_7500",
"LABEL_7501",
"LABEL_7502",
"LABEL_7503",
"LABEL_7504",
"LABEL_7505",
"LABEL_7506",
"LABEL_7507",
"LABEL_7508",
"LABEL_7509",
"LABEL_751",
"LABEL_7510",
"LABEL_7511",
"LABEL_7512",
"LABEL_7513",
"LABEL_7514",
"LABEL_7515",
"LABEL_7516",
"LABEL_7517",
"LABEL_7518",
"LABEL_7519",
"LABEL_752",
"LABEL_7520",
"LABEL_7521",
"LABEL_7522",
"LABEL_7523",
"LABEL_7524",
"LABEL_7525",
"LABEL_7526",
"LABEL_7527",
"LABEL_7528",
"LABEL_7529",
"LABEL_753",
"LABEL_7530",
"LABEL_7531",
"LABEL_7532",
"LABEL_7533",
"LABEL_7534",
"LABEL_7535",
"LABEL_7536",
"LABEL_7537",
"LABEL_7538",
"LABEL_7539",
"LABEL_754",
"LABEL_7540",
"LABEL_7541",
"LABEL_7542",
"LABEL_7543",
"LABEL_7544",
"LABEL_7545",
"LABEL_7546",
"LABEL_7547",
"LABEL_7548",
"LABEL_7549",
"LABEL_755",
"LABEL_7550",
"LABEL_7551",
"LABEL_7552",
"LABEL_7553",
"LABEL_7554",
"LABEL_7555",
"LABEL_7556",
"LABEL_7557",
"LABEL_7558",
"LABEL_7559",
"LABEL_756",
"LABEL_7560",
"LABEL_7561",
"LABEL_7562",
"LABEL_7563",
"LABEL_7564",
"LABEL_7565",
"LABEL_7566",
"LABEL_7567",
"LABEL_7568",
"LABEL_7569",
"LABEL_757",
"LABEL_7570",
"LABEL_7571",
"LABEL_7572",
"LABEL_7573",
"LABEL_7574",
"LABEL_7575",
"LABEL_7576",
"LABEL_7577",
"LABEL_7578",
"LABEL_7579",
"LABEL_758",
"LABEL_7580",
"LABEL_7581",
"LABEL_7582",
"LABEL_7583",
"LABEL_7584",
"LABEL_7585",
"LABEL_7586",
"LABEL_7587",
"LABEL_7588",
"LABEL_7589",
"LABEL_759",
"LABEL_7590",
"LABEL_7591",
"LABEL_7592",
"LABEL_7593",
"LABEL_7594",
"LABEL_7595",
"LABEL_7596",
"LABEL_7597",
"LABEL_7598",
"LABEL_7599",
"LABEL_76",
"LABEL_760",
"LABEL_7600",
"LABEL_7601",
"LABEL_7602",
"LABEL_7603",
"LABEL_7604",
"LABEL_7605",
"LABEL_7606",
"LABEL_7607",
"LABEL_7608",
"LABEL_7609",
"LABEL_761",
"LABEL_7610",
"LABEL_7611",
"LABEL_7612",
"LABEL_7613",
"LABEL_7614",
"LABEL_7615",
"LABEL_7616",
"LABEL_7617",
"LABEL_7618",
"LABEL_7619",
"LABEL_762",
"LABEL_7620",
"LABEL_7621",
"LABEL_7622",
"LABEL_7623",
"LABEL_7624",
"LABEL_7625",
"LABEL_7626",
"LABEL_7627",
"LABEL_7628",
"LABEL_7629",
"LABEL_763",
"LABEL_7630",
"LABEL_7631",
"LABEL_7632",
"LABEL_7633",
"LABEL_7634",
"LABEL_7635",
"LABEL_7636",
"LABEL_7637",
"LABEL_7638",
"LABEL_7639",
"LABEL_764",
"LABEL_7640",
"LABEL_7641",
"LABEL_7642",
"LABEL_7643",
"LABEL_7644",
"LABEL_7645",
"LABEL_7646",
"LABEL_7647",
"LABEL_7648",
"LABEL_7649",
"LABEL_765",
"LABEL_7650",
"LABEL_7651",
"LABEL_7652",
"LABEL_7653",
"LABEL_7654",
"LABEL_7655",
"LABEL_7656",
"LABEL_7657",
"LABEL_7658",
"LABEL_7659",
"LABEL_766",
"LABEL_7660",
"LABEL_7661",
"LABEL_7662",
"LABEL_7663",
"LABEL_7664",
"LABEL_7665",
"LABEL_7666",
"LABEL_7667",
"LABEL_7668",
"LABEL_7669",
"LABEL_767",
"LABEL_7670",
"LABEL_7671",
"LABEL_7672",
"LABEL_7673",
"LABEL_7674",
"LABEL_7675",
"LABEL_7676",
"LABEL_7677",
"LABEL_7678",
"LABEL_7679",
"LABEL_768",
"LABEL_7680",
"LABEL_7681",
"LABEL_7682",
"LABEL_7683",
"LABEL_7684",
"LABEL_7685",
"LABEL_7686",
"LABEL_7687",
"LABEL_7688",
"LABEL_7689",
"LABEL_769",
"LABEL_7690",
"LABEL_7691",
"LABEL_7692",
"LABEL_7693",
"LABEL_7694",
"LABEL_7695",
"LABEL_7696",
"LABEL_7697",
"LABEL_7698",
"LABEL_7699",
"LABEL_77",
"LABEL_770",
"LABEL_7700",
"LABEL_7701",
"LABEL_7702",
"LABEL_7703",
"LABEL_7704",
"LABEL_7705",
"LABEL_7706",
"LABEL_7707",
"LABEL_7708",
"LABEL_7709",
"LABEL_771",
"LABEL_7710",
"LABEL_7711",
"LABEL_7712",
"LABEL_7713",
"LABEL_7714",
"LABEL_7715",
"LABEL_7716",
"LABEL_7717",
"LABEL_7718",
"LABEL_7719",
"LABEL_772",
"LABEL_7720",
"LABEL_7721",
"LABEL_7722",
"LABEL_7723",
"LABEL_7724",
"LABEL_7725",
"LABEL_7726",
"LABEL_7727",
"LABEL_7728",
"LABEL_7729",
"LABEL_773",
"LABEL_7730",
"LABEL_7731",
"LABEL_7732",
"LABEL_7733",
"LABEL_7734",
"LABEL_7735",
"LABEL_7736",
"LABEL_7737",
"LABEL_7738",
"LABEL_7739",
"LABEL_774",
"LABEL_7740",
"LABEL_7741",
"LABEL_7742",
"LABEL_7743",
"LABEL_7744",
"LABEL_7745",
"LABEL_7746",
"LABEL_7747",
"LABEL_7748",
"LABEL_7749",
"LABEL_775",
"LABEL_7750",
"LABEL_7751",
"LABEL_7752",
"LABEL_7753",
"LABEL_7754",
"LABEL_7755",
"LABEL_7756",
"LABEL_7757",
"LABEL_7758",
"LABEL_7759",
"LABEL_776",
"LABEL_7760",
"LABEL_7761",
"LABEL_7762",
"LABEL_7763",
"LABEL_7764",
"LABEL_7765",
"LABEL_7766",
"LABEL_7767",
"LABEL_7768",
"LABEL_7769",
"LABEL_777",
"LABEL_7770",
"LABEL_7771",
"LABEL_7772",
"LABEL_7773",
"LABEL_7774",
"LABEL_7775",
"LABEL_7776",
"LABEL_7777",
"LABEL_7778",
"LABEL_7779",
"LABEL_778",
"LABEL_7780",
"LABEL_7781",
"LABEL_7782",
"LABEL_7783",
"LABEL_7784",
"LABEL_7785",
"LABEL_7786",
"LABEL_7787",
"LABEL_7788",
"LABEL_7789",
"LABEL_779",
"LABEL_7790",
"LABEL_7791",
"LABEL_7792",
"LABEL_7793",
"LABEL_7794",
"LABEL_7795",
"LABEL_7796",
"LABEL_7797",
"LABEL_7798",
"LABEL_7799",
"LABEL_78",
"LABEL_780",
"LABEL_7800",
"LABEL_7801",
"LABEL_7802",
"LABEL_7803",
"LABEL_7804",
"LABEL_7805",
"LABEL_7806",
"LABEL_7807",
"LABEL_7808",
"LABEL_7809",
"LABEL_781",
"LABEL_7810",
"LABEL_7811",
"LABEL_7812",
"LABEL_7813",
"LABEL_7814",
"LABEL_7815",
"LABEL_7816",
"LABEL_7817",
"LABEL_7818",
"LABEL_7819",
"LABEL_782",
"LABEL_7820",
"LABEL_7821",
"LABEL_7822",
"LABEL_7823",
"LABEL_7824",
"LABEL_7825",
"LABEL_7826",
"LABEL_7827",
"LABEL_7828",
"LABEL_7829",
"LABEL_783",
"LABEL_7830",
"LABEL_7831",
"LABEL_7832",
"LABEL_7833",
"LABEL_7834",
"LABEL_7835",
"LABEL_7836",
"LABEL_7837",
"LABEL_7838",
"LABEL_7839",
"LABEL_784",
"LABEL_7840",
"LABEL_7841",
"LABEL_7842",
"LABEL_7843",
"LABEL_7844",
"LABEL_7845",
"LABEL_7846",
"LABEL_7847",
"LABEL_7848",
"LABEL_7849",
"LABEL_785",
"LABEL_7850",
"LABEL_7851",
"LABEL_7852",
"LABEL_7853",
"LABEL_7854",
"LABEL_7855",
"LABEL_7856",
"LABEL_7857",
"LABEL_7858",
"LABEL_7859",
"LABEL_786",
"LABEL_7860",
"LABEL_7861",
"LABEL_7862",
"LABEL_7863",
"LABEL_7864",
"LABEL_7865",
"LABEL_7866",
"LABEL_7867",
"LABEL_7868",
"LABEL_7869",
"LABEL_787",
"LABEL_7870",
"LABEL_7871",
"LABEL_7872",
"LABEL_7873",
"LABEL_7874",
"LABEL_7875",
"LABEL_7876",
"LABEL_7877",
"LABEL_7878",
"LABEL_7879",
"LABEL_788",
"LABEL_7880",
"LABEL_7881",
"LABEL_7882",
"LABEL_7883",
"LABEL_7884",
"LABEL_7885",
"LABEL_7886",
"LABEL_7887",
"LABEL_7888",
"LABEL_7889",
"LABEL_789",
"LABEL_7890",
"LABEL_7891",
"LABEL_7892",
"LABEL_7893",
"LABEL_7894",
"LABEL_7895",
"LABEL_7896",
"LABEL_7897",
"LABEL_7898",
"LABEL_7899",
"LABEL_79",
"LABEL_790",
"LABEL_7900",
"LABEL_7901",
"LABEL_7902",
"LABEL_7903",
"LABEL_7904",
"LABEL_7905",
"LABEL_7906",
"LABEL_7907",
"LABEL_7908",
"LABEL_7909",
"LABEL_791",
"LABEL_7910",
"LABEL_7911",
"LABEL_7912",
"LABEL_7913",
"LABEL_7914",
"LABEL_7915",
"LABEL_7916",
"LABEL_7917",
"LABEL_7918",
"LABEL_7919",
"LABEL_792",
"LABEL_7920",
"LABEL_7921",
"LABEL_7922",
"LABEL_7923",
"LABEL_7924",
"LABEL_7925",
"LABEL_7926",
"LABEL_7927",
"LABEL_7928",
"LABEL_7929",
"LABEL_793",
"LABEL_7930",
"LABEL_7931",
"LABEL_7932",
"LABEL_7933",
"LABEL_7934",
"LABEL_7935",
"LABEL_7936",
"LABEL_7937",
"LABEL_7938",
"LABEL_7939",
"LABEL_794",
"LABEL_7940",
"LABEL_7941",
"LABEL_7942",
"LABEL_7943",
"LABEL_7944",
"LABEL_7945",
"LABEL_7946",
"LABEL_7947",
"LABEL_7948",
"LABEL_7949",
"LABEL_795",
"LABEL_7950",
"LABEL_7951",
"LABEL_7952",
"LABEL_7953",
"LABEL_7954",
"LABEL_7955",
"LABEL_7956",
"LABEL_7957",
"LABEL_7958",
"LABEL_7959",
"LABEL_796",
"LABEL_7960",
"LABEL_7961",
"LABEL_7962",
"LABEL_7963",
"LABEL_7964",
"LABEL_7965",
"LABEL_7966",
"LABEL_7967",
"LABEL_7968",
"LABEL_7969",
"LABEL_797",
"LABEL_7970",
"LABEL_7971",
"LABEL_7972",
"LABEL_7973",
"LABEL_7974",
"LABEL_7975",
"LABEL_7976",
"LABEL_7977",
"LABEL_7978",
"LABEL_7979",
"LABEL_798",
"LABEL_7980",
"LABEL_7981",
"LABEL_7982",
"LABEL_7983",
"LABEL_7984",
"LABEL_7985",
"LABEL_7986",
"LABEL_7987",
"LABEL_7988",
"LABEL_7989",
"LABEL_799",
"LABEL_7990",
"LABEL_7991",
"LABEL_7992",
"LABEL_7993",
"LABEL_7994",
"LABEL_7995",
"LABEL_7996",
"LABEL_7997",
"LABEL_7998",
"LABEL_7999",
"LABEL_8",
"LABEL_80",
"LABEL_800",
"LABEL_8000",
"LABEL_8001",
"LABEL_8002",
"LABEL_8003",
"LABEL_8004",
"LABEL_8005",
"LABEL_8006",
"LABEL_8007",
"LABEL_8008",
"LABEL_8009",
"LABEL_801",
"LABEL_8010",
"LABEL_8011",
"LABEL_8012",
"LABEL_8013",
"LABEL_8014",
"LABEL_8015",
"LABEL_8016",
"LABEL_8017",
"LABEL_8018",
"LABEL_8019",
"LABEL_802",
"LABEL_8020",
"LABEL_8021",
"LABEL_8022",
"LABEL_8023",
"LABEL_8024",
"LABEL_8025",
"LABEL_8026",
"LABEL_8027",
"LABEL_8028",
"LABEL_8029",
"LABEL_803",
"LABEL_8030",
"LABEL_8031",
"LABEL_8032",
"LABEL_8033",
"LABEL_8034",
"LABEL_8035",
"LABEL_8036",
"LABEL_8037",
"LABEL_8038",
"LABEL_8039",
"LABEL_804",
"LABEL_8040",
"LABEL_8041",
"LABEL_8042",
"LABEL_8043",
"LABEL_8044",
"LABEL_8045",
"LABEL_8046",
"LABEL_8047",
"LABEL_8048",
"LABEL_8049",
"LABEL_805",
"LABEL_8050",
"LABEL_8051",
"LABEL_8052",
"LABEL_8053",
"LABEL_8054",
"LABEL_8055",
"LABEL_8056",
"LABEL_8057",
"LABEL_8058",
"LABEL_8059",
"LABEL_806",
"LABEL_8060",
"LABEL_8061",
"LABEL_8062",
"LABEL_8063",
"LABEL_8064",
"LABEL_8065",
"LABEL_8066",
"LABEL_8067",
"LABEL_8068",
"LABEL_8069",
"LABEL_807",
"LABEL_8070",
"LABEL_8071",
"LABEL_8072",
"LABEL_8073",
"LABEL_8074",
"LABEL_8075",
"LABEL_8076",
"LABEL_8077",
"LABEL_8078",
"LABEL_8079",
"LABEL_808",
"LABEL_8080",
"LABEL_8081",
"LABEL_8082",
"LABEL_8083",
"LABEL_8084",
"LABEL_8085",
"LABEL_8086",
"LABEL_8087",
"LABEL_8088",
"LABEL_8089",
"LABEL_809",
"LABEL_8090",
"LABEL_8091",
"LABEL_8092",
"LABEL_8093",
"LABEL_8094",
"LABEL_8095",
"LABEL_8096",
"LABEL_8097",
"LABEL_8098",
"LABEL_8099",
"LABEL_81",
"LABEL_810",
"LABEL_8100",
"LABEL_8101",
"LABEL_8102",
"LABEL_8103",
"LABEL_8104",
"LABEL_8105",
"LABEL_8106",
"LABEL_8107",
"LABEL_8108",
"LABEL_8109",
"LABEL_811",
"LABEL_8110",
"LABEL_8111",
"LABEL_8112",
"LABEL_8113",
"LABEL_8114",
"LABEL_8115",
"LABEL_8116",
"LABEL_8117",
"LABEL_8118",
"LABEL_8119",
"LABEL_812",
"LABEL_8120",
"LABEL_8121",
"LABEL_8122",
"LABEL_8123",
"LABEL_8124",
"LABEL_8125",
"LABEL_8126",
"LABEL_8127",
"LABEL_8128",
"LABEL_8129",
"LABEL_813",
"LABEL_8130",
"LABEL_8131",
"LABEL_8132",
"LABEL_8133",
"LABEL_8134",
"LABEL_8135",
"LABEL_8136",
"LABEL_8137",
"LABEL_8138",
"LABEL_8139",
"LABEL_814",
"LABEL_8140",
"LABEL_8141",
"LABEL_8142",
"LABEL_8143",
"LABEL_8144",
"LABEL_8145",
"LABEL_8146",
"LABEL_8147",
"LABEL_8148",
"LABEL_8149",
"LABEL_815",
"LABEL_8150",
"LABEL_8151",
"LABEL_8152",
"LABEL_8153",
"LABEL_8154",
"LABEL_8155",
"LABEL_8156",
"LABEL_8157",
"LABEL_8158",
"LABEL_8159",
"LABEL_816",
"LABEL_8160",
"LABEL_8161",
"LABEL_8162",
"LABEL_8163",
"LABEL_8164",
"LABEL_8165",
"LABEL_8166",
"LABEL_8167",
"LABEL_8168",
"LABEL_8169",
"LABEL_817",
"LABEL_8170",
"LABEL_8171",
"LABEL_8172",
"LABEL_8173",
"LABEL_8174",
"LABEL_8175",
"LABEL_8176",
"LABEL_8177",
"LABEL_8178",
"LABEL_8179",
"LABEL_818",
"LABEL_8180",
"LABEL_8181",
"LABEL_8182",
"LABEL_8183",
"LABEL_8184",
"LABEL_8185",
"LABEL_8186",
"LABEL_8187",
"LABEL_8188",
"LABEL_8189",
"LABEL_819",
"LABEL_8190",
"LABEL_8191",
"LABEL_8192",
"LABEL_8193",
"LABEL_8194",
"LABEL_8195",
"LABEL_8196",
"LABEL_8197",
"LABEL_8198",
"LABEL_8199",
"LABEL_82",
"LABEL_820",
"LABEL_8200",
"LABEL_8201",
"LABEL_8202",
"LABEL_8203",
"LABEL_8204",
"LABEL_8205",
"LABEL_8206",
"LABEL_8207",
"LABEL_8208",
"LABEL_8209",
"LABEL_821",
"LABEL_8210",
"LABEL_8211",
"LABEL_8212",
"LABEL_8213",
"LABEL_8214",
"LABEL_8215",
"LABEL_8216",
"LABEL_8217",
"LABEL_8218",
"LABEL_8219",
"LABEL_822",
"LABEL_8220",
"LABEL_8221",
"LABEL_8222",
"LABEL_8223",
"LABEL_8224",
"LABEL_8225",
"LABEL_8226",
"LABEL_8227",
"LABEL_8228",
"LABEL_8229",
"LABEL_823",
"LABEL_8230",
"LABEL_8231",
"LABEL_8232",
"LABEL_8233",
"LABEL_8234",
"LABEL_8235",
"LABEL_8236",
"LABEL_8237",
"LABEL_8238",
"LABEL_8239",
"LABEL_824",
"LABEL_8240",
"LABEL_8241",
"LABEL_8242",
"LABEL_8243",
"LABEL_8244",
"LABEL_8245",
"LABEL_8246",
"LABEL_8247",
"LABEL_8248",
"LABEL_8249",
"LABEL_825",
"LABEL_8250",
"LABEL_8251",
"LABEL_8252",
"LABEL_8253",
"LABEL_8254",
"LABEL_8255",
"LABEL_8256",
"LABEL_8257",
"LABEL_8258",
"LABEL_8259",
"LABEL_826",
"LABEL_8260",
"LABEL_8261",
"LABEL_8262",
"LABEL_8263",
"LABEL_8264",
"LABEL_8265",
"LABEL_8266",
"LABEL_8267",
"LABEL_8268",
"LABEL_8269",
"LABEL_827",
"LABEL_8270",
"LABEL_8271",
"LABEL_8272",
"LABEL_8273",
"LABEL_8274",
"LABEL_8275",
"LABEL_8276",
"LABEL_8277",
"LABEL_8278",
"LABEL_8279",
"LABEL_828",
"LABEL_8280",
"LABEL_8281",
"LABEL_8282",
"LABEL_8283",
"LABEL_8284",
"LABEL_8285",
"LABEL_8286",
"LABEL_8287",
"LABEL_8288",
"LABEL_8289",
"LABEL_829",
"LABEL_8290",
"LABEL_8291",
"LABEL_8292",
"LABEL_8293",
"LABEL_8294",
"LABEL_8295",
"LABEL_8296",
"LABEL_8297",
"LABEL_8298",
"LABEL_8299",
"LABEL_83",
"LABEL_830",
"LABEL_8300",
"LABEL_8301",
"LABEL_8302",
"LABEL_8303",
"LABEL_8304",
"LABEL_8305",
"LABEL_8306",
"LABEL_8307",
"LABEL_8308",
"LABEL_8309",
"LABEL_831",
"LABEL_8310",
"LABEL_8311",
"LABEL_8312",
"LABEL_8313",
"LABEL_8314",
"LABEL_8315",
"LABEL_8316",
"LABEL_8317",
"LABEL_8318",
"LABEL_8319",
"LABEL_832",
"LABEL_8320",
"LABEL_8321",
"LABEL_8322",
"LABEL_8323",
"LABEL_8324",
"LABEL_8325",
"LABEL_8326",
"LABEL_8327",
"LABEL_8328",
"LABEL_8329",
"LABEL_833",
"LABEL_8330",
"LABEL_8331",
"LABEL_8332",
"LABEL_8333",
"LABEL_8334",
"LABEL_8335",
"LABEL_8336",
"LABEL_8337",
"LABEL_8338",
"LABEL_8339",
"LABEL_834",
"LABEL_8340",
"LABEL_8341",
"LABEL_8342",
"LABEL_8343",
"LABEL_8344",
"LABEL_8345",
"LABEL_8346",
"LABEL_8347",
"LABEL_8348",
"LABEL_8349",
"LABEL_835",
"LABEL_8350",
"LABEL_8351",
"LABEL_8352",
"LABEL_8353",
"LABEL_8354",
"LABEL_8355",
"LABEL_8356",
"LABEL_8357",
"LABEL_8358",
"LABEL_8359",
"LABEL_836",
"LABEL_8360",
"LABEL_8361",
"LABEL_8362",
"LABEL_8363",
"LABEL_8364",
"LABEL_8365",
"LABEL_8366",
"LABEL_8367",
"LABEL_8368",
"LABEL_8369",
"LABEL_837",
"LABEL_8370",
"LABEL_8371",
"LABEL_8372",
"LABEL_8373",
"LABEL_8374",
"LABEL_8375",
"LABEL_8376",
"LABEL_8377",
"LABEL_8378",
"LABEL_8379",
"LABEL_838",
"LABEL_8380",
"LABEL_8381",
"LABEL_8382",
"LABEL_8383",
"LABEL_8384",
"LABEL_8385",
"LABEL_8386",
"LABEL_8387",
"LABEL_8388",
"LABEL_8389",
"LABEL_839",
"LABEL_8390",
"LABEL_8391",
"LABEL_8392",
"LABEL_8393",
"LABEL_8394",
"LABEL_8395",
"LABEL_8396",
"LABEL_8397",
"LABEL_8398",
"LABEL_8399",
"LABEL_84",
"LABEL_840",
"LABEL_8400",
"LABEL_8401",
"LABEL_8402",
"LABEL_8403",
"LABEL_8404",
"LABEL_8405",
"LABEL_8406",
"LABEL_8407",
"LABEL_8408",
"LABEL_8409",
"LABEL_841",
"LABEL_8410",
"LABEL_8411",
"LABEL_8412",
"LABEL_8413",
"LABEL_8414",
"LABEL_8415",
"LABEL_8416",
"LABEL_8417",
"LABEL_8418",
"LABEL_8419",
"LABEL_842",
"LABEL_8420",
"LABEL_8421",
"LABEL_8422",
"LABEL_8423",
"LABEL_8424",
"LABEL_8425",
"LABEL_8426",
"LABEL_8427",
"LABEL_8428",
"LABEL_8429",
"LABEL_843",
"LABEL_8430",
"LABEL_8431",
"LABEL_8432",
"LABEL_8433",
"LABEL_8434",
"LABEL_8435",
"LABEL_8436",
"LABEL_8437",
"LABEL_8438",
"LABEL_8439",
"LABEL_844",
"LABEL_8440",
"LABEL_8441",
"LABEL_8442",
"LABEL_8443",
"LABEL_8444",
"LABEL_8445",
"LABEL_8446",
"LABEL_8447",
"LABEL_8448",
"LABEL_8449",
"LABEL_845",
"LABEL_8450",
"LABEL_8451",
"LABEL_8452",
"LABEL_8453",
"LABEL_8454",
"LABEL_8455",
"LABEL_8456",
"LABEL_8457",
"LABEL_8458",
"LABEL_8459",
"LABEL_846",
"LABEL_8460",
"LABEL_8461",
"LABEL_8462",
"LABEL_8463",
"LABEL_8464",
"LABEL_8465",
"LABEL_8466",
"LABEL_8467",
"LABEL_8468",
"LABEL_8469",
"LABEL_847",
"LABEL_8470",
"LABEL_8471",
"LABEL_8472",
"LABEL_8473",
"LABEL_8474",
"LABEL_8475",
"LABEL_8476",
"LABEL_8477",
"LABEL_8478",
"LABEL_8479",
"LABEL_848",
"LABEL_8480",
"LABEL_8481",
"LABEL_8482",
"LABEL_8483",
"LABEL_8484",
"LABEL_8485",
"LABEL_8486",
"LABEL_8487",
"LABEL_8488",
"LABEL_8489",
"LABEL_849",
"LABEL_8490",
"LABEL_8491",
"LABEL_8492",
"LABEL_8493",
"LABEL_8494",
"LABEL_8495",
"LABEL_8496",
"LABEL_8497",
"LABEL_8498",
"LABEL_8499",
"LABEL_85",
"LABEL_850",
"LABEL_8500",
"LABEL_8501",
"LABEL_8502",
"LABEL_8503",
"LABEL_8504",
"LABEL_8505",
"LABEL_8506",
"LABEL_8507",
"LABEL_8508",
"LABEL_8509",
"LABEL_851",
"LABEL_8510",
"LABEL_8511",
"LABEL_8512",
"LABEL_8513",
"LABEL_8514",
"LABEL_8515",
"LABEL_8516",
"LABEL_8517",
"LABEL_8518",
"LABEL_8519",
"LABEL_852",
"LABEL_8520",
"LABEL_8521",
"LABEL_8522",
"LABEL_8523",
"LABEL_8524",
"LABEL_8525",
"LABEL_8526",
"LABEL_8527",
"LABEL_8528",
"LABEL_8529",
"LABEL_853",
"LABEL_8530",
"LABEL_8531",
"LABEL_8532",
"LABEL_8533",
"LABEL_8534",
"LABEL_8535",
"LABEL_8536",
"LABEL_8537",
"LABEL_8538",
"LABEL_8539",
"LABEL_854",
"LABEL_8540",
"LABEL_8541",
"LABEL_8542",
"LABEL_8543",
"LABEL_8544",
"LABEL_8545",
"LABEL_8546",
"LABEL_8547",
"LABEL_8548",
"LABEL_8549",
"LABEL_855",
"LABEL_8550",
"LABEL_8551",
"LABEL_8552",
"LABEL_8553",
"LABEL_8554",
"LABEL_8555",
"LABEL_8556",
"LABEL_8557",
"LABEL_8558",
"LABEL_8559",
"LABEL_856",
"LABEL_8560",
"LABEL_8561",
"LABEL_8562",
"LABEL_8563",
"LABEL_8564",
"LABEL_8565",
"LABEL_8566",
"LABEL_8567",
"LABEL_8568",
"LABEL_8569",
"LABEL_857",
"LABEL_8570",
"LABEL_8571",
"LABEL_8572",
"LABEL_8573",
"LABEL_8574",
"LABEL_8575",
"LABEL_8576",
"LABEL_8577",
"LABEL_8578",
"LABEL_8579",
"LABEL_858",
"LABEL_8580",
"LABEL_8581",
"LABEL_8582",
"LABEL_8583",
"LABEL_8584",
"LABEL_8585",
"LABEL_8586",
"LABEL_8587",
"LABEL_8588",
"LABEL_8589",
"LABEL_859",
"LABEL_8590",
"LABEL_8591",
"LABEL_8592",
"LABEL_8593",
"LABEL_8594",
"LABEL_8595",
"LABEL_8596",
"LABEL_8597",
"LABEL_8598",
"LABEL_8599",
"LABEL_86",
"LABEL_860",
"LABEL_8600",
"LABEL_8601",
"LABEL_8602",
"LABEL_8603",
"LABEL_8604",
"LABEL_8605",
"LABEL_8606",
"LABEL_8607",
"LABEL_8608",
"LABEL_8609",
"LABEL_861",
"LABEL_8610",
"LABEL_8611",
"LABEL_8612",
"LABEL_8613",
"LABEL_8614",
"LABEL_8615",
"LABEL_8616",
"LABEL_8617",
"LABEL_8618",
"LABEL_8619",
"LABEL_862",
"LABEL_8620",
"LABEL_8621",
"LABEL_8622",
"LABEL_8623",
"LABEL_8624",
"LABEL_8625",
"LABEL_8626",
"LABEL_8627",
"LABEL_8628",
"LABEL_8629",
"LABEL_863",
"LABEL_8630",
"LABEL_8631",
"LABEL_8632",
"LABEL_8633",
"LABEL_8634",
"LABEL_8635",
"LABEL_8636",
"LABEL_8637",
"LABEL_8638",
"LABEL_8639",
"LABEL_864",
"LABEL_8640",
"LABEL_8641",
"LABEL_8642",
"LABEL_8643",
"LABEL_8644",
"LABEL_8645",
"LABEL_8646",
"LABEL_8647",
"LABEL_8648",
"LABEL_8649",
"LABEL_865",
"LABEL_8650",
"LABEL_8651",
"LABEL_8652",
"LABEL_8653",
"LABEL_8654",
"LABEL_8655",
"LABEL_8656",
"LABEL_8657",
"LABEL_8658",
"LABEL_8659",
"LABEL_866",
"LABEL_8660",
"LABEL_8661",
"LABEL_8662",
"LABEL_8663",
"LABEL_8664",
"LABEL_8665",
"LABEL_8666",
"LABEL_8667",
"LABEL_8668",
"LABEL_8669",
"LABEL_867",
"LABEL_8670",
"LABEL_8671",
"LABEL_8672",
"LABEL_8673",
"LABEL_8674",
"LABEL_8675",
"LABEL_8676",
"LABEL_8677",
"LABEL_8678",
"LABEL_8679",
"LABEL_868",
"LABEL_8680",
"LABEL_8681",
"LABEL_8682",
"LABEL_8683",
"LABEL_8684",
"LABEL_8685",
"LABEL_8686",
"LABEL_8687",
"LABEL_8688",
"LABEL_8689",
"LABEL_869",
"LABEL_8690",
"LABEL_8691",
"LABEL_8692",
"LABEL_8693",
"LABEL_8694",
"LABEL_8695",
"LABEL_8696",
"LABEL_8697",
"LABEL_8698",
"LABEL_8699",
"LABEL_87",
"LABEL_870",
"LABEL_8700",
"LABEL_8701",
"LABEL_8702",
"LABEL_8703",
"LABEL_8704",
"LABEL_8705",
"LABEL_8706",
"LABEL_8707",
"LABEL_8708",
"LABEL_8709",
"LABEL_871",
"LABEL_8710",
"LABEL_8711",
"LABEL_8712",
"LABEL_8713",
"LABEL_8714",
"LABEL_8715",
"LABEL_8716",
"LABEL_8717",
"LABEL_8718",
"LABEL_8719",
"LABEL_872",
"LABEL_8720",
"LABEL_8721",
"LABEL_8722",
"LABEL_8723",
"LABEL_8724",
"LABEL_8725",
"LABEL_8726",
"LABEL_8727",
"LABEL_8728",
"LABEL_8729",
"LABEL_873",
"LABEL_8730",
"LABEL_8731",
"LABEL_8732",
"LABEL_8733",
"LABEL_8734",
"LABEL_8735",
"LABEL_8736",
"LABEL_8737",
"LABEL_8738",
"LABEL_8739",
"LABEL_874",
"LABEL_8740",
"LABEL_8741",
"LABEL_8742",
"LABEL_8743",
"LABEL_8744",
"LABEL_8745",
"LABEL_8746",
"LABEL_8747",
"LABEL_8748",
"LABEL_8749",
"LABEL_875",
"LABEL_8750",
"LABEL_8751",
"LABEL_8752",
"LABEL_8753",
"LABEL_8754",
"LABEL_8755",
"LABEL_8756",
"LABEL_8757",
"LABEL_8758",
"LABEL_8759",
"LABEL_876",
"LABEL_8760",
"LABEL_8761",
"LABEL_8762",
"LABEL_8763",
"LABEL_8764",
"LABEL_8765",
"LABEL_8766",
"LABEL_8767",
"LABEL_8768",
"LABEL_8769",
"LABEL_877",
"LABEL_8770",
"LABEL_8771",
"LABEL_8772",
"LABEL_8773",
"LABEL_8774",
"LABEL_8775",
"LABEL_8776",
"LABEL_8777",
"LABEL_8778",
"LABEL_8779",
"LABEL_878",
"LABEL_8780",
"LABEL_8781",
"LABEL_8782",
"LABEL_8783",
"LABEL_8784",
"LABEL_8785",
"LABEL_8786",
"LABEL_8787",
"LABEL_8788",
"LABEL_8789",
"LABEL_879",
"LABEL_8790",
"LABEL_8791",
"LABEL_8792",
"LABEL_8793",
"LABEL_8794",
"LABEL_8795",
"LABEL_8796",
"LABEL_8797",
"LABEL_8798",
"LABEL_8799",
"LABEL_88",
"LABEL_880",
"LABEL_8800",
"LABEL_8801",
"LABEL_8802",
"LABEL_8803",
"LABEL_8804",
"LABEL_8805",
"LABEL_8806",
"LABEL_8807",
"LABEL_8808",
"LABEL_8809",
"LABEL_881",
"LABEL_8810",
"LABEL_8811",
"LABEL_8812",
"LABEL_8813",
"LABEL_8814",
"LABEL_8815",
"LABEL_8816",
"LABEL_8817",
"LABEL_8818",
"LABEL_8819",
"LABEL_882",
"LABEL_8820",
"LABEL_8821",
"LABEL_8822",
"LABEL_8823",
"LABEL_8824",
"LABEL_8825",
"LABEL_8826",
"LABEL_8827",
"LABEL_8828",
"LABEL_8829",
"LABEL_883",
"LABEL_8830",
"LABEL_8831",
"LABEL_8832",
"LABEL_8833",
"LABEL_8834",
"LABEL_8835",
"LABEL_8836",
"LABEL_8837",
"LABEL_8838",
"LABEL_8839",
"LABEL_884",
"LABEL_8840",
"LABEL_8841",
"LABEL_8842",
"LABEL_8843",
"LABEL_8844",
"LABEL_8845",
"LABEL_8846",
"LABEL_8847",
"LABEL_8848",
"LABEL_8849",
"LABEL_885",
"LABEL_8850",
"LABEL_8851",
"LABEL_8852",
"LABEL_8853",
"LABEL_8854",
"LABEL_8855",
"LABEL_8856",
"LABEL_8857",
"LABEL_8858",
"LABEL_8859",
"LABEL_886",
"LABEL_8860",
"LABEL_8861",
"LABEL_8862",
"LABEL_8863",
"LABEL_8864",
"LABEL_8865",
"LABEL_8866",
"LABEL_8867",
"LABEL_8868",
"LABEL_8869",
"LABEL_887",
"LABEL_8870",
"LABEL_8871",
"LABEL_8872",
"LABEL_8873",
"LABEL_8874",
"LABEL_8875",
"LABEL_8876",
"LABEL_8877",
"LABEL_8878",
"LABEL_8879",
"LABEL_888",
"LABEL_8880",
"LABEL_8881",
"LABEL_8882",
"LABEL_8883",
"LABEL_8884",
"LABEL_8885",
"LABEL_8886",
"LABEL_8887",
"LABEL_8888",
"LABEL_8889",
"LABEL_889",
"LABEL_8890",
"LABEL_8891",
"LABEL_8892",
"LABEL_8893",
"LABEL_8894",
"LABEL_8895",
"LABEL_8896",
"LABEL_8897",
"LABEL_8898",
"LABEL_8899",
"LABEL_89",
"LABEL_890",
"LABEL_8900",
"LABEL_8901",
"LABEL_8902",
"LABEL_8903",
"LABEL_8904",
"LABEL_8905",
"LABEL_8906",
"LABEL_8907",
"LABEL_8908",
"LABEL_8909",
"LABEL_891",
"LABEL_8910",
"LABEL_8911",
"LABEL_8912",
"LABEL_8913",
"LABEL_8914",
"LABEL_8915",
"LABEL_8916",
"LABEL_8917",
"LABEL_8918",
"LABEL_8919",
"LABEL_892",
"LABEL_8920",
"LABEL_8921",
"LABEL_8922",
"LABEL_8923",
"LABEL_8924",
"LABEL_8925",
"LABEL_8926",
"LABEL_8927",
"LABEL_8928",
"LABEL_8929",
"LABEL_893",
"LABEL_8930",
"LABEL_8931",
"LABEL_8932",
"LABEL_8933",
"LABEL_8934",
"LABEL_8935",
"LABEL_8936",
"LABEL_8937",
"LABEL_8938",
"LABEL_8939",
"LABEL_894",
"LABEL_8940",
"LABEL_8941",
"LABEL_8942",
"LABEL_8943",
"LABEL_8944",
"LABEL_8945",
"LABEL_8946",
"LABEL_8947",
"LABEL_8948",
"LABEL_8949",
"LABEL_895",
"LABEL_8950",
"LABEL_8951",
"LABEL_8952",
"LABEL_8953",
"LABEL_8954",
"LABEL_8955",
"LABEL_8956",
"LABEL_8957",
"LABEL_8958",
"LABEL_8959",
"LABEL_896",
"LABEL_8960",
"LABEL_8961",
"LABEL_8962",
"LABEL_8963",
"LABEL_8964",
"LABEL_8965",
"LABEL_8966",
"LABEL_8967",
"LABEL_8968",
"LABEL_8969",
"LABEL_897",
"LABEL_8970",
"LABEL_8971",
"LABEL_8972",
"LABEL_8973",
"LABEL_8974",
"LABEL_8975",
"LABEL_8976",
"LABEL_8977",
"LABEL_8978",
"LABEL_8979",
"LABEL_898",
"LABEL_8980",
"LABEL_8981",
"LABEL_8982",
"LABEL_8983",
"LABEL_8984",
"LABEL_8985",
"LABEL_8986",
"LABEL_8987",
"LABEL_8988",
"LABEL_8989",
"LABEL_899",
"LABEL_8990",
"LABEL_8991",
"LABEL_8992",
"LABEL_8993",
"LABEL_8994",
"LABEL_8995",
"LABEL_8996",
"LABEL_8997",
"LABEL_8998",
"LABEL_8999",
"LABEL_9",
"LABEL_90",
"LABEL_900",
"LABEL_9000",
"LABEL_9001",
"LABEL_9002",
"LABEL_9003",
"LABEL_9004",
"LABEL_9005",
"LABEL_9006",
"LABEL_9007",
"LABEL_9008",
"LABEL_9009",
"LABEL_901",
"LABEL_9010",
"LABEL_9011",
"LABEL_9012",
"LABEL_9013",
"LABEL_9014",
"LABEL_9015",
"LABEL_9016",
"LABEL_9017",
"LABEL_9018",
"LABEL_9019",
"LABEL_902",
"LABEL_9020",
"LABEL_9021",
"LABEL_9022",
"LABEL_9023",
"LABEL_9024",
"LABEL_9025",
"LABEL_9026",
"LABEL_9027",
"LABEL_9028",
"LABEL_9029",
"LABEL_903",
"LABEL_9030",
"LABEL_9031",
"LABEL_9032",
"LABEL_9033",
"LABEL_9034",
"LABEL_9035",
"LABEL_9036",
"LABEL_9037",
"LABEL_9038",
"LABEL_9039",
"LABEL_904",
"LABEL_9040",
"LABEL_9041",
"LABEL_9042",
"LABEL_9043",
"LABEL_9044",
"LABEL_9045",
"LABEL_9046",
"LABEL_9047",
"LABEL_9048",
"LABEL_9049",
"LABEL_905",
"LABEL_9050",
"LABEL_9051",
"LABEL_9052",
"LABEL_9053",
"LABEL_9054",
"LABEL_9055",
"LABEL_9056",
"LABEL_9057",
"LABEL_9058",
"LABEL_9059",
"LABEL_906",
"LABEL_9060",
"LABEL_9061",
"LABEL_9062",
"LABEL_9063",
"LABEL_9064",
"LABEL_9065",
"LABEL_9066",
"LABEL_9067",
"LABEL_9068",
"LABEL_9069",
"LABEL_907",
"LABEL_9070",
"LABEL_9071",
"LABEL_9072",
"LABEL_9073",
"LABEL_9074",
"LABEL_9075",
"LABEL_9076",
"LABEL_9077",
"LABEL_9078",
"LABEL_9079",
"LABEL_908",
"LABEL_9080",
"LABEL_9081",
"LABEL_9082",
"LABEL_9083",
"LABEL_9084",
"LABEL_9085",
"LABEL_9086",
"LABEL_9087",
"LABEL_9088",
"LABEL_9089",
"LABEL_909",
"LABEL_9090",
"LABEL_9091",
"LABEL_9092",
"LABEL_9093",
"LABEL_9094",
"LABEL_9095",
"LABEL_9096",
"LABEL_9097",
"LABEL_9098",
"LABEL_9099",
"LABEL_91",
"LABEL_910",
"LABEL_9100",
"LABEL_9101",
"LABEL_9102",
"LABEL_9103",
"LABEL_9104",
"LABEL_9105",
"LABEL_9106",
"LABEL_9107",
"LABEL_9108",
"LABEL_9109",
"LABEL_911",
"LABEL_9110",
"LABEL_9111",
"LABEL_9112",
"LABEL_9113",
"LABEL_9114",
"LABEL_9115",
"LABEL_9116",
"LABEL_9117",
"LABEL_9118",
"LABEL_9119",
"LABEL_912",
"LABEL_9120",
"LABEL_9121",
"LABEL_9122",
"LABEL_9123",
"LABEL_9124",
"LABEL_9125",
"LABEL_9126",
"LABEL_9127",
"LABEL_9128",
"LABEL_9129",
"LABEL_913",
"LABEL_9130",
"LABEL_9131",
"LABEL_9132",
"LABEL_9133",
"LABEL_9134",
"LABEL_9135",
"LABEL_9136",
"LABEL_9137",
"LABEL_9138",
"LABEL_9139",
"LABEL_914",
"LABEL_9140",
"LABEL_9141",
"LABEL_9142",
"LABEL_9143",
"LABEL_9144",
"LABEL_9145",
"LABEL_9146",
"LABEL_9147",
"LABEL_9148",
"LABEL_9149",
"LABEL_915",
"LABEL_9150",
"LABEL_9151",
"LABEL_9152",
"LABEL_9153",
"LABEL_9154",
"LABEL_9155",
"LABEL_9156",
"LABEL_9157",
"LABEL_9158",
"LABEL_9159",
"LABEL_916",
"LABEL_9160",
"LABEL_9161",
"LABEL_9162",
"LABEL_9163",
"LABEL_9164",
"LABEL_9165",
"LABEL_9166",
"LABEL_9167",
"LABEL_9168",
"LABEL_9169",
"LABEL_917",
"LABEL_9170",
"LABEL_9171",
"LABEL_9172",
"LABEL_9173",
"LABEL_9174",
"LABEL_9175",
"LABEL_9176",
"LABEL_9177",
"LABEL_9178",
"LABEL_9179",
"LABEL_918",
"LABEL_9180",
"LABEL_9181",
"LABEL_9182",
"LABEL_9183",
"LABEL_9184",
"LABEL_9185",
"LABEL_9186",
"LABEL_9187",
"LABEL_9188",
"LABEL_9189",
"LABEL_919",
"LABEL_9190",
"LABEL_9191",
"LABEL_9192",
"LABEL_9193",
"LABEL_9194",
"LABEL_9195",
"LABEL_9196",
"LABEL_9197",
"LABEL_9198",
"LABEL_9199",
"LABEL_92",
"LABEL_920",
"LABEL_9200",
"LABEL_9201",
"LABEL_9202",
"LABEL_9203",
"LABEL_9204",
"LABEL_9205",
"LABEL_9206",
"LABEL_9207",
"LABEL_9208",
"LABEL_9209",
"LABEL_921",
"LABEL_9210",
"LABEL_9211",
"LABEL_9212",
"LABEL_9213",
"LABEL_9214",
"LABEL_9215",
"LABEL_9216",
"LABEL_9217",
"LABEL_9218",
"LABEL_9219",
"LABEL_922",
"LABEL_9220",
"LABEL_9221",
"LABEL_9222",
"LABEL_9223",
"LABEL_9224",
"LABEL_9225",
"LABEL_9226",
"LABEL_9227",
"LABEL_9228",
"LABEL_9229",
"LABEL_923",
"LABEL_9230",
"LABEL_9231",
"LABEL_9232",
"LABEL_9233",
"LABEL_9234",
"LABEL_9235",
"LABEL_9236",
"LABEL_9237",
"LABEL_9238",
"LABEL_9239",
"LABEL_924",
"LABEL_9240",
"LABEL_9241",
"LABEL_9242",
"LABEL_9243",
"LABEL_9244",
"LABEL_9245",
"LABEL_9246",
"LABEL_9247",
"LABEL_9248",
"LABEL_9249",
"LABEL_925",
"LABEL_9250",
"LABEL_9251",
"LABEL_9252",
"LABEL_9253",
"LABEL_9254",
"LABEL_9255",
"LABEL_9256",
"LABEL_9257",
"LABEL_9258",
"LABEL_9259",
"LABEL_926",
"LABEL_9260",
"LABEL_9261",
"LABEL_9262",
"LABEL_9263",
"LABEL_9264",
"LABEL_9265",
"LABEL_9266",
"LABEL_9267",
"LABEL_9268",
"LABEL_9269",
"LABEL_927",
"LABEL_9270",
"LABEL_9271",
"LABEL_9272",
"LABEL_9273",
"LABEL_9274",
"LABEL_9275",
"LABEL_9276",
"LABEL_9277",
"LABEL_9278",
"LABEL_9279",
"LABEL_928",
"LABEL_9280",
"LABEL_9281",
"LABEL_9282",
"LABEL_9283",
"LABEL_9284",
"LABEL_9285",
"LABEL_9286",
"LABEL_9287",
"LABEL_9288",
"LABEL_9289",
"LABEL_929",
"LABEL_9290",
"LABEL_9291",
"LABEL_9292",
"LABEL_9293",
"LABEL_9294",
"LABEL_9295",
"LABEL_9296",
"LABEL_9297",
"LABEL_9298",
"LABEL_9299",
"LABEL_93",
"LABEL_930",
"LABEL_9300",
"LABEL_9301",
"LABEL_9302",
"LABEL_9303",
"LABEL_9304",
"LABEL_9305",
"LABEL_9306",
"LABEL_9307",
"LABEL_9308",
"LABEL_9309",
"LABEL_931",
"LABEL_9310",
"LABEL_9311",
"LABEL_9312",
"LABEL_9313",
"LABEL_9314",
"LABEL_9315",
"LABEL_9316",
"LABEL_9317",
"LABEL_9318",
"LABEL_9319",
"LABEL_932",
"LABEL_9320",
"LABEL_9321",
"LABEL_9322",
"LABEL_9323",
"LABEL_9324",
"LABEL_9325",
"LABEL_9326",
"LABEL_9327",
"LABEL_9328",
"LABEL_9329",
"LABEL_933",
"LABEL_9330",
"LABEL_9331",
"LABEL_9332",
"LABEL_9333",
"LABEL_9334",
"LABEL_9335",
"LABEL_9336",
"LABEL_9337",
"LABEL_9338",
"LABEL_9339",
"LABEL_934",
"LABEL_9340",
"LABEL_9341",
"LABEL_9342",
"LABEL_9343",
"LABEL_9344",
"LABEL_9345",
"LABEL_9346",
"LABEL_9347",
"LABEL_9348",
"LABEL_9349",
"LABEL_935",
"LABEL_9350",
"LABEL_9351",
"LABEL_9352",
"LABEL_9353",
"LABEL_9354",
"LABEL_9355",
"LABEL_9356",
"LABEL_9357",
"LABEL_9358",
"LABEL_9359",
"LABEL_936",
"LABEL_9360",
"LABEL_9361",
"LABEL_9362",
"LABEL_9363",
"LABEL_9364",
"LABEL_9365",
"LABEL_9366",
"LABEL_9367",
"LABEL_9368",
"LABEL_9369",
"LABEL_937",
"LABEL_9370",
"LABEL_9371",
"LABEL_9372",
"LABEL_9373",
"LABEL_9374",
"LABEL_9375",
"LABEL_9376",
"LABEL_9377",
"LABEL_9378",
"LABEL_9379",
"LABEL_938",
"LABEL_9380",
"LABEL_9381",
"LABEL_9382",
"LABEL_9383",
"LABEL_9384",
"LABEL_9385",
"LABEL_9386",
"LABEL_9387",
"LABEL_9388",
"LABEL_9389",
"LABEL_939",
"LABEL_9390",
"LABEL_9391",
"LABEL_9392",
"LABEL_9393",
"LABEL_9394",
"LABEL_9395",
"LABEL_9396",
"LABEL_9397",
"LABEL_9398",
"LABEL_9399",
"LABEL_94",
"LABEL_940",
"LABEL_9400",
"LABEL_9401",
"LABEL_9402",
"LABEL_9403",
"LABEL_9404",
"LABEL_9405",
"LABEL_9406",
"LABEL_9407",
"LABEL_9408",
"LABEL_9409",
"LABEL_941",
"LABEL_9410",
"LABEL_9411",
"LABEL_9412",
"LABEL_9413",
"LABEL_9414",
"LABEL_9415",
"LABEL_9416",
"LABEL_9417",
"LABEL_9418",
"LABEL_9419",
"LABEL_942",
"LABEL_9420",
"LABEL_9421",
"LABEL_9422",
"LABEL_9423",
"LABEL_9424",
"LABEL_9425",
"LABEL_9426",
"LABEL_9427",
"LABEL_9428",
"LABEL_9429",
"LABEL_943",
"LABEL_9430",
"LABEL_9431",
"LABEL_9432",
"LABEL_9433",
"LABEL_9434",
"LABEL_9435",
"LABEL_9436",
"LABEL_9437",
"LABEL_9438",
"LABEL_9439",
"LABEL_944",
"LABEL_9440",
"LABEL_9441",
"LABEL_9442",
"LABEL_9443",
"LABEL_9444",
"LABEL_9445",
"LABEL_9446",
"LABEL_9447",
"LABEL_9448",
"LABEL_9449",
"LABEL_945",
"LABEL_9450",
"LABEL_9451",
"LABEL_9452",
"LABEL_9453",
"LABEL_9454",
"LABEL_9455",
"LABEL_9456",
"LABEL_9457",
"LABEL_9458",
"LABEL_9459",
"LABEL_946",
"LABEL_9460",
"LABEL_9461",
"LABEL_9462",
"LABEL_9463",
"LABEL_9464",
"LABEL_9465",
"LABEL_9466",
"LABEL_9467",
"LABEL_9468",
"LABEL_9469",
"LABEL_947",
"LABEL_9470",
"LABEL_9471",
"LABEL_9472",
"LABEL_9473",
"LABEL_9474",
"LABEL_9475",
"LABEL_9476",
"LABEL_9477",
"LABEL_9478",
"LABEL_9479",
"LABEL_948",
"LABEL_9480",
"LABEL_9481",
"LABEL_9482",
"LABEL_9483",
"LABEL_9484",
"LABEL_9485",
"LABEL_9486",
"LABEL_9487",
"LABEL_9488",
"LABEL_9489",
"LABEL_949",
"LABEL_9490",
"LABEL_9491",
"LABEL_9492",
"LABEL_9493",
"LABEL_9494",
"LABEL_9495",
"LABEL_9496",
"LABEL_9497",
"LABEL_9498",
"LABEL_9499",
"LABEL_95",
"LABEL_950",
"LABEL_9500",
"LABEL_9501",
"LABEL_9502",
"LABEL_9503",
"LABEL_9504",
"LABEL_9505",
"LABEL_9506",
"LABEL_9507",
"LABEL_9508",
"LABEL_9509",
"LABEL_951",
"LABEL_9510",
"LABEL_9511",
"LABEL_9512",
"LABEL_9513",
"LABEL_9514",
"LABEL_9515",
"LABEL_9516",
"LABEL_9517",
"LABEL_9518",
"LABEL_9519",
"LABEL_952",
"LABEL_9520",
"LABEL_9521",
"LABEL_9522",
"LABEL_9523",
"LABEL_9524",
"LABEL_9525",
"LABEL_9526",
"LABEL_9527",
"LABEL_9528",
"LABEL_9529",
"LABEL_953",
"LABEL_9530",
"LABEL_9531",
"LABEL_9532",
"LABEL_9533",
"LABEL_9534",
"LABEL_9535",
"LABEL_9536",
"LABEL_9537",
"LABEL_9538",
"LABEL_9539",
"LABEL_954",
"LABEL_9540",
"LABEL_9541",
"LABEL_9542",
"LABEL_9543",
"LABEL_9544",
"LABEL_9545",
"LABEL_9546",
"LABEL_9547",
"LABEL_9548",
"LABEL_9549",
"LABEL_955",
"LABEL_9550",
"LABEL_9551",
"LABEL_9552",
"LABEL_9553",
"LABEL_9554",
"LABEL_9555",
"LABEL_9556",
"LABEL_9557",
"LABEL_9558",
"LABEL_9559",
"LABEL_956",
"LABEL_9560",
"LABEL_9561",
"LABEL_9562",
"LABEL_9563",
"LABEL_9564",
"LABEL_9565",
"LABEL_9566",
"LABEL_9567",
"LABEL_9568",
"LABEL_9569",
"LABEL_957",
"LABEL_9570",
"LABEL_9571",
"LABEL_9572",
"LABEL_9573",
"LABEL_9574",
"LABEL_9575",
"LABEL_9576",
"LABEL_9577",
"LABEL_9578",
"LABEL_9579",
"LABEL_958",
"LABEL_9580",
"LABEL_9581",
"LABEL_9582",
"LABEL_9583",
"LABEL_9584",
"LABEL_9585",
"LABEL_9586",
"LABEL_9587",
"LABEL_9588",
"LABEL_9589",
"LABEL_959",
"LABEL_9590",
"LABEL_9591",
"LABEL_9592",
"LABEL_9593",
"LABEL_9594",
"LABEL_9595",
"LABEL_9596",
"LABEL_9597",
"LABEL_9598",
"LABEL_9599",
"LABEL_96",
"LABEL_960",
"LABEL_9600",
"LABEL_9601",
"LABEL_9602",
"LABEL_9603",
"LABEL_9604",
"LABEL_9605",
"LABEL_9606",
"LABEL_9607",
"LABEL_9608",
"LABEL_9609",
"LABEL_961",
"LABEL_9610",
"LABEL_9611",
"LABEL_9612",
"LABEL_9613",
"LABEL_9614",
"LABEL_9615",
"LABEL_9616",
"LABEL_9617",
"LABEL_9618",
"LABEL_9619",
"LABEL_962",
"LABEL_9620",
"LABEL_9621",
"LABEL_9622",
"LABEL_9623",
"LABEL_9624",
"LABEL_9625",
"LABEL_9626",
"LABEL_9627",
"LABEL_9628",
"LABEL_9629",
"LABEL_963",
"LABEL_9630",
"LABEL_9631",
"LABEL_9632",
"LABEL_9633",
"LABEL_9634",
"LABEL_9635",
"LABEL_9636",
"LABEL_9637",
"LABEL_9638",
"LABEL_9639",
"LABEL_964",
"LABEL_9640",
"LABEL_9641",
"LABEL_9642",
"LABEL_9643",
"LABEL_9644",
"LABEL_9645",
"LABEL_9646",
"LABEL_9647",
"LABEL_9648",
"LABEL_9649",
"LABEL_965",
"LABEL_9650",
"LABEL_9651",
"LABEL_9652",
"LABEL_9653",
"LABEL_9654",
"LABEL_9655",
"LABEL_9656",
"LABEL_9657",
"LABEL_9658",
"LABEL_9659",
"LABEL_966",
"LABEL_9660",
"LABEL_9661",
"LABEL_9662",
"LABEL_9663",
"LABEL_9664",
"LABEL_9665",
"LABEL_9666",
"LABEL_9667",
"LABEL_9668",
"LABEL_9669",
"LABEL_967",
"LABEL_9670",
"LABEL_9671",
"LABEL_9672",
"LABEL_9673",
"LABEL_9674",
"LABEL_9675",
"LABEL_9676",
"LABEL_9677",
"LABEL_9678",
"LABEL_9679",
"LABEL_968",
"LABEL_9680",
"LABEL_9681",
"LABEL_9682",
"LABEL_9683",
"LABEL_9684",
"LABEL_9685",
"LABEL_9686",
"LABEL_9687",
"LABEL_9688",
"LABEL_9689",
"LABEL_969",
"LABEL_9690",
"LABEL_9691",
"LABEL_9692",
"LABEL_9693",
"LABEL_9694",
"LABEL_9695",
"LABEL_9696",
"LABEL_9697",
"LABEL_9698",
"LABEL_9699",
"LABEL_97",
"LABEL_970",
"LABEL_9700",
"LABEL_9701",
"LABEL_9702",
"LABEL_9703",
"LABEL_9704",
"LABEL_9705",
"LABEL_9706",
"LABEL_9707",
"LABEL_9708",
"LABEL_9709",
"LABEL_971",
"LABEL_9710",
"LABEL_9711",
"LABEL_9712",
"LABEL_9713",
"LABEL_9714",
"LABEL_9715",
"LABEL_9716",
"LABEL_9717",
"LABEL_9718",
"LABEL_9719",
"LABEL_972",
"LABEL_9720",
"LABEL_9721",
"LABEL_9722",
"LABEL_9723",
"LABEL_9724",
"LABEL_9725",
"LABEL_9726",
"LABEL_9727",
"LABEL_9728",
"LABEL_9729",
"LABEL_973",
"LABEL_9730",
"LABEL_9731",
"LABEL_9732",
"LABEL_9733",
"LABEL_9734",
"LABEL_9735",
"LABEL_9736",
"LABEL_9737",
"LABEL_9738",
"LABEL_9739",
"LABEL_974",
"LABEL_9740",
"LABEL_9741",
"LABEL_9742",
"LABEL_9743",
"LABEL_9744",
"LABEL_9745",
"LABEL_9746",
"LABEL_9747",
"LABEL_9748",
"LABEL_9749",
"LABEL_975",
"LABEL_9750",
"LABEL_9751",
"LABEL_9752",
"LABEL_9753",
"LABEL_9754",
"LABEL_9755",
"LABEL_9756",
"LABEL_9757",
"LABEL_9758",
"LABEL_9759",
"LABEL_976",
"LABEL_9760",
"LABEL_9761",
"LABEL_9762",
"LABEL_9763",
"LABEL_9764",
"LABEL_9765",
"LABEL_9766",
"LABEL_9767",
"LABEL_9768",
"LABEL_9769",
"LABEL_977",
"LABEL_9770",
"LABEL_9771",
"LABEL_9772",
"LABEL_9773",
"LABEL_9774",
"LABEL_9775",
"LABEL_9776",
"LABEL_9777",
"LABEL_9778",
"LABEL_9779",
"LABEL_978",
"LABEL_9780",
"LABEL_9781",
"LABEL_9782",
"LABEL_9783",
"LABEL_9784",
"LABEL_9785",
"LABEL_9786",
"LABEL_9787",
"LABEL_9788",
"LABEL_9789",
"LABEL_979",
"LABEL_9790",
"LABEL_9791",
"LABEL_9792",
"LABEL_9793",
"LABEL_9794",
"LABEL_9795",
"LABEL_9796",
"LABEL_9797",
"LABEL_9798",
"LABEL_9799",
"LABEL_98",
"LABEL_980",
"LABEL_9800",
"LABEL_9801",
"LABEL_9802",
"LABEL_9803",
"LABEL_9804",
"LABEL_9805",
"LABEL_9806",
"LABEL_9807",
"LABEL_9808",
"LABEL_9809",
"LABEL_981",
"LABEL_9810",
"LABEL_9811",
"LABEL_9812",
"LABEL_9813",
"LABEL_9814",
"LABEL_9815",
"LABEL_9816",
"LABEL_9817",
"LABEL_9818",
"LABEL_9819",
"LABEL_982",
"LABEL_9820",
"LABEL_9821",
"LABEL_9822",
"LABEL_9823",
"LABEL_9824",
"LABEL_9825",
"LABEL_9826",
"LABEL_9827",
"LABEL_9828",
"LABEL_9829",
"LABEL_983",
"LABEL_9830",
"LABEL_9831",
"LABEL_9832",
"LABEL_9833",
"LABEL_9834",
"LABEL_9835",
"LABEL_9836",
"LABEL_9837",
"LABEL_9838",
"LABEL_9839",
"LABEL_984",
"LABEL_9840",
"LABEL_9841",
"LABEL_9842",
"LABEL_9843",
"LABEL_9844",
"LABEL_9845",
"LABEL_9846",
"LABEL_9847",
"LABEL_9848",
"LABEL_9849",
"LABEL_985",
"LABEL_9850",
"LABEL_9851",
"LABEL_9852",
"LABEL_9853",
"LABEL_9854",
"LABEL_9855",
"LABEL_9856",
"LABEL_9857",
"LABEL_9858",
"LABEL_9859",
"LABEL_986",
"LABEL_9860",
"LABEL_9861",
"LABEL_9862",
"LABEL_9863",
"LABEL_9864",
"LABEL_9865",
"LABEL_9866",
"LABEL_9867",
"LABEL_9868",
"LABEL_9869",
"LABEL_987",
"LABEL_9870",
"LABEL_9871",
"LABEL_9872",
"LABEL_9873",
"LABEL_9874",
"LABEL_9875",
"LABEL_9876",
"LABEL_9877",
"LABEL_9878",
"LABEL_9879",
"LABEL_988",
"LABEL_9880",
"LABEL_9881",
"LABEL_9882",
"LABEL_9883",
"LABEL_9884",
"LABEL_9885",
"LABEL_9886",
"LABEL_9887",
"LABEL_9888",
"LABEL_9889",
"LABEL_989",
"LABEL_9890",
"LABEL_9891",
"LABEL_9892",
"LABEL_9893",
"LABEL_9894",
"LABEL_9895",
"LABEL_9896",
"LABEL_9897",
"LABEL_9898",
"LABEL_9899",
"LABEL_99",
"LABEL_990",
"LABEL_9900",
"LABEL_9901",
"LABEL_9902",
"LABEL_9903",
"LABEL_9904",
"LABEL_9905",
"LABEL_9906",
"LABEL_9907",
"LABEL_9908",
"LABEL_9909",
"LABEL_991",
"LABEL_9910",
"LABEL_9911",
"LABEL_9912",
"LABEL_9913",
"LABEL_9914",
"LABEL_9915",
"LABEL_9916",
"LABEL_9917",
"LABEL_9918",
"LABEL_9919",
"LABEL_992",
"LABEL_9920",
"LABEL_9921",
"LABEL_9922",
"LABEL_9923",
"LABEL_9924",
"LABEL_9925",
"LABEL_9926",
"LABEL_9927",
"LABEL_9928",
"LABEL_9929",
"LABEL_993",
"LABEL_9930",
"LABEL_9931",
"LABEL_9932",
"LABEL_9933",
"LABEL_9934",
"LABEL_9935",
"LABEL_9936",
"LABEL_9937",
"LABEL_9938",
"LABEL_9939",
"LABEL_994",
"LABEL_9940",
"LABEL_9941",
"LABEL_9942",
"LABEL_9943",
"LABEL_9944",
"LABEL_9945",
"LABEL_9946",
"LABEL_9947",
"LABEL_9948",
"LABEL_9949",
"LABEL_995",
"LABEL_9950",
"LABEL_9951",
"LABEL_9952",
"LABEL_9953",
"LABEL_9954",
"LABEL_9955",
"LABEL_9956",
"LABEL_9957",
"LABEL_9958",
"LABEL_9959",
"LABEL_996",
"LABEL_9960",
"LABEL_9961",
"LABEL_9962",
"LABEL_9963",
"LABEL_9964",
"LABEL_9965",
"LABEL_9966",
"LABEL_9967",
"LABEL_9968",
"LABEL_9969",
"LABEL_997",
"LABEL_9970",
"LABEL_9971",
"LABEL_9972",
"LABEL_9973",
"LABEL_9974",
"LABEL_9975",
"LABEL_9976",
"LABEL_9977",
"LABEL_9978",
"LABEL_9979",
"LABEL_998",
"LABEL_9980",
"LABEL_9981",
"LABEL_9982",
"LABEL_9983",
"LABEL_9984",
"LABEL_9985",
"LABEL_9986",
"LABEL_9987",
"LABEL_9988",
"LABEL_9989",
"LABEL_999",
"LABEL_9990",
"LABEL_9991",
"LABEL_9992",
"LABEL_9993",
"LABEL_9994",
"LABEL_9995",
"LABEL_9996",
"LABEL_9997",
"LABEL_9998",
"LABEL_9999"
] | ---
license: apache-2.0
pipeline_tag: text-classification
---
# WellcomeBertMesh
WellcomeBertMesh is build from the data science team at the WellcomeTrust to tag biomedical grants with Medical Subject Headings ([Mesh](https://www.nlm.nih.gov/mesh/meshhome.html)). Even though developed with the intention to be used towards research grants, it should be applicable to any type of biomedical text close to the domain it was trained which is abstracts from biomedical publications.
# Model description
The model is inspired from [BertMesh](https://pubmed.ncbi.nlm.nih.gov/32976559/) which is trained on the full text of biomedical publications and uses BioBert as its pretrained model.
WellcomeBertMesh is utilising the latest state of the art model in the biomedical domain which is [PubMedBert](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract) from Microsoft and attach a Multilabel attention head which essentially allows the model to pay attention to different tokens per label to decide whether it applies.
We train the model using data from the [BioASQ](http://bioasq.org) competition which consists of abstracts from PubMed publications. We use 2016-2019 data for training and 2020-2021 for testing which gives us ~2.5M publications to train and 220K to test. This is out of a total of 14M publications. It takes 4 days to train WellcomeBertMesh on 8 Nvidia P100 GPUs.
The model achieves 63% micro f1 with a 0.5 threshold for all labels.
The code for developing the model is open source and can be found in https://github.com/wellcometrust/grants_tagger
# How to use
⚠️ You need transformers 4.17+ for the example to work due to its recent support for custom models.
You can use the model straight from the hub but because it contains a custom forward function due to the multilabel attention head you have to pass `trust_remote_code=True`. You can get access to the probabilities for all labels by omitting `return_labels=True`.
```
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
"Wellcome/WellcomeBertMesh"
)
model = AutoModel.from_pretrained(
"Wellcome/WellcomeBertMesh",
trust_remote_code=True
)
text = "This grant is about malaria and not about HIV."
inputs = tokenizer([text], padding="max_length")
labels = model(**inputs, return_labels=True)
print(labels)
```
You can inspect the model code if you navigate to the files and see `model.py`. |
533 | Worldman/distilbert-base-uncased-finetuned-emotion | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4",
"LABEL_5"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9225
- name: F1
type: f1
value: 0.9227046184638882
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2162
- Accuracy: 0.9225
- F1: 0.9227
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8437 | 1.0 | 250 | 0.3153 | 0.903 | 0.9005 |
| 0.2467 | 2.0 | 500 | 0.2162 | 0.9225 | 0.9227 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2+cpu
- Datasets 1.18.3
- Tokenizers 0.11.0
|
538 | XYHY/autonlp-123-478412765 | [
"0",
"1"
] | ---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- XYHY/autonlp-data-123
co2_eq_emissions: 69.86520391863117
---
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 478412765
- CO2 Emissions (in grams): 69.86520391863117
## Validation Metrics
- Loss: 0.186362624168396
- Accuracy: 0.9539955699437723
- Precision: 0.9527454242928453
- Recall: 0.9572049481778669
- AUC: 0.9903929997079495
- F1: 0.9549699799866577
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/XYHY/autonlp-123-478412765
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("XYHY/autonlp-123-478412765", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("XYHY/autonlp-123-478412765", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
540 | Yah216/Sentiment_Analysis_CAMelBERT_msa_sixteenth_HARD | [
"NEGATIVE",
"NEUTRAL",
"POSITIVE"
] | ---
language: ar
widget:
- text: "ممتاز"
- text: "أنا حزين"
- text: "لا شيء"
---
# Model description
This model is an Arabic language sentiment analysis pretrained model.
The model is built on top of the CAMelBERT_msa_sixteenth BERT-based model.
We used the HARD dataset of hotels review to fine tune the model.
The dataset original labels based on a five-star rating were modified to a 3 label data:
- POSITIVE: for ratings > 3 stars
- NEUTRAL: for a 3 star rating
- NEGATIVE: for ratings < 3 stars
This first prototype was trained on 3 epochs for 1 hours using Colab and a TPU acceleration.
# Examples
Here are some examples in Arabic to test :
- Excellent -> ممتاز(Happy)
- I'am sad -> أنا حزين (Sad)
- Nothing -> لا شيء (Neutral)
# Contact
If you have questions or improvement remarks, feel free to contact me on my LinkedIn profile: https://www.linkedin.com/in/yahya-ghrab/ |
541 | Yaia/distilbert-base-uncased-finetuned-emotion | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4",
"LABEL_5"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9255
- name: F1
type: f1
value: 0.9257196896784097
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2086
- Accuracy: 0.9255
- F1: 0.9257
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8249 | 1.0 | 250 | 0.3042 | 0.9085 | 0.9068 |
| 0.2437 | 2.0 | 500 | 0.2086 | 0.9255 | 0.9257 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1
- Datasets 1.17.0
- Tokenizers 0.10.3
|
542 | Yanjie/message-intent | [
"goodbye",
"discount",
"can_i_help",
"other",
"escalation",
"goodbye|purchase",
"restock",
"subscription",
"discount|other",
"subscription|removal",
"goodbye|anything_else",
"issue|query_clarification",
"order|query_order_number",
"shipping|policy",
"shopping|query_link_item",
"shopping|query_first_time",
"issue|query_screenshot",
"issue|query_different_browser",
"shipping|cost",
"checkout",
"order|tracking_info",
"order|query_order_time",
"escalation|waiting",
"issue|query_screenshot_cart",
"order|query_tracking_info",
"issue|query_error_message",
"shopping|query_screenshot_cart",
"product",
"return|policy",
"shopping|query_item_info",
"shopping|query_other_item",
"issue|query_spam_folder",
"warranty",
"checkout|giftcard",
"shipping|other"
] | This is the concierge intent model. Fined tuned on DistilBert uncased model. |
543 | Yanjie/message-preamble | [
"blank",
"great",
"welcome",
"no_worries",
"thanks",
"sorry",
"sure",
"got_it",
"alright",
"no_rush",
"confirmation",
"disagreement",
"will_do",
"understand",
"funny"
] | This is the concierge preamble model. Fined tuned on DistilBert uncased model. |
544 | Yuri/xlm-roberta-base-finetuned-marc | [
"good",
"great",
"ok",
"poor",
"terrible"
] | ---
license: mit
tags:
- generated_from_trainer
datasets:
- amazon_reviews_multi
model-index:
- name: xlm-roberta-base-finetuned-marc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-marc
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the amazon_reviews_multi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9825
- Mae: 0.4956
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mae |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.1432 | 1.0 | 308 | 1.0559 | 0.5133 |
| 0.9883 | 2.0 | 616 | 0.9825 | 0.4956 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
|
591 | abdelkader/distilbert-base-uncased-distilled-clinc | [
"accept_reservations",
"account_blocked",
"alarm",
"application_status",
"apr",
"are_you_a_bot",
"balance",
"bill_balance",
"bill_due",
"book_flight",
"book_hotel",
"calculator",
"calendar",
"calendar_update",
"calories",
"cancel",
"cancel_reservation",
"car_rental",
"card_declined",
"carry_on",
"change_accent",
"change_ai_name",
"change_language",
"change_speed",
"change_user_name",
"change_volume",
"confirm_reservation",
"cook_time",
"credit_limit",
"credit_limit_change",
"credit_score",
"current_location",
"damaged_card",
"date",
"definition",
"direct_deposit",
"directions",
"distance",
"do_you_have_pets",
"exchange_rate",
"expiration_date",
"find_phone",
"flight_status",
"flip_coin",
"food_last",
"freeze_account",
"fun_fact",
"gas",
"gas_type",
"goodbye",
"greeting",
"how_busy",
"how_old_are_you",
"improve_credit_score",
"income",
"ingredient_substitution",
"ingredients_list",
"insurance",
"insurance_change",
"interest_rate",
"international_fees",
"international_visa",
"jump_start",
"last_maintenance",
"lost_luggage",
"make_call",
"maybe",
"meal_suggestion",
"meaning_of_life",
"measurement_conversion",
"meeting_schedule",
"min_payment",
"mpg",
"new_card",
"next_holiday",
"next_song",
"no",
"nutrition_info",
"oil_change_how",
"oil_change_when",
"oos",
"order",
"order_checks",
"order_status",
"pay_bill",
"payday",
"pin_change",
"play_music",
"plug_type",
"pto_balance",
"pto_request",
"pto_request_status",
"pto_used",
"recipe",
"redeem_rewards",
"reminder",
"reminder_update",
"repeat",
"replacement_card_duration",
"report_fraud",
"report_lost_card",
"reset_settings",
"restaurant_reservation",
"restaurant_reviews",
"restaurant_suggestion",
"rewards_balance",
"roll_dice",
"rollover_401k",
"routing",
"schedule_maintenance",
"schedule_meeting",
"share_location",
"shopping_list",
"shopping_list_update",
"smart_home",
"spelling",
"spending_history",
"sync_device",
"taxes",
"tell_joke",
"text",
"thank_you",
"time",
"timer",
"timezone",
"tire_change",
"tire_pressure",
"todo_list",
"todo_list_update",
"traffic",
"transactions",
"transfer",
"translate",
"travel_alert",
"travel_notification",
"travel_suggestion",
"uber",
"update_playlist",
"user_name",
"vaccines",
"w2",
"weather",
"what_are_your_hobbies",
"what_can_i_ask_you",
"what_is_your_name",
"what_song",
"where_are_you_from",
"whisper_mode",
"who_do_you_work_for",
"who_made_you",
"yes"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- clinc_oos
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-distilled-clinc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: clinc_oos
type: clinc_oos
args: plus
metrics:
- name: Accuracy
type: accuracy
value: 0.9464516129032258
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3038
- Accuracy: 0.9465
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 2.8460 | 0.7506 |
| 3.322 | 2.0 | 636 | 1.4301 | 0.8532 |
| 3.322 | 3.0 | 954 | 0.7377 | 0.9152 |
| 1.2296 | 4.0 | 1272 | 0.4784 | 0.9316 |
| 0.449 | 5.0 | 1590 | 0.3730 | 0.9390 |
| 0.449 | 6.0 | 1908 | 0.3367 | 0.9429 |
| 0.2424 | 7.0 | 2226 | 0.3163 | 0.9468 |
| 0.1741 | 8.0 | 2544 | 0.3074 | 0.9452 |
| 0.1741 | 9.0 | 2862 | 0.3054 | 0.9458 |
| 0.1501 | 10.0 | 3180 | 0.3038 | 0.9465 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
592 | abdelkader/distilbert-base-uncased-finetuned-clinc | [
"accept_reservations",
"account_blocked",
"alarm",
"application_status",
"apr",
"are_you_a_bot",
"balance",
"bill_balance",
"bill_due",
"book_flight",
"book_hotel",
"calculator",
"calendar",
"calendar_update",
"calories",
"cancel",
"cancel_reservation",
"car_rental",
"card_declined",
"carry_on",
"change_accent",
"change_ai_name",
"change_language",
"change_speed",
"change_user_name",
"change_volume",
"confirm_reservation",
"cook_time",
"credit_limit",
"credit_limit_change",
"credit_score",
"current_location",
"damaged_card",
"date",
"definition",
"direct_deposit",
"directions",
"distance",
"do_you_have_pets",
"exchange_rate",
"expiration_date",
"find_phone",
"flight_status",
"flip_coin",
"food_last",
"freeze_account",
"fun_fact",
"gas",
"gas_type",
"goodbye",
"greeting",
"how_busy",
"how_old_are_you",
"improve_credit_score",
"income",
"ingredient_substitution",
"ingredients_list",
"insurance",
"insurance_change",
"interest_rate",
"international_fees",
"international_visa",
"jump_start",
"last_maintenance",
"lost_luggage",
"make_call",
"maybe",
"meal_suggestion",
"meaning_of_life",
"measurement_conversion",
"meeting_schedule",
"min_payment",
"mpg",
"new_card",
"next_holiday",
"next_song",
"no",
"nutrition_info",
"oil_change_how",
"oil_change_when",
"oos",
"order",
"order_checks",
"order_status",
"pay_bill",
"payday",
"pin_change",
"play_music",
"plug_type",
"pto_balance",
"pto_request",
"pto_request_status",
"pto_used",
"recipe",
"redeem_rewards",
"reminder",
"reminder_update",
"repeat",
"replacement_card_duration",
"report_fraud",
"report_lost_card",
"reset_settings",
"restaurant_reservation",
"restaurant_reviews",
"restaurant_suggestion",
"rewards_balance",
"roll_dice",
"rollover_401k",
"routing",
"schedule_maintenance",
"schedule_meeting",
"share_location",
"shopping_list",
"shopping_list_update",
"smart_home",
"spelling",
"spending_history",
"sync_device",
"taxes",
"tell_joke",
"text",
"thank_you",
"time",
"timer",
"timezone",
"tire_change",
"tire_pressure",
"todo_list",
"todo_list_update",
"traffic",
"transactions",
"transfer",
"translate",
"travel_alert",
"travel_notification",
"travel_suggestion",
"uber",
"update_playlist",
"user_name",
"vaccines",
"w2",
"weather",
"what_are_your_hobbies",
"what_can_i_ask_you",
"what_is_your_name",
"what_song",
"where_are_you_from",
"whisper_mode",
"who_do_you_work_for",
"who_made_you",
"yes"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- clinc_oos
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-clinc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: clinc_oos
type: clinc_oos
args: plus
metrics:
- name: Accuracy
type: accuracy
value: 0.9174193548387096
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7713
- Accuracy: 0.9174
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 3.2831 | 0.7426 |
| 3.785 | 2.0 | 636 | 1.8739 | 0.8335 |
| 3.785 | 3.0 | 954 | 1.1525 | 0.8926 |
| 1.6894 | 4.0 | 1272 | 0.8569 | 0.91 |
| 0.897 | 5.0 | 1590 | 0.7713 | 0.9174 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
593 | abdelkader/distilbert-base-uncased-finetuned-emotion | [
"LABEL_0",
"LABEL_1",
"LABEL_2",
"LABEL_3",
"LABEL_4",
"LABEL_5"
] | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9215
- name: F1
type: f1
value: 0.9215604730468001
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2162
- Accuracy: 0.9215
- F1: 0.9216
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8007 | 1.0 | 250 | 0.3082 | 0.907 | 0.9045 |
| 0.2438 | 2.0 | 500 | 0.2162 | 0.9215 | 0.9216 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
594 | abhishek/autonlp-bbc-news-classification-37229289 | [
"business",
"entertainment",
"politics",
"sport",
"tech"
] | ---
tags: autonlp
language: en
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-bbc-news-classification
co2_eq_emissions: 5.448567309047846
---
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 37229289
- CO2 Emissions (in grams): 5.448567309047846
## Validation Metrics
- Loss: 0.07081354409456253
- Accuracy: 0.9867109634551495
- Macro F1: 0.9859067529980614
- Micro F1: 0.9867109634551495
- Weighted F1: 0.9866417220968429
- Macro Precision: 0.9868771404595043
- Micro Precision: 0.9867109634551495
- Weighted Precision: 0.9869289511551576
- Macro Recall: 0.9853173241852486
- Micro Recall: 0.9867109634551495
- Weighted Recall: 0.9867109634551495
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-bbc-news-classification-37229289
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-bbc-news-classification-37229289", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-bbc-news-classification-37229289", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
595 | abhishek/autonlp-bbc-roberta-37249301 | [
"business",
"entertainment",
"politics",
"sport",
"tech"
] | ---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-bbc-roberta
co2_eq_emissions: 1.9859980179658823
---
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 37249301
- CO2 Emissions (in grams): 1.9859980179658823
## Validation Metrics
- Loss: 0.06406362354755402
- Accuracy: 0.9833887043189369
- Macro F1: 0.9832763664701248
- Micro F1: 0.9833887043189369
- Weighted F1: 0.9833288528828136
- Macro Precision: 0.9847257743677181
- Micro Precision: 0.9833887043189369
- Weighted Precision: 0.9835392869652073
- Macro Recall: 0.982101705176067
- Micro Recall: 0.9833887043189369
- Weighted Recall: 0.9833887043189369
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-bbc-roberta-37249301
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-bbc-roberta-37249301", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-bbc-roberta-37249301", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
596 | abhishek/autonlp-ferd1-2652021 | [
"0",
"1"
] | ---
tags: autonlp
language: en
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-ferd1
---
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 2652021
## Validation Metrics
- Loss: 0.3934604227542877
- Accuracy: 0.8411030860144452
- Precision: 0.8201550387596899
- Recall: 0.8076335877862595
- AUC: 0.8946767157983608
- F1: 0.8138461538461538
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-ferd1-2652021
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-ferd1-2652021", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-ferd1-2652021", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
597 | abhishek/autonlp-fred2-2682064 | [
"0",
"1"
] | ---
tags: autonlp
language: en
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-fred2
---
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 2682064
## Validation Metrics
- Loss: 0.4454168379306793
- Accuracy: 0.8188976377952756
- Precision: 0.8442028985507246
- Recall: 0.7103658536585366
- AUC: 0.8699702146791053
- F1: 0.771523178807947
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-fred2-2682064
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-fred2-2682064", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-fred2-2682064", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
598 | abhishek/autonlp-imdb-roberta-base-3662644 | [
"neg",
"pos"
] | ---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-imdb-roberta-base
co2_eq_emissions: 25.894117734124272
---
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 3662644
- CO2 Emissions (in grams): 25.894117734124272
## Validation Metrics
- Loss: 0.20277436077594757
- Accuracy: 0.92604
- Precision: 0.9560674830864092
- Recall: 0.89312
- AUC: 0.9814625504000001
- F1: 0.9235223559581421
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-imdb-roberta-base-3662644
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-imdb-roberta-base-3662644", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-imdb-roberta-base-3662644", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.