DoNotChoke commited on
Commit
8d8ec12
·
verified ·
1 Parent(s): 4906166

abte-resturants-distilbert-base-uncased

Browse files
README.md ADDED
@@ -0,0 +1,158 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: distilbert/distilbert-base-uncased
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: abte-restaurants-distilbert-base-uncased
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # abte-restaurants-distilbert-base-uncased
16
+
17
+ This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3605
20
+ - F1-score: 0.8429
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 2e-05
40
+ - train_batch_size: 256
41
+ - eval_batch_size: 256
42
+ - seed: 42
43
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 100
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | F1-score |
50
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
51
+ | 0.6511 | 1.0 | 15 | 0.5160 | 0.0210 |
52
+ | 0.3533 | 2.0 | 30 | 0.2970 | 0.5713 |
53
+ | 0.2243 | 3.0 | 45 | 0.2558 | 0.6359 |
54
+ | 0.1706 | 4.0 | 60 | 0.2319 | 0.6803 |
55
+ | 0.1363 | 5.0 | 75 | 0.2149 | 0.7386 |
56
+ | 0.0983 | 6.0 | 90 | 0.2058 | 0.7840 |
57
+ | 0.0763 | 7.0 | 105 | 0.2034 | 0.8062 |
58
+ | 0.0614 | 8.0 | 120 | 0.2150 | 0.8121 |
59
+ | 0.0484 | 9.0 | 135 | 0.2192 | 0.8166 |
60
+ | 0.0406 | 10.0 | 150 | 0.2291 | 0.8243 |
61
+ | 0.0341 | 11.0 | 165 | 0.2317 | 0.8284 |
62
+ | 0.0278 | 12.0 | 180 | 0.2352 | 0.8334 |
63
+ | 0.0244 | 13.0 | 195 | 0.2480 | 0.8261 |
64
+ | 0.0221 | 14.0 | 210 | 0.2546 | 0.8288 |
65
+ | 0.0208 | 15.0 | 225 | 0.2558 | 0.8288 |
66
+ | 0.0175 | 16.0 | 240 | 0.2678 | 0.8317 |
67
+ | 0.0164 | 17.0 | 255 | 0.2712 | 0.8225 |
68
+ | 0.0141 | 18.0 | 270 | 0.2635 | 0.8365 |
69
+ | 0.0128 | 19.0 | 285 | 0.2720 | 0.8356 |
70
+ | 0.012 | 20.0 | 300 | 0.2800 | 0.8332 |
71
+ | 0.0118 | 21.0 | 315 | 0.2837 | 0.8378 |
72
+ | 0.0115 | 22.0 | 330 | 0.2866 | 0.8378 |
73
+ | 0.0108 | 23.0 | 345 | 0.2893 | 0.8354 |
74
+ | 0.0099 | 24.0 | 360 | 0.2955 | 0.8362 |
75
+ | 0.0087 | 25.0 | 375 | 0.2979 | 0.8353 |
76
+ | 0.0082 | 26.0 | 390 | 0.2957 | 0.8393 |
77
+ | 0.0074 | 27.0 | 405 | 0.3025 | 0.8391 |
78
+ | 0.0072 | 28.0 | 420 | 0.3022 | 0.8376 |
79
+ | 0.0079 | 29.0 | 435 | 0.3137 | 0.8360 |
80
+ | 0.0066 | 30.0 | 450 | 0.3118 | 0.8338 |
81
+ | 0.0068 | 31.0 | 465 | 0.3132 | 0.8424 |
82
+ | 0.0073 | 32.0 | 480 | 0.3071 | 0.8413 |
83
+ | 0.0059 | 33.0 | 495 | 0.3048 | 0.8365 |
84
+ | 0.0064 | 34.0 | 510 | 0.3218 | 0.8407 |
85
+ | 0.0083 | 35.0 | 525 | 0.3187 | 0.8392 |
86
+ | 0.006 | 36.0 | 540 | 0.3218 | 0.8396 |
87
+ | 0.0056 | 37.0 | 555 | 0.3167 | 0.8431 |
88
+ | 0.0051 | 38.0 | 570 | 0.3160 | 0.8404 |
89
+ | 0.006 | 39.0 | 585 | 0.3229 | 0.8421 |
90
+ | 0.005 | 40.0 | 600 | 0.3178 | 0.8408 |
91
+ | 0.0049 | 41.0 | 615 | 0.3275 | 0.8388 |
92
+ | 0.005 | 42.0 | 630 | 0.3265 | 0.8409 |
93
+ | 0.0048 | 43.0 | 645 | 0.3221 | 0.8403 |
94
+ | 0.0047 | 44.0 | 660 | 0.3212 | 0.8402 |
95
+ | 0.0044 | 45.0 | 675 | 0.3221 | 0.8413 |
96
+ | 0.0049 | 46.0 | 690 | 0.3278 | 0.8405 |
97
+ | 0.0046 | 47.0 | 705 | 0.3348 | 0.8408 |
98
+ | 0.0044 | 48.0 | 720 | 0.3305 | 0.8414 |
99
+ | 0.0038 | 49.0 | 735 | 0.3358 | 0.8420 |
100
+ | 0.0052 | 50.0 | 750 | 0.3368 | 0.8416 |
101
+ | 0.0042 | 51.0 | 765 | 0.3298 | 0.8410 |
102
+ | 0.004 | 52.0 | 780 | 0.3412 | 0.8359 |
103
+ | 0.0045 | 53.0 | 795 | 0.3404 | 0.8371 |
104
+ | 0.004 | 54.0 | 810 | 0.3332 | 0.8410 |
105
+ | 0.0041 | 55.0 | 825 | 0.3361 | 0.8428 |
106
+ | 0.0036 | 56.0 | 840 | 0.3355 | 0.8413 |
107
+ | 0.0041 | 57.0 | 855 | 0.3396 | 0.8413 |
108
+ | 0.0039 | 58.0 | 870 | 0.3441 | 0.8412 |
109
+ | 0.004 | 59.0 | 885 | 0.3437 | 0.8419 |
110
+ | 0.0039 | 60.0 | 900 | 0.3470 | 0.8407 |
111
+ | 0.0037 | 61.0 | 915 | 0.3478 | 0.8434 |
112
+ | 0.0036 | 62.0 | 930 | 0.3499 | 0.8454 |
113
+ | 0.0036 | 63.0 | 945 | 0.3492 | 0.8437 |
114
+ | 0.0043 | 64.0 | 960 | 0.3477 | 0.8429 |
115
+ | 0.0039 | 65.0 | 975 | 0.3431 | 0.8409 |
116
+ | 0.0035 | 66.0 | 990 | 0.3474 | 0.8434 |
117
+ | 0.004 | 67.0 | 1005 | 0.3478 | 0.8436 |
118
+ | 0.0034 | 68.0 | 1020 | 0.3526 | 0.8421 |
119
+ | 0.0035 | 69.0 | 1035 | 0.3514 | 0.8459 |
120
+ | 0.0033 | 70.0 | 1050 | 0.3527 | 0.8443 |
121
+ | 0.0036 | 71.0 | 1065 | 0.3485 | 0.8430 |
122
+ | 0.0036 | 72.0 | 1080 | 0.3521 | 0.8456 |
123
+ | 0.0036 | 73.0 | 1095 | 0.3535 | 0.8433 |
124
+ | 0.0036 | 74.0 | 1110 | 0.3578 | 0.8405 |
125
+ | 0.0031 | 75.0 | 1125 | 0.3609 | 0.8414 |
126
+ | 0.0033 | 76.0 | 1140 | 0.3563 | 0.8426 |
127
+ | 0.0033 | 77.0 | 1155 | 0.3561 | 0.8441 |
128
+ | 0.0032 | 78.0 | 1170 | 0.3550 | 0.8423 |
129
+ | 0.0032 | 79.0 | 1185 | 0.3554 | 0.8414 |
130
+ | 0.0031 | 80.0 | 1200 | 0.3554 | 0.8404 |
131
+ | 0.0039 | 81.0 | 1215 | 0.3549 | 0.8413 |
132
+ | 0.0034 | 82.0 | 1230 | 0.3548 | 0.8405 |
133
+ | 0.0029 | 83.0 | 1245 | 0.3575 | 0.8443 |
134
+ | 0.0032 | 84.0 | 1260 | 0.3579 | 0.8416 |
135
+ | 0.0029 | 85.0 | 1275 | 0.3603 | 0.8408 |
136
+ | 0.0031 | 86.0 | 1290 | 0.3611 | 0.8445 |
137
+ | 0.0031 | 87.0 | 1305 | 0.3612 | 0.8444 |
138
+ | 0.0029 | 88.0 | 1320 | 0.3620 | 0.8447 |
139
+ | 0.0032 | 89.0 | 1335 | 0.3594 | 0.8416 |
140
+ | 0.0041 | 90.0 | 1350 | 0.3586 | 0.8423 |
141
+ | 0.0032 | 91.0 | 1365 | 0.3599 | 0.8423 |
142
+ | 0.0031 | 92.0 | 1380 | 0.3598 | 0.8409 |
143
+ | 0.0033 | 93.0 | 1395 | 0.3593 | 0.8424 |
144
+ | 0.0029 | 94.0 | 1410 | 0.3593 | 0.8422 |
145
+ | 0.003 | 95.0 | 1425 | 0.3607 | 0.8426 |
146
+ | 0.0028 | 96.0 | 1440 | 0.3610 | 0.8449 |
147
+ | 0.0029 | 97.0 | 1455 | 0.3607 | 0.8424 |
148
+ | 0.003 | 98.0 | 1470 | 0.3609 | 0.8422 |
149
+ | 0.0029 | 99.0 | 1485 | 0.3606 | 0.8433 |
150
+ | 0.003 | 100.0 | 1500 | 0.3605 | 0.8429 |
151
+
152
+
153
+ ### Framework versions
154
+
155
+ - Transformers 4.48.3
156
+ - Pytorch 2.5.1+cu124
157
+ - Datasets 3.2.0
158
+ - Tokenizers 0.21.0
config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "distilbert/distilbert-base-uncased",
3
+ "activation": "gelu",
4
+ "architectures": [
5
+ "DistilBertForTokenClassification"
6
+ ],
7
+ "attention_dropout": 0.1,
8
+ "dim": 768,
9
+ "dropout": 0.1,
10
+ "hidden_dim": 3072,
11
+ "id2label": {
12
+ "0": "O",
13
+ "1": "B-Term",
14
+ "2": "I-Term"
15
+ },
16
+ "initializer_range": 0.02,
17
+ "label2id": {
18
+ "B-Term": 1,
19
+ "I-Term": 2,
20
+ "O": 0
21
+ },
22
+ "max_position_embeddings": 512,
23
+ "model_type": "distilbert",
24
+ "n_heads": 12,
25
+ "n_layers": 6,
26
+ "pad_token_id": 0,
27
+ "qa_dropout": 0.1,
28
+ "seq_classif_dropout": 0.2,
29
+ "sinusoidal_pos_embds": false,
30
+ "tie_weights_": true,
31
+ "torch_dtype": "float32",
32
+ "transformers_version": "4.48.3",
33
+ "vocab_size": 30522
34
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:12d395ea0c8bb8afff0a739e2ab5826ead88c6c961cbbc9a87392a7de820a3c9
3
+ size 265473092
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": false,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "extra_special_tokens": {},
48
+ "mask_token": "[MASK]",
49
+ "model_max_length": 512,
50
+ "pad_token": "[PAD]",
51
+ "sep_token": "[SEP]",
52
+ "strip_accents": null,
53
+ "tokenize_chinese_chars": true,
54
+ "tokenizer_class": "DistilBertTokenizer",
55
+ "unk_token": "[UNK]"
56
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0ecf67565b542bd0c7a5adb3c402743548a98b904968a03f3a858036e0c09bb1
3
+ size 5304
vocab.txt ADDED
The diff for this file is too large to render. See raw diff