apwic commited on
Commit
05811ba
·
verified ·
1 Parent(s): 2914782

Model save

Browse files
README.md ADDED
@@ -0,0 +1,172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: indolem/indobert-base-uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: nerui-pt-pl50-2
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # nerui-pt-pl50-2
15
+
16
+ This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0795
19
+ - Location Precision: 0.8911
20
+ - Location Recall: 0.9677
21
+ - Location F1: 0.9278
22
+ - Location Number: 93
23
+ - Organization Precision: 0.9202
24
+ - Organization Recall: 0.9036
25
+ - Organization F1: 0.9119
26
+ - Organization Number: 166
27
+ - Person Precision: 0.9583
28
+ - Person Recall: 0.9718
29
+ - Person F1: 0.9650
30
+ - Person Number: 142
31
+ - Overall Precision: 0.9265
32
+ - Overall Recall: 0.9426
33
+ - Overall F1: 0.9345
34
+ - Overall Accuracy: 0.9857
35
+
36
+ ## Model description
37
+
38
+ More information needed
39
+
40
+ ## Intended uses & limitations
41
+
42
+ More information needed
43
+
44
+ ## Training and evaluation data
45
+
46
+ More information needed
47
+
48
+ ## Training procedure
49
+
50
+ ### Training hyperparameters
51
+
52
+ The following hyperparameters were used during training:
53
+ - learning_rate: 5e-05
54
+ - train_batch_size: 16
55
+ - eval_batch_size: 64
56
+ - seed: 42
57
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
+ - lr_scheduler_type: linear
59
+ - num_epochs: 100.0
60
+
61
+ ### Training results
62
+
63
+ | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
64
+ |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:|
65
+ | 0.8529 | 1.0 | 96 | 0.4086 | 0.0 | 0.0 | 0.0 | 93 | 0.2301 | 0.1566 | 0.1864 | 166 | 0.3086 | 0.1761 | 0.2242 | 142 | 0.2589 | 0.1272 | 0.1706 | 0.8527 |
66
+ | 0.3531 | 2.0 | 192 | 0.1972 | 0.3952 | 0.5269 | 0.4516 | 93 | 0.5864 | 0.5723 | 0.5793 | 166 | 0.6742 | 0.8451 | 0.7500 | 142 | 0.5690 | 0.6584 | 0.6104 | 0.9432 |
67
+ | 0.1852 | 3.0 | 288 | 0.0960 | 0.8261 | 0.8172 | 0.8216 | 93 | 0.7474 | 0.8735 | 0.8056 | 166 | 0.9592 | 0.9930 | 0.9758 | 142 | 0.8360 | 0.9027 | 0.8681 | 0.9737 |
68
+ | 0.126 | 4.0 | 384 | 0.0836 | 0.7736 | 0.8817 | 0.8241 | 93 | 0.7824 | 0.9096 | 0.8412 | 166 | 0.9586 | 0.9789 | 0.9686 | 142 | 0.8378 | 0.9277 | 0.8805 | 0.9750 |
69
+ | 0.1027 | 5.0 | 480 | 0.0584 | 0.8763 | 0.9140 | 0.8947 | 93 | 0.8795 | 0.8795 | 0.8795 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9066 | 0.9202 | 0.9134 | 0.9830 |
70
+ | 0.0925 | 6.0 | 576 | 0.0524 | 0.7876 | 0.9570 | 0.8641 | 93 | 0.9221 | 0.8554 | 0.8875 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.8978 | 0.9202 | 0.9089 | 0.9824 |
71
+ | 0.0803 | 7.0 | 672 | 0.0523 | 0.8542 | 0.8817 | 0.8677 | 93 | 0.8495 | 0.9518 | 0.8977 | 166 | 0.9586 | 0.9789 | 0.9686 | 142 | 0.8876 | 0.9451 | 0.9155 | 0.9833 |
72
+ | 0.0733 | 8.0 | 768 | 0.0525 | 0.8333 | 0.9677 | 0.8955 | 93 | 0.9241 | 0.8795 | 0.9012 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9144 | 0.9327 | 0.9235 | 0.9824 |
73
+ | 0.0684 | 9.0 | 864 | 0.0438 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9171 | 0.9377 | 0.9273 | 0.9852 |
74
+ | 0.0627 | 10.0 | 960 | 0.0445 | 0.8654 | 0.9677 | 0.9137 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9586 | 0.9789 | 0.9686 | 142 | 0.9173 | 0.9401 | 0.9286 | 0.9849 |
75
+ | 0.059 | 11.0 | 1056 | 0.0453 | 0.8641 | 0.9570 | 0.9082 | 93 | 0.9 | 0.9217 | 0.9107 | 166 | 0.9716 | 0.9648 | 0.9682 | 142 | 0.9155 | 0.9451 | 0.9301 | 0.9855 |
76
+ | 0.0535 | 12.0 | 1152 | 0.0456 | 0.88 | 0.9462 | 0.9119 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9858 | 0.9789 | 0.9823 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9852 |
77
+ | 0.05 | 13.0 | 1248 | 0.0437 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.8889 | 0.9157 | 0.9021 | 166 | 0.9714 | 0.9577 | 0.9645 | 142 | 0.9171 | 0.9377 | 0.9273 | 0.9860 |
78
+ | 0.0481 | 14.0 | 1344 | 0.0442 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9866 |
79
+ | 0.0456 | 15.0 | 1440 | 0.0437 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9212 | 0.9157 | 0.9184 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9244 | 0.9451 | 0.9346 | 0.9866 |
80
+ | 0.0435 | 16.0 | 1536 | 0.0495 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9187 | 0.8855 | 0.9018 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9236 | 0.9352 | 0.9294 | 0.9855 |
81
+ | 0.0412 | 17.0 | 1632 | 0.0444 | 0.9167 | 0.9462 | 0.9312 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9868 |
82
+ | 0.0369 | 18.0 | 1728 | 0.0475 | 0.8667 | 0.9785 | 0.9192 | 93 | 0.9427 | 0.8916 | 0.9164 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9333 | 0.9426 | 0.9380 | 0.9863 |
83
+ | 0.04 | 19.0 | 1824 | 0.0397 | 0.9175 | 0.9570 | 0.9368 | 93 | 0.8988 | 0.9096 | 0.9042 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9874 |
84
+ | 0.0355 | 20.0 | 1920 | 0.0478 | 0.9091 | 0.9677 | 0.9375 | 93 | 0.8947 | 0.9217 | 0.9080 | 166 | 0.9716 | 0.9648 | 0.9682 | 142 | 0.9246 | 0.9476 | 0.9360 | 0.9868 |
85
+ | 0.0369 | 21.0 | 2016 | 0.0561 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9355 | 0.8735 | 0.9034 | 166 | 0.9857 | 0.9718 | 0.9787 | 142 | 0.9372 | 0.9302 | 0.9337 | 0.9855 |
86
+ | 0.0338 | 22.0 | 2112 | 0.0521 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9379 | 0.9577 | 0.9477 | 142 | 0.9031 | 0.9302 | 0.9165 | 0.9841 |
87
+ | 0.033 | 23.0 | 2208 | 0.0519 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.9146 | 0.9352 | 0.9248 | 0.9852 |
88
+ | 0.032 | 24.0 | 2304 | 0.0583 | 0.8505 | 0.9785 | 0.91 | 93 | 0.8957 | 0.8795 | 0.8875 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9080 | 0.9352 | 0.9214 | 0.9835 |
89
+ | 0.0303 | 25.0 | 2400 | 0.0572 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9855 |
90
+ | 0.0286 | 26.0 | 2496 | 0.0588 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9214 | 0.9352 | 0.9282 | 0.9846 |
91
+ | 0.0288 | 27.0 | 2592 | 0.0550 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9259 | 0.9352 | 0.9305 | 0.9849 |
92
+ | 0.0267 | 28.0 | 2688 | 0.0563 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9653 | 0.9789 | 0.9720 | 142 | 0.9223 | 0.9476 | 0.9348 | 0.9855 |
93
+ | 0.0257 | 29.0 | 2784 | 0.0529 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.9310 | 0.9426 | 0.9368 | 0.9860 |
94
+ | 0.0252 | 30.0 | 2880 | 0.0557 | 0.9175 | 0.9570 | 0.9368 | 93 | 0.9118 | 0.9337 | 0.9226 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9863 |
95
+ | 0.0258 | 31.0 | 2976 | 0.0524 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9653 | 0.9789 | 0.9720 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9868 |
96
+ | 0.0221 | 32.0 | 3072 | 0.0587 | 0.8725 | 0.9570 | 0.9128 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9214 | 0.9352 | 0.9282 | 0.9857 |
97
+ | 0.023 | 33.0 | 3168 | 0.0571 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9146 | 0.9036 | 0.9091 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9852 |
98
+ | 0.0236 | 34.0 | 3264 | 0.0558 | 0.9278 | 0.9677 | 0.9474 | 93 | 0.9222 | 0.9277 | 0.9249 | 166 | 0.9786 | 0.9648 | 0.9716 | 142 | 0.9431 | 0.9501 | 0.9466 | 0.9871 |
99
+ | 0.0195 | 35.0 | 3360 | 0.0618 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9857 |
100
+ | 0.0238 | 36.0 | 3456 | 0.0569 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9304 | 0.8855 | 0.9074 | 166 | 0.9510 | 0.9577 | 0.9544 | 142 | 0.9257 | 0.9327 | 0.9292 | 0.9863 |
101
+ | 0.0219 | 37.0 | 3552 | 0.0601 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9860 |
102
+ | 0.0188 | 38.0 | 3648 | 0.0608 | 0.9091 | 0.9677 | 0.9375 | 93 | 0.9212 | 0.9157 | 0.9184 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9866 |
103
+ | 0.0193 | 39.0 | 3744 | 0.0613 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9308 | 0.8916 | 0.9108 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9307 | 0.9377 | 0.9342 | 0.9852 |
104
+ | 0.0192 | 40.0 | 3840 | 0.0613 | 0.89 | 0.9570 | 0.9223 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9863 |
105
+ | 0.0166 | 41.0 | 3936 | 0.0598 | 0.9091 | 0.9677 | 0.9375 | 93 | 0.9053 | 0.9217 | 0.9134 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9246 | 0.9476 | 0.9360 | 0.9863 |
106
+ | 0.0162 | 42.0 | 4032 | 0.0590 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9653 | 0.9789 | 0.9720 | 142 | 0.9268 | 0.9476 | 0.9371 | 0.9868 |
107
+ | 0.0163 | 43.0 | 4128 | 0.0629 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9852 |
108
+ | 0.0168 | 44.0 | 4224 | 0.0631 | 0.9082 | 0.9570 | 0.9319 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9653 | 0.9789 | 0.9720 | 142 | 0.9291 | 0.9476 | 0.9383 | 0.9857 |
109
+ | 0.0152 | 45.0 | 4320 | 0.0669 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9255 | 0.8976 | 0.9113 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9860 |
110
+ | 0.0138 | 46.0 | 4416 | 0.0695 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9182 | 0.8795 | 0.8985 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9212 | 0.9327 | 0.9269 | 0.9844 |
111
+ | 0.0142 | 47.0 | 4512 | 0.0683 | 0.9175 | 0.9570 | 0.9368 | 93 | 0.8876 | 0.9036 | 0.8955 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9852 |
112
+ | 0.0156 | 48.0 | 4608 | 0.0628 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9255 | 0.8976 | 0.9113 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9286 | 0.9401 | 0.9343 | 0.9866 |
113
+ | 0.0139 | 49.0 | 4704 | 0.0686 | 0.9263 | 0.9462 | 0.9362 | 93 | 0.8947 | 0.9217 | 0.9080 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9242 | 0.9426 | 0.9333 | 0.9855 |
114
+ | 0.016 | 50.0 | 4800 | 0.0620 | 0.9167 | 0.9462 | 0.9312 | 93 | 0.9042 | 0.9096 | 0.9069 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9309 | 0.9401 | 0.9355 | 0.9863 |
115
+ | 0.0145 | 51.0 | 4896 | 0.0674 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9863 |
116
+ | 0.0126 | 52.0 | 4992 | 0.0710 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9852 |
117
+ | 0.0132 | 53.0 | 5088 | 0.0719 | 0.9082 | 0.9570 | 0.9319 | 93 | 0.9217 | 0.9217 | 0.9217 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9314 | 0.9476 | 0.9394 | 0.9841 |
118
+ | 0.014 | 54.0 | 5184 | 0.0706 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9024 | 0.8916 | 0.8970 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9214 | 0.9352 | 0.9282 | 0.9852 |
119
+ | 0.014 | 55.0 | 5280 | 0.0654 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.9313 | 0.8976 | 0.9141 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9305 | 0.9352 | 0.9328 | 0.9866 |
120
+ | 0.0119 | 56.0 | 5376 | 0.0683 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9212 | 0.9157 | 0.9184 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9267 | 0.9451 | 0.9358 | 0.9863 |
121
+ | 0.0117 | 57.0 | 5472 | 0.0686 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9236 | 0.9352 | 0.9294 | 0.9860 |
122
+ | 0.0113 | 58.0 | 5568 | 0.0691 | 0.8654 | 0.9677 | 0.9137 | 93 | 0.9182 | 0.8795 | 0.8985 | 166 | 0.9452 | 0.9718 | 0.9583 | 142 | 0.9144 | 0.9327 | 0.9235 | 0.9849 |
123
+ | 0.0126 | 59.0 | 5664 | 0.0714 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9242 | 0.9426 | 0.9333 | 0.9863 |
124
+ | 0.0106 | 60.0 | 5760 | 0.0744 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9716 | 0.9648 | 0.9682 | 142 | 0.9259 | 0.9352 | 0.9305 | 0.9849 |
125
+ | 0.0116 | 61.0 | 5856 | 0.0741 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9236 | 0.9352 | 0.9294 | 0.9849 |
126
+ | 0.0122 | 62.0 | 5952 | 0.0684 | 0.8725 | 0.9570 | 0.9128 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9648 | 0.9648 | 0.9648 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9860 |
127
+ | 0.0103 | 63.0 | 6048 | 0.0751 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9514 | 0.9648 | 0.9580 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9846 |
128
+ | 0.0099 | 64.0 | 6144 | 0.0667 | 0.9184 | 0.9677 | 0.9424 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9580 | 0.9648 | 0.9614 | 142 | 0.9358 | 0.9451 | 0.9404 | 0.9871 |
129
+ | 0.0089 | 65.0 | 6240 | 0.0764 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9130 | 0.8855 | 0.8991 | 166 | 0.9452 | 0.9718 | 0.9583 | 142 | 0.9167 | 0.9327 | 0.9246 | 0.9841 |
130
+ | 0.0098 | 66.0 | 6336 | 0.0752 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9857 |
131
+ | 0.0097 | 67.0 | 6432 | 0.0784 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9846 |
132
+ | 0.0097 | 68.0 | 6528 | 0.0771 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9217 | 0.9217 | 0.9217 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9315 | 0.9501 | 0.9407 | 0.9852 |
133
+ | 0.0094 | 69.0 | 6624 | 0.0737 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9157 | 0.9157 | 0.9157 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9289 | 0.9451 | 0.9370 | 0.9857 |
134
+ | 0.0083 | 70.0 | 6720 | 0.0739 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9310 | 0.9426 | 0.9368 | 0.9852 |
135
+ | 0.0087 | 71.0 | 6816 | 0.0715 | 0.9082 | 0.9570 | 0.9319 | 93 | 0.9042 | 0.9096 | 0.9069 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9866 |
136
+ | 0.0084 | 72.0 | 6912 | 0.0727 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9333 | 0.9426 | 0.9380 | 0.9871 |
137
+ | 0.0076 | 73.0 | 7008 | 0.0726 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9333 | 0.9426 | 0.9380 | 0.9868 |
138
+ | 0.0087 | 74.0 | 7104 | 0.0800 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9259 | 0.9036 | 0.9146 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9841 |
139
+ | 0.0079 | 75.0 | 7200 | 0.0772 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9358 | 0.9451 | 0.9404 | 0.9857 |
140
+ | 0.007 | 76.0 | 7296 | 0.0782 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9863 |
141
+ | 0.0089 | 77.0 | 7392 | 0.0773 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9855 |
142
+ | 0.008 | 78.0 | 7488 | 0.0786 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9238 | 0.9377 | 0.9307 | 0.9838 |
143
+ | 0.0082 | 79.0 | 7584 | 0.0727 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9849 |
144
+ | 0.0086 | 80.0 | 7680 | 0.0743 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9085 | 0.8976 | 0.9030 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9846 |
145
+ | 0.0085 | 81.0 | 7776 | 0.0710 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9860 |
146
+ | 0.0072 | 82.0 | 7872 | 0.0770 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9849 |
147
+ | 0.0067 | 83.0 | 7968 | 0.0810 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9841 |
148
+ | 0.0067 | 84.0 | 8064 | 0.0766 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9242 | 0.9426 | 0.9333 | 0.9855 |
149
+ | 0.0058 | 85.0 | 8160 | 0.0795 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9863 |
150
+ | 0.0079 | 86.0 | 8256 | 0.0777 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9030 | 0.8976 | 0.9003 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9846 |
151
+ | 0.0058 | 87.0 | 8352 | 0.0786 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9846 |
152
+ | 0.0071 | 88.0 | 8448 | 0.0757 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9242 | 0.9426 | 0.9333 | 0.9855 |
153
+ | 0.0068 | 89.0 | 8544 | 0.0806 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9091 | 0.9036 | 0.9063 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9242 | 0.9426 | 0.9333 | 0.9846 |
154
+ | 0.0061 | 90.0 | 8640 | 0.0750 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9309 | 0.9401 | 0.9355 | 0.9855 |
155
+ | 0.0056 | 91.0 | 8736 | 0.0774 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9855 |
156
+ | 0.0054 | 92.0 | 8832 | 0.0808 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9195 | 0.9401 | 0.9297 | 0.9849 |
157
+ | 0.0058 | 93.0 | 8928 | 0.0783 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9218 | 0.9401 | 0.9309 | 0.9852 |
158
+ | 0.0064 | 94.0 | 9024 | 0.0802 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9216 | 0.9377 | 0.9295 | 0.9852 |
159
+ | 0.0069 | 95.0 | 9120 | 0.0824 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9141 | 0.8976 | 0.9058 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9855 |
160
+ | 0.0059 | 96.0 | 9216 | 0.0814 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9080 | 0.8916 | 0.8997 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.9193 | 0.9377 | 0.9284 | 0.9852 |
161
+ | 0.0058 | 97.0 | 9312 | 0.0790 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9860 |
162
+ | 0.0069 | 98.0 | 9408 | 0.0797 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9857 |
163
+ | 0.0061 | 99.0 | 9504 | 0.0795 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9857 |
164
+ | 0.0067 | 100.0 | 9600 | 0.0795 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9265 | 0.9426 | 0.9345 | 0.9857 |
165
+
166
+
167
+ ### Framework versions
168
+
169
+ - Transformers 4.40.2
170
+ - Pytorch 2.3.0+cu121
171
+ - Datasets 2.19.1
172
+ - Tokenizers 0.19.1
adapter-ner/adapter_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "config": {
3
+ "architecture": "prefix_tuning",
4
+ "bottleneck_size": 512,
5
+ "cross_prefix": true,
6
+ "dropout": 0.0,
7
+ "encoder_prefix": true,
8
+ "flat": false,
9
+ "leave_out": [],
10
+ "non_linearity": "tanh",
11
+ "prefix_length": 50,
12
+ "shared_gating": true,
13
+ "use_gating": false
14
+ },
15
+ "config_id": "516cc5189dd85939",
16
+ "hidden_size": 768,
17
+ "model_class": "BertForTokenClassification",
18
+ "model_name": "indolem/indobert-base-uncased",
19
+ "model_type": "bert",
20
+ "name": "adapter-ner",
21
+ "version": "0.2.2"
22
+ }
adapter-ner/head_config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "config": null,
3
+ "hidden_size": 768,
4
+ "label2id": {
5
+ "B-LOCATION": 0,
6
+ "B-ORGANIZATION": 1,
7
+ "B-PERSON": 2,
8
+ "I-LOCATION": 3,
9
+ "I-ORGANIZATION": 4,
10
+ "I-PERSON": 5,
11
+ "O": 6
12
+ },
13
+ "model_class": "BertForTokenClassification",
14
+ "model_name": "indolem/indobert-base-uncased",
15
+ "model_type": "bert",
16
+ "name": null,
17
+ "num_labels": 7,
18
+ "version": "0.2.2"
19
+ }
adapter-ner/pytorch_adapter.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e9b4332f723d8625cf99819240f21578bce51b9940d56506a902c70a8bcbad9
3
+ size 39565332
adapter-ner/pytorch_model_head.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:410581abae1911034624f4282b74911d027563f5d8699f5b2b88cca01daa0332
3
+ size 23066