nerui-pt-pl50-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0690
  • Location Precision: 0.9
  • Location Recall: 0.9574
  • Location F1: 0.9278
  • Location Number: 94
  • Organization Precision: 0.9379
  • Organization Recall: 0.9042
  • Organization F1: 0.9207
  • Organization Number: 167
  • Person Precision: 0.9706
  • Person Recall: 0.9635
  • Person F1: 0.9670
  • Person Number: 137
  • Overall Precision: 0.9395
  • Overall Recall: 0.9372
  • Overall F1: 0.9384
  • Overall Accuracy: 0.9881

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8736 1.0 96 0.4345 0.0 0.0 0.0 94 0.2048 0.1018 0.1360 167 0.3019 0.1168 0.1684 137 0.2409 0.0829 0.1234 0.8453
0.3697 2.0 192 0.2212 0.3288 0.5106 0.4 94 0.5408 0.6347 0.5840 167 0.8207 0.8686 0.8440 137 0.5606 0.6859 0.6169 0.9312
0.203 3.0 288 0.1151 0.7723 0.8298 0.8000 94 0.7316 0.8323 0.7787 167 0.9184 0.9854 0.9507 137 0.8037 0.8844 0.8421 0.9660
0.1351 4.0 384 0.0815 0.7265 0.9043 0.8057 94 0.8187 0.7844 0.8012 167 0.9643 0.9854 0.9747 137 0.8417 0.8819 0.8613 0.9718
0.1121 5.0 480 0.0613 0.8866 0.9149 0.9005 94 0.8580 0.8683 0.8631 167 0.9926 0.9854 0.9890 137 0.9104 0.9196 0.9150 0.9807
0.0964 6.0 576 0.0584 0.8198 0.9681 0.8878 94 0.8788 0.8683 0.8735 167 0.9783 0.9854 0.9818 137 0.8961 0.9322 0.9138 0.9820
0.0832 7.0 672 0.0550 0.8558 0.9468 0.8990 94 0.8634 0.8323 0.8476 167 0.95 0.9708 0.9603 137 0.8914 0.9070 0.8991 0.9815
0.078 8.0 768 0.0495 0.8165 0.9468 0.8768 94 0.8742 0.8323 0.8528 167 0.9708 0.9708 0.9708 137 0.8914 0.9070 0.8991 0.9823
0.0701 9.0 864 0.0479 0.8108 0.9574 0.8780 94 0.8954 0.8204 0.8563 167 0.9710 0.9781 0.9745 137 0.8980 0.9070 0.9025 0.9831
0.0639 10.0 960 0.0465 0.89 0.9468 0.9175 94 0.8793 0.9162 0.8974 167 0.9638 0.9708 0.9673 137 0.9102 0.9422 0.9259 0.9854
0.0619 11.0 1056 0.0412 0.8911 0.9574 0.9231 94 0.9329 0.9162 0.9245 167 0.9854 0.9854 0.9854 137 0.9403 0.9497 0.9450 0.9873
0.056 12.0 1152 0.0478 0.8738 0.9574 0.9137 94 0.9167 0.8563 0.8854 167 1.0 0.9854 0.9926 137 0.9340 0.9246 0.9293 0.9862
0.0502 13.0 1248 0.0438 0.8725 0.9468 0.9082 94 0.9182 0.8743 0.8957 167 0.9779 0.9708 0.9744 137 0.9270 0.9246 0.9258 0.9862
0.0495 14.0 1344 0.0441 0.8505 0.9681 0.9055 94 0.9080 0.8862 0.8970 167 0.9853 0.9781 0.9817 137 0.9187 0.9372 0.9279 0.9854
0.0465 15.0 1440 0.0424 0.9072 0.9362 0.9215 94 0.9042 0.9042 0.9042 167 0.9779 0.9708 0.9744 137 0.93 0.9347 0.9323 0.9870
0.045 16.0 1536 0.0447 0.8969 0.9255 0.9110 94 0.8966 0.9341 0.9150 167 0.9779 0.9708 0.9744 137 0.9238 0.9447 0.9342 0.9865
0.0427 17.0 1632 0.0470 0.8889 0.9362 0.9119 94 0.9146 0.8982 0.9063 167 0.9779 0.9708 0.9744 137 0.9298 0.9322 0.9310 0.9865
0.04 18.0 1728 0.0523 0.8725 0.9468 0.9082 94 0.875 0.8383 0.8563 167 0.9853 0.9781 0.9817 137 0.9121 0.9121 0.9121 0.9826
0.0402 19.0 1824 0.0501 0.9062 0.9255 0.9158 94 0.9085 0.8922 0.9003 167 0.9779 0.9708 0.9744 137 0.9318 0.9271 0.9295 0.9865
0.0382 20.0 1920 0.0509 0.9082 0.9468 0.9271 94 0.9157 0.9102 0.9129 167 0.9708 0.9708 0.9708 137 0.9327 0.9397 0.9362 0.9865
0.0356 21.0 2016 0.0450 0.9082 0.9468 0.9271 94 0.9162 0.9162 0.9162 167 0.9853 0.9781 0.9817 137 0.9377 0.9447 0.9412 0.9876
0.0352 22.0 2112 0.0510 0.9091 0.9574 0.9326 94 0.9212 0.9102 0.9157 167 0.9926 0.9781 0.9853 137 0.9424 0.9447 0.9435 0.9867
0.0316 23.0 2208 0.0481 0.89 0.9468 0.9175 94 0.9264 0.9042 0.9152 167 0.9779 0.9708 0.9744 137 0.9348 0.9372 0.9360 0.9870
0.03 24.0 2304 0.0503 0.8544 0.9362 0.8934 94 0.9321 0.9042 0.9179 167 0.9781 0.9781 0.9781 137 0.9279 0.9372 0.9325 0.9856
0.0304 25.0 2400 0.0464 0.88 0.9362 0.9072 94 0.9102 0.9102 0.9102 167 0.9853 0.9781 0.9817 137 0.9280 0.9397 0.9338 0.9859
0.0292 26.0 2496 0.0471 0.89 0.9468 0.9175 94 0.9212 0.9102 0.9157 167 0.9853 0.9781 0.9817 137 0.9352 0.9422 0.9387 0.9865
0.0279 27.0 2592 0.0479 0.8812 0.9468 0.9128 94 0.9068 0.8743 0.8902 167 0.9853 0.9781 0.9817 137 0.9271 0.9271 0.9271 0.9862
0.028 28.0 2688 0.0530 0.8491 0.9574 0.9 94 0.9304 0.8802 0.9046 167 0.9781 0.9781 0.9781 137 0.9252 0.9322 0.9287 0.9859
0.0266 29.0 2784 0.0534 0.8738 0.9574 0.9137 94 0.9367 0.8862 0.9108 167 0.9853 0.9781 0.9817 137 0.9370 0.9347 0.9358 0.9870
0.0247 30.0 2880 0.0493 0.8911 0.9574 0.9231 94 0.9152 0.9042 0.9096 167 0.9854 0.9854 0.9854 137 0.9330 0.9447 0.9388 0.9876
0.0232 31.0 2976 0.0528 0.9082 0.9468 0.9271 94 0.9207 0.9042 0.9124 167 0.9853 0.9781 0.9817 137 0.9397 0.9397 0.9397 0.9876
0.0249 32.0 3072 0.0561 0.8990 0.9468 0.9223 94 0.9053 0.9162 0.9107 167 0.9853 0.9781 0.9817 137 0.9307 0.9447 0.9377 0.9859
0.0235 33.0 3168 0.0554 0.9082 0.9468 0.9271 94 0.9379 0.9042 0.9207 167 0.9926 0.9781 0.9853 137 0.9492 0.9397 0.9444 0.9878
0.0226 34.0 3264 0.0601 0.8725 0.9468 0.9082 94 0.925 0.8862 0.9052 167 0.9779 0.9708 0.9744 137 0.9296 0.9296 0.9296 0.9862
0.0216 35.0 3360 0.0611 0.8571 0.9574 0.9045 94 0.9423 0.8802 0.9102 167 0.9781 0.9781 0.9781 137 0.9322 0.9322 0.9322 0.9859
0.0211 36.0 3456 0.0548 0.8476 0.9468 0.8945 94 0.9610 0.8862 0.9221 167 0.9779 0.9708 0.9744 137 0.9367 0.9296 0.9332 0.9873
0.0209 37.0 3552 0.0531 0.8558 0.9468 0.8990 94 0.9367 0.8862 0.9108 167 0.9926 0.9854 0.9890 137 0.9347 0.9347 0.9347 0.9867
0.0178 38.0 3648 0.0520 0.89 0.9468 0.9175 94 0.9321 0.9042 0.9179 167 0.9853 0.9781 0.9817 137 0.9397 0.9397 0.9397 0.9881
0.019 39.0 3744 0.0580 0.8654 0.9574 0.9091 94 0.9545 0.8802 0.9159 167 0.9926 0.9854 0.9890 137 0.9442 0.9347 0.9394 0.9876
0.0182 40.0 3840 0.0594 0.8585 0.9681 0.91 94 0.9419 0.8743 0.9068 167 0.9779 0.9708 0.9744 137 0.9320 0.9296 0.9308 0.9867
0.0175 41.0 3936 0.0499 0.8922 0.9681 0.9286 94 0.9325 0.9102 0.9212 167 0.9926 0.9854 0.9890 137 0.9426 0.9497 0.9462 0.9884
0.017 42.0 4032 0.0531 0.89 0.9468 0.9175 94 0.9259 0.8982 0.9119 167 0.9779 0.9708 0.9744 137 0.9347 0.9347 0.9347 0.9881
0.016 43.0 4128 0.0585 0.8738 0.9574 0.9137 94 0.9321 0.9042 0.9179 167 0.9853 0.9781 0.9817 137 0.9352 0.9422 0.9387 0.9867
0.0176 44.0 4224 0.0629 0.8654 0.9574 0.9091 94 0.9241 0.8743 0.8985 167 0.9926 0.9781 0.9853 137 0.9320 0.9296 0.9308 0.9856
0.0183 45.0 4320 0.0527 0.8990 0.9468 0.9223 94 0.9193 0.8862 0.9024 167 0.9778 0.9635 0.9706 137 0.9342 0.9271 0.9306 0.9867
0.0154 46.0 4416 0.0611 0.8558 0.9468 0.8990 94 0.9481 0.8743 0.9097 167 0.9778 0.9635 0.9706 137 0.9338 0.9221 0.9279 0.9865
0.0133 47.0 4512 0.0661 0.8725 0.9468 0.9082 94 0.8982 0.8982 0.8982 167 0.9926 0.9781 0.9853 137 0.9233 0.9372 0.9302 0.9865
0.0153 48.0 4608 0.0627 0.8641 0.9468 0.9036 94 0.9125 0.8743 0.8930 167 0.9853 0.9781 0.9817 137 0.9248 0.9271 0.9260 0.9865
0.0152 49.0 4704 0.0618 0.8641 0.9468 0.9036 94 0.9304 0.8802 0.9046 167 0.9852 0.9708 0.9779 137 0.9318 0.9271 0.9295 0.9867
0.0139 50.0 4800 0.0601 0.8911 0.9574 0.9231 94 0.9371 0.8922 0.9141 167 0.9706 0.9635 0.9670 137 0.9369 0.9322 0.9345 0.9876
0.0127 51.0 4896 0.0601 0.8812 0.9468 0.9128 94 0.9193 0.8862 0.9024 167 0.9779 0.9708 0.9744 137 0.9296 0.9296 0.9296 0.9873
0.0121 52.0 4992 0.0666 0.8990 0.9468 0.9223 94 0.9259 0.8982 0.9119 167 0.9779 0.9708 0.9744 137 0.9370 0.9347 0.9358 0.9876
0.0126 53.0 5088 0.0586 0.9 0.9574 0.9278 94 0.9259 0.8982 0.9119 167 0.9853 0.9781 0.9817 137 0.9397 0.9397 0.9397 0.9878
0.0124 54.0 5184 0.0609 0.9462 0.9362 0.9412 94 0.9281 0.9281 0.9281 167 0.9708 0.9708 0.9708 137 0.9471 0.9447 0.9459 0.9887
0.0131 55.0 5280 0.0614 0.9082 0.9468 0.9271 94 0.9317 0.8982 0.9146 167 0.9853 0.9781 0.9817 137 0.9443 0.9372 0.9407 0.9881
0.013 56.0 5376 0.0632 0.9091 0.9574 0.9326 94 0.9325 0.9102 0.9212 167 0.9779 0.9708 0.9744 137 0.9422 0.9422 0.9422 0.9884
0.0119 57.0 5472 0.0645 0.8824 0.9574 0.9184 94 0.9146 0.8982 0.9063 167 0.9779 0.9708 0.9744 137 0.9279 0.9372 0.9325 0.9865
0.0121 58.0 5568 0.0631 0.8725 0.9468 0.9082 94 0.925 0.8862 0.9052 167 0.9853 0.9781 0.9817 137 0.9322 0.9322 0.9322 0.9873
0.0109 59.0 5664 0.0638 0.8990 0.9468 0.9223 94 0.9375 0.8982 0.9174 167 0.9779 0.9708 0.9744 137 0.9418 0.9347 0.9382 0.9878
0.0117 60.0 5760 0.0665 0.8812 0.9468 0.9128 94 0.9371 0.8922 0.9141 167 0.9638 0.9708 0.9673 137 0.9322 0.9322 0.9322 0.9862
0.01 61.0 5856 0.0647 0.8824 0.9574 0.9184 94 0.9141 0.8922 0.9030 167 0.9708 0.9708 0.9708 137 0.9254 0.9347 0.93 0.9867
0.0108 62.0 5952 0.0601 0.9 0.9574 0.9278 94 0.9317 0.8982 0.9146 167 0.9779 0.9708 0.9744 137 0.9395 0.9372 0.9384 0.9876
0.0109 63.0 6048 0.0592 0.8990 0.9468 0.9223 94 0.9554 0.8982 0.9259 167 0.9853 0.9781 0.9817 137 0.9515 0.9372 0.9443 0.9890
0.0101 64.0 6144 0.0609 0.8725 0.9468 0.9082 94 0.9212 0.9102 0.9157 167 0.9779 0.9708 0.9744 137 0.9280 0.9397 0.9338 0.9867
0.0115 65.0 6240 0.0620 0.8725 0.9468 0.9082 94 0.9487 0.8862 0.9164 167 0.9779 0.9708 0.9744 137 0.9391 0.9296 0.9343 0.9870
0.0093 66.0 6336 0.0619 0.89 0.9468 0.9175 94 0.9152 0.9042 0.9096 167 0.9779 0.9708 0.9744 137 0.9302 0.9372 0.9337 0.9870
0.0105 67.0 6432 0.0596 0.8990 0.9468 0.9223 94 0.9434 0.8982 0.9202 167 0.9779 0.9708 0.9744 137 0.9442 0.9347 0.9394 0.9878
0.0094 68.0 6528 0.0608 0.9091 0.9574 0.9326 94 0.9437 0.9042 0.9235 167 0.9853 0.9781 0.9817 137 0.9494 0.9422 0.9458 0.9887
0.0096 69.0 6624 0.0638 0.8990 0.9468 0.9223 94 0.9618 0.9042 0.9321 167 0.9778 0.9635 0.9706 137 0.9514 0.9347 0.9430 0.9884
0.0089 70.0 6720 0.0630 0.8990 0.9468 0.9223 94 0.9679 0.9042 0.9350 167 0.9778 0.9635 0.9706 137 0.9538 0.9347 0.9442 0.9887
0.0097 71.0 6816 0.0628 0.9 0.9574 0.9278 94 0.9264 0.9042 0.9152 167 0.9779 0.9708 0.9744 137 0.9373 0.9397 0.9385 0.9878
0.0092 72.0 6912 0.0650 0.8812 0.9468 0.9128 94 0.9430 0.8922 0.9169 167 0.9779 0.9708 0.9744 137 0.9392 0.9322 0.9357 0.9873
0.0089 73.0 7008 0.0663 0.8990 0.9468 0.9223 94 0.9321 0.9042 0.9179 167 0.9706 0.9635 0.9670 137 0.9370 0.9347 0.9358 0.9878
0.0088 74.0 7104 0.0663 0.89 0.9468 0.9175 94 0.9259 0.8982 0.9119 167 0.9706 0.9635 0.9670 137 0.9322 0.9322 0.9322 0.9876
0.0076 75.0 7200 0.0646 0.9 0.9574 0.9278 94 0.9434 0.8982 0.9202 167 0.9779 0.9708 0.9744 137 0.9443 0.9372 0.9407 0.9884
0.0084 76.0 7296 0.0634 0.89 0.9468 0.9175 94 0.9379 0.9042 0.9207 167 0.9779 0.9708 0.9744 137 0.9395 0.9372 0.9384 0.9884
0.0079 77.0 7392 0.0647 0.9 0.9574 0.9278 94 0.9259 0.8982 0.9119 167 0.9706 0.9635 0.9670 137 0.9347 0.9347 0.9347 0.9878
0.0078 78.0 7488 0.0669 0.9082 0.9468 0.9271 94 0.9325 0.9102 0.9212 167 0.9706 0.9635 0.9670 137 0.9395 0.9372 0.9384 0.9881
0.0082 79.0 7584 0.0679 0.8812 0.9468 0.9128 94 0.9313 0.8922 0.9113 167 0.9706 0.9635 0.9670 137 0.9320 0.9296 0.9308 0.9870
0.0075 80.0 7680 0.0651 0.9091 0.9574 0.9326 94 0.9152 0.9042 0.9096 167 0.9779 0.9708 0.9744 137 0.935 0.9397 0.9373 0.9884
0.0086 81.0 7776 0.0650 0.9 0.9574 0.9278 94 0.9202 0.8982 0.9091 167 0.9779 0.9708 0.9744 137 0.9348 0.9372 0.9360 0.9873
0.0064 82.0 7872 0.0652 0.9091 0.9574 0.9326 94 0.9321 0.9042 0.9179 167 0.9779 0.9708 0.9744 137 0.9421 0.9397 0.9409 0.9881
0.006 83.0 7968 0.0676 0.9091 0.9574 0.9326 94 0.9264 0.9042 0.9152 167 0.9779 0.9708 0.9744 137 0.9397 0.9397 0.9397 0.9884
0.0079 84.0 8064 0.0677 0.9091 0.9574 0.9326 94 0.9268 0.9102 0.9184 167 0.9706 0.9635 0.9670 137 0.9373 0.9397 0.9385 0.9878
0.0074 85.0 8160 0.0699 0.9082 0.9468 0.9271 94 0.9321 0.9042 0.9179 167 0.9778 0.9635 0.9706 137 0.9418 0.9347 0.9382 0.9878
0.0084 86.0 8256 0.0696 0.9 0.9574 0.9278 94 0.9202 0.8982 0.9091 167 0.9779 0.9708 0.9744 137 0.9348 0.9372 0.9360 0.9878
0.007 87.0 8352 0.0680 0.9 0.9574 0.9278 94 0.9259 0.8982 0.9119 167 0.9779 0.9708 0.9744 137 0.9372 0.9372 0.9372 0.9878
0.0069 88.0 8448 0.0665 0.8911 0.9574 0.9231 94 0.9430 0.8922 0.9169 167 0.9779 0.9708 0.9744 137 0.9418 0.9347 0.9382 0.9884
0.0066 89.0 8544 0.0673 0.8911 0.9574 0.9231 94 0.925 0.8862 0.9052 167 0.9853 0.9781 0.9817 137 0.9370 0.9347 0.9358 0.9878
0.0072 90.0 8640 0.0687 0.8911 0.9574 0.9231 94 0.9487 0.8862 0.9164 167 0.9779 0.9708 0.9744 137 0.9440 0.9322 0.9381 0.9878
0.0067 91.0 8736 0.0693 0.8824 0.9574 0.9184 94 0.9255 0.8922 0.9085 167 0.9706 0.9635 0.9670 137 0.9298 0.9322 0.9310 0.9873
0.0064 92.0 8832 0.0697 0.9 0.9574 0.9278 94 0.9321 0.9042 0.9179 167 0.9706 0.9635 0.9670 137 0.9372 0.9372 0.9372 0.9881
0.007 93.0 8928 0.0683 0.9 0.9574 0.9278 94 0.9437 0.9042 0.9235 167 0.9779 0.9708 0.9744 137 0.9444 0.9397 0.9421 0.9890
0.0061 94.0 9024 0.0676 0.9091 0.9574 0.9326 94 0.9325 0.9102 0.9212 167 0.9706 0.9635 0.9670 137 0.9397 0.9397 0.9397 0.9884
0.0066 95.0 9120 0.0701 0.8824 0.9574 0.9184 94 0.9430 0.8922 0.9169 167 0.9706 0.9635 0.9670 137 0.9369 0.9322 0.9345 0.9881
0.0052 96.0 9216 0.0689 0.9 0.9574 0.9278 94 0.9379 0.9042 0.9207 167 0.9706 0.9635 0.9670 137 0.9395 0.9372 0.9384 0.9881
0.0052 97.0 9312 0.0702 0.8911 0.9574 0.9231 94 0.9434 0.8982 0.9202 167 0.9706 0.9635 0.9670 137 0.9394 0.9347 0.9370 0.9884
0.0073 98.0 9408 0.0692 0.9 0.9574 0.9278 94 0.9379 0.9042 0.9207 167 0.9706 0.9635 0.9670 137 0.9395 0.9372 0.9384 0.9881
0.0052 99.0 9504 0.0690 0.9 0.9574 0.9278 94 0.9379 0.9042 0.9207 167 0.9706 0.9635 0.9670 137 0.9395 0.9372 0.9384 0.9881
0.0063 100.0 9600 0.0690 0.9 0.9574 0.9278 94 0.9379 0.9042 0.9207 167 0.9706 0.9635 0.9670 137 0.9395 0.9372 0.9384 0.9881

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-pt-pl50-0

Finetuned
(388)
this model