nerui-pt-pl20-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0562
  • Location Precision: 0.9
  • Location Recall: 0.9574
  • Location F1: 0.9278
  • Location Number: 94
  • Organization Precision: 0.9268
  • Organization Recall: 0.9102
  • Organization F1: 0.9184
  • Organization Number: 167
  • Person Precision: 0.9926
  • Person Recall: 0.9854
  • Person F1: 0.9890
  • Person Number: 137
  • Overall Precision: 0.9425
  • Overall Recall: 0.9472
  • Overall F1: 0.9449
  • Overall Accuracy: 0.9876

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8717 1.0 96 0.4085 0.0 0.0 0.0 94 0.2308 0.2156 0.2229 167 0.2595 0.3504 0.2981 137 0.2456 0.2111 0.2270 0.8638
0.3803 2.0 192 0.2545 0.3913 0.2872 0.3313 94 0.4693 0.6407 0.5418 167 0.6034 0.7883 0.6835 137 0.5084 0.6080 0.5538 0.9238
0.204 3.0 288 0.1028 0.8041 0.8298 0.8168 94 0.7487 0.8383 0.7910 167 0.9714 0.9927 0.9819 137 0.8349 0.8894 0.8613 0.9702
0.1305 4.0 384 0.0795 0.8447 0.9255 0.8832 94 0.8068 0.8503 0.8280 167 0.9645 0.9927 0.9784 137 0.8690 0.9171 0.8924 0.9738
0.1104 5.0 480 0.0698 0.8131 0.9255 0.8657 94 0.8535 0.8024 0.8272 167 0.9855 0.9927 0.9891 137 0.8881 0.8970 0.8925 0.9782
0.098 6.0 576 0.0536 0.8476 0.9468 0.8945 94 0.8690 0.8743 0.8716 167 0.9783 0.9854 0.9818 137 0.9002 0.9296 0.9147 0.9820
0.0876 7.0 672 0.0461 0.9010 0.9681 0.9333 94 0.8636 0.9102 0.8863 167 0.9926 0.9854 0.9890 137 0.9153 0.9497 0.9322 0.9851
0.0784 8.0 768 0.0440 0.8911 0.9574 0.9231 94 0.8795 0.8743 0.8769 167 0.9854 0.9854 0.9854 137 0.9183 0.9322 0.9252 0.9859
0.0721 9.0 864 0.0417 0.8364 0.9787 0.9020 94 0.9152 0.9042 0.9096 167 0.9926 0.9854 0.9890 137 0.9197 0.9497 0.9345 0.9854
0.0659 10.0 960 0.0440 0.8788 0.9255 0.9016 94 0.8791 0.9581 0.9169 167 0.9854 0.9854 0.9854 137 0.9139 0.9598 0.9363 0.9856
0.063 11.0 1056 0.0420 0.8288 0.9787 0.8976 94 0.9080 0.8862 0.8970 167 0.9926 0.9781 0.9853 137 0.9144 0.9397 0.9269 0.9845
0.0583 12.0 1152 0.0463 0.8529 0.9255 0.8878 94 0.8929 0.8982 0.8955 167 1.0 0.9854 0.9926 137 0.9185 0.9347 0.9265 0.9845
0.0562 13.0 1248 0.0454 0.87 0.9255 0.8969 94 0.8857 0.9281 0.9064 167 0.9926 0.9854 0.9890 137 0.9173 0.9472 0.9320 0.9840
0.0528 14.0 1344 0.0391 0.8365 0.9255 0.8788 94 0.8994 0.9102 0.9048 167 0.9927 0.9927 0.9927 137 0.9146 0.9422 0.9282 0.9867
0.048 15.0 1440 0.0409 0.8627 0.9362 0.8980 94 0.8895 0.9162 0.9027 167 0.9853 0.9781 0.9817 137 0.9146 0.9422 0.9282 0.9854
0.0464 16.0 1536 0.0360 0.8911 0.9574 0.9231 94 0.9329 0.9162 0.9245 167 1.0 0.9854 0.9926 137 0.945 0.9497 0.9474 0.9878
0.0466 17.0 1632 0.0425 0.8491 0.9574 0.9 94 0.9273 0.9162 0.9217 167 1.0 0.9781 0.9889 137 0.9309 0.9472 0.9390 0.9867
0.041 18.0 1728 0.0423 0.8333 0.9574 0.8911 94 0.9474 0.8623 0.9028 167 1.0 0.9854 0.9926 137 0.9342 0.9271 0.9306 0.9865
0.0423 19.0 1824 0.0420 0.8738 0.9574 0.9137 94 0.9141 0.8922 0.9030 167 1.0 0.9854 0.9926 137 0.9327 0.9397 0.9362 0.9856
0.0385 20.0 1920 0.0412 0.8505 0.9681 0.9055 94 0.9130 0.8802 0.8963 167 0.9926 0.9854 0.9890 137 0.9233 0.9372 0.9302 0.9854
0.0372 21.0 2016 0.0327 0.91 0.9681 0.9381 94 0.9286 0.9341 0.9313 167 0.9926 0.9854 0.9890 137 0.9455 0.9598 0.9526 0.9890
0.0336 22.0 2112 0.0359 0.9010 0.9681 0.9333 94 0.9186 0.9461 0.9322 167 0.9926 0.9854 0.9890 137 0.9389 0.9648 0.9517 0.9876
0.0345 23.0 2208 0.0414 0.8571 0.9574 0.9045 94 0.9375 0.8982 0.9174 167 1.0 0.9854 0.9926 137 0.9375 0.9422 0.9398 0.9870
0.0342 24.0 2304 0.0430 0.8911 0.9574 0.9231 94 0.925 0.8862 0.9052 167 0.9926 0.9781 0.9853 137 0.9394 0.9347 0.9370 0.9865
0.0315 25.0 2400 0.0396 0.9192 0.9681 0.9430 94 0.9136 0.8862 0.8997 167 0.9781 0.9781 0.9781 137 0.9372 0.9372 0.9372 0.9865
0.0306 26.0 2496 0.0405 0.8738 0.9574 0.9137 94 0.9202 0.8982 0.9091 167 0.9926 0.9854 0.9890 137 0.9328 0.9422 0.9375 0.9865
0.0302 27.0 2592 0.0496 0.8627 0.9362 0.8980 94 0.9295 0.8683 0.8978 167 1.0 0.9854 0.9926 137 0.9364 0.9246 0.9305 0.9865
0.0287 28.0 2688 0.0392 0.8641 0.9468 0.9036 94 0.9130 0.8802 0.8963 167 0.9926 0.9781 0.9853 137 0.9273 0.9296 0.9285 0.9859
0.0314 29.0 2784 0.0426 0.8654 0.9574 0.9091 94 0.9152 0.9042 0.9096 167 1.0 0.9781 0.9889 137 0.9305 0.9422 0.9363 0.9870
0.0259 30.0 2880 0.0466 0.9 0.9574 0.9278 94 0.9152 0.9042 0.9096 167 0.9926 0.9854 0.9890 137 0.9377 0.9447 0.9412 0.9862
0.0285 31.0 2976 0.0450 0.8990 0.9468 0.9223 94 0.9273 0.9162 0.9217 167 0.9781 0.9781 0.9781 137 0.9377 0.9447 0.9412 0.9862
0.0272 32.0 3072 0.0547 0.8396 0.9468 0.89 94 0.9042 0.9042 0.9042 167 1.0 0.9854 0.9926 137 0.9191 0.9422 0.9305 0.9837
0.0262 33.0 3168 0.0352 0.88 0.9362 0.9072 94 0.9277 0.9222 0.9249 167 0.9779 0.9708 0.9744 137 0.9328 0.9422 0.9375 0.9876
0.0255 34.0 3264 0.0422 0.8980 0.9362 0.9167 94 0.9198 0.8922 0.9058 167 0.9640 0.9781 0.9710 137 0.9298 0.9322 0.9310 0.9854
0.024 35.0 3360 0.0519 0.8505 0.9681 0.9055 94 0.9241 0.8743 0.8985 167 0.9854 0.9854 0.9854 137 0.9254 0.9347 0.93 0.9848
0.0223 36.0 3456 0.0559 0.8738 0.9574 0.9137 94 0.9375 0.8982 0.9174 167 0.9854 0.9854 0.9854 137 0.9375 0.9422 0.9398 0.9856
0.0213 37.0 3552 0.0485 0.8627 0.9362 0.8980 94 0.9042 0.9042 0.9042 167 0.9926 0.9854 0.9890 137 0.9235 0.9397 0.9315 0.9848
0.0221 38.0 3648 0.0481 0.9091 0.9574 0.9326 94 0.9112 0.9222 0.9167 167 0.9926 0.9781 0.9853 137 0.9380 0.9497 0.9438 0.9870
0.0217 39.0 3744 0.0462 0.875 0.9681 0.9192 94 0.9152 0.9042 0.9096 167 0.9926 0.9854 0.9890 137 0.9309 0.9472 0.9390 0.9859
0.0243 40.0 3840 0.0416 0.8932 0.9787 0.9340 94 0.9217 0.9162 0.9189 167 0.9853 0.9781 0.9817 137 0.9358 0.9523 0.9440 0.9876
0.0202 41.0 3936 0.0496 0.8725 0.9468 0.9082 94 0.9222 0.9222 0.9222 167 1.0 0.9854 0.9926 137 0.9356 0.9497 0.9426 0.9862
0.0191 42.0 4032 0.0408 0.8846 0.9787 0.9293 94 0.9207 0.9042 0.9124 167 0.9926 0.9854 0.9890 137 0.9356 0.9497 0.9426 0.9881
0.0191 43.0 4128 0.0425 0.875 0.9681 0.9192 94 0.9317 0.8982 0.9146 167 0.9926 0.9854 0.9890 137 0.9377 0.9447 0.9412 0.9873
0.0187 44.0 4224 0.0535 0.8411 0.9574 0.8955 94 0.9182 0.8743 0.8957 167 1.0 0.9854 0.9926 137 0.9252 0.9322 0.9287 0.9851
0.0189 45.0 4320 0.0353 0.89 0.9468 0.9175 94 0.9337 0.9281 0.9309 167 0.9926 0.9854 0.9890 137 0.9428 0.9523 0.9475 0.9895
0.0194 46.0 4416 0.0437 0.9010 0.9681 0.9333 94 0.9317 0.8982 0.9146 167 0.9926 0.9781 0.9853 137 0.9446 0.9422 0.9434 0.9881
0.0168 47.0 4512 0.0428 0.9010 0.9681 0.9333 94 0.9006 0.9222 0.9112 167 0.9926 0.9781 0.9853 137 0.9312 0.9523 0.9416 0.9878
0.0149 48.0 4608 0.0447 0.89 0.9468 0.9175 94 0.9268 0.9102 0.9184 167 0.9853 0.9781 0.9817 137 0.9375 0.9422 0.9398 0.9873
0.0167 49.0 4704 0.0453 0.8679 0.9787 0.9200 94 0.9444 0.9162 0.9301 167 1.0 0.9854 0.9926 137 0.9429 0.9548 0.9488 0.9884
0.0173 50.0 4800 0.0481 0.8835 0.9681 0.9239 94 0.9557 0.9042 0.9292 167 0.9854 0.9854 0.9854 137 0.9472 0.9472 0.9472 0.9876
0.0171 51.0 4896 0.0443 0.9 0.9574 0.9278 94 0.9371 0.8922 0.9141 167 0.9781 0.9781 0.9781 137 0.9419 0.9372 0.9395 0.9870
0.016 52.0 4992 0.0398 0.875 0.9681 0.9192 94 0.8988 0.9042 0.9015 167 0.9779 0.9708 0.9744 137 0.9191 0.9422 0.9305 0.9870
0.0163 53.0 5088 0.0472 0.8667 0.9681 0.9146 94 0.9371 0.8922 0.9141 167 0.9854 0.9854 0.9854 137 0.9352 0.9422 0.9387 0.9873
0.0147 54.0 5184 0.0444 0.9029 0.9894 0.9442 94 0.9390 0.9222 0.9305 167 0.9926 0.9854 0.9890 137 0.9479 0.9598 0.9538 0.9881
0.0143 55.0 5280 0.0496 0.8835 0.9681 0.9239 94 0.9096 0.9042 0.9069 167 0.9781 0.9781 0.9781 137 0.9261 0.9447 0.9353 0.9867
0.0146 56.0 5376 0.0445 0.9192 0.9681 0.9430 94 0.9321 0.9042 0.9179 167 0.9926 0.9854 0.9890 137 0.9496 0.9472 0.9484 0.9884
0.0129 57.0 5472 0.0445 0.9010 0.9681 0.9333 94 0.9557 0.9042 0.9292 167 0.9926 0.9854 0.9890 137 0.9544 0.9472 0.9508 0.9881
0.0139 58.0 5568 0.0445 0.8922 0.9681 0.9286 94 0.9490 0.8922 0.9198 167 0.9926 0.9854 0.9890 137 0.9494 0.9422 0.9458 0.9878
0.0136 59.0 5664 0.0508 0.8835 0.9681 0.9239 94 0.9329 0.9162 0.9245 167 0.9854 0.9854 0.9854 137 0.9381 0.9523 0.9451 0.9873
0.0119 60.0 5760 0.0508 0.8835 0.9681 0.9239 94 0.9162 0.9162 0.9162 167 0.9926 0.9781 0.9853 137 0.9333 0.9497 0.9415 0.9862
0.013 61.0 5856 0.0566 0.8889 0.9362 0.9119 94 0.9212 0.9102 0.9157 167 0.9926 0.9854 0.9890 137 0.9375 0.9422 0.9398 0.9859
0.0123 62.0 5952 0.0436 0.9167 0.9362 0.9263 94 0.9394 0.9281 0.9337 167 0.9854 0.9854 0.9854 137 0.9497 0.9497 0.9497 0.9892
0.0126 63.0 6048 0.0445 0.9192 0.9681 0.9430 94 0.9162 0.9162 0.9162 167 0.9926 0.9854 0.9890 137 0.9428 0.9523 0.9475 0.9873
0.0124 64.0 6144 0.0504 0.8762 0.9787 0.9246 94 0.9427 0.8862 0.9136 167 0.9926 0.9854 0.9890 137 0.9422 0.9422 0.9422 0.9878
0.0126 65.0 6240 0.0495 0.8911 0.9574 0.9231 94 0.9317 0.8982 0.9146 167 0.9926 0.9854 0.9890 137 0.9422 0.9422 0.9422 0.9870
0.013 66.0 6336 0.0517 0.8922 0.9681 0.9286 94 0.9308 0.8862 0.9080 167 0.9926 0.9854 0.9890 137 0.9421 0.9397 0.9409 0.9878
0.0112 67.0 6432 0.0500 0.8835 0.9681 0.9239 94 0.9487 0.8862 0.9164 167 0.9926 0.9854 0.9890 137 0.9468 0.9397 0.9433 0.9878
0.0103 68.0 6528 0.0515 0.9091 0.9574 0.9326 94 0.9281 0.9281 0.9281 167 0.9926 0.9854 0.9890 137 0.9453 0.9548 0.95 0.9878
0.0115 69.0 6624 0.0459 0.9010 0.9681 0.9333 94 0.9423 0.8802 0.9102 167 0.9926 0.9854 0.9890 137 0.9491 0.9372 0.9431 0.9878
0.0129 70.0 6720 0.0516 0.9184 0.9574 0.9375 94 0.9217 0.9162 0.9189 167 0.9926 0.9854 0.9890 137 0.945 0.9497 0.9474 0.9878
0.0105 71.0 6816 0.0486 0.8889 0.9362 0.9119 94 0.9207 0.9042 0.9124 167 0.9853 0.9781 0.9817 137 0.9348 0.9372 0.9360 0.9862
0.0098 72.0 6912 0.0553 0.875 0.9681 0.9192 94 0.9259 0.8982 0.9119 167 0.9779 0.9708 0.9744 137 0.9303 0.9397 0.9350 0.9862
0.0097 73.0 7008 0.0510 0.9 0.9574 0.9278 94 0.9497 0.9042 0.9264 167 0.9926 0.9854 0.9890 137 0.9519 0.9447 0.9483 0.9884
0.0109 74.0 7104 0.0485 0.91 0.9681 0.9381 94 0.9444 0.9162 0.9301 167 0.9781 0.9781 0.9781 137 0.9474 0.9497 0.9486 0.9884
0.0087 75.0 7200 0.0580 0.8738 0.9574 0.9137 94 0.9375 0.8982 0.9174 167 0.9926 0.9854 0.9890 137 0.9398 0.9422 0.9410 0.9867
0.0108 76.0 7296 0.0486 0.9010 0.9681 0.9333 94 0.9375 0.8982 0.9174 167 0.9926 0.9854 0.9890 137 0.9471 0.9447 0.9459 0.9878
0.0096 77.0 7392 0.0497 0.91 0.9681 0.9381 94 0.9434 0.8982 0.9202 167 0.9926 0.9854 0.9890 137 0.9519 0.9447 0.9483 0.9884
0.0091 78.0 7488 0.0498 0.9091 0.9574 0.9326 94 0.9375 0.8982 0.9174 167 0.9926 0.9854 0.9890 137 0.9494 0.9422 0.9458 0.9876
0.0106 79.0 7584 0.0502 0.8922 0.9681 0.9286 94 0.9390 0.9222 0.9305 167 0.9926 0.9854 0.9890 137 0.9453 0.9548 0.95 0.9881
0.0084 80.0 7680 0.0556 0.9 0.9574 0.9278 94 0.9255 0.8922 0.9085 167 0.9926 0.9854 0.9890 137 0.9421 0.9397 0.9409 0.9867
0.0096 81.0 7776 0.0591 0.8835 0.9681 0.9239 94 0.9146 0.8982 0.9063 167 0.9853 0.9781 0.9817 137 0.9305 0.9422 0.9363 0.9862
0.0087 82.0 7872 0.0520 0.9 0.9574 0.9278 94 0.9255 0.8922 0.9085 167 0.9926 0.9854 0.9890 137 0.9421 0.9397 0.9409 0.9867
0.0087 83.0 7968 0.0596 0.8824 0.9574 0.9184 94 0.9325 0.9102 0.9212 167 0.9926 0.9854 0.9890 137 0.9401 0.9472 0.9437 0.9870
0.0076 84.0 8064 0.0571 0.8911 0.9574 0.9231 94 0.9273 0.9162 0.9217 167 0.9926 0.9854 0.9890 137 0.9403 0.9497 0.9450 0.9870
0.0083 85.0 8160 0.0572 0.8835 0.9681 0.9239 94 0.9255 0.8922 0.9085 167 0.9926 0.9854 0.9890 137 0.9375 0.9422 0.9398 0.9867
0.0089 86.0 8256 0.0548 0.8922 0.9681 0.9286 94 0.9325 0.9102 0.9212 167 0.9926 0.9854 0.9890 137 0.9426 0.9497 0.9462 0.9873
0.0078 87.0 8352 0.0571 0.8911 0.9574 0.9231 94 0.9217 0.9162 0.9189 167 0.9926 0.9854 0.9890 137 0.9380 0.9497 0.9438 0.9870
0.0088 88.0 8448 0.0538 0.9 0.9574 0.9278 94 0.9212 0.9102 0.9157 167 0.9926 0.9854 0.9890 137 0.9401 0.9472 0.9437 0.9876
0.0079 89.0 8544 0.0579 0.8835 0.9681 0.9239 94 0.9383 0.9102 0.9240 167 0.9926 0.9854 0.9890 137 0.9426 0.9497 0.9462 0.9876
0.0094 90.0 8640 0.0556 0.9010 0.9681 0.9333 94 0.9325 0.9102 0.9212 167 0.9926 0.9854 0.9890 137 0.945 0.9497 0.9474 0.9878
0.0081 91.0 8736 0.0570 0.8980 0.9362 0.9167 94 0.9217 0.9162 0.9189 167 0.9926 0.9854 0.9890 137 0.94 0.9447 0.9424 0.9870
0.0073 92.0 8832 0.0574 0.8835 0.9681 0.9239 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.9378 0.9472 0.9425 0.9870
0.0076 93.0 8928 0.0575 0.8911 0.9574 0.9231 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.94 0.9447 0.9424 0.9873
0.0089 94.0 9024 0.0568 0.8911 0.9574 0.9231 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.94 0.9447 0.9424 0.9873
0.0081 95.0 9120 0.0586 0.8835 0.9681 0.9239 94 0.9321 0.9042 0.9179 167 0.9926 0.9854 0.9890 137 0.9401 0.9472 0.9437 0.9873
0.0077 96.0 9216 0.0569 0.89 0.9468 0.9175 94 0.9207 0.9042 0.9124 167 0.9926 0.9854 0.9890 137 0.9375 0.9422 0.9398 0.9870
0.0069 97.0 9312 0.0575 0.8911 0.9574 0.9231 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.94 0.9447 0.9424 0.9873
0.0086 98.0 9408 0.0583 0.8824 0.9574 0.9184 94 0.9264 0.9042 0.9152 167 0.9926 0.9854 0.9890 137 0.9377 0.9447 0.9412 0.9870
0.0073 99.0 9504 0.0564 0.9 0.9574 0.9278 94 0.9268 0.9102 0.9184 167 0.9926 0.9854 0.9890 137 0.9425 0.9472 0.9449 0.9876
0.0072 100.0 9600 0.0562 0.9 0.9574 0.9278 94 0.9268 0.9102 0.9184 167 0.9926 0.9854 0.9890 137 0.9425 0.9472 0.9449 0.9876

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-pt-pl20-0

Finetuned
(388)
this model