nerui-pt-pl5-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0678
  • Location Precision: 0.9255
  • Location Recall: 0.9355
  • Location F1: 0.9305
  • Location Number: 93
  • Organization Precision: 0.9162
  • Organization Recall: 0.9217
  • Organization F1: 0.9189
  • Organization Number: 166
  • Person Precision: 0.9718
  • Person Recall: 0.9718
  • Person F1: 0.9718
  • Person Number: 142
  • Overall Precision: 0.9380
  • Overall Recall: 0.9426
  • Overall F1: 0.9403
  • Overall Accuracy: 0.9877

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.881 1.0 96 0.3962 0.3333 0.0108 0.0208 93 0.2688 0.3012 0.2841 166 0.2834 0.3732 0.3222 142 0.2766 0.2594 0.2677 0.8672
0.3506 2.0 192 0.1969 0.3094 0.4624 0.3707 93 0.64 0.6747 0.6569 166 0.6437 0.7887 0.7089 142 0.5471 0.6658 0.6007 0.9418
0.1965 3.0 288 0.1110 0.7470 0.6667 0.7045 93 0.6591 0.8735 0.7513 166 0.9459 0.9859 0.9655 142 0.7694 0.8653 0.8146 0.9638
0.1379 4.0 384 0.0912 0.6957 0.8602 0.7692 93 0.7772 0.8614 0.8171 166 0.9792 0.9930 0.9860 142 0.8217 0.9077 0.8626 0.9701
0.1107 5.0 480 0.0687 0.8265 0.8710 0.8482 93 0.8268 0.8916 0.8580 166 0.9722 0.9859 0.9790 142 0.8765 0.9202 0.8978 0.9794
0.0972 6.0 576 0.0764 0.7213 0.9462 0.8186 93 0.8618 0.7892 0.8239 166 0.9789 0.9789 0.9789 142 0.8606 0.8928 0.8764 0.9759
0.086 7.0 672 0.0620 0.8265 0.8710 0.8482 93 0.8158 0.9337 0.8708 166 0.9655 0.9859 0.9756 142 0.8684 0.9377 0.9017 0.9789
0.0786 8.0 768 0.0522 0.8131 0.9355 0.8700 93 0.8539 0.9157 0.8837 166 0.9857 0.9718 0.9787 142 0.8871 0.9401 0.9128 0.9835
0.0709 9.0 864 0.0522 0.7757 0.8925 0.83 93 0.8820 0.8554 0.8685 166 0.9857 0.9718 0.9787 142 0.8897 0.9052 0.8974 0.9808
0.0665 10.0 960 0.0494 0.8 0.9032 0.8485 93 0.8895 0.9217 0.9053 166 0.9720 0.9789 0.9754 142 0.8952 0.9377 0.9160 0.9838
0.0598 11.0 1056 0.0489 0.8333 0.9140 0.8718 93 0.8793 0.9217 0.9 166 0.9787 0.9718 0.9753 142 0.9017 0.9377 0.9193 0.9838
0.0584 12.0 1152 0.0502 0.8485 0.9032 0.875 93 0.8844 0.9217 0.9027 166 0.9718 0.9718 0.9718 142 0.9058 0.9352 0.9202 0.9849
0.0536 13.0 1248 0.0502 0.8673 0.9140 0.8901 93 0.9136 0.8916 0.9024 166 0.9787 0.9718 0.9753 142 0.9252 0.9252 0.9252 0.9855
0.0519 14.0 1344 0.0527 0.9043 0.9140 0.9091 93 0.9198 0.8976 0.9085 166 0.9789 0.9789 0.9789 142 0.9372 0.9302 0.9337 0.9855
0.0507 15.0 1440 0.0450 0.8788 0.9355 0.9062 93 0.9255 0.8976 0.9113 166 0.9789 0.9789 0.9789 142 0.9328 0.9352 0.9340 0.9879
0.0451 16.0 1536 0.0458 0.8529 0.9355 0.8923 93 0.9321 0.9096 0.9207 166 0.9718 0.9718 0.9718 142 0.9261 0.9377 0.9318 0.9866
0.0438 17.0 1632 0.0516 0.8381 0.9462 0.8889 93 0.9313 0.8976 0.9141 166 0.9650 0.9718 0.9684 142 0.9191 0.9352 0.9271 0.9860
0.0415 18.0 1728 0.0491 0.8866 0.9247 0.9053 93 0.9085 0.8976 0.9030 166 0.9650 0.9718 0.9684 142 0.9233 0.9302 0.9267 0.9866
0.04 19.0 1824 0.0481 0.9348 0.9247 0.9297 93 0.9167 0.9277 0.9222 166 0.9650 0.9718 0.9684 142 0.9380 0.9426 0.9403 0.9879
0.0383 20.0 1920 0.0501 0.9175 0.9570 0.9368 93 0.9394 0.9337 0.9366 166 0.9650 0.9718 0.9684 142 0.9432 0.9526 0.9479 0.9890
0.0391 21.0 2016 0.0523 0.8462 0.9462 0.8934 93 0.9255 0.8976 0.9113 166 0.9718 0.9718 0.9718 142 0.9214 0.9352 0.9282 0.9860
0.0353 22.0 2112 0.0451 0.8980 0.9462 0.9215 93 0.9231 0.9398 0.9313 166 0.9787 0.9718 0.9753 142 0.9363 0.9526 0.9444 0.9890
0.0356 23.0 2208 0.0490 0.9062 0.9355 0.9206 93 0.9112 0.9277 0.9194 166 0.9650 0.9718 0.9684 142 0.9289 0.9451 0.9370 0.9874
0.0333 24.0 2304 0.0491 0.8641 0.9570 0.9082 93 0.9576 0.9518 0.9547 166 0.9718 0.9718 0.9718 142 0.9390 0.9601 0.9494 0.9885
0.033 25.0 2400 0.0544 0.8990 0.9570 0.9271 93 0.9152 0.9096 0.9124 166 0.9718 0.9718 0.9718 142 0.9310 0.9426 0.9368 0.9852
0.0333 26.0 2496 0.0555 0.9247 0.9247 0.9247 93 0.9394 0.9337 0.9366 166 0.9650 0.9718 0.9684 142 0.9451 0.9451 0.9451 0.9866
0.0309 27.0 2592 0.0562 0.9082 0.9570 0.9319 93 0.9448 0.9277 0.9362 166 0.9789 0.9789 0.9789 142 0.9479 0.9526 0.9502 0.9879
0.0311 28.0 2688 0.0547 0.8804 0.8710 0.8757 93 0.8851 0.9277 0.9059 166 0.9786 0.9648 0.9716 142 0.9163 0.9277 0.9219 0.9866
0.0272 29.0 2784 0.0543 0.8544 0.9462 0.8980 93 0.9187 0.8855 0.9018 166 0.9716 0.9648 0.9682 142 0.9208 0.9277 0.9242 0.9868
0.0257 30.0 2880 0.0589 0.8660 0.9032 0.8842 93 0.8571 0.9398 0.8966 166 0.9858 0.9789 0.9823 142 0.9024 0.9451 0.9233 0.9844
0.0263 31.0 2976 0.0547 0.9167 0.9462 0.9312 93 0.9387 0.9217 0.9301 166 0.9787 0.9718 0.9753 142 0.9475 0.9451 0.9463 0.9882
0.0271 32.0 3072 0.0531 0.9082 0.9570 0.9319 93 0.9383 0.9157 0.9268 166 0.9653 0.9789 0.9720 142 0.9406 0.9476 0.9441 0.9874
0.0271 33.0 3168 0.0502 0.8878 0.9355 0.9110 93 0.9379 0.9096 0.9235 166 0.9789 0.9789 0.9789 142 0.9401 0.9401 0.9401 0.9874
0.0255 34.0 3264 0.0632 0.89 0.9570 0.9223 93 0.9451 0.9337 0.9394 166 0.9720 0.9789 0.9754 142 0.9410 0.9551 0.9480 0.9874
0.0235 35.0 3360 0.0592 0.8889 0.9462 0.9167 93 0.9394 0.9337 0.9366 166 0.9517 0.9718 0.9617 142 0.9315 0.9501 0.9407 0.9863
0.0243 36.0 3456 0.0575 0.8654 0.9677 0.9137 93 0.9560 0.9157 0.9354 166 0.9650 0.9718 0.9684 142 0.9360 0.9476 0.9418 0.9877
0.0225 37.0 3552 0.0586 0.8980 0.9462 0.9215 93 0.9042 0.9096 0.9069 166 0.9586 0.9789 0.9686 142 0.9220 0.9426 0.9322 0.9868
0.0239 38.0 3648 0.0495 0.8969 0.9355 0.9158 93 0.9277 0.9277 0.9277 166 0.9787 0.9718 0.9753 142 0.9381 0.9451 0.9416 0.9885
0.0202 39.0 3744 0.0533 0.8824 0.9677 0.9231 93 0.9437 0.9096 0.9264 166 0.9789 0.9789 0.9789 142 0.9406 0.9476 0.9441 0.9890
0.0199 40.0 3840 0.0547 0.9053 0.9247 0.9149 93 0.9277 0.9277 0.9277 166 0.9580 0.9648 0.9614 142 0.9332 0.9401 0.9366 0.9879
0.0219 41.0 3936 0.0562 0.8947 0.9140 0.9043 93 0.9172 0.9337 0.9254 166 0.9720 0.9789 0.9754 142 0.9312 0.9451 0.9381 0.9874
0.0209 42.0 4032 0.0575 0.8889 0.9462 0.9167 93 0.9437 0.9096 0.9264 166 0.9720 0.9789 0.9754 142 0.9403 0.9426 0.9415 0.9874
0.0188 43.0 4128 0.0590 0.8980 0.9462 0.9215 93 0.9273 0.9217 0.9245 166 0.9653 0.9789 0.9720 142 0.9337 0.9476 0.9406 0.9874
0.0209 44.0 4224 0.0552 0.8763 0.9140 0.8947 93 0.8830 0.9096 0.8961 166 0.9787 0.9718 0.9753 142 0.9144 0.9327 0.9235 0.9855
0.0179 45.0 4320 0.0590 0.88 0.9462 0.9119 93 0.9375 0.9036 0.9202 166 0.9720 0.9789 0.9754 142 0.9355 0.9401 0.9378 0.9871
0.0173 46.0 4416 0.0643 0.8713 0.9462 0.9072 93 0.9202 0.9036 0.9119 166 0.9716 0.9648 0.9682 142 0.9259 0.9352 0.9305 0.9863
0.0199 47.0 4512 0.0652 0.8878 0.9355 0.9110 93 0.9107 0.9217 0.9162 166 0.9648 0.9648 0.9648 142 0.9240 0.9401 0.9320 0.9855
0.0183 48.0 4608 0.0625 0.9158 0.9355 0.9255 93 0.9112 0.9277 0.9194 166 0.9716 0.9648 0.9682 142 0.9333 0.9426 0.9380 0.9868
0.0176 49.0 4704 0.0577 0.8889 0.9462 0.9167 93 0.9212 0.9157 0.9184 166 0.9650 0.9718 0.9684 142 0.9287 0.9426 0.9356 0.9885
0.0176 50.0 4800 0.0624 0.9 0.9677 0.9326 93 0.9202 0.9036 0.9119 166 0.9786 0.9648 0.9716 142 0.9355 0.9401 0.9378 0.9877
0.017 51.0 4896 0.0606 0.9062 0.9355 0.9206 93 0.9053 0.9217 0.9134 166 0.9517 0.9718 0.9617 142 0.9220 0.9426 0.9322 0.9871
0.0174 52.0 4992 0.0637 0.9255 0.9355 0.9305 93 0.9172 0.9337 0.9254 166 0.9716 0.9648 0.9682 142 0.9381 0.9451 0.9416 0.9874
0.0172 53.0 5088 0.0618 0.9149 0.9247 0.9198 93 0.8994 0.9157 0.9075 166 0.9580 0.9648 0.9614 142 0.9236 0.9352 0.9294 0.9868
0.0162 54.0 5184 0.0582 0.9175 0.9570 0.9368 93 0.8994 0.9157 0.9075 166 0.9648 0.9648 0.9648 142 0.9265 0.9426 0.9345 0.9871
0.0158 55.0 5280 0.0613 0.8980 0.9462 0.9215 93 0.9259 0.9036 0.9146 166 0.9650 0.9718 0.9684 142 0.9330 0.9377 0.9353 0.9868
0.0154 56.0 5376 0.0612 0.8980 0.9462 0.9215 93 0.9048 0.9157 0.9102 166 0.9716 0.9648 0.9682 142 0.9263 0.9401 0.9332 0.9874
0.0167 57.0 5472 0.0595 0.8627 0.9462 0.9026 93 0.9202 0.9036 0.9119 166 0.9716 0.9648 0.9682 142 0.9236 0.9352 0.9294 0.9866
0.0161 58.0 5568 0.0642 0.8544 0.9462 0.8980 93 0.9430 0.8976 0.9198 166 0.9718 0.9718 0.9718 142 0.9305 0.9352 0.9328 0.9882
0.0148 59.0 5664 0.0570 0.9062 0.9355 0.9206 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9381 0.9451 0.9416 0.9888
0.0143 60.0 5760 0.0591 0.9167 0.9462 0.9312 93 0.9277 0.9277 0.9277 166 0.9650 0.9718 0.9684 142 0.9383 0.9476 0.9429 0.9877
0.0123 61.0 5856 0.0630 0.8842 0.9032 0.8936 93 0.9157 0.9157 0.9157 166 0.9580 0.9648 0.9614 142 0.9233 0.9302 0.9267 0.9860
0.013 62.0 5952 0.0677 0.9062 0.9355 0.9206 93 0.9212 0.9157 0.9184 166 0.9648 0.9648 0.9648 142 0.9330 0.9377 0.9353 0.9868
0.0136 63.0 6048 0.0608 0.9255 0.9355 0.9305 93 0.9053 0.9217 0.9134 166 0.9580 0.9648 0.9614 142 0.9286 0.9401 0.9343 0.9874
0.0123 64.0 6144 0.0662 0.9158 0.9355 0.9255 93 0.9217 0.9217 0.9217 166 0.9650 0.9718 0.9684 142 0.9356 0.9426 0.9391 0.9868
0.0138 65.0 6240 0.0638 0.8788 0.9355 0.9062 93 0.9152 0.9096 0.9124 166 0.9648 0.9648 0.9648 142 0.9236 0.9352 0.9294 0.9866
0.0126 66.0 6336 0.0646 0.9072 0.9462 0.9263 93 0.9152 0.9096 0.9124 166 0.9716 0.9648 0.9682 142 0.9330 0.9377 0.9353 0.9871
0.0132 67.0 6432 0.0695 0.8878 0.9355 0.9110 93 0.9091 0.9036 0.9063 166 0.9583 0.9718 0.9650 142 0.9214 0.9352 0.9282 0.9866
0.0117 68.0 6528 0.0637 0.9158 0.9355 0.9255 93 0.9212 0.9157 0.9184 166 0.9650 0.9718 0.9684 142 0.9355 0.9401 0.9378 0.9866
0.0132 69.0 6624 0.0665 0.9255 0.9355 0.9305 93 0.9222 0.9277 0.9249 166 0.9648 0.9648 0.9648 142 0.9380 0.9426 0.9403 0.9877
0.012 70.0 6720 0.0669 0.9149 0.9247 0.9198 93 0.9172 0.9337 0.9254 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9868
0.0126 71.0 6816 0.0645 0.9062 0.9355 0.9206 93 0.9281 0.9337 0.9309 166 0.9716 0.9648 0.9682 142 0.9381 0.9451 0.9416 0.9882
0.0122 72.0 6912 0.0649 0.8958 0.9247 0.9101 93 0.9387 0.9217 0.9301 166 0.9720 0.9789 0.9754 142 0.9403 0.9426 0.9415 0.9871
0.0124 73.0 7008 0.0610 0.8958 0.9247 0.9101 93 0.9333 0.9277 0.9305 166 0.9720 0.9789 0.9754 142 0.9381 0.9451 0.9416 0.9882
0.0112 74.0 7104 0.0656 0.8866 0.9247 0.9053 93 0.9264 0.9096 0.9179 166 0.9718 0.9718 0.9718 142 0.9328 0.9352 0.9340 0.9866
0.011 75.0 7200 0.0644 0.8866 0.9247 0.9053 93 0.9337 0.9337 0.9337 166 0.9720 0.9789 0.9754 142 0.9360 0.9476 0.9418 0.9874
0.0095 76.0 7296 0.0660 0.8958 0.9247 0.9101 93 0.9277 0.9277 0.9277 166 0.9720 0.9789 0.9754 142 0.9358 0.9451 0.9404 0.9877
0.0103 77.0 7392 0.0669 0.8788 0.9355 0.9062 93 0.9317 0.9036 0.9174 166 0.9653 0.9789 0.9720 142 0.9307 0.9377 0.9342 0.9866
0.0102 78.0 7488 0.0640 0.8969 0.9355 0.9158 93 0.9329 0.9217 0.9273 166 0.9718 0.9718 0.9718 142 0.9380 0.9426 0.9403 0.9882
0.0105 79.0 7584 0.0625 0.9355 0.9355 0.9355 93 0.9387 0.9217 0.9301 166 0.9653 0.9789 0.9720 142 0.9475 0.9451 0.9463 0.9882
0.0099 80.0 7680 0.0648 0.8969 0.9355 0.9158 93 0.9152 0.9096 0.9124 166 0.9653 0.9789 0.9720 142 0.9286 0.9401 0.9343 0.9868
0.009 81.0 7776 0.0639 0.8958 0.9247 0.9101 93 0.9157 0.9157 0.9157 166 0.9653 0.9789 0.9720 142 0.9286 0.9401 0.9343 0.9868
0.01 82.0 7872 0.0651 0.9457 0.9355 0.9405 93 0.9217 0.9217 0.9217 166 0.9653 0.9789 0.9720 142 0.9428 0.9451 0.9440 0.9888
0.0093 83.0 7968 0.0670 0.9062 0.9355 0.9206 93 0.9268 0.9157 0.9212 166 0.9720 0.9789 0.9754 142 0.9380 0.9426 0.9403 0.9874
0.0094 84.0 8064 0.0659 0.9255 0.9355 0.9305 93 0.9212 0.9157 0.9184 166 0.9718 0.9718 0.9718 142 0.9401 0.9401 0.9401 0.9877
0.0118 85.0 8160 0.0645 0.8958 0.9247 0.9101 93 0.9268 0.9157 0.9212 166 0.9580 0.9648 0.9614 142 0.9305 0.9352 0.9328 0.9874
0.0091 86.0 8256 0.0641 0.9348 0.9247 0.9297 93 0.9277 0.9277 0.9277 166 0.9648 0.9648 0.9648 142 0.9425 0.9401 0.9413 0.9882
0.0094 87.0 8352 0.0650 0.9247 0.9247 0.9247 93 0.9167 0.9277 0.9222 166 0.9648 0.9648 0.9648 142 0.9355 0.9401 0.9378 0.9877
0.0093 88.0 8448 0.0646 0.9149 0.9247 0.9198 93 0.9333 0.9277 0.9305 166 0.9650 0.9718 0.9684 142 0.9403 0.9426 0.9415 0.9879
0.009 89.0 8544 0.0634 0.9247 0.9247 0.9247 93 0.9222 0.9277 0.9249 166 0.9650 0.9718 0.9684 142 0.9380 0.9426 0.9403 0.9874
0.0081 90.0 8640 0.0668 0.9355 0.9355 0.9355 93 0.9217 0.9217 0.9217 166 0.9720 0.9789 0.9754 142 0.9428 0.9451 0.9440 0.9885
0.0089 91.0 8736 0.0670 0.9247 0.9247 0.9247 93 0.9281 0.9337 0.9309 166 0.9718 0.9718 0.9718 142 0.9428 0.9451 0.9440 0.9877
0.0085 92.0 8832 0.0690 0.9158 0.9355 0.9255 93 0.9157 0.9157 0.9157 166 0.9718 0.9718 0.9718 142 0.9355 0.9401 0.9378 0.9874
0.0099 93.0 8928 0.0674 0.9355 0.9355 0.9355 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9451 0.9451 0.9451 0.9882
0.0095 94.0 9024 0.0682 0.9457 0.9355 0.9405 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9475 0.9451 0.9463 0.9882
0.0086 95.0 9120 0.0678 0.9457 0.9355 0.9405 93 0.9222 0.9277 0.9249 166 0.9718 0.9718 0.9718 142 0.9451 0.9451 0.9451 0.9882
0.0079 96.0 9216 0.0662 0.9247 0.9247 0.9247 93 0.9281 0.9337 0.9309 166 0.9718 0.9718 0.9718 142 0.9428 0.9451 0.9440 0.9879
0.0084 97.0 9312 0.0669 0.9255 0.9355 0.9305 93 0.9217 0.9217 0.9217 166 0.9718 0.9718 0.9718 142 0.9403 0.9426 0.9415 0.9879
0.0085 98.0 9408 0.0679 0.9255 0.9355 0.9305 93 0.9217 0.9217 0.9217 166 0.9718 0.9718 0.9718 142 0.9403 0.9426 0.9415 0.9874
0.0071 99.0 9504 0.0679 0.9255 0.9355 0.9305 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9380 0.9426 0.9403 0.9877
0.0079 100.0 9600 0.0678 0.9255 0.9355 0.9305 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9380 0.9426 0.9403 0.9877

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-pt-pl5-2

Finetuned
(388)
this model