nerui-pt-pl50-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0795
  • Location Precision: 0.8911
  • Location Recall: 0.9677
  • Location F1: 0.9278
  • Location Number: 93
  • Organization Precision: 0.9202
  • Organization Recall: 0.9036
  • Organization F1: 0.9119
  • Organization Number: 166
  • Person Precision: 0.9583
  • Person Recall: 0.9718
  • Person F1: 0.9650
  • Person Number: 142
  • Overall Precision: 0.9265
  • Overall Recall: 0.9426
  • Overall F1: 0.9345
  • Overall Accuracy: 0.9857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8529 1.0 96 0.4086 0.0 0.0 0.0 93 0.2301 0.1566 0.1864 166 0.3086 0.1761 0.2242 142 0.2589 0.1272 0.1706 0.8527
0.3531 2.0 192 0.1972 0.3952 0.5269 0.4516 93 0.5864 0.5723 0.5793 166 0.6742 0.8451 0.7500 142 0.5690 0.6584 0.6104 0.9432
0.1852 3.0 288 0.0960 0.8261 0.8172 0.8216 93 0.7474 0.8735 0.8056 166 0.9592 0.9930 0.9758 142 0.8360 0.9027 0.8681 0.9737
0.126 4.0 384 0.0836 0.7736 0.8817 0.8241 93 0.7824 0.9096 0.8412 166 0.9586 0.9789 0.9686 142 0.8378 0.9277 0.8805 0.9750
0.1027 5.0 480 0.0584 0.8763 0.9140 0.8947 93 0.8795 0.8795 0.8795 166 0.9583 0.9718 0.9650 142 0.9066 0.9202 0.9134 0.9830
0.0925 6.0 576 0.0524 0.7876 0.9570 0.8641 93 0.9221 0.8554 0.8875 166 0.9583 0.9718 0.9650 142 0.8978 0.9202 0.9089 0.9824
0.0803 7.0 672 0.0523 0.8542 0.8817 0.8677 93 0.8495 0.9518 0.8977 166 0.9586 0.9789 0.9686 142 0.8876 0.9451 0.9155 0.9833
0.0733 8.0 768 0.0525 0.8333 0.9677 0.8955 93 0.9241 0.8795 0.9012 166 0.9650 0.9718 0.9684 142 0.9144 0.9327 0.9235 0.9824
0.0684 9.0 864 0.0438 0.8812 0.9570 0.9175 93 0.9030 0.8976 0.9003 166 0.9583 0.9718 0.9650 142 0.9171 0.9377 0.9273 0.9852
0.0627 10.0 960 0.0445 0.8654 0.9677 0.9137 93 0.9136 0.8916 0.9024 166 0.9586 0.9789 0.9686 142 0.9173 0.9401 0.9286 0.9849
0.059 11.0 1056 0.0453 0.8641 0.9570 0.9082 93 0.9 0.9217 0.9107 166 0.9716 0.9648 0.9682 142 0.9155 0.9451 0.9301 0.9855
0.0535 12.0 1152 0.0456 0.88 0.9462 0.9119 93 0.9091 0.9036 0.9063 166 0.9858 0.9789 0.9823 142 0.9286 0.9401 0.9343 0.9852
0.05 13.0 1248 0.0437 0.8889 0.9462 0.9167 93 0.8889 0.9157 0.9021 166 0.9714 0.9577 0.9645 142 0.9171 0.9377 0.9273 0.9860
0.0481 14.0 1344 0.0442 0.9020 0.9892 0.9436 93 0.9136 0.8916 0.9024 166 0.9720 0.9789 0.9754 142 0.9312 0.9451 0.9381 0.9866
0.0456 15.0 1440 0.0437 0.8812 0.9570 0.9175 93 0.9212 0.9157 0.9184 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9866
0.0435 16.0 1536 0.0495 0.8824 0.9677 0.9231 93 0.9187 0.8855 0.9018 166 0.9583 0.9718 0.9650 142 0.9236 0.9352 0.9294 0.9855
0.0412 17.0 1632 0.0444 0.9167 0.9462 0.9312 93 0.9048 0.9157 0.9102 166 0.9650 0.9718 0.9684 142 0.9287 0.9426 0.9356 0.9868
0.0369 18.0 1728 0.0475 0.8667 0.9785 0.9192 93 0.9427 0.8916 0.9164 166 0.9720 0.9789 0.9754 142 0.9333 0.9426 0.9380 0.9863
0.04 19.0 1824 0.0397 0.9175 0.9570 0.9368 93 0.8988 0.9096 0.9042 166 0.9718 0.9718 0.9718 142 0.9287 0.9426 0.9356 0.9874
0.0355 20.0 1920 0.0478 0.9091 0.9677 0.9375 93 0.8947 0.9217 0.9080 166 0.9716 0.9648 0.9682 142 0.9246 0.9476 0.9360 0.9868
0.0369 21.0 2016 0.0561 0.8738 0.9677 0.9184 93 0.9355 0.8735 0.9034 166 0.9857 0.9718 0.9787 142 0.9372 0.9302 0.9337 0.9855
0.0338 22.0 2112 0.0521 0.8713 0.9462 0.9072 93 0.8922 0.8976 0.8949 166 0.9379 0.9577 0.9477 142 0.9031 0.9302 0.9165 0.9841
0.033 23.0 2208 0.0519 0.8990 0.9570 0.9271 93 0.8922 0.8976 0.8949 166 0.9514 0.9648 0.9580 142 0.9146 0.9352 0.9248 0.9852
0.032 24.0 2304 0.0583 0.8505 0.9785 0.91 93 0.8957 0.8795 0.8875 166 0.9650 0.9718 0.9684 142 0.9080 0.9352 0.9214 0.9835
0.0303 25.0 2400 0.0572 0.8911 0.9677 0.9278 93 0.9036 0.9036 0.9036 166 0.9650 0.9718 0.9684 142 0.9220 0.9426 0.9322 0.9855
0.0286 26.0 2496 0.0588 0.8824 0.9677 0.9231 93 0.9136 0.8916 0.9024 166 0.9580 0.9648 0.9614 142 0.9214 0.9352 0.9282 0.9846
0.0288 27.0 2592 0.0550 0.9 0.9677 0.9326 93 0.9136 0.8916 0.9024 166 0.9580 0.9648 0.9614 142 0.9259 0.9352 0.9305 0.9849
0.0267 28.0 2688 0.0563 0.8824 0.9677 0.9231 93 0.9096 0.9096 0.9096 166 0.9653 0.9789 0.9720 142 0.9223 0.9476 0.9348 0.9855
0.0257 29.0 2784 0.0529 0.9 0.9677 0.9326 93 0.9321 0.9096 0.9207 166 0.9514 0.9648 0.9580 142 0.9310 0.9426 0.9368 0.9860
0.0252 30.0 2880 0.0557 0.9175 0.9570 0.9368 93 0.9118 0.9337 0.9226 166 0.9718 0.9718 0.9718 142 0.9340 0.9526 0.9432 0.9863
0.0258 31.0 2976 0.0524 0.8824 0.9677 0.9231 93 0.9136 0.8916 0.9024 166 0.9653 0.9789 0.9720 142 0.9240 0.9401 0.9320 0.9868
0.0221 32.0 3072 0.0587 0.8725 0.9570 0.9128 93 0.9198 0.8976 0.9085 166 0.9580 0.9648 0.9614 142 0.9214 0.9352 0.9282 0.9857
0.023 33.0 3168 0.0571 0.8738 0.9677 0.9184 93 0.9146 0.9036 0.9091 166 0.9580 0.9648 0.9614 142 0.9195 0.9401 0.9297 0.9852
0.0236 34.0 3264 0.0558 0.9278 0.9677 0.9474 93 0.9222 0.9277 0.9249 166 0.9786 0.9648 0.9716 142 0.9431 0.9501 0.9466 0.9871
0.0195 35.0 3360 0.0618 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9580 0.9648 0.9614 142 0.9263 0.9401 0.9332 0.9857
0.0238 36.0 3456 0.0569 0.8835 0.9785 0.9286 93 0.9304 0.8855 0.9074 166 0.9510 0.9577 0.9544 142 0.9257 0.9327 0.9292 0.9863
0.0219 37.0 3552 0.0601 0.8824 0.9677 0.9231 93 0.9096 0.9096 0.9096 166 0.9650 0.9718 0.9684 142 0.9221 0.9451 0.9335 0.9860
0.0188 38.0 3648 0.0608 0.9091 0.9677 0.9375 93 0.9212 0.9157 0.9184 166 0.9580 0.9648 0.9614 142 0.9312 0.9451 0.9381 0.9866
0.0193 39.0 3744 0.0613 0.8824 0.9677 0.9231 93 0.9308 0.8916 0.9108 166 0.9650 0.9718 0.9684 142 0.9307 0.9377 0.9342 0.9852
0.0192 40.0 3840 0.0613 0.89 0.9570 0.9223 93 0.9202 0.9036 0.9119 166 0.9650 0.9718 0.9684 142 0.9286 0.9401 0.9343 0.9863
0.0166 41.0 3936 0.0598 0.9091 0.9677 0.9375 93 0.9053 0.9217 0.9134 166 0.9580 0.9648 0.9614 142 0.9246 0.9476 0.9360 0.9863
0.0162 42.0 4032 0.0590 0.8824 0.9677 0.9231 93 0.9207 0.9096 0.9152 166 0.9653 0.9789 0.9720 142 0.9268 0.9476 0.9371 0.9868
0.0163 43.0 4128 0.0629 0.8911 0.9677 0.9278 93 0.9268 0.9157 0.9212 166 0.9580 0.9648 0.9614 142 0.9289 0.9451 0.9370 0.9852
0.0168 44.0 4224 0.0631 0.9082 0.9570 0.9319 93 0.9102 0.9157 0.9129 166 0.9653 0.9789 0.9720 142 0.9291 0.9476 0.9383 0.9857
0.0152 45.0 4320 0.0669 0.8824 0.9677 0.9231 93 0.9255 0.8976 0.9113 166 0.9650 0.9718 0.9684 142 0.9286 0.9401 0.9343 0.9860
0.0138 46.0 4416 0.0695 0.8738 0.9677 0.9184 93 0.9182 0.8795 0.8985 166 0.9583 0.9718 0.9650 142 0.9212 0.9327 0.9269 0.9844
0.0142 47.0 4512 0.0683 0.9175 0.9570 0.9368 93 0.8876 0.9036 0.8955 166 0.9583 0.9718 0.9650 142 0.9195 0.9401 0.9297 0.9852
0.0156 48.0 4608 0.0628 0.8738 0.9677 0.9184 93 0.9255 0.8976 0.9113 166 0.9718 0.9718 0.9718 142 0.9286 0.9401 0.9343 0.9866
0.0139 49.0 4704 0.0686 0.9263 0.9462 0.9362 93 0.8947 0.9217 0.9080 166 0.9580 0.9648 0.9614 142 0.9242 0.9426 0.9333 0.9855
0.016 50.0 4800 0.0620 0.9167 0.9462 0.9312 93 0.9042 0.9096 0.9069 166 0.9718 0.9718 0.9718 142 0.9309 0.9401 0.9355 0.9863
0.0145 51.0 4896 0.0674 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9650 0.9718 0.9684 142 0.9287 0.9426 0.9356 0.9863
0.0126 52.0 4992 0.0710 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9580 0.9648 0.9614 142 0.9263 0.9401 0.9332 0.9852
0.0132 53.0 5088 0.0719 0.9082 0.9570 0.9319 93 0.9217 0.9217 0.9217 166 0.9583 0.9718 0.9650 142 0.9314 0.9476 0.9394 0.9841
0.014 54.0 5184 0.0706 0.8911 0.9677 0.9278 93 0.9024 0.8916 0.8970 166 0.9648 0.9648 0.9648 142 0.9214 0.9352 0.9282 0.9852
0.014 55.0 5280 0.0654 0.8713 0.9462 0.9072 93 0.9313 0.8976 0.9141 166 0.9718 0.9718 0.9718 142 0.9305 0.9352 0.9328 0.9866
0.0119 56.0 5376 0.0683 0.8812 0.9570 0.9175 93 0.9212 0.9157 0.9184 166 0.9650 0.9718 0.9684 142 0.9267 0.9451 0.9358 0.9863
0.0117 57.0 5472 0.0686 0.8812 0.9570 0.9175 93 0.9136 0.8916 0.9024 166 0.9650 0.9718 0.9684 142 0.9236 0.9352 0.9294 0.9860
0.0113 58.0 5568 0.0691 0.8654 0.9677 0.9137 93 0.9182 0.8795 0.8985 166 0.9452 0.9718 0.9583 142 0.9144 0.9327 0.9235 0.9849
0.0126 59.0 5664 0.0714 0.8812 0.9570 0.9175 93 0.9264 0.9096 0.9179 166 0.9517 0.9718 0.9617 142 0.9242 0.9426 0.9333 0.9863
0.0106 60.0 5760 0.0744 0.8824 0.9677 0.9231 93 0.9136 0.8916 0.9024 166 0.9716 0.9648 0.9682 142 0.9259 0.9352 0.9305 0.9849
0.0116 61.0 5856 0.0741 0.8812 0.9570 0.9175 93 0.9141 0.8976 0.9058 166 0.9648 0.9648 0.9648 142 0.9236 0.9352 0.9294 0.9849
0.0122 62.0 5952 0.0684 0.8725 0.9570 0.9128 93 0.9207 0.9096 0.9152 166 0.9648 0.9648 0.9648 142 0.9240 0.9401 0.9320 0.9860
0.0103 63.0 6048 0.0751 0.8824 0.9677 0.9231 93 0.9207 0.9096 0.9152 166 0.9514 0.9648 0.9580 142 0.9220 0.9426 0.9322 0.9846
0.0099 64.0 6144 0.0667 0.9184 0.9677 0.9424 93 0.9268 0.9157 0.9212 166 0.9580 0.9648 0.9614 142 0.9358 0.9451 0.9404 0.9871
0.0089 65.0 6240 0.0764 0.8812 0.9570 0.9175 93 0.9130 0.8855 0.8991 166 0.9452 0.9718 0.9583 142 0.9167 0.9327 0.9246 0.9841
0.0098 66.0 6336 0.0752 0.8824 0.9677 0.9231 93 0.9202 0.9036 0.9119 166 0.9650 0.9718 0.9684 142 0.9265 0.9426 0.9345 0.9857
0.0097 67.0 6432 0.0784 0.8738 0.9677 0.9184 93 0.9198 0.8976 0.9085 166 0.9650 0.9718 0.9684 142 0.9240 0.9401 0.9320 0.9846
0.0097 68.0 6528 0.0771 0.9 0.9677 0.9326 93 0.9217 0.9217 0.9217 166 0.9650 0.9718 0.9684 142 0.9315 0.9501 0.9407 0.9852
0.0094 69.0 6624 0.0737 0.8990 0.9570 0.9271 93 0.9157 0.9157 0.9157 166 0.9650 0.9718 0.9684 142 0.9289 0.9451 0.9370 0.9857
0.0083 70.0 6720 0.0739 0.8824 0.9677 0.9231 93 0.9317 0.9036 0.9174 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9852
0.0087 71.0 6816 0.0715 0.9082 0.9570 0.9319 93 0.9042 0.9096 0.9069 166 0.9650 0.9718 0.9684 142 0.9265 0.9426 0.9345 0.9866
0.0084 72.0 6912 0.0727 0.8911 0.9677 0.9278 93 0.9317 0.9036 0.9174 166 0.9650 0.9718 0.9684 142 0.9333 0.9426 0.9380 0.9871
0.0076 73.0 7008 0.0726 0.8824 0.9677 0.9231 93 0.9317 0.9036 0.9174 166 0.9718 0.9718 0.9718 142 0.9333 0.9426 0.9380 0.9868
0.0087 74.0 7104 0.0800 0.8824 0.9677 0.9231 93 0.9259 0.9036 0.9146 166 0.9650 0.9718 0.9684 142 0.9287 0.9426 0.9356 0.9841
0.0079 75.0 7200 0.0772 0.9 0.9677 0.9326 93 0.9321 0.9096 0.9207 166 0.9650 0.9718 0.9684 142 0.9358 0.9451 0.9404 0.9857
0.007 76.0 7296 0.0782 0.9 0.9677 0.9326 93 0.9207 0.9096 0.9152 166 0.9650 0.9718 0.9684 142 0.9312 0.9451 0.9381 0.9863
0.0089 77.0 7392 0.0773 0.8911 0.9677 0.9278 93 0.9207 0.9096 0.9152 166 0.9718 0.9718 0.9718 142 0.9312 0.9451 0.9381 0.9855
0.008 78.0 7488 0.0786 0.8824 0.9677 0.9231 93 0.9080 0.8916 0.8997 166 0.9718 0.9718 0.9718 142 0.9238 0.9377 0.9307 0.9838
0.0082 79.0 7584 0.0727 0.8911 0.9677 0.9278 93 0.9198 0.8976 0.9085 166 0.9517 0.9718 0.9617 142 0.9240 0.9401 0.9320 0.9849
0.0086 80.0 7680 0.0743 0.8911 0.9677 0.9278 93 0.9085 0.8976 0.9030 166 0.9650 0.9718 0.9684 142 0.9240 0.9401 0.9320 0.9846
0.0085 81.0 7776 0.0710 0.8911 0.9677 0.9278 93 0.9091 0.9036 0.9063 166 0.9718 0.9718 0.9718 142 0.9265 0.9426 0.9345 0.9860
0.0072 82.0 7872 0.0770 0.9 0.9677 0.9326 93 0.9141 0.8976 0.9058 166 0.9583 0.9718 0.9650 142 0.9263 0.9401 0.9332 0.9849
0.0067 83.0 7968 0.0810 0.8824 0.9677 0.9231 93 0.9136 0.8916 0.9024 166 0.9583 0.9718 0.9650 142 0.9216 0.9377 0.9295 0.9841
0.0067 84.0 8064 0.0766 0.8990 0.9570 0.9271 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9242 0.9426 0.9333 0.9855
0.0058 85.0 8160 0.0795 0.8911 0.9677 0.9278 93 0.9264 0.9096 0.9179 166 0.9718 0.9718 0.9718 142 0.9335 0.9451 0.9393 0.9863
0.0079 86.0 8256 0.0777 0.8824 0.9677 0.9231 93 0.9030 0.8976 0.9003 166 0.9650 0.9718 0.9684 142 0.9195 0.9401 0.9297 0.9846
0.0058 87.0 8352 0.0786 0.8824 0.9677 0.9231 93 0.9141 0.8976 0.9058 166 0.9517 0.9718 0.9617 142 0.9195 0.9401 0.9297 0.9846
0.0071 88.0 8448 0.0757 0.9 0.9677 0.9326 93 0.9091 0.9036 0.9063 166 0.9583 0.9718 0.9650 142 0.9242 0.9426 0.9333 0.9855
0.0068 89.0 8544 0.0806 0.8911 0.9677 0.9278 93 0.9091 0.9036 0.9063 166 0.9650 0.9718 0.9684 142 0.9242 0.9426 0.9333 0.9846
0.0061 90.0 8640 0.0750 0.8911 0.9677 0.9278 93 0.9198 0.8976 0.9085 166 0.9718 0.9718 0.9718 142 0.9309 0.9401 0.9355 0.9855
0.0056 91.0 8736 0.0774 0.8911 0.9677 0.9278 93 0.9141 0.8976 0.9058 166 0.9583 0.9718 0.9650 142 0.9240 0.9401 0.9320 0.9855
0.0054 92.0 8832 0.0808 0.8824 0.9677 0.9231 93 0.9141 0.8976 0.9058 166 0.9517 0.9718 0.9617 142 0.9195 0.9401 0.9297 0.9849
0.0058 93.0 8928 0.0783 0.8911 0.9677 0.9278 93 0.9141 0.8976 0.9058 166 0.9517 0.9718 0.9617 142 0.9218 0.9401 0.9309 0.9852
0.0064 94.0 9024 0.0802 0.8911 0.9677 0.9278 93 0.9080 0.8916 0.8997 166 0.9583 0.9718 0.9650 142 0.9216 0.9377 0.9295 0.9852
0.0069 95.0 9120 0.0824 0.8911 0.9677 0.9278 93 0.9141 0.8976 0.9058 166 0.9650 0.9718 0.9684 142 0.9263 0.9401 0.9332 0.9855
0.0059 96.0 9216 0.0814 0.8911 0.9677 0.9278 93 0.9080 0.8916 0.8997 166 0.9517 0.9718 0.9617 142 0.9193 0.9377 0.9284 0.9852
0.0058 97.0 9312 0.0790 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9860
0.0069 98.0 9408 0.0797 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9857
0.0061 99.0 9504 0.0795 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9857
0.0067 100.0 9600 0.0795 0.8911 0.9677 0.9278 93 0.9202 0.9036 0.9119 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9857

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-pt-pl50-2

Finetuned
(388)
this model