nerui-pt-pl5-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0573
  • Location Precision: 0.8911
  • Location Recall: 0.9574
  • Location F1: 0.9231
  • Location Number: 94
  • Organization Precision: 0.9187
  • Organization Recall: 0.8802
  • Organization F1: 0.8991
  • Organization Number: 167
  • Person Precision: 0.9853
  • Person Recall: 0.9781
  • Person F1: 0.9817
  • Person Number: 137
  • Overall Precision: 0.9345
  • Overall Recall: 0.9322
  • Overall F1: 0.9333
  • Overall Accuracy: 0.9859

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.839 1.0 96 0.3844 0.25 0.0106 0.0204 94 0.2767 0.4192 0.3333 167 0.2259 0.4453 0.2998 137 0.2505 0.3317 0.2854 0.8696
0.3364 2.0 192 0.1918 0.5714 0.4255 0.4878 94 0.5545 0.7305 0.6305 167 0.8591 0.9343 0.8951 137 0.6606 0.7286 0.6930 0.9439
0.1872 3.0 288 0.1045 0.7473 0.7234 0.7351 94 0.7487 0.8383 0.7910 167 0.9441 0.9854 0.9643 137 0.8147 0.8618 0.8376 0.9655
0.1357 4.0 384 0.0794 0.7822 0.8404 0.8103 94 0.7765 0.8323 0.8035 167 0.9714 0.9927 0.9819 137 0.8429 0.8894 0.8655 0.9715
0.1109 5.0 480 0.0615 0.8095 0.9043 0.8543 94 0.8253 0.8204 0.8228 167 0.9926 0.9854 0.9890 137 0.8771 0.8970 0.8870 0.9785
0.0945 6.0 576 0.0598 0.7679 0.9149 0.8350 94 0.8383 0.8383 0.8383 167 0.9926 0.9854 0.9890 137 0.8699 0.9070 0.8881 0.9793
0.0834 7.0 672 0.0513 0.83 0.8830 0.8557 94 0.8588 0.8743 0.8665 167 0.9853 0.9781 0.9817 137 0.8941 0.9121 0.9030 0.9820
0.0779 8.0 768 0.0559 0.7586 0.9362 0.8381 94 0.9073 0.8204 0.8616 167 0.9854 0.9854 0.9854 137 0.8911 0.9045 0.8978 0.9801
0.0707 9.0 864 0.0503 0.7672 0.9468 0.8476 94 0.8831 0.8144 0.8474 167 0.9783 0.9854 0.9818 137 0.8824 0.9045 0.8933 0.9812
0.0677 10.0 960 0.0476 0.8148 0.9362 0.8713 94 0.8834 0.8623 0.8727 167 0.9926 0.9854 0.9890 137 0.9017 0.9221 0.9118 0.9834
0.0634 11.0 1056 0.0480 0.8396 0.9468 0.89 94 0.8994 0.8563 0.8773 167 1.0 0.9854 0.9926 137 0.9175 0.9221 0.9198 0.9843
0.0569 12.0 1152 0.0517 0.85 0.9043 0.8763 94 0.8690 0.8743 0.8716 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9823
0.0583 13.0 1248 0.0423 0.8958 0.9149 0.9053 94 0.8605 0.8862 0.8732 167 0.9926 0.9854 0.9890 137 0.9134 0.9271 0.9202 0.9848
0.0538 14.0 1344 0.0422 0.8738 0.9574 0.9137 94 0.8855 0.8802 0.8829 167 1.0 0.9854 0.9926 137 0.9208 0.9347 0.9277 0.9859
0.0497 15.0 1440 0.0519 0.8713 0.9362 0.9026 94 0.8690 0.8743 0.8716 167 0.9926 0.9854 0.9890 137 0.9111 0.9271 0.9191 0.9845
0.0472 16.0 1536 0.0422 0.8426 0.9681 0.9010 94 0.9051 0.8563 0.88 167 0.9926 0.9854 0.9890 137 0.9179 0.9271 0.9225 0.9862
0.0437 17.0 1632 0.0422 0.8598 0.9787 0.9154 94 0.9136 0.8862 0.8997 167 1.0 0.9854 0.9926 137 0.9282 0.9422 0.9352 0.9870
0.0433 18.0 1728 0.0486 0.8679 0.9787 0.9200 94 0.9145 0.8323 0.8715 167 0.9926 0.9854 0.9890 137 0.9289 0.9196 0.9242 0.9854
0.0397 19.0 1824 0.0541 0.7965 0.9574 0.8696 94 0.9375 0.8084 0.8682 167 0.9854 0.9854 0.9854 137 0.9137 0.9045 0.9091 0.9829
0.0408 20.0 1920 0.0469 0.8519 0.9787 0.9109 94 0.9281 0.8503 0.8875 167 0.9854 0.9854 0.9854 137 0.9271 0.9271 0.9271 0.9859
0.0388 21.0 2016 0.0433 0.91 0.9681 0.9381 94 0.9036 0.8982 0.9009 167 0.9854 0.9854 0.9854 137 0.9330 0.9447 0.9388 0.9876
0.0373 22.0 2112 0.0441 0.8679 0.9787 0.9200 94 0.8976 0.8922 0.8949 167 0.9926 0.9854 0.9890 137 0.9216 0.9447 0.9330 0.9867
0.0348 23.0 2208 0.0519 0.8182 0.9574 0.8824 94 0.9156 0.8443 0.8785 167 0.9926 0.9854 0.9890 137 0.915 0.9196 0.9173 0.9840
0.0343 24.0 2304 0.0432 0.8812 0.9468 0.9128 94 0.9042 0.9042 0.9042 167 1.0 0.9854 0.9926 137 0.9305 0.9422 0.9363 0.9870
0.0371 25.0 2400 0.0434 0.8667 0.9681 0.9146 94 0.8896 0.8683 0.8788 167 0.9926 0.9781 0.9853 137 0.9181 0.9296 0.9238 0.9851
0.033 26.0 2496 0.0433 0.9072 0.9362 0.9215 94 0.9080 0.8862 0.8970 167 0.9926 0.9781 0.9853 137 0.9367 0.9296 0.9332 0.9867
0.0315 27.0 2592 0.0424 0.9149 0.9149 0.9149 94 0.8935 0.9042 0.8988 167 0.9926 0.9854 0.9890 137 0.9323 0.9347 0.9335 0.9862
0.0287 28.0 2688 0.0464 0.8922 0.9681 0.9286 94 0.9097 0.8443 0.8758 167 0.9854 0.9854 0.9854 137 0.9315 0.9221 0.9268 0.9867
0.0285 29.0 2784 0.0545 0.8182 0.9574 0.8824 94 0.9315 0.8144 0.8690 167 1.0 0.9854 0.9926 137 0.9233 0.9070 0.9151 0.9848
0.0287 30.0 2880 0.0494 0.8333 0.9574 0.8911 94 0.8970 0.8862 0.8916 167 0.9927 0.9927 0.9927 137 0.9122 0.9397 0.9257 0.9854
0.0276 31.0 2976 0.0416 0.8922 0.9681 0.9286 94 0.9182 0.8743 0.8957 167 1.0 0.9854 0.9926 137 0.9394 0.9347 0.9370 0.9878
0.027 32.0 3072 0.0478 0.8426 0.9681 0.9010 94 0.9167 0.8563 0.8854 167 1.0 0.9854 0.9926 137 0.9248 0.9271 0.9260 0.9873
0.0269 33.0 3168 0.0442 0.9 0.9574 0.9278 94 0.9245 0.8802 0.9018 167 0.9926 0.9854 0.9890 137 0.9418 0.9347 0.9382 0.9876
0.0231 34.0 3264 0.0465 0.875 0.9681 0.9192 94 0.9299 0.8743 0.9012 167 0.9926 0.9781 0.9853 137 0.9369 0.9322 0.9345 0.9878
0.0249 35.0 3360 0.0489 0.8273 0.9681 0.8922 94 0.9172 0.8623 0.8889 167 1.0 0.9854 0.9926 137 0.9204 0.9296 0.925 0.9870
0.0245 36.0 3456 0.0452 0.875 0.9681 0.9192 94 0.9255 0.8922 0.9085 167 0.9926 0.9854 0.9890 137 0.9352 0.9422 0.9387 0.9881
0.0236 37.0 3552 0.0536 0.8411 0.9574 0.8955 94 0.9068 0.8743 0.8902 167 1.0 0.9854 0.9926 137 0.9206 0.9322 0.9263 0.9851
0.0225 38.0 3648 0.0538 0.8411 0.9574 0.8955 94 0.9 0.8623 0.8807 167 1.0 0.9854 0.9926 137 0.9179 0.9271 0.9225 0.9851
0.0246 39.0 3744 0.0508 0.88 0.9362 0.9072 94 0.9036 0.8982 0.9009 167 0.9926 0.9854 0.9890 137 0.9279 0.9372 0.9325 0.9859
0.0233 40.0 3840 0.0528 0.8411 0.9574 0.8955 94 0.9193 0.8862 0.9024 167 1.0 0.9854 0.9926 137 0.9256 0.9372 0.9313 0.9856
0.0222 41.0 3936 0.0449 0.8878 0.9255 0.9062 94 0.9042 0.9042 0.9042 167 1.0 0.9854 0.9926 137 0.9325 0.9372 0.9348 0.9865
0.0208 42.0 4032 0.0472 0.9010 0.9681 0.9333 94 0.9096 0.9042 0.9069 167 0.9926 0.9854 0.9890 137 0.9355 0.9472 0.9413 0.9876
0.0199 43.0 4128 0.0570 0.8824 0.9574 0.9184 94 0.8916 0.8862 0.8889 167 0.9926 0.9854 0.9890 137 0.9233 0.9372 0.9302 0.9851
0.0206 44.0 4224 0.0619 0.8491 0.9574 0.9 94 0.9026 0.8323 0.8660 167 0.9779 0.9708 0.9744 137 0.9141 0.9095 0.9118 0.9840
0.0219 45.0 4320 0.0458 0.9175 0.9468 0.9319 94 0.8817 0.8922 0.8869 167 0.9852 0.9708 0.9779 137 0.9252 0.9322 0.9287 0.9867
0.0188 46.0 4416 0.0546 0.9082 0.9468 0.9271 94 0.8896 0.8683 0.8788 167 1.0 0.9781 0.9889 137 0.9316 0.9246 0.9281 0.9845
0.0195 47.0 4512 0.0473 0.8932 0.9787 0.9340 94 0.9141 0.8922 0.9030 167 1.0 0.9854 0.9926 137 0.9377 0.9447 0.9412 0.9878
0.0174 48.0 4608 0.0576 0.8654 0.9574 0.9091 94 0.8994 0.8563 0.8773 167 0.9926 0.9854 0.9890 137 0.9223 0.9246 0.9235 0.9845
0.0188 49.0 4704 0.0533 0.8396 0.9468 0.89 94 0.9006 0.8683 0.8841 167 1.0 0.9781 0.9889 137 0.9177 0.9246 0.9212 0.9845
0.0201 50.0 4800 0.0522 0.8641 0.9468 0.9036 94 0.9068 0.8743 0.8902 167 0.9853 0.9781 0.9817 137 0.9225 0.9271 0.9248 0.9865
0.0184 51.0 4896 0.0543 0.8529 0.9255 0.8878 94 0.9024 0.8862 0.8943 167 1.0 0.9781 0.9889 137 0.9225 0.9271 0.9248 0.9865
0.0167 52.0 4992 0.0494 0.9082 0.9468 0.9271 94 0.9030 0.8922 0.8976 167 0.9851 0.9635 0.9742 137 0.9320 0.9296 0.9308 0.9873
0.0149 53.0 5088 0.0619 0.8544 0.9362 0.8934 94 0.9241 0.8743 0.8985 167 0.9925 0.9708 0.9815 137 0.9291 0.9221 0.9256 0.9851
0.0175 54.0 5184 0.0522 0.9167 0.9362 0.9263 94 0.8988 0.9042 0.9015 167 0.9852 0.9708 0.9779 137 0.9323 0.9347 0.9335 0.9870
0.0145 55.0 5280 0.0509 0.89 0.9468 0.9175 94 0.9146 0.8982 0.9063 167 1.0 0.9854 0.9926 137 0.9373 0.9397 0.9385 0.9884
0.0146 56.0 5376 0.0553 0.8969 0.9255 0.9110 94 0.9152 0.9042 0.9096 167 1.0 0.9854 0.9926 137 0.9395 0.9372 0.9384 0.9873
0.0155 57.0 5472 0.0610 0.8505 0.9681 0.9055 94 0.9108 0.8563 0.8827 167 0.9926 0.9854 0.9890 137 0.9225 0.9271 0.9248 0.9854
0.0145 58.0 5568 0.0565 0.9167 0.9362 0.9263 94 0.8916 0.8862 0.8889 167 0.9853 0.9781 0.9817 137 0.9296 0.9296 0.9296 0.9851
0.0174 59.0 5664 0.0595 0.8302 0.9362 0.88 94 0.8720 0.8563 0.8640 167 0.9926 0.9854 0.9890 137 0.9015 0.9196 0.9104 0.9834
0.0168 60.0 5760 0.0559 0.8654 0.9574 0.9091 94 0.9119 0.8683 0.8896 167 0.9926 0.9854 0.9890 137 0.9273 0.9296 0.9285 0.9859
0.0172 61.0 5856 0.0563 0.9082 0.9468 0.9271 94 0.8817 0.8922 0.8869 167 0.9926 0.9854 0.9890 137 0.9256 0.9372 0.9313 0.9859
0.0142 62.0 5952 0.0599 0.875 0.9681 0.9192 94 0.9119 0.8683 0.8896 167 0.9853 0.9781 0.9817 137 0.9273 0.9296 0.9285 0.9854
0.013 63.0 6048 0.0518 0.8922 0.9681 0.9286 94 0.8855 0.8802 0.8829 167 0.9926 0.9781 0.9853 137 0.9231 0.9347 0.9288 0.9867
0.0137 64.0 6144 0.0639 0.8476 0.9468 0.8945 94 0.9103 0.8503 0.8793 167 0.9926 0.9854 0.9890 137 0.9219 0.9196 0.9208 0.9854
0.0125 65.0 6240 0.0558 0.8922 0.9681 0.9286 94 0.9299 0.8743 0.9012 167 0.9852 0.9708 0.9779 137 0.9391 0.9296 0.9343 0.9870
0.0129 66.0 6336 0.0599 0.8725 0.9468 0.9082 94 0.9119 0.8683 0.8896 167 0.9778 0.9635 0.9706 137 0.9242 0.9196 0.9219 0.9856
0.0124 67.0 6432 0.0655 0.8333 0.9574 0.8911 94 0.9032 0.8383 0.8696 167 0.9853 0.9781 0.9817 137 0.9123 0.9146 0.9134 0.9840
0.0146 68.0 6528 0.0561 0.8911 0.9574 0.9231 94 0.9062 0.8683 0.8869 167 0.9926 0.9854 0.9890 137 0.9320 0.9296 0.9308 0.9862
0.0125 69.0 6624 0.0558 0.8641 0.9468 0.9036 94 0.8810 0.8862 0.8836 167 0.9926 0.9854 0.9890 137 0.9140 0.9347 0.9242 0.9862
0.0113 70.0 6720 0.0547 0.8911 0.9574 0.9231 94 0.9074 0.8802 0.8936 167 0.9853 0.9781 0.9817 137 0.9298 0.9322 0.9310 0.9867
0.0121 71.0 6816 0.0568 0.8725 0.9468 0.9082 94 0.9182 0.8743 0.8957 167 0.9853 0.9781 0.9817 137 0.9295 0.9271 0.9283 0.9862
0.0116 72.0 6912 0.0549 0.8641 0.9468 0.9036 94 0.9012 0.8743 0.8875 167 0.9926 0.9854 0.9890 137 0.9227 0.9296 0.9262 0.9862
0.0139 73.0 7008 0.0564 0.8641 0.9468 0.9036 94 0.9074 0.8802 0.8936 167 0.9853 0.9781 0.9817 137 0.9227 0.9296 0.9262 0.9856
0.012 74.0 7104 0.0532 0.8824 0.9574 0.9184 94 0.9308 0.8862 0.9080 167 0.9779 0.9708 0.9744 137 0.9345 0.9322 0.9333 0.9867
0.0122 75.0 7200 0.0576 0.9 0.9574 0.9278 94 0.9074 0.8802 0.8936 167 0.9853 0.9781 0.9817 137 0.9322 0.9322 0.9322 0.9865
0.0112 76.0 7296 0.0608 0.8725 0.9468 0.9082 94 0.9012 0.8743 0.8875 167 0.9779 0.9708 0.9744 137 0.92 0.9246 0.9223 0.9859
0.0108 77.0 7392 0.0643 0.8558 0.9468 0.8990 94 0.9221 0.8503 0.8847 167 0.9853 0.9781 0.9817 137 0.9264 0.9171 0.9217 0.9851
0.0098 78.0 7488 0.0656 0.8824 0.9574 0.9184 94 0.9006 0.8683 0.8841 167 0.9926 0.9854 0.9890 137 0.9273 0.9296 0.9285 0.9851
0.0107 79.0 7584 0.0604 0.8824 0.9574 0.9184 94 0.9024 0.8862 0.8943 167 0.9926 0.9854 0.9890 137 0.9279 0.9372 0.9325 0.9859
0.0102 80.0 7680 0.0597 0.8911 0.9574 0.9231 94 0.9085 0.8922 0.9003 167 0.9779 0.9708 0.9744 137 0.9277 0.9347 0.9312 0.9867
0.0106 81.0 7776 0.0614 0.8641 0.9468 0.9036 94 0.9241 0.8743 0.8985 167 0.9853 0.9781 0.9817 137 0.9295 0.9271 0.9283 0.9862
0.011 82.0 7872 0.0560 0.9 0.9574 0.9278 94 0.9255 0.8922 0.9085 167 0.9852 0.9708 0.9779 137 0.9394 0.9347 0.9370 0.9870
0.01 83.0 7968 0.0602 0.8725 0.9468 0.9082 94 0.9182 0.8743 0.8957 167 0.9852 0.9708 0.9779 137 0.9293 0.9246 0.9270 0.9865
0.0098 84.0 8064 0.0569 0.8738 0.9574 0.9137 94 0.9068 0.8743 0.8902 167 0.9926 0.9781 0.9853 137 0.9273 0.9296 0.9285 0.9859
0.0105 85.0 8160 0.0580 0.8558 0.9468 0.8990 94 0.9172 0.8623 0.8889 167 0.9852 0.9708 0.9779 137 0.9242 0.9196 0.9219 0.9854
0.0108 86.0 8256 0.0577 0.8824 0.9574 0.9184 94 0.9187 0.8802 0.8991 167 0.9853 0.9781 0.9817 137 0.9322 0.9322 0.9322 0.9865
0.0092 87.0 8352 0.0594 0.8911 0.9574 0.9231 94 0.8802 0.8802 0.8802 167 0.9926 0.9854 0.9890 137 0.9208 0.9347 0.9277 0.9859
0.01 88.0 8448 0.0583 0.8667 0.9681 0.9146 94 0.9 0.8623 0.8807 167 0.9926 0.9854 0.9890 137 0.9227 0.9296 0.9262 0.9859
0.0098 89.0 8544 0.0587 0.8824 0.9574 0.9184 94 0.9018 0.8802 0.8909 167 0.9926 0.9854 0.9890 137 0.9277 0.9347 0.9312 0.9867
0.0093 90.0 8640 0.0555 0.8824 0.9574 0.9184 94 0.9130 0.8802 0.8963 167 0.9926 0.9854 0.9890 137 0.9323 0.9347 0.9335 0.9870
0.0085 91.0 8736 0.0550 0.8911 0.9574 0.9231 94 0.9074 0.8802 0.8936 167 0.9853 0.9781 0.9817 137 0.9298 0.9322 0.9310 0.9865
0.0081 92.0 8832 0.0573 0.8911 0.9574 0.9231 94 0.9074 0.8802 0.8936 167 0.9853 0.9781 0.9817 137 0.9298 0.9322 0.9310 0.9865
0.0101 93.0 8928 0.0610 0.8911 0.9574 0.9231 94 0.9187 0.8802 0.8991 167 0.9926 0.9854 0.9890 137 0.9370 0.9347 0.9358 0.9859
0.0095 94.0 9024 0.0596 0.8922 0.9681 0.9286 94 0.9304 0.8802 0.9046 167 0.9926 0.9854 0.9890 137 0.9419 0.9372 0.9395 0.9865
0.0092 95.0 9120 0.0559 0.8911 0.9574 0.9231 94 0.9080 0.8862 0.8970 167 1.0 0.9854 0.9926 137 0.9348 0.9372 0.9360 0.9873
0.0089 96.0 9216 0.0566 0.8911 0.9574 0.9231 94 0.8963 0.8802 0.8882 167 0.9853 0.9781 0.9817 137 0.9252 0.9322 0.9287 0.9865
0.0085 97.0 9312 0.0570 0.8911 0.9574 0.9231 94 0.9245 0.8802 0.9018 167 0.9853 0.9781 0.9817 137 0.9369 0.9322 0.9345 0.9862
0.0084 98.0 9408 0.0571 0.8911 0.9574 0.9231 94 0.9245 0.8802 0.9018 167 0.9853 0.9781 0.9817 137 0.9369 0.9322 0.9345 0.9862
0.0089 99.0 9504 0.0571 0.8911 0.9574 0.9231 94 0.9187 0.8802 0.8991 167 0.9853 0.9781 0.9817 137 0.9345 0.9322 0.9333 0.9859
0.0093 100.0 9600 0.0573 0.8911 0.9574 0.9231 94 0.9187 0.8802 0.8991 167 0.9853 0.9781 0.9817 137 0.9345 0.9322 0.9333 0.9859

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-pt-pl5-0

Finetuned
(388)
this model