nerui-unipelt-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0711
  • Location Precision: 0.9184
  • Location Recall: 0.9677
  • Location F1: 0.9424
  • Location Number: 93
  • Organization Precision: 0.9152
  • Organization Recall: 0.9096
  • Organization F1: 0.9124
  • Organization Number: 166
  • Person Precision: 0.9650
  • Person Recall: 0.9718
  • Person F1: 0.9684
  • Person Number: 142
  • Overall Precision: 0.9335
  • Overall Recall: 0.9451
  • Overall F1: 0.9393
  • Overall Accuracy: 0.9879

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8565 1.0 96 0.4994 0.0 0.0 0.0 93 0.3333 0.0120 0.0233 166 0.0 0.0 0.0 142 0.2222 0.0050 0.0098 0.8351
0.3861 2.0 192 0.2398 0.3150 0.4301 0.3636 93 0.5410 0.5964 0.5673 166 0.5238 0.7746 0.625 142 0.4788 0.6209 0.5407 0.9309
0.1946 3.0 288 0.1054 0.7742 0.7742 0.7742 93 0.7189 0.8012 0.7578 166 0.9139 0.9718 0.9420 142 0.7995 0.8554 0.8265 0.9695
0.1288 4.0 384 0.0858 0.6412 0.9032 0.75 93 0.7955 0.8434 0.8187 166 0.9456 0.9789 0.9619 142 0.7996 0.9052 0.8491 0.9709
0.0996 5.0 480 0.0602 0.7810 0.8817 0.8283 93 0.8855 0.8855 0.8855 166 0.9720 0.9789 0.9754 142 0.8889 0.9177 0.9031 0.9816
0.0883 6.0 576 0.0645 0.7273 0.9462 0.8224 93 0.9247 0.8133 0.8654 166 0.9653 0.9789 0.9720 142 0.8808 0.9027 0.8916 0.9789
0.0736 7.0 672 0.0552 0.8252 0.9140 0.8673 93 0.8793 0.9217 0.9 166 0.9789 0.9789 0.9789 142 0.8998 0.9401 0.9195 0.9827
0.0677 8.0 768 0.0476 0.8835 0.9785 0.9286 93 0.9157 0.9157 0.9157 166 0.9789 0.9789 0.9789 142 0.9294 0.9526 0.9409 0.9857
0.0588 9.0 864 0.0466 0.8667 0.9785 0.9192 93 0.9325 0.9157 0.9240 166 0.9858 0.9789 0.9823 142 0.9340 0.9526 0.9432 0.9857
0.0558 10.0 960 0.0521 0.7857 0.9462 0.8585 93 0.8862 0.8916 0.8889 166 0.9790 0.9859 0.9825 142 0.8910 0.9377 0.9137 0.9830
0.051 11.0 1056 0.0407 0.8515 0.9247 0.8866 93 0.9172 0.9337 0.9254 166 0.9789 0.9789 0.9789 142 0.9223 0.9476 0.9348 0.9871
0.0456 12.0 1152 0.0446 0.8558 0.9570 0.9036 93 0.9068 0.8795 0.8930 166 0.9720 0.9789 0.9754 142 0.9167 0.9327 0.9246 0.9849
0.0423 13.0 1248 0.0421 0.8571 0.9677 0.9091 93 0.9255 0.8976 0.9113 166 0.9858 0.9789 0.9823 142 0.9287 0.9426 0.9356 0.9866
0.0412 14.0 1344 0.0549 0.7739 0.9570 0.8558 93 0.92 0.8313 0.8734 166 0.9720 0.9789 0.9754 142 0.8971 0.9127 0.9048 0.9811
0.0383 15.0 1440 0.0400 0.8889 0.9462 0.9167 93 0.9212 0.9157 0.9184 166 0.9858 0.9789 0.9823 142 0.9358 0.9451 0.9404 0.9874
0.0341 16.0 1536 0.0489 0.8426 0.9785 0.9055 93 0.9057 0.8675 0.8862 166 0.9789 0.9789 0.9789 142 0.9144 0.9327 0.9235 0.9835
0.0348 17.0 1632 0.0456 0.8571 0.9677 0.9091 93 0.9146 0.9036 0.9091 166 0.9789 0.9789 0.9789 142 0.9221 0.9451 0.9335 0.9857
0.0317 18.0 1728 0.0449 0.8447 0.9355 0.8878 93 0.8841 0.8735 0.8788 166 0.9789 0.9789 0.9789 142 0.9071 0.9252 0.9160 0.9844
0.0282 19.0 1824 0.0486 0.8958 0.9247 0.9101 93 0.9074 0.8855 0.8963 166 0.9720 0.9789 0.9754 142 0.9277 0.9277 0.9277 0.9860
0.0272 20.0 1920 0.0504 0.8641 0.9570 0.9082 93 0.9317 0.9036 0.9174 166 0.9583 0.9718 0.9650 142 0.9240 0.9401 0.9320 0.9844
0.0256 21.0 2016 0.0461 0.8846 0.9892 0.9340 93 0.9259 0.9036 0.9146 166 0.9720 0.9789 0.9754 142 0.9315 0.9501 0.9407 0.9860
0.0246 22.0 2112 0.0486 0.9 0.9677 0.9326 93 0.9048 0.9157 0.9102 166 0.9653 0.9789 0.9720 142 0.9248 0.9501 0.9373 0.9849
0.023 23.0 2208 0.0429 0.8878 0.9355 0.9110 93 0.9107 0.9217 0.9162 166 0.9720 0.9789 0.9754 142 0.9267 0.9451 0.9358 0.9866
0.0234 24.0 2304 0.0489 0.9072 0.9462 0.9263 93 0.9176 0.9398 0.9286 166 0.9720 0.9789 0.9754 142 0.9341 0.9551 0.9445 0.9868
0.0214 25.0 2400 0.0578 0.8824 0.9677 0.9231 93 0.9241 0.8795 0.9012 166 0.9789 0.9789 0.9789 142 0.9328 0.9352 0.9340 0.9852
0.0193 26.0 2496 0.0503 0.91 0.9785 0.9430 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9871
0.0165 27.0 2592 0.0521 0.91 0.9785 0.9430 93 0.9390 0.9277 0.9333 166 0.9720 0.9789 0.9754 142 0.9435 0.9576 0.9505 0.9871
0.0185 28.0 2688 0.0514 0.8980 0.9462 0.9215 93 0.8935 0.9096 0.9015 166 0.9789 0.9789 0.9789 142 0.9242 0.9426 0.9333 0.9857
0.0156 29.0 2784 0.0486 0.8990 0.9570 0.9271 93 0.9091 0.9036 0.9063 166 0.9653 0.9789 0.9720 142 0.9265 0.9426 0.9345 0.9868
0.0159 30.0 2880 0.0513 0.8835 0.9785 0.9286 93 0.8935 0.9096 0.9015 166 0.9653 0.9789 0.9720 142 0.9159 0.9501 0.9327 0.9863
0.0146 31.0 2976 0.0495 0.8911 0.9677 0.9278 93 0.9030 0.8976 0.9003 166 0.9789 0.9789 0.9789 142 0.9265 0.9426 0.9345 0.9868
0.0138 32.0 3072 0.0492 0.89 0.9570 0.9223 93 0.9313 0.8976 0.9141 166 0.9858 0.9789 0.9823 142 0.9401 0.9401 0.9401 0.9885
0.0133 33.0 3168 0.0494 0.9175 0.9570 0.9368 93 0.8982 0.9036 0.9009 166 0.9653 0.9789 0.9720 142 0.9265 0.9426 0.9345 0.9877
0.0138 34.0 3264 0.0502 0.91 0.9785 0.9430 93 0.9383 0.9157 0.9268 166 0.9858 0.9789 0.9823 142 0.9479 0.9526 0.9502 0.9885
0.0135 35.0 3360 0.0503 0.9091 0.9677 0.9375 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9312 0.9451 0.9381 0.9866
0.0123 36.0 3456 0.0552 0.9010 0.9785 0.9381 93 0.9437 0.9096 0.9264 166 0.9789 0.9789 0.9789 142 0.9454 0.9501 0.9478 0.9879
0.0129 37.0 3552 0.0468 0.9375 0.9677 0.9524 93 0.9112 0.9277 0.9194 166 0.9789 0.9789 0.9789 142 0.9410 0.9551 0.9480 0.9890
0.0103 38.0 3648 0.0517 0.91 0.9785 0.9430 93 0.9080 0.8916 0.8997 166 0.9858 0.9789 0.9823 142 0.9356 0.9426 0.9391 0.9882
0.0101 39.0 3744 0.0496 0.89 0.9570 0.9223 93 0.8994 0.9157 0.9075 166 0.9787 0.9718 0.9753 142 0.9244 0.9451 0.9346 0.9882
0.0094 40.0 3840 0.0470 0.9278 0.9677 0.9474 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9429 0.9476 0.9453 0.9890
0.0084 41.0 3936 0.0556 0.9271 0.9570 0.9418 93 0.9042 0.9096 0.9069 166 0.9789 0.9789 0.9789 142 0.9358 0.9451 0.9404 0.9882
0.0094 42.0 4032 0.0609 0.8969 0.9355 0.9158 93 0.9146 0.9036 0.9091 166 0.9580 0.9648 0.9614 142 0.9257 0.9327 0.9292 0.9855
0.0098 43.0 4128 0.0554 0.9286 0.9785 0.9529 93 0.9030 0.8976 0.9003 166 0.9789 0.9789 0.9789 142 0.9358 0.9451 0.9404 0.9882
0.0087 44.0 4224 0.0571 0.9286 0.9785 0.9529 93 0.9152 0.9096 0.9124 166 0.9789 0.9789 0.9789 142 0.9407 0.9501 0.9454 0.9877
0.0084 45.0 4320 0.0642 0.9091 0.9677 0.9375 93 0.9313 0.8976 0.9141 166 0.9789 0.9789 0.9789 142 0.9426 0.9426 0.9426 0.9871
0.0087 46.0 4416 0.0656 0.9091 0.9677 0.9375 93 0.9259 0.9036 0.9146 166 0.9720 0.9789 0.9754 142 0.9381 0.9451 0.9416 0.9860
0.0086 47.0 4512 0.0597 0.9192 0.9785 0.9479 93 0.9091 0.9036 0.9063 166 0.9858 0.9789 0.9823 142 0.9383 0.9476 0.9429 0.9874
0.007 48.0 4608 0.0635 0.8922 0.9785 0.9333 93 0.9080 0.8916 0.8997 166 0.9720 0.9789 0.9754 142 0.9265 0.9426 0.9345 0.9871
0.0079 49.0 4704 0.0611 0.9082 0.9570 0.9319 93 0.8982 0.9036 0.9009 166 0.9720 0.9789 0.9754 142 0.9265 0.9426 0.9345 0.9868
0.0069 50.0 4800 0.0602 0.9082 0.9570 0.9319 93 0.8988 0.9096 0.9042 166 0.9653 0.9789 0.9720 142 0.9244 0.9451 0.9346 0.9868
0.0068 51.0 4896 0.0651 0.9 0.9677 0.9326 93 0.9202 0.9036 0.9119 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9866
0.0054 52.0 4992 0.0533 0.8980 0.9462 0.9215 93 0.9096 0.9096 0.9096 166 0.9653 0.9789 0.9720 142 0.9265 0.9426 0.9345 0.9871
0.0059 53.0 5088 0.0594 0.9271 0.9570 0.9418 93 0.9268 0.9157 0.9212 166 0.9858 0.9789 0.9823 142 0.9476 0.9476 0.9476 0.9888
0.0058 54.0 5184 0.0636 0.9010 0.9785 0.9381 93 0.9136 0.8916 0.9024 166 0.9720 0.9789 0.9754 142 0.9310 0.9426 0.9368 0.9877
0.0063 55.0 5280 0.0577 0.9278 0.9677 0.9474 93 0.9162 0.9217 0.9189 166 0.9789 0.9789 0.9789 142 0.9409 0.9526 0.9467 0.9882
0.0045 56.0 5376 0.0619 0.9082 0.9570 0.9319 93 0.9146 0.9036 0.9091 166 0.9789 0.9789 0.9789 142 0.9356 0.9426 0.9391 0.9879
0.0072 57.0 5472 0.0723 0.9175 0.9570 0.9368 93 0.9255 0.8976 0.9113 166 0.9653 0.9789 0.9720 142 0.9378 0.9401 0.9390 0.9871
0.0059 58.0 5568 0.0681 0.91 0.9785 0.9430 93 0.9313 0.8976 0.9141 166 0.9720 0.9789 0.9754 142 0.9404 0.9451 0.9428 0.9882
0.0048 59.0 5664 0.0655 0.9 0.9677 0.9326 93 0.9259 0.9036 0.9146 166 0.9789 0.9789 0.9789 142 0.9381 0.9451 0.9416 0.9877
0.0046 60.0 5760 0.0759 0.8835 0.9785 0.9286 93 0.9085 0.8976 0.9030 166 0.9789 0.9789 0.9789 142 0.9267 0.9451 0.9358 0.9860
0.0051 61.0 5856 0.0614 0.9381 0.9785 0.9579 93 0.95 0.9157 0.9325 166 0.9789 0.9789 0.9789 142 0.9574 0.9526 0.955 0.9893
0.0046 62.0 5952 0.0698 0.9192 0.9785 0.9479 93 0.9371 0.8976 0.9169 166 0.9789 0.9789 0.9789 142 0.9475 0.9451 0.9463 0.9888
0.0056 63.0 6048 0.0614 0.9286 0.9785 0.9529 93 0.9317 0.9036 0.9174 166 0.9789 0.9789 0.9789 142 0.9476 0.9476 0.9476 0.9888
0.004 64.0 6144 0.0680 0.9192 0.9785 0.9479 93 0.9255 0.8976 0.9113 166 0.9720 0.9789 0.9754 142 0.9404 0.9451 0.9428 0.9882
0.0034 65.0 6240 0.0743 0.9184 0.9677 0.9424 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9429 0.9476 0.9453 0.9877
0.005 66.0 6336 0.0712 0.9192 0.9785 0.9479 93 0.9259 0.9036 0.9146 166 0.9720 0.9789 0.9754 142 0.9406 0.9476 0.9441 0.9879
0.0034 67.0 6432 0.0715 0.9010 0.9785 0.9381 93 0.9085 0.8976 0.9030 166 0.9720 0.9789 0.9754 142 0.9289 0.9451 0.9370 0.9871
0.0044 68.0 6528 0.0722 0.9091 0.9677 0.9375 93 0.9202 0.9036 0.9119 166 0.9720 0.9789 0.9754 142 0.9358 0.9451 0.9404 0.9866
0.0034 69.0 6624 0.0667 0.8990 0.9570 0.9271 93 0.9198 0.8976 0.9085 166 0.9650 0.9718 0.9684 142 0.9307 0.9377 0.9342 0.9868
0.0037 70.0 6720 0.0798 0.8911 0.9677 0.9278 93 0.9193 0.8916 0.9052 166 0.9720 0.9789 0.9754 142 0.9309 0.9401 0.9355 0.9852
0.0038 71.0 6816 0.0615 0.9462 0.9462 0.9462 93 0.9217 0.9217 0.9217 166 0.9650 0.9718 0.9684 142 0.9428 0.9451 0.9440 0.9885
0.0036 72.0 6912 0.0726 0.9 0.9677 0.9326 93 0.9255 0.8976 0.9113 166 0.9720 0.9789 0.9754 142 0.9356 0.9426 0.9391 0.9863
0.004 73.0 7008 0.0673 0.9184 0.9677 0.9424 93 0.9317 0.9036 0.9174 166 0.9720 0.9789 0.9754 142 0.9428 0.9451 0.9440 0.9877
0.0031 74.0 7104 0.0689 0.9175 0.9570 0.9368 93 0.9152 0.9096 0.9124 166 0.9789 0.9789 0.9789 142 0.9381 0.9451 0.9416 0.9863
0.0035 75.0 7200 0.0677 0.9184 0.9677 0.9424 93 0.9259 0.9036 0.9146 166 0.9720 0.9789 0.9754 142 0.9404 0.9451 0.9428 0.9877
0.0025 76.0 7296 0.0668 0.9184 0.9677 0.9424 93 0.9202 0.9036 0.9119 166 0.9720 0.9789 0.9754 142 0.9381 0.9451 0.9416 0.9882
0.0028 77.0 7392 0.0716 0.9184 0.9677 0.9424 93 0.9157 0.9157 0.9157 166 0.9789 0.9789 0.9789 142 0.9384 0.9501 0.9442 0.9877
0.003 78.0 7488 0.0737 0.9 0.9677 0.9326 93 0.9080 0.8916 0.8997 166 0.9720 0.9789 0.9754 142 0.9286 0.9401 0.9343 0.9860
0.0043 79.0 7584 0.0663 0.9286 0.9785 0.9529 93 0.9152 0.9096 0.9124 166 0.9720 0.9789 0.9754 142 0.9384 0.9501 0.9442 0.9879
0.0025 80.0 7680 0.0699 0.9184 0.9677 0.9424 93 0.9091 0.9036 0.9063 166 0.9720 0.9789 0.9754 142 0.9335 0.9451 0.9393 0.9879
0.0034 81.0 7776 0.0677 0.9278 0.9677 0.9474 93 0.9207 0.9096 0.9152 166 0.9650 0.9718 0.9684 142 0.9381 0.9451 0.9416 0.9868
0.003 82.0 7872 0.0684 0.9381 0.9785 0.9579 93 0.9207 0.9096 0.9152 166 0.9720 0.9789 0.9754 142 0.9431 0.9501 0.9466 0.9877
0.0033 83.0 7968 0.0668 0.9192 0.9785 0.9479 93 0.9085 0.8976 0.9030 166 0.9720 0.9789 0.9754 142 0.9335 0.9451 0.9393 0.9882
0.0033 84.0 8064 0.0656 0.9286 0.9785 0.9529 93 0.9091 0.9036 0.9063 166 0.9720 0.9789 0.9754 142 0.9360 0.9476 0.9418 0.9882
0.0034 85.0 8160 0.0705 0.9184 0.9677 0.9424 93 0.9207 0.9096 0.9152 166 0.9789 0.9789 0.9789 142 0.9406 0.9476 0.9441 0.9877
0.0024 86.0 8256 0.0707 0.9192 0.9785 0.9479 93 0.9146 0.9036 0.9091 166 0.9789 0.9789 0.9789 142 0.9383 0.9476 0.9429 0.9882
0.0033 87.0 8352 0.0658 0.9286 0.9785 0.9529 93 0.9264 0.9096 0.9179 166 0.9789 0.9789 0.9789 142 0.9454 0.9501 0.9478 0.9893
0.0027 88.0 8448 0.0741 0.9184 0.9677 0.9424 93 0.9152 0.9096 0.9124 166 0.9718 0.9718 0.9718 142 0.9358 0.9451 0.9404 0.9877
0.0031 89.0 8544 0.0754 0.9184 0.9677 0.9424 93 0.9091 0.9036 0.9063 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9874
0.0027 90.0 8640 0.0722 0.9082 0.9570 0.9319 93 0.9146 0.9036 0.9091 166 0.9650 0.9718 0.9684 142 0.9309 0.9401 0.9355 0.9877
0.0029 91.0 8736 0.0713 0.9184 0.9677 0.9424 93 0.9202 0.9036 0.9119 166 0.9650 0.9718 0.9684 142 0.9356 0.9426 0.9391 0.9879
0.0032 92.0 8832 0.0720 0.9192 0.9785 0.9479 93 0.9091 0.9036 0.9063 166 0.9650 0.9718 0.9684 142 0.9312 0.9451 0.9381 0.9877
0.0028 93.0 8928 0.0708 0.9184 0.9677 0.9424 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9877
0.0032 94.0 9024 0.0721 0.9091 0.9677 0.9375 93 0.9202 0.9036 0.9119 166 0.9720 0.9789 0.9754 142 0.9358 0.9451 0.9404 0.9882
0.0019 95.0 9120 0.0721 0.9082 0.9570 0.9319 93 0.9091 0.9036 0.9063 166 0.9650 0.9718 0.9684 142 0.9286 0.9401 0.9343 0.9871
0.0031 96.0 9216 0.0707 0.9184 0.9677 0.9424 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9877
0.0024 97.0 9312 0.0704 0.9091 0.9677 0.9375 93 0.9146 0.9036 0.9091 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9874
0.0029 98.0 9408 0.0709 0.9091 0.9677 0.9375 93 0.9146 0.9036 0.9091 166 0.9650 0.9718 0.9684 142 0.9310 0.9426 0.9368 0.9877
0.002 99.0 9504 0.0713 0.9184 0.9677 0.9424 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9879
0.0017 100.0 9600 0.0711 0.9184 0.9677 0.9424 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9335 0.9451 0.9393 0.9879

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-unipelt-2

Finetuned
(388)
this model