nerui-lora-r16-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0417
  • Location Precision: 0.8713
  • Location Recall: 0.9462
  • Location F1: 0.9072
  • Location Number: 93
  • Organization Precision: 0.8909
  • Organization Recall: 0.8855
  • Organization F1: 0.8882
  • Organization Number: 166
  • Person Precision: 0.9787
  • Person Recall: 0.9718
  • Person F1: 0.9753
  • Person Number: 142
  • Overall Precision: 0.9165
  • Overall Recall: 0.9302
  • Overall F1: 0.9233
  • Overall Accuracy: 0.9868

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0607 1.0 96 0.6772 0.0 0.0 0.0 93 0.0 0.0 0.0 166 0.0 0.0 0.0 142 0.0 0.0 0.0 0.8343
0.6351 2.0 192 0.5251 0.0 0.0 0.0 93 0.5 0.0120 0.0235 166 0.0 0.0 0.0 142 0.3333 0.0050 0.0098 0.8348
0.4897 3.0 288 0.3649 0.0 0.0 0.0 93 0.3529 0.2169 0.2687 166 0.3286 0.3239 0.3262 142 0.3267 0.2045 0.2515 0.8763
0.335 4.0 384 0.2323 0.3684 0.3011 0.3314 93 0.5099 0.6205 0.5598 166 0.5683 0.7324 0.6400 142 0.5098 0.5860 0.5452 0.9289
0.2342 5.0 480 0.1642 0.5895 0.6022 0.5957 93 0.6396 0.7590 0.6942 166 0.8269 0.9085 0.8658 142 0.6942 0.7756 0.7326 0.9564
0.1832 6.0 576 0.1316 0.7027 0.8387 0.7647 93 0.7432 0.8193 0.7794 166 0.9257 0.9648 0.9448 142 0.7941 0.8753 0.8327 0.9657
0.1526 7.0 672 0.1085 0.7692 0.8602 0.8122 93 0.7433 0.8373 0.7875 166 0.9079 0.9718 0.9388 142 0.8059 0.8903 0.8460 0.9690
0.136 8.0 768 0.0910 0.75 0.8710 0.8060 93 0.8011 0.8494 0.8246 166 0.9262 0.9718 0.9485 142 0.8314 0.8978 0.8633 0.9734
0.1234 9.0 864 0.0817 0.7981 0.8925 0.8426 93 0.8229 0.8675 0.8446 166 0.9133 0.9648 0.9384 142 0.8485 0.9077 0.8771 0.9753
0.1123 10.0 960 0.0774 0.7981 0.8925 0.8426 93 0.8207 0.9096 0.8629 166 0.9388 0.9718 0.9550 142 0.8552 0.9277 0.8900 0.9772
0.1042 11.0 1056 0.0683 0.8039 0.8817 0.8410 93 0.8371 0.8976 0.8663 166 0.9448 0.9648 0.9547 142 0.8659 0.9177 0.8910 0.9789
0.1 12.0 1152 0.0661 0.8317 0.9032 0.8660 93 0.8436 0.9096 0.8754 166 0.9514 0.9648 0.9580 142 0.8774 0.9277 0.9018 0.9800
0.0949 13.0 1248 0.0622 0.8416 0.9140 0.8763 93 0.8571 0.9036 0.8798 166 0.9580 0.9648 0.9614 142 0.8878 0.9277 0.9073 0.9811
0.091 14.0 1344 0.0597 0.8173 0.9140 0.8629 93 0.8788 0.8735 0.8761 166 0.9580 0.9648 0.9614 142 0.8908 0.9152 0.9028 0.9802
0.0852 15.0 1440 0.0593 0.84 0.9032 0.8705 93 0.8306 0.9157 0.8711 166 0.9650 0.9718 0.9684 142 0.8779 0.9327 0.9045 0.9800
0.0874 16.0 1536 0.0591 0.7838 0.9355 0.8529 93 0.8538 0.8795 0.8665 166 0.9514 0.9648 0.9580 142 0.8685 0.9227 0.8948 0.9797
0.0817 17.0 1632 0.0538 0.8350 0.9247 0.8776 93 0.8876 0.9036 0.8955 166 0.9580 0.9648 0.9614 142 0.8988 0.9302 0.9142 0.9830
0.0784 18.0 1728 0.0511 0.8350 0.9247 0.8776 93 0.8830 0.9096 0.8961 166 0.9580 0.9648 0.9614 142 0.8969 0.9327 0.9144 0.9833
0.0764 19.0 1824 0.0523 0.7890 0.9247 0.8515 93 0.8841 0.8735 0.8788 166 0.9718 0.9718 0.9718 142 0.8892 0.9202 0.9044 0.9822
0.0735 20.0 1920 0.0524 0.8018 0.9570 0.8725 93 0.8889 0.8675 0.8780 166 0.9718 0.9718 0.9718 142 0.8940 0.9252 0.9093 0.9819
0.074 21.0 2016 0.0519 0.8 0.9462 0.8670 93 0.8788 0.8735 0.8761 166 0.9718 0.9718 0.9718 142 0.8897 0.9252 0.9071 0.9822
0.0695 22.0 2112 0.0529 0.7857 0.9462 0.8585 93 0.8353 0.8554 0.8452 166 0.9718 0.9718 0.9718 142 0.8679 0.9177 0.8921 0.9805
0.0673 23.0 2208 0.0519 0.8056 0.9355 0.8657 93 0.9045 0.8554 0.8793 166 0.9718 0.9718 0.9718 142 0.9017 0.9152 0.9084 0.9824
0.0677 24.0 2304 0.0530 0.7982 0.9355 0.8614 93 0.9045 0.8554 0.8793 166 0.9718 0.9718 0.9718 142 0.8995 0.9152 0.9073 0.9811
0.0649 25.0 2400 0.0501 0.8018 0.9570 0.8725 93 0.8994 0.8614 0.8800 166 0.9718 0.9718 0.9718 142 0.8981 0.9227 0.9102 0.9822
0.0647 26.0 2496 0.0478 0.8365 0.9355 0.8832 93 0.9057 0.8675 0.8862 166 0.9718 0.9718 0.9718 142 0.9111 0.9202 0.9156 0.9838
0.0579 27.0 2592 0.0466 0.8208 0.9355 0.8744 93 0.8963 0.8855 0.8909 166 0.9718 0.9718 0.9718 142 0.9029 0.9277 0.9151 0.9835
0.0627 28.0 2688 0.0488 0.8131 0.9355 0.8700 93 0.8855 0.8855 0.8855 166 0.9718 0.9718 0.9718 142 0.8964 0.9277 0.9118 0.9819
0.0601 29.0 2784 0.0487 0.8131 0.9355 0.8700 93 0.8882 0.9096 0.8988 166 0.9718 0.9718 0.9718 142 0.8974 0.9377 0.9171 0.9827
0.0575 30.0 2880 0.0459 0.8286 0.9355 0.8788 93 0.8922 0.8976 0.8949 166 0.9718 0.9718 0.9718 142 0.9034 0.9327 0.9178 0.9833
0.0569 31.0 2976 0.0455 0.8073 0.9462 0.8713 93 0.8951 0.8735 0.8841 166 0.9718 0.9718 0.9718 142 0.8983 0.9252 0.9115 0.9841
0.0548 32.0 3072 0.0445 0.8224 0.9462 0.88 93 0.8889 0.8675 0.8780 166 0.9718 0.9718 0.9718 142 0.9002 0.9227 0.9113 0.9846
0.0528 33.0 3168 0.0471 0.7946 0.9570 0.8683 93 0.8944 0.8675 0.8807 166 0.9858 0.9789 0.9823 142 0.8986 0.9277 0.9129 0.9827
0.0533 34.0 3264 0.0445 0.8073 0.9462 0.8713 93 0.8802 0.8855 0.8829 166 0.9789 0.9789 0.9789 142 0.8947 0.9327 0.9133 0.9833
0.0503 35.0 3360 0.0425 0.8286 0.9355 0.8788 93 0.8922 0.8976 0.8949 166 0.9718 0.9718 0.9718 142 0.9034 0.9327 0.9178 0.9852
0.0531 36.0 3456 0.0447 0.7928 0.9462 0.8627 93 0.8957 0.8795 0.8875 166 0.9648 0.9648 0.9648 142 0.8918 0.9252 0.9082 0.9830
0.0493 37.0 3552 0.0442 0.8365 0.9355 0.8832 93 0.8970 0.8916 0.8943 166 0.9718 0.9718 0.9718 142 0.9075 0.9302 0.9187 0.9841
0.05 38.0 3648 0.0423 0.87 0.9355 0.9016 93 0.9042 0.9096 0.9069 166 0.9718 0.9718 0.9718 142 0.9193 0.9377 0.9284 0.9857
0.0489 39.0 3744 0.0416 0.8529 0.9355 0.8923 93 0.8994 0.9157 0.9075 166 0.9718 0.9718 0.9718 142 0.9128 0.9401 0.9263 0.9855
0.0481 40.0 3840 0.0411 0.8544 0.9462 0.8980 93 0.9068 0.8795 0.8930 166 0.9718 0.9718 0.9718 142 0.9163 0.9277 0.9219 0.9852
0.0462 41.0 3936 0.0429 0.8286 0.9355 0.8788 93 0.9036 0.9036 0.9036 166 0.9787 0.9718 0.9753 142 0.9102 0.9352 0.9225 0.9855
0.0468 42.0 4032 0.0435 0.8302 0.9462 0.8844 93 0.9030 0.8976 0.9003 166 0.9858 0.9789 0.9823 142 0.9126 0.9377 0.9250 0.9846
0.0469 43.0 4128 0.0423 0.8878 0.9355 0.9110 93 0.8976 0.8976 0.8976 166 0.9858 0.9789 0.9823 142 0.9259 0.9352 0.9305 0.9860
0.0472 44.0 4224 0.0460 0.8148 0.9462 0.8756 93 0.8938 0.8614 0.8773 166 0.9718 0.9718 0.9718 142 0.9 0.9202 0.9100 0.9830
0.0468 45.0 4320 0.0420 0.8713 0.9462 0.9072 93 0.9062 0.8735 0.8896 166 0.9858 0.9789 0.9823 142 0.9254 0.9277 0.9265 0.9852
0.0453 46.0 4416 0.0425 0.8462 0.9462 0.8934 93 0.8994 0.8614 0.8800 166 0.9718 0.9718 0.9718 142 0.9111 0.9202 0.9156 0.9852
0.0428 47.0 4512 0.0432 0.8788 0.9355 0.9062 93 0.8902 0.9277 0.9086 166 0.9787 0.9718 0.9753 142 0.9177 0.9451 0.9312 0.9855
0.043 48.0 4608 0.0433 0.8381 0.9462 0.8889 93 0.8924 0.8494 0.8704 166 0.9718 0.9718 0.9718 142 0.9062 0.9152 0.9107 0.9841
0.0443 49.0 4704 0.0437 0.8529 0.9355 0.8923 93 0.8929 0.9036 0.8982 166 0.9648 0.9648 0.9648 142 0.9078 0.9327 0.9200 0.9846
0.0466 50.0 4800 0.0430 0.8627 0.9462 0.9026 93 0.8922 0.8976 0.8949 166 0.9787 0.9718 0.9753 142 0.9146 0.9352 0.9248 0.9860
0.0419 51.0 4896 0.0430 0.8462 0.9462 0.8934 93 0.8951 0.8735 0.8841 166 0.9787 0.9718 0.9753 142 0.9115 0.9252 0.9183 0.9852
0.0421 52.0 4992 0.0404 0.9158 0.9355 0.9255 93 0.8953 0.9277 0.9112 166 0.9787 0.9718 0.9753 142 0.9289 0.9451 0.9370 0.9874
0.0409 53.0 5088 0.0431 0.8462 0.9462 0.8934 93 0.8982 0.9036 0.9009 166 0.9787 0.9718 0.9753 142 0.9126 0.9377 0.9250 0.9857
0.0391 54.0 5184 0.0417 0.8969 0.9355 0.9158 93 0.9012 0.9337 0.9172 166 0.9787 0.9718 0.9753 142 0.9268 0.9476 0.9371 0.9868
0.0383 55.0 5280 0.0402 0.8980 0.9462 0.9215 93 0.9053 0.9217 0.9134 166 0.9787 0.9718 0.9753 142 0.9289 0.9451 0.9370 0.9877
0.0399 56.0 5376 0.0431 0.8627 0.9462 0.9026 93 0.9048 0.9157 0.9102 166 0.9787 0.9718 0.9753 142 0.9197 0.9426 0.9310 0.9855
0.04 57.0 5472 0.0425 0.8544 0.9462 0.8980 93 0.9024 0.8916 0.8970 166 0.9787 0.9718 0.9753 142 0.9167 0.9327 0.9246 0.9855
0.04 58.0 5568 0.0422 0.8713 0.9462 0.9072 93 0.9146 0.9036 0.9091 166 0.9787 0.9718 0.9753 142 0.9261 0.9377 0.9318 0.9868
0.0372 59.0 5664 0.0425 0.8713 0.9462 0.9072 93 0.9036 0.9036 0.9036 166 0.9787 0.9718 0.9753 142 0.9216 0.9377 0.9295 0.9863
0.0384 60.0 5760 0.0422 0.8713 0.9462 0.9072 93 0.9146 0.9036 0.9091 166 0.9787 0.9718 0.9753 142 0.9261 0.9377 0.9318 0.9866
0.0379 61.0 5856 0.0402 0.8627 0.9462 0.9026 93 0.9091 0.9036 0.9063 166 0.9787 0.9718 0.9753 142 0.9216 0.9377 0.9295 0.9877
0.0362 62.0 5952 0.0387 0.8889 0.9462 0.9167 93 0.9036 0.9036 0.9036 166 0.9648 0.9648 0.9648 142 0.9214 0.9352 0.9282 0.9871
0.036 63.0 6048 0.0424 0.8381 0.9462 0.8889 93 0.9030 0.8976 0.9003 166 0.9787 0.9718 0.9753 142 0.9124 0.9352 0.9236 0.9852
0.036 64.0 6144 0.0404 0.88 0.9462 0.9119 93 0.9024 0.8916 0.8970 166 0.9580 0.9648 0.9614 142 0.9165 0.9302 0.9233 0.9857
0.033 65.0 6240 0.0419 0.8544 0.9462 0.8980 93 0.9030 0.8976 0.9003 166 0.9787 0.9718 0.9753 142 0.9169 0.9352 0.9259 0.9857
0.0348 66.0 6336 0.0396 0.88 0.9462 0.9119 93 0.9024 0.8916 0.8970 166 0.9787 0.9718 0.9753 142 0.9235 0.9327 0.9280 0.9868
0.0346 67.0 6432 0.0410 0.8627 0.9462 0.9026 93 0.8862 0.8916 0.8889 166 0.9648 0.9648 0.9648 142 0.9075 0.9302 0.9187 0.9849
0.0337 68.0 6528 0.0416 0.8544 0.9462 0.8980 93 0.9030 0.8976 0.9003 166 0.9787 0.9718 0.9753 142 0.9169 0.9352 0.9259 0.9857
0.0355 69.0 6624 0.0418 0.8627 0.9462 0.9026 93 0.8909 0.8855 0.8882 166 0.9787 0.9718 0.9753 142 0.9142 0.9302 0.9221 0.9855
0.0337 70.0 6720 0.0408 0.8713 0.9462 0.9072 93 0.9146 0.9036 0.9091 166 0.9718 0.9718 0.9718 142 0.9238 0.9377 0.9307 0.9863
0.0351 71.0 6816 0.0411 0.8713 0.9462 0.9072 93 0.9152 0.9096 0.9124 166 0.9787 0.9718 0.9753 142 0.9263 0.9401 0.9332 0.9860
0.0337 72.0 6912 0.0411 0.9072 0.9462 0.9263 93 0.8929 0.9036 0.8982 166 0.9787 0.9718 0.9753 142 0.9261 0.9377 0.9318 0.9866
0.0317 73.0 7008 0.0415 0.8713 0.9462 0.9072 93 0.9036 0.9036 0.9036 166 0.9787 0.9718 0.9753 142 0.9216 0.9377 0.9295 0.9860
0.0308 74.0 7104 0.0442 0.8558 0.9570 0.9036 93 0.9202 0.9036 0.9119 166 0.9787 0.9718 0.9753 142 0.9240 0.9401 0.9320 0.9860
0.0331 75.0 7200 0.0416 0.9072 0.9462 0.9263 93 0.9053 0.9217 0.9134 166 0.9787 0.9718 0.9753 142 0.9312 0.9451 0.9381 0.9879
0.0307 76.0 7296 0.0426 0.8725 0.9570 0.9128 93 0.8963 0.8855 0.8909 166 0.9787 0.9718 0.9753 142 0.9189 0.9327 0.9257 0.9860
0.0311 77.0 7392 0.0411 0.8889 0.9462 0.9167 93 0.8869 0.8976 0.8922 166 0.9787 0.9718 0.9753 142 0.9191 0.9352 0.9271 0.9871
0.0321 78.0 7488 0.0421 0.8713 0.9462 0.9072 93 0.8862 0.8916 0.8889 166 0.9787 0.9718 0.9753 142 0.9144 0.9327 0.9235 0.9863
0.0314 79.0 7584 0.0419 0.88 0.9462 0.9119 93 0.8869 0.8976 0.8922 166 0.9787 0.9718 0.9753 142 0.9169 0.9352 0.9259 0.9866
0.0327 80.0 7680 0.0420 0.88 0.9462 0.9119 93 0.9096 0.9096 0.9096 166 0.9787 0.9718 0.9753 142 0.9263 0.9401 0.9332 0.9868
0.0338 81.0 7776 0.0423 0.8713 0.9462 0.9072 93 0.9091 0.9036 0.9063 166 0.9787 0.9718 0.9753 142 0.9238 0.9377 0.9307 0.9871
0.0326 82.0 7872 0.0430 0.8713 0.9462 0.9072 93 0.9080 0.8916 0.8997 166 0.9787 0.9718 0.9753 142 0.9235 0.9327 0.9280 0.9857
0.0311 83.0 7968 0.0420 0.8889 0.9462 0.9167 93 0.8970 0.8916 0.8943 166 0.9787 0.9718 0.9753 142 0.9235 0.9327 0.9280 0.9857
0.0319 84.0 8064 0.0435 0.8462 0.9462 0.8934 93 0.8970 0.8916 0.8943 166 0.9787 0.9718 0.9753 142 0.9122 0.9327 0.9223 0.9855
0.0312 85.0 8160 0.0414 0.88 0.9462 0.9119 93 0.8909 0.8855 0.8882 166 0.9787 0.9718 0.9753 142 0.9187 0.9302 0.9244 0.9863
0.0313 86.0 8256 0.0418 0.88 0.9462 0.9119 93 0.8862 0.8916 0.8889 166 0.9787 0.9718 0.9753 142 0.9167 0.9327 0.9246 0.9866
0.0315 87.0 8352 0.0414 0.88 0.9462 0.9119 93 0.8916 0.8916 0.8916 166 0.9787 0.9718 0.9753 142 0.9189 0.9327 0.9257 0.9868
0.0314 88.0 8448 0.0415 0.88 0.9462 0.9119 93 0.9024 0.8916 0.8970 166 0.9787 0.9718 0.9753 142 0.9235 0.9327 0.9280 0.9866
0.0301 89.0 8544 0.0416 0.88 0.9462 0.9119 93 0.8970 0.8916 0.8943 166 0.9787 0.9718 0.9753 142 0.9212 0.9327 0.9269 0.9868
0.0303 90.0 8640 0.0410 0.88 0.9462 0.9119 93 0.9030 0.8976 0.9003 166 0.9787 0.9718 0.9753 142 0.9236 0.9352 0.9294 0.9866
0.0292 91.0 8736 0.0412 0.8713 0.9462 0.9072 93 0.8909 0.8855 0.8882 166 0.9787 0.9718 0.9753 142 0.9165 0.9302 0.9233 0.9863
0.0292 92.0 8832 0.0424 0.88 0.9462 0.9119 93 0.9080 0.8916 0.8997 166 0.9787 0.9718 0.9753 142 0.9257 0.9327 0.9292 0.9868
0.0295 93.0 8928 0.0426 0.88 0.9462 0.9119 93 0.9080 0.8916 0.8997 166 0.9787 0.9718 0.9753 142 0.9257 0.9327 0.9292 0.9866
0.0304 94.0 9024 0.0422 0.88 0.9462 0.9119 93 0.8963 0.8855 0.8909 166 0.9787 0.9718 0.9753 142 0.9210 0.9302 0.9256 0.9866
0.0304 95.0 9120 0.0415 0.8713 0.9462 0.9072 93 0.8855 0.8855 0.8855 166 0.9787 0.9718 0.9753 142 0.9142 0.9302 0.9221 0.9866
0.0312 96.0 9216 0.0415 0.88 0.9462 0.9119 93 0.8862 0.8916 0.8889 166 0.9787 0.9718 0.9753 142 0.9167 0.9327 0.9246 0.9868
0.0291 97.0 9312 0.0418 0.8713 0.9462 0.9072 93 0.8855 0.8855 0.8855 166 0.9787 0.9718 0.9753 142 0.9142 0.9302 0.9221 0.9866
0.0306 98.0 9408 0.0417 0.88 0.9462 0.9119 93 0.8916 0.8916 0.8916 166 0.9787 0.9718 0.9753 142 0.9189 0.9327 0.9257 0.9871
0.0293 99.0 9504 0.0417 0.8713 0.9462 0.9072 93 0.8909 0.8855 0.8882 166 0.9787 0.9718 0.9753 142 0.9165 0.9302 0.9233 0.9868
0.0302 100.0 9600 0.0417 0.8713 0.9462 0.9072 93 0.8909 0.8855 0.8882 166 0.9787 0.9718 0.9753 142 0.9165 0.9302 0.9233 0.9868

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-lora-r16-2

Finetuned
(388)
this model