nerui-seq_bn-rf64-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0340
  • Location Precision: 0.9160
  • Location Recall: 0.9397
  • Location F1: 0.9277
  • Location Number: 116
  • Organization Precision: 0.9371
  • Organization Recall: 0.9430
  • Organization F1: 0.9401
  • Organization Number: 158
  • Person Precision: 0.9685
  • Person Recall: 0.9919
  • Person F1: 0.9801
  • Person Number: 124
  • Overall Precision: 0.9407
  • Overall Recall: 0.9573
  • Overall F1: 0.9489
  • Overall Accuracy: 0.9882

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0547 1.0 96 0.6439 0.0 0.0 0.0 116 0.0 0.0 0.0 158 0.0 0.0 0.0 124 0.0 0.0 0.0 0.8394
0.5973 2.0 192 0.4455 0.0 0.0 0.0 116 0.3636 0.0506 0.0889 158 0.3448 0.0806 0.1307 124 0.3396 0.0452 0.0798 0.8457
0.4304 3.0 288 0.3282 0.3793 0.1897 0.2529 116 0.3692 0.5 0.4247 158 0.3520 0.5565 0.4313 124 0.3632 0.4271 0.3926 0.8973
0.3462 4.0 384 0.2760 0.4767 0.3534 0.4059 116 0.4635 0.5633 0.5086 158 0.3865 0.6452 0.4834 124 0.4330 0.5276 0.4757 0.9171
0.3083 5.0 480 0.2443 0.4891 0.3879 0.4327 116 0.4848 0.7089 0.5758 158 0.5269 0.7097 0.6048 124 0.5 0.6156 0.5518 0.9306
0.2762 6.0 576 0.2112 0.5299 0.5345 0.5322 116 0.5476 0.7278 0.625 158 0.6196 0.8145 0.7038 124 0.5673 0.6985 0.6261 0.9448
0.2372 7.0 672 0.1746 0.56 0.6034 0.5809 116 0.6205 0.7658 0.6856 158 0.7415 0.8790 0.8044 124 0.6424 0.7538 0.6936 0.9555
0.2032 8.0 768 0.1440 0.6423 0.6810 0.6611 116 0.6978 0.8038 0.7471 158 0.7832 0.9032 0.8390 124 0.7098 0.7990 0.7518 0.9649
0.1698 9.0 864 0.1209 0.7016 0.75 0.725 116 0.7374 0.8354 0.7834 158 0.8156 0.9274 0.8679 124 0.7523 0.8392 0.7933 0.9679
0.1454 10.0 960 0.1023 0.7742 0.8276 0.8000 116 0.7882 0.8481 0.8171 158 0.9023 0.9677 0.9339 124 0.8197 0.8794 0.8485 0.9739
0.1335 11.0 1056 0.0897 0.8 0.8621 0.8299 116 0.8171 0.8481 0.8323 158 0.9030 0.9758 0.9380 124 0.8392 0.8920 0.8648 0.9756
0.1188 12.0 1152 0.0829 0.7891 0.8707 0.8279 116 0.8193 0.8608 0.8395 158 0.9037 0.9839 0.9421 124 0.8368 0.9020 0.8682 0.9753
0.1117 13.0 1248 0.0753 0.8211 0.8707 0.8452 116 0.8333 0.8861 0.8589 158 0.9535 0.9919 0.9723 124 0.8667 0.9146 0.8900 0.9783
0.1055 14.0 1344 0.0695 0.8145 0.8707 0.8417 116 0.8415 0.8734 0.8571 158 0.9609 0.9919 0.9762 124 0.8702 0.9095 0.8894 0.9783
0.1008 15.0 1440 0.0685 0.8443 0.8879 0.8655 116 0.8263 0.8734 0.8492 158 0.9389 0.9919 0.9647 124 0.8667 0.9146 0.8900 0.9772
0.0978 16.0 1536 0.0678 0.8062 0.8966 0.8490 116 0.8225 0.8797 0.8502 158 0.9462 0.9919 0.9685 124 0.8551 0.9196 0.8862 0.9764
0.0929 17.0 1632 0.0605 0.824 0.8879 0.8548 116 0.85 0.8608 0.8553 158 0.9609 0.9919 0.9762 124 0.8765 0.9095 0.8927 0.9797
0.0854 18.0 1728 0.0591 0.8346 0.9138 0.8724 116 0.8742 0.8797 0.8770 158 0.9762 0.9919 0.9840 124 0.8932 0.9246 0.9086 0.9811
0.0834 19.0 1824 0.0554 0.8618 0.9138 0.8870 116 0.8868 0.8924 0.8896 158 0.9685 0.9919 0.9801 124 0.9046 0.9296 0.9170 0.9824
0.0815 20.0 1920 0.0561 0.848 0.9138 0.8797 116 0.8812 0.8924 0.8868 158 0.9762 0.9919 0.9840 124 0.9002 0.9296 0.9147 0.9805
0.0807 21.0 2016 0.0549 0.8492 0.9224 0.8843 116 0.8659 0.8987 0.8820 158 0.9762 0.9919 0.9840 124 0.8942 0.9347 0.9140 0.9808
0.078 22.0 2112 0.0527 0.8689 0.9138 0.8908 116 0.8889 0.9114 0.9000 158 0.9762 0.9919 0.9840 124 0.9098 0.9372 0.9233 0.9824
0.0738 23.0 2208 0.0506 0.8710 0.9310 0.9 116 0.9045 0.8987 0.9016 158 0.9762 0.9919 0.9840 124 0.9165 0.9372 0.9267 0.9835
0.0733 24.0 2304 0.0498 0.8699 0.9224 0.8954 116 0.8812 0.8924 0.8868 158 0.9762 0.9919 0.9840 124 0.9071 0.9322 0.9195 0.9830
0.0704 25.0 2400 0.0477 0.8843 0.9224 0.9030 116 0.9051 0.9051 0.9051 158 0.9762 0.9919 0.9840 124 0.9210 0.9372 0.9290 0.9841
0.0698 26.0 2496 0.0467 0.8824 0.9052 0.8936 116 0.9006 0.9177 0.9091 158 0.9762 0.9919 0.9840 124 0.9187 0.9372 0.9279 0.9844
0.0666 27.0 2592 0.0470 0.8926 0.9310 0.9114 116 0.8795 0.9241 0.9012 158 0.9762 0.9919 0.9840 124 0.9128 0.9472 0.9297 0.9835
0.0649 28.0 2688 0.0465 0.9016 0.9483 0.9244 116 0.8855 0.9304 0.9074 158 0.9762 0.9919 0.9840 124 0.9179 0.9548 0.9360 0.9852
0.0648 29.0 2784 0.0445 0.8908 0.9138 0.9021 116 0.8970 0.9367 0.9164 158 0.9762 0.9919 0.9840 124 0.9195 0.9472 0.9332 0.9846
0.0614 30.0 2880 0.0440 0.9153 0.9310 0.9231 116 0.9024 0.9367 0.9193 158 0.9762 0.9919 0.9840 124 0.9289 0.9523 0.9404 0.9860
0.0618 31.0 2976 0.0422 0.9 0.9310 0.9153 116 0.9068 0.9241 0.9154 158 0.9762 0.9919 0.9840 124 0.9263 0.9472 0.9366 0.9857
0.0613 32.0 3072 0.0432 0.9008 0.9397 0.9198 116 0.9286 0.9051 0.9167 158 0.984 0.9919 0.9880 124 0.9375 0.9422 0.9398 0.9865
0.0581 33.0 3168 0.0426 0.9091 0.9483 0.9283 116 0.9119 0.9177 0.9148 158 0.9685 0.9919 0.9801 124 0.9287 0.9497 0.9391 0.9863
0.0586 34.0 3264 0.0422 0.9160 0.9397 0.9277 116 0.9182 0.9241 0.9211 158 0.9762 0.9919 0.9840 124 0.9356 0.9497 0.9426 0.9868
0.0557 35.0 3360 0.0408 0.9231 0.9310 0.9270 116 0.8970 0.9367 0.9164 158 0.9762 0.9919 0.9840 124 0.9289 0.9523 0.9404 0.9865
0.0531 36.0 3456 0.0419 0.9091 0.9483 0.9283 116 0.9018 0.9304 0.9159 158 0.9685 0.9919 0.9801 124 0.9246 0.9548 0.9394 0.9863
0.0546 37.0 3552 0.0402 0.9167 0.9483 0.9322 116 0.9241 0.9241 0.9241 158 0.9685 0.9919 0.9801 124 0.9358 0.9523 0.9440 0.9868
0.0521 38.0 3648 0.0395 0.9068 0.9224 0.9145 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9378 0.9472 0.9425 0.9871
0.0535 39.0 3744 0.0391 0.9237 0.9397 0.9316 116 0.9136 0.9367 0.9250 158 0.9762 0.9919 0.9840 124 0.9360 0.9548 0.9453 0.9874
0.0504 40.0 3840 0.0393 0.9167 0.9483 0.9322 116 0.9299 0.9241 0.9270 158 0.984 0.9919 0.9880 124 0.9428 0.9523 0.9475 0.9876
0.0511 41.0 3936 0.0392 0.9083 0.9397 0.9237 116 0.8957 0.9241 0.9097 158 0.9762 0.9919 0.9840 124 0.9242 0.9497 0.9368 0.9868
0.0481 42.0 4032 0.0379 0.9244 0.9483 0.9362 116 0.9304 0.9304 0.9304 158 0.984 0.9919 0.9880 124 0.9453 0.9548 0.95 0.9879
0.0509 43.0 4128 0.0386 0.9237 0.9397 0.9316 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9428 0.9523 0.9475 0.9879
0.0498 44.0 4224 0.0382 0.9167 0.9483 0.9322 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9358 0.9523 0.9440 0.9879
0.0468 45.0 4320 0.0376 0.9083 0.9397 0.9237 116 0.9241 0.9241 0.9241 158 0.984 0.9919 0.9880 124 0.9380 0.9497 0.9438 0.9876
0.0449 46.0 4416 0.0372 0.9167 0.9483 0.9322 116 0.9187 0.9304 0.9245 158 0.984 0.9919 0.9880 124 0.9383 0.9548 0.9465 0.9882
0.0466 47.0 4512 0.0368 0.9160 0.9397 0.9277 116 0.9074 0.9304 0.9187 158 0.9762 0.9919 0.9840 124 0.9312 0.9523 0.9416 0.9874
0.0439 48.0 4608 0.0372 0.9167 0.9483 0.9322 116 0.9080 0.9367 0.9221 158 0.984 0.9919 0.9880 124 0.9338 0.9573 0.9454 0.9882
0.045 49.0 4704 0.0369 0.9167 0.9483 0.9322 116 0.9187 0.9304 0.9245 158 0.984 0.9919 0.9880 124 0.9383 0.9548 0.9465 0.9882
0.0439 50.0 4800 0.0367 0.9167 0.9483 0.9322 116 0.9187 0.9304 0.9245 158 0.9762 0.9919 0.9840 124 0.9360 0.9548 0.9453 0.9885
0.0453 51.0 4896 0.0372 0.9083 0.9397 0.9237 116 0.9182 0.9241 0.9211 158 0.984 0.9919 0.9880 124 0.9356 0.9497 0.9426 0.9879
0.0447 52.0 4992 0.0365 0.9153 0.9310 0.9231 116 0.8855 0.9304 0.9074 158 0.9762 0.9919 0.9840 124 0.9220 0.9497 0.9356 0.9868
0.0424 53.0 5088 0.0366 0.9167 0.9483 0.9322 116 0.9313 0.9430 0.9371 158 0.9762 0.9919 0.9840 124 0.9409 0.9598 0.9502 0.9887
0.042 54.0 5184 0.0360 0.9160 0.9397 0.9277 116 0.9198 0.9430 0.9312 158 0.9762 0.9919 0.9840 124 0.9361 0.9573 0.9466 0.9882
0.0407 55.0 5280 0.0355 0.9160 0.9397 0.9277 116 0.9136 0.9367 0.9250 158 0.9762 0.9919 0.9840 124 0.9337 0.9548 0.9441 0.9879
0.0414 56.0 5376 0.0353 0.9160 0.9397 0.9277 116 0.9198 0.9430 0.9312 158 0.9762 0.9919 0.9840 124 0.9361 0.9573 0.9466 0.9882
0.0417 57.0 5472 0.0356 0.9160 0.9397 0.9277 116 0.9255 0.9430 0.9342 158 0.9762 0.9919 0.9840 124 0.9384 0.9573 0.9478 0.9885
0.0395 58.0 5568 0.0359 0.9167 0.9483 0.9322 116 0.9490 0.9430 0.9460 158 0.9762 0.9919 0.9840 124 0.9479 0.9598 0.9538 0.9890
0.0403 59.0 5664 0.0362 0.9160 0.9397 0.9277 116 0.9198 0.9430 0.9312 158 0.9762 0.9919 0.9840 124 0.9361 0.9573 0.9466 0.9876
0.0413 60.0 5760 0.0354 0.9167 0.9483 0.9322 116 0.9371 0.9430 0.9401 158 0.9762 0.9919 0.9840 124 0.9432 0.9598 0.9514 0.9890
0.0391 61.0 5856 0.0363 0.9083 0.9397 0.9237 116 0.9304 0.9304 0.9304 158 0.9762 0.9919 0.9840 124 0.9381 0.9523 0.9451 0.9887
0.0381 62.0 5952 0.0360 0.9167 0.9483 0.9322 116 0.9371 0.9430 0.9401 158 0.9762 0.9919 0.9840 124 0.9432 0.9598 0.9514 0.9885
0.0388 63.0 6048 0.0359 0.9167 0.9483 0.9322 116 0.9313 0.9430 0.9371 158 0.9762 0.9919 0.9840 124 0.9409 0.9598 0.9502 0.9882
0.0377 64.0 6144 0.0364 0.8992 0.9224 0.9106 116 0.9308 0.9367 0.9338 158 0.984 0.9919 0.9880 124 0.9380 0.9497 0.9438 0.9887
0.0356 65.0 6240 0.0358 0.9167 0.9483 0.9322 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9455 0.9598 0.9526 0.9887
0.0377 66.0 6336 0.0358 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9762 0.9919 0.9840 124 0.9407 0.9573 0.9489 0.9882
0.0356 67.0 6432 0.0357 0.9076 0.9310 0.9191 116 0.9308 0.9367 0.9338 158 0.9762 0.9919 0.9840 124 0.9381 0.9523 0.9451 0.9885
0.0363 68.0 6528 0.0353 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.984 0.9919 0.9880 124 0.9454 0.9573 0.9513 0.9893
0.0354 69.0 6624 0.0352 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.984 0.9919 0.9880 124 0.9454 0.9573 0.9513 0.9893
0.0371 70.0 6720 0.0349 0.9160 0.9397 0.9277 116 0.9255 0.9430 0.9342 158 0.9685 0.9919 0.9801 124 0.9361 0.9573 0.9466 0.9882
0.035 71.0 6816 0.0347 0.9160 0.9397 0.9277 116 0.9255 0.9430 0.9342 158 0.984 0.9919 0.9880 124 0.9407 0.9573 0.9489 0.9887
0.0352 72.0 6912 0.0356 0.9167 0.9483 0.9322 116 0.9427 0.9367 0.9397 158 0.9762 0.9919 0.9840 124 0.9454 0.9573 0.9513 0.9896
0.0346 73.0 7008 0.0349 0.9160 0.9397 0.9277 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9454 0.9573 0.9513 0.9887
0.036 74.0 7104 0.0350 0.9076 0.9310 0.9191 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9429 0.9548 0.9488 0.9890
0.0355 75.0 7200 0.0346 0.9076 0.9310 0.9191 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9429 0.9548 0.9488 0.9893
0.0341 76.0 7296 0.0348 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0336 77.0 7392 0.0350 0.9167 0.9483 0.9322 116 0.9487 0.9367 0.9427 158 0.984 0.9919 0.9880 124 0.9501 0.9573 0.9537 0.9896
0.0339 78.0 7488 0.0341 0.9160 0.9397 0.9277 116 0.9255 0.9430 0.9342 158 0.9685 0.9919 0.9801 124 0.9361 0.9573 0.9466 0.9882
0.034 79.0 7584 0.0350 0.9160 0.9397 0.9277 116 0.9141 0.9430 0.9283 158 0.9685 0.9919 0.9801 124 0.9315 0.9573 0.9442 0.9876
0.0345 80.0 7680 0.0346 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0335 81.0 7776 0.0344 0.9160 0.9397 0.9277 116 0.9198 0.9430 0.9312 158 0.9685 0.9919 0.9801 124 0.9338 0.9573 0.9454 0.9879
0.0326 82.0 7872 0.0342 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.034 83.0 7968 0.0339 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0329 84.0 8064 0.0347 0.9167 0.9483 0.9322 116 0.9430 0.9430 0.9430 158 0.9685 0.9919 0.9801 124 0.9432 0.9598 0.9514 0.9885
0.0344 85.0 8160 0.0342 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0321 86.0 8256 0.0342 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9762 0.9919 0.9840 124 0.9431 0.9573 0.9501 0.9885
0.0323 87.0 8352 0.0346 0.9167 0.9483 0.9322 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9455 0.9598 0.9526 0.9887
0.0316 88.0 8448 0.0345 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0312 89.0 8544 0.0348 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0324 90.0 8640 0.0347 0.9160 0.9397 0.9277 116 0.9490 0.9430 0.9460 158 0.9762 0.9919 0.9840 124 0.9478 0.9573 0.9525 0.9893
0.0338 91.0 8736 0.0343 0.9076 0.9310 0.9191 116 0.9430 0.9430 0.9430 158 0.9685 0.9919 0.9801 124 0.9406 0.9548 0.9476 0.9890
0.0319 92.0 8832 0.0340 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0322 93.0 8928 0.0339 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9887
0.031 94.0 9024 0.0338 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0337 95.0 9120 0.0339 0.9160 0.9397 0.9277 116 0.9313 0.9430 0.9371 158 0.9685 0.9919 0.9801 124 0.9384 0.9573 0.9478 0.9885
0.0324 96.0 9216 0.0339 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0318 97.0 9312 0.0339 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0323 98.0 9408 0.0341 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0316 99.0 9504 0.0341 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882
0.0325 100.0 9600 0.0340 0.9160 0.9397 0.9277 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9407 0.9573 0.9489 0.9882

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-seq_bn-rf64-1

Finetuned
(388)
this model