nerui-seq_bn-rf64-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0452
  • Location Precision: 0.8632
  • Location Recall: 0.9535
  • Location F1: 0.9061
  • Location Number: 86
  • Organization Precision: 0.9111
  • Organization Recall: 0.9213
  • Organization F1: 0.9162
  • Organization Number: 178
  • Person Precision: 0.9688
  • Person Recall: 0.9688
  • Person F1: 0.9688
  • Person Number: 128
  • Overall Precision: 0.9181
  • Overall Recall: 0.9439
  • Overall F1: 0.9308
  • Overall Accuracy: 0.9852

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.062 1.0 96 0.6283 0.0 0.0 0.0 86 0.0 0.0 0.0 178 0.0 0.0 0.0 128 0.0 0.0 0.0 0.8435
0.5934 2.0 192 0.4368 0.0 0.0 0.0 86 0.3462 0.0506 0.0882 178 0.2647 0.0703 0.1111 128 0.2951 0.0459 0.0795 0.8507
0.4331 3.0 288 0.3204 0.5667 0.1977 0.2931 86 0.3977 0.3933 0.3955 178 0.3441 0.5 0.4076 128 0.3852 0.3852 0.3852 0.8955
0.3496 4.0 384 0.2769 0.5323 0.3837 0.4459 86 0.4512 0.5449 0.4936 178 0.4080 0.6406 0.4985 128 0.4435 0.5408 0.4874 0.9193
0.303 5.0 480 0.2427 0.5806 0.4186 0.4865 86 0.4937 0.6629 0.5659 178 0.4946 0.7109 0.5833 128 0.5052 0.625 0.5587 0.9314
0.2698 6.0 576 0.2096 0.6533 0.5698 0.6087 86 0.5701 0.7079 0.6316 178 0.6131 0.8047 0.6959 128 0.5991 0.7092 0.6495 0.9439
0.2305 7.0 672 0.1785 0.6585 0.6279 0.6429 86 0.6301 0.7753 0.6952 178 0.7434 0.8828 0.8071 128 0.6733 0.7781 0.7219 0.9557
0.1941 8.0 768 0.1476 0.7711 0.7442 0.7574 86 0.7340 0.7753 0.7541 178 0.8156 0.8984 0.8550 128 0.7694 0.8087 0.7886 0.9641
0.1651 9.0 864 0.1254 0.7634 0.8256 0.7933 86 0.7475 0.8315 0.7872 178 0.8369 0.9219 0.8773 128 0.7801 0.8597 0.8180 0.9665
0.1432 10.0 960 0.1102 0.7474 0.8256 0.7845 86 0.7708 0.8315 0.8000 178 0.8897 0.9453 0.9167 128 0.8038 0.8673 0.8344 0.9690
0.1296 11.0 1056 0.1010 0.7273 0.8372 0.7784 86 0.7551 0.8315 0.7914 178 0.9118 0.9688 0.9394 128 0.7981 0.8776 0.8360 0.9703
0.1207 12.0 1152 0.0945 0.7526 0.8488 0.7978 86 0.7857 0.8652 0.8235 178 0.8921 0.9688 0.9288 128 0.8125 0.8954 0.8519 0.9714
0.11 13.0 1248 0.0859 0.7579 0.8372 0.7956 86 0.8095 0.8596 0.8338 178 0.9612 0.9688 0.9650 128 0.8450 0.8903 0.8671 0.9752
0.1041 14.0 1344 0.0841 0.7474 0.8256 0.7845 86 0.7978 0.8202 0.8089 178 0.9323 0.9688 0.9502 128 0.8297 0.8699 0.8493 0.9730
0.1023 15.0 1440 0.0778 0.7826 0.8372 0.8090 86 0.8281 0.8933 0.8595 178 0.9538 0.9688 0.9612 128 0.8575 0.9056 0.8809 0.9765
0.0924 16.0 1536 0.0765 0.7660 0.8372 0.8 86 0.8247 0.8989 0.8602 178 0.9323 0.9688 0.9502 128 0.8456 0.9082 0.8758 0.9754
0.0914 17.0 1632 0.0723 0.7935 0.8488 0.8202 86 0.8519 0.9045 0.8774 178 0.9394 0.9688 0.9538 128 0.8668 0.9133 0.8894 0.9773
0.0878 18.0 1728 0.0706 0.7849 0.8488 0.8156 86 0.8182 0.8596 0.8384 178 0.9538 0.9688 0.9612 128 0.8537 0.8929 0.8728 0.9760
0.0838 19.0 1824 0.0692 0.7789 0.8605 0.8177 86 0.8396 0.8820 0.8603 178 0.9538 0.9688 0.9612 128 0.8617 0.9056 0.8831 0.9765
0.0826 20.0 1920 0.0662 0.7979 0.8721 0.8333 86 0.8360 0.8876 0.8610 178 0.9612 0.9688 0.9650 128 0.8665 0.9107 0.8881 0.9779
0.0783 21.0 2016 0.0659 0.8085 0.8837 0.8444 86 0.8579 0.8820 0.8698 178 0.9612 0.9688 0.9650 128 0.8793 0.9107 0.8947 0.9784
0.0792 22.0 2112 0.0644 0.7979 0.8721 0.8333 86 0.8333 0.8989 0.8649 178 0.9612 0.9688 0.9650 128 0.8651 0.9158 0.8897 0.9784
0.0734 23.0 2208 0.0628 0.8105 0.8953 0.8508 86 0.8641 0.8933 0.8785 178 0.9612 0.9688 0.9650 128 0.8824 0.9184 0.9 0.9795
0.0741 24.0 2304 0.0637 0.8021 0.8953 0.8462 86 0.8396 0.8820 0.8603 178 0.9612 0.9688 0.9650 128 0.8689 0.9133 0.8905 0.9773
0.0704 25.0 2400 0.0620 0.8191 0.8953 0.8556 86 0.8449 0.8876 0.8658 178 0.9538 0.9688 0.9612 128 0.8735 0.9158 0.8941 0.9784
0.0691 26.0 2496 0.0611 0.8105 0.8953 0.8508 86 0.8478 0.8764 0.8619 178 0.9612 0.9688 0.9650 128 0.875 0.9107 0.8925 0.9784
0.0668 27.0 2592 0.0581 0.7917 0.8837 0.8352 86 0.8663 0.9101 0.8877 178 0.9612 0.9688 0.9650 128 0.8786 0.9235 0.9005 0.9798
0.0658 28.0 2688 0.0581 0.8021 0.8953 0.8462 86 0.8457 0.8933 0.8689 178 0.9612 0.9688 0.9650 128 0.8717 0.9184 0.8944 0.9795
0.0659 29.0 2784 0.0561 0.8280 0.8953 0.8603 86 0.8571 0.9101 0.8828 178 0.9612 0.9688 0.9650 128 0.8832 0.9260 0.9041 0.9814
0.0629 30.0 2880 0.0548 0.8172 0.8837 0.8492 86 0.8571 0.9101 0.8828 178 0.9612 0.9688 0.9650 128 0.8808 0.9235 0.9016 0.9814
0.0606 31.0 2976 0.0562 0.8 0.8837 0.8398 86 0.8579 0.8820 0.8698 178 0.9612 0.9688 0.9650 128 0.8771 0.9107 0.8936 0.9795
0.0596 32.0 3072 0.0550 0.7857 0.8953 0.8370 86 0.8763 0.9157 0.8956 178 0.9612 0.9688 0.9650 128 0.8814 0.9286 0.9043 0.9806
0.0614 33.0 3168 0.0551 0.8085 0.8837 0.8444 86 0.8895 0.9045 0.8969 178 0.9612 0.9688 0.9650 128 0.8936 0.9209 0.9070 0.9803
0.0585 34.0 3264 0.0532 0.8172 0.8837 0.8492 86 0.8763 0.9157 0.8956 178 0.9612 0.9688 0.9650 128 0.8897 0.9260 0.9075 0.9816
0.0568 35.0 3360 0.0541 0.8105 0.8953 0.8508 86 0.8895 0.9045 0.8969 178 0.9612 0.9688 0.9650 128 0.8938 0.9235 0.9084 0.9816
0.0576 36.0 3456 0.0546 0.7980 0.9186 0.8541 86 0.8703 0.9045 0.8871 178 0.9766 0.9766 0.9766 128 0.8859 0.9311 0.9080 0.9803
0.0554 37.0 3552 0.0523 0.8280 0.8953 0.8603 86 0.8852 0.9101 0.8975 178 0.9612 0.9688 0.9650 128 0.8963 0.9260 0.9109 0.9819
0.0551 38.0 3648 0.0538 0.8191 0.8953 0.8556 86 0.8791 0.8989 0.8889 178 0.9612 0.9688 0.9650 128 0.8914 0.9209 0.9059 0.9811
0.0514 39.0 3744 0.0527 0.8125 0.9070 0.8571 86 0.9061 0.9213 0.9136 178 0.9612 0.9688 0.9650 128 0.9015 0.9337 0.9173 0.9819
0.0536 40.0 3840 0.0524 0.8211 0.9070 0.8619 86 0.8950 0.9101 0.9025 178 0.9612 0.9688 0.9650 128 0.8988 0.9286 0.9134 0.9819
0.0514 41.0 3936 0.0512 0.8105 0.8953 0.8508 86 0.8956 0.9157 0.9056 178 0.9688 0.9688 0.9688 128 0.8988 0.9286 0.9134 0.9822
0.0503 42.0 4032 0.0490 0.8191 0.8953 0.8556 86 0.8877 0.9326 0.9096 178 0.9688 0.9688 0.9688 128 0.8973 0.9362 0.9164 0.9825
0.0512 43.0 4128 0.0497 0.8333 0.9302 0.8791 86 0.9016 0.9270 0.9141 178 0.9688 0.9688 0.9688 128 0.9066 0.9413 0.9237 0.9838
0.051 44.0 4224 0.0492 0.8421 0.9302 0.8840 86 0.8967 0.9270 0.9116 178 0.9688 0.9688 0.9688 128 0.9066 0.9413 0.9237 0.9833
0.0489 45.0 4320 0.0494 0.8191 0.8953 0.8556 86 0.8956 0.9157 0.9056 178 0.9688 0.9688 0.9688 128 0.9010 0.9286 0.9146 0.9822
0.05 46.0 4416 0.0509 0.8265 0.9419 0.8804 86 0.8962 0.9213 0.9086 178 0.9688 0.9688 0.9688 128 0.9022 0.9413 0.9213 0.9822
0.0473 47.0 4512 0.0480 0.8191 0.8953 0.8556 86 0.9071 0.9326 0.9197 178 0.9688 0.9688 0.9688 128 0.9062 0.9362 0.9210 0.9827
0.0454 48.0 4608 0.0489 0.8191 0.8953 0.8556 86 0.9116 0.9270 0.9192 178 0.9688 0.9688 0.9688 128 0.9082 0.9337 0.9208 0.9827
0.0468 49.0 4704 0.0498 0.8125 0.9070 0.8571 86 0.9101 0.9101 0.9101 178 0.9688 0.9688 0.9688 128 0.9055 0.9286 0.9169 0.9835
0.044 50.0 4800 0.0488 0.8105 0.8953 0.8508 86 0.9006 0.9157 0.9081 178 0.9688 0.9688 0.9688 128 0.9010 0.9286 0.9146 0.9838
0.0439 51.0 4896 0.0495 0.8526 0.9419 0.8950 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9136 0.9439 0.9285 0.9835
0.0451 52.0 4992 0.0490 0.8172 0.8837 0.8492 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9075 0.9260 0.9167 0.9841
0.0438 53.0 5088 0.0491 0.8351 0.9419 0.8852 86 0.9006 0.9157 0.9081 178 0.9688 0.9688 0.9688 128 0.9064 0.9388 0.9223 0.9827
0.0433 54.0 5184 0.0464 0.8387 0.9070 0.8715 86 0.9121 0.9326 0.9222 178 0.9688 0.9688 0.9688 128 0.9132 0.9388 0.9258 0.9838
0.0441 55.0 5280 0.0477 0.8333 0.9302 0.8791 86 0.8967 0.9270 0.9116 178 0.9688 0.9688 0.9688 128 0.9044 0.9413 0.9225 0.9838
0.0441 56.0 5376 0.0470 0.8333 0.9302 0.8791 86 0.8919 0.9270 0.9091 178 0.9688 0.9688 0.9688 128 0.9022 0.9413 0.9213 0.9841
0.0419 57.0 5472 0.0468 0.8454 0.9535 0.8962 86 0.8871 0.9270 0.9066 178 0.9688 0.9688 0.9688 128 0.9027 0.9464 0.9240 0.9852
0.0421 58.0 5568 0.0474 0.8316 0.9186 0.8729 86 0.9101 0.9101 0.9101 178 0.9688 0.9688 0.9688 128 0.9102 0.9311 0.9206 0.9841
0.0385 59.0 5664 0.0472 0.8298 0.9070 0.8667 86 0.9050 0.9101 0.9076 178 0.9688 0.9688 0.9688 128 0.9077 0.9286 0.9180 0.9841
0.0424 60.0 5760 0.0453 0.8298 0.9070 0.8667 86 0.9011 0.9213 0.9111 178 0.9688 0.9688 0.9688 128 0.9059 0.9337 0.9196 0.9838
0.0399 61.0 5856 0.0446 0.8404 0.9186 0.8778 86 0.8919 0.9270 0.9091 178 0.9688 0.9688 0.9688 128 0.9042 0.9388 0.9212 0.9841
0.0393 62.0 5952 0.0454 0.8421 0.9302 0.8840 86 0.9 0.9101 0.9050 178 0.9688 0.9688 0.9688 128 0.9082 0.9337 0.9208 0.9841
0.0386 63.0 6048 0.0455 0.8454 0.9535 0.8962 86 0.8967 0.9270 0.9116 178 0.9688 0.9688 0.9688 128 0.9071 0.9464 0.9263 0.9846
0.0386 64.0 6144 0.0443 0.8387 0.9070 0.8715 86 0.8817 0.9213 0.9011 178 0.9688 0.9688 0.9688 128 0.8993 0.9337 0.9161 0.9843
0.0378 65.0 6240 0.0457 0.8351 0.9419 0.8852 86 0.8973 0.9326 0.9146 178 0.9688 0.9688 0.9688 128 0.9049 0.9464 0.9252 0.9841
0.0376 66.0 6336 0.0458 0.8421 0.9302 0.8840 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9111 0.9413 0.9260 0.9846
0.0362 67.0 6432 0.0456 0.8421 0.9302 0.8840 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9111 0.9413 0.9260 0.9852
0.0366 68.0 6528 0.0458 0.8617 0.9419 0.9000 86 0.9011 0.9213 0.9111 178 0.9688 0.9688 0.9688 128 0.9134 0.9413 0.9271 0.9841
0.038 69.0 6624 0.0459 0.8617 0.9419 0.9000 86 0.9162 0.9213 0.9188 178 0.9688 0.9688 0.9688 128 0.9202 0.9413 0.9306 0.9849
0.0364 70.0 6720 0.0450 0.8526 0.9419 0.8950 86 0.9016 0.9270 0.9141 178 0.9688 0.9688 0.9688 128 0.9113 0.9439 0.9273 0.9849
0.0361 71.0 6816 0.0453 0.8723 0.9535 0.9111 86 0.9022 0.9326 0.9171 178 0.9688 0.9688 0.9688 128 0.9163 0.9490 0.9323 0.9857
0.0358 72.0 6912 0.0475 0.8632 0.9535 0.9061 86 0.9 0.9101 0.9050 178 0.9612 0.9688 0.9650 128 0.9109 0.9388 0.9246 0.9852
0.0356 73.0 7008 0.0463 0.8316 0.9186 0.8729 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9104 0.9337 0.9219 0.9852
0.0366 74.0 7104 0.0460 0.8723 0.9535 0.9111 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9202 0.9413 0.9306 0.9846
0.0345 75.0 7200 0.0456 0.8542 0.9535 0.9011 86 0.9061 0.9213 0.9136 178 0.9688 0.9688 0.9688 128 0.9136 0.9439 0.9285 0.9846
0.0352 76.0 7296 0.0455 0.8333 0.9302 0.8791 86 0.9061 0.9213 0.9136 178 0.9688 0.9688 0.9688 128 0.9086 0.9388 0.9235 0.9852
0.0344 77.0 7392 0.0444 0.8723 0.9535 0.9111 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9183 0.9464 0.9322 0.9854
0.0364 78.0 7488 0.0462 0.8723 0.9535 0.9111 86 0.9061 0.9213 0.9136 178 0.9612 0.9688 0.9650 128 0.9158 0.9439 0.9296 0.9854
0.0351 79.0 7584 0.0451 0.8737 0.9651 0.9171 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9185 0.9490 0.9335 0.9857
0.0359 80.0 7680 0.0453 0.8737 0.9651 0.9171 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9185 0.9490 0.9335 0.9857
0.0338 81.0 7776 0.0448 0.8542 0.9535 0.9011 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9138 0.9464 0.9298 0.9857
0.0349 82.0 7872 0.0455 0.8632 0.9535 0.9061 86 0.9056 0.9157 0.9106 178 0.9612 0.9688 0.9650 128 0.9134 0.9413 0.9271 0.9849
0.0346 83.0 7968 0.0448 0.8723 0.9535 0.9111 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9183 0.9464 0.9322 0.9852
0.0339 84.0 8064 0.0456 0.8632 0.9535 0.9061 86 0.9061 0.9213 0.9136 178 0.9688 0.9688 0.9688 128 0.9158 0.9439 0.9296 0.9849
0.0336 85.0 8160 0.0456 0.8723 0.9535 0.9111 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9204 0.9439 0.9320 0.9852
0.0333 86.0 8256 0.0453 0.8632 0.9535 0.9061 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9160 0.9464 0.9310 0.9849
0.0354 87.0 8352 0.0452 0.8723 0.9535 0.9111 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9202 0.9413 0.9306 0.9846
0.0348 88.0 8448 0.0451 0.8723 0.9535 0.9111 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9183 0.9464 0.9322 0.9852
0.034 89.0 8544 0.0450 0.8723 0.9535 0.9111 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9183 0.9464 0.9322 0.9852
0.0337 90.0 8640 0.0451 0.8632 0.9535 0.9061 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9160 0.9464 0.9310 0.9849
0.0325 91.0 8736 0.0451 0.8817 0.9535 0.9162 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9225 0.9413 0.9318 0.9852
0.0325 92.0 8832 0.0452 0.8817 0.9535 0.9162 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9225 0.9413 0.9318 0.9852
0.0342 93.0 8928 0.0449 0.8632 0.9535 0.9061 86 0.9066 0.9270 0.9167 178 0.9688 0.9688 0.9688 128 0.9160 0.9464 0.9310 0.9849
0.0321 94.0 9024 0.0449 0.8817 0.9535 0.9162 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9227 0.9439 0.9332 0.9854
0.0335 95.0 9120 0.0450 0.8817 0.9535 0.9162 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9227 0.9439 0.9332 0.9857
0.033 96.0 9216 0.0450 0.8723 0.9535 0.9111 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9204 0.9439 0.9320 0.9852
0.0322 97.0 9312 0.0452 0.8723 0.9535 0.9111 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9204 0.9439 0.9320 0.9854
0.0333 98.0 9408 0.0451 0.8632 0.9535 0.9061 86 0.9116 0.9270 0.9192 178 0.9688 0.9688 0.9688 128 0.9183 0.9464 0.9322 0.9852
0.0317 99.0 9504 0.0452 0.8632 0.9535 0.9061 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9181 0.9439 0.9308 0.9852
0.0332 100.0 9600 0.0452 0.8632 0.9535 0.9061 86 0.9111 0.9213 0.9162 178 0.9688 0.9688 0.9688 128 0.9181 0.9439 0.9308 0.9852

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerui-seq_bn-rf64-3

Finetuned
(388)
this model