nerugm-lora-r16-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1358
  • Location Precision: 0.7722
  • Location Recall: 0.8841
  • Location F1: 0.8243
  • Location Number: 69
  • Organization Precision: 0.6377
  • Organization Recall: 0.7586
  • Organization F1: 0.6929
  • Organization Number: 58
  • Person Precision: 0.8424
  • Person Recall: 0.9145
  • Person F1: 0.8770
  • Person Number: 152
  • Quantity Precision: 0.6154
  • Quantity Recall: 0.8
  • Quantity F1: 0.6957
  • Quantity Number: 30
  • Time Precision: 0.7576
  • Time Recall: 0.8621
  • Time F1: 0.8065
  • Time Number: 29
  • Overall Precision: 0.7610
  • Overall Recall: 0.8669
  • Overall F1: 0.8105
  • Overall Accuracy: 0.9577

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Quantity Precision Quantity Recall Quantity F1 Quantity Number Time Precision Time Recall Time F1 Time Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1575 1.0 106 0.7082 0.0 0.0 0.0 69 0.0 0.0 0.0 58 0.0 0.0 0.0 152 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.0 0.0 0.0 0.8397
0.6803 2.0 212 0.5858 0.0 0.0 0.0 69 0.0 0.0 0.0 58 0.2 0.0066 0.0127 152 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.1429 0.0030 0.0058 0.8410
0.5602 3.0 318 0.4571 0.2 0.0145 0.0270 69 0.0 0.0 0.0 58 0.3 0.1579 0.2069 152 0.0 0.0 0.0 30 0.1429 0.0690 0.0930 29 0.27 0.0799 0.1233 0.8612
0.4308 4.0 424 0.3336 0.4 0.2899 0.3361 69 0.2 0.1552 0.1748 58 0.5309 0.6776 0.5954 152 0.2941 0.1667 0.2128 30 0.4815 0.4483 0.4643 29 0.4505 0.4438 0.4471 0.9047
0.3238 5.0 530 0.2585 0.5488 0.6522 0.5960 69 0.3276 0.3276 0.3276 58 0.6736 0.8553 0.7536 152 0.3611 0.4333 0.3939 30 0.5143 0.6207 0.5625 29 0.5569 0.6657 0.6065 0.9265
0.2653 6.0 636 0.2287 0.5870 0.7826 0.6708 69 0.4118 0.4828 0.4444 58 0.6802 0.8816 0.7679 152 0.4242 0.4667 0.4444 30 0.6061 0.6897 0.6452 29 0.5910 0.7396 0.6570 0.9309
0.2286 7.0 742 0.1973 0.6705 0.8551 0.7516 69 0.4722 0.5862 0.5231 58 0.7433 0.9145 0.8201 152 0.5 0.6 0.5455 30 0.6176 0.7241 0.6667 29 0.6499 0.8018 0.7179 0.9393
0.2052 8.0 848 0.1775 0.6742 0.8696 0.7595 69 0.5122 0.7241 0.6000 58 0.7514 0.8947 0.8168 152 0.4857 0.5667 0.5231 30 0.7812 0.8621 0.8197 29 0.6683 0.8284 0.7398 0.9431
0.1891 9.0 954 0.1697 0.7045 0.8986 0.7898 69 0.5375 0.7414 0.6232 58 0.7403 0.8816 0.8048 152 0.5429 0.6333 0.5846 30 0.7273 0.8276 0.7742 29 0.6763 0.8343 0.7470 0.9442
0.176 10.0 1060 0.1571 0.7470 0.8986 0.8158 69 0.5294 0.7759 0.6294 58 0.7644 0.875 0.8160 152 0.5641 0.7333 0.6377 30 0.7812 0.8621 0.8197 29 0.6949 0.8491 0.7643 0.9498
0.1684 11.0 1166 0.1611 0.7386 0.9420 0.8280 69 0.5301 0.7586 0.6241 58 0.7268 0.875 0.7940 152 0.6154 0.8 0.6957 30 0.7222 0.8966 0.8000 29 0.6807 0.8639 0.7614 0.9465
0.1609 12.0 1272 0.1457 0.7558 0.9420 0.8387 69 0.5811 0.7414 0.6515 58 0.7746 0.8816 0.8246 152 0.6857 0.8 0.7385 30 0.7941 0.9310 0.8571 29 0.7289 0.8669 0.7919 0.9542
0.1516 13.0 1378 0.1377 0.7531 0.8841 0.8133 69 0.5867 0.7586 0.6617 58 0.7988 0.8882 0.8411 152 0.6111 0.7333 0.6667 30 0.8182 0.9310 0.8710 29 0.7335 0.8550 0.7896 0.9557
0.147 14.0 1484 0.1513 0.7805 0.9275 0.8477 69 0.4945 0.7759 0.6040 58 0.7907 0.8947 0.8395 152 0.7027 0.8667 0.7761 30 0.7429 0.8966 0.8125 29 0.7122 0.8787 0.7868 0.9506
0.1421 15.0 1590 0.1350 0.8101 0.9275 0.8649 69 0.5946 0.7586 0.6667 58 0.8133 0.8882 0.8491 152 0.6571 0.7667 0.7077 30 0.8125 0.8966 0.8525 29 0.7565 0.8639 0.8066 0.9580
0.1368 16.0 1696 0.1334 0.7901 0.9275 0.8533 69 0.5556 0.6897 0.6154 58 0.8036 0.8882 0.8437 152 0.6471 0.7333 0.6875 30 0.8710 0.9310 0.9 29 0.7461 0.8521 0.7956 0.9577
0.1349 17.0 1802 0.1446 0.7529 0.9275 0.8312 69 0.5366 0.7586 0.6286 58 0.7988 0.8882 0.8411 152 0.6410 0.8333 0.7246 30 0.7059 0.8276 0.7619 29 0.7139 0.8639 0.7818 0.9521
0.1331 18.0 1908 0.1384 0.7711 0.9275 0.8421 69 0.5714 0.7586 0.6519 58 0.7861 0.8947 0.8369 152 0.6579 0.8333 0.7353 30 0.7647 0.8966 0.8254 29 0.7284 0.8728 0.7941 0.9547
0.1281 19.0 2014 0.1298 0.7875 0.9130 0.8456 69 0.6027 0.7586 0.6718 58 0.8047 0.8947 0.8474 152 0.6579 0.8333 0.7353 30 0.8182 0.9310 0.8710 29 0.7506 0.8728 0.8071 0.9580
0.1238 20.0 2120 0.1383 0.7590 0.9130 0.8289 69 0.5789 0.7586 0.6567 58 0.8012 0.9013 0.8483 152 0.6316 0.8 0.7059 30 0.7143 0.8621 0.7813 29 0.7270 0.8669 0.7908 0.9549
0.1206 21.0 2226 0.1410 0.7711 0.9275 0.8421 69 0.5476 0.7931 0.6479 58 0.8059 0.9013 0.8509 152 0.6579 0.8333 0.7353 30 0.7429 0.8966 0.8125 29 0.7268 0.8817 0.7968 0.9544
0.1203 22.0 2332 0.1311 0.7590 0.9130 0.8289 69 0.55 0.7586 0.6377 58 0.8204 0.9013 0.8589 152 0.7222 0.8667 0.7879 30 0.7647 0.8966 0.8254 29 0.74 0.8757 0.8022 0.9575
0.1186 23.0 2438 0.1342 0.8101 0.9275 0.8649 69 0.5921 0.7759 0.6716 58 0.7977 0.9079 0.8492 152 0.6667 0.8 0.7273 30 0.7576 0.8621 0.8065 29 0.7456 0.8757 0.8054 0.9572
0.1194 24.0 2544 0.1445 0.7901 0.9275 0.8533 69 0.6076 0.8276 0.7007 58 0.7667 0.9079 0.8313 152 0.6316 0.8 0.7059 30 0.7059 0.8276 0.7619 29 0.7233 0.8817 0.7947 0.9539
0.1159 25.0 2650 0.1341 0.7683 0.9130 0.8344 69 0.5696 0.7759 0.6569 58 0.8107 0.9013 0.8536 152 0.6667 0.8 0.7273 30 0.7353 0.8621 0.7937 29 0.735 0.8698 0.7967 0.9567
0.112 26.0 2756 0.1394 0.7619 0.9275 0.8366 69 0.575 0.7931 0.6667 58 0.8059 0.9013 0.8509 152 0.6053 0.7667 0.6765 30 0.7353 0.8621 0.7937 29 0.7266 0.8728 0.7930 0.9554
0.1099 27.0 2862 0.1319 0.7561 0.8986 0.8212 69 0.5775 0.7069 0.6357 58 0.8274 0.9145 0.8687 152 0.6053 0.7667 0.6765 30 0.7812 0.8621 0.8197 29 0.7417 0.8580 0.7956 0.9567
0.112 28.0 2968 0.1301 0.7683 0.9130 0.8344 69 0.5616 0.7069 0.6260 58 0.8323 0.9145 0.8715 152 0.6316 0.8 0.7059 30 0.7353 0.8621 0.7937 29 0.7411 0.8639 0.7978 0.9570
0.1073 29.0 3074 0.1324 0.7805 0.9275 0.8477 69 0.5556 0.6897 0.6154 58 0.8383 0.9211 0.8777 152 0.6053 0.7667 0.6765 30 0.75 0.8276 0.7869 29 0.7442 0.8609 0.7984 0.9565
0.1072 30.0 3180 0.1271 0.7778 0.9130 0.84 69 0.5753 0.7241 0.6412 58 0.8204 0.9013 0.8589 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7398 0.8580 0.7945 0.9570
0.1054 31.0 3286 0.1269 0.7875 0.9130 0.8456 69 0.6087 0.7241 0.6614 58 0.8023 0.9079 0.8519 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7462 0.8609 0.7995 0.9575
0.1036 32.0 3392 0.1220 0.7949 0.8986 0.8435 69 0.6232 0.7414 0.6772 58 0.8155 0.9013 0.8562 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7578 0.8609 0.8061 0.9590
0.1035 33.0 3498 0.1218 0.7848 0.8986 0.8378 69 0.6087 0.7241 0.6614 58 0.8373 0.9145 0.8742 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7604 0.8639 0.8089 0.9598
0.0989 34.0 3604 0.1317 0.7975 0.9130 0.8514 69 0.6 0.7759 0.6767 58 0.8758 0.9276 0.9010 152 0.6316 0.8 0.7059 30 0.7273 0.8276 0.7742 29 0.7694 0.8787 0.8204 0.9580
0.0979 35.0 3710 0.1354 0.7778 0.9130 0.84 69 0.6081 0.7759 0.6818 58 0.8129 0.9145 0.8607 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7431 0.8728 0.8027 0.9562
0.0983 36.0 3816 0.1285 0.7949 0.8986 0.8435 69 0.6111 0.7586 0.6769 58 0.8047 0.8947 0.8474 152 0.6316 0.8 0.7059 30 0.7273 0.8276 0.7742 29 0.7436 0.8580 0.7967 0.9575
0.1003 37.0 3922 0.1267 0.7590 0.9130 0.8289 69 0.5890 0.7414 0.6565 58 0.8214 0.9079 0.8625 152 0.6316 0.8 0.7059 30 0.7273 0.8276 0.7742 29 0.7392 0.8639 0.7967 0.9577
0.0933 38.0 4028 0.1350 0.7875 0.9130 0.8456 69 0.6104 0.8103 0.6963 58 0.8383 0.9211 0.8777 152 0.6410 0.8333 0.7246 30 0.75 0.8276 0.7869 29 0.7570 0.8846 0.8158 0.9580
0.0963 39.0 4134 0.1293 0.7875 0.9130 0.8456 69 0.6197 0.7586 0.6822 58 0.8012 0.9013 0.8483 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7455 0.8669 0.8016 0.9577
0.0935 40.0 4240 0.1271 0.7949 0.8986 0.8435 69 0.6286 0.7586 0.6875 58 0.8274 0.9145 0.8687 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7597 0.8698 0.8110 0.9588
0.0936 41.0 4346 0.1237 0.8052 0.8986 0.8493 69 0.6164 0.7759 0.6870 58 0.8434 0.9211 0.8805 152 0.6667 0.8 0.7273 30 0.8065 0.8621 0.8333 29 0.7728 0.8757 0.8211 0.9598
0.091 42.0 4452 0.1286 0.775 0.8986 0.8322 69 0.625 0.7759 0.6923 58 0.8476 0.9145 0.8797 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7623 0.8728 0.8138 0.9590
0.0923 43.0 4558 0.1299 0.7875 0.9130 0.8456 69 0.6081 0.7759 0.6818 58 0.8274 0.9145 0.8687 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7551 0.8757 0.8110 0.9580
0.0919 44.0 4664 0.1245 0.775 0.8986 0.8322 69 0.6429 0.7759 0.7031 58 0.8155 0.9013 0.8562 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7532 0.8669 0.8061 0.9588
0.0889 45.0 4770 0.1310 0.775 0.8986 0.8322 69 0.6338 0.7759 0.6977 58 0.8485 0.9211 0.8833 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7623 0.8728 0.8138 0.9588
0.0884 46.0 4876 0.1276 0.775 0.8986 0.8322 69 0.6 0.7759 0.6767 58 0.8466 0.9079 0.8762 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7558 0.8698 0.8088 0.9583
0.0887 47.0 4982 0.1309 0.7848 0.8986 0.8378 69 0.625 0.7759 0.6923 58 0.8571 0.9079 0.8818 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7650 0.8669 0.8128 0.9588
0.0858 48.0 5088 0.1291 0.7848 0.8986 0.8378 69 0.6522 0.7759 0.7087 58 0.8263 0.9079 0.8652 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7617 0.8698 0.8122 0.9585
0.0861 49.0 5194 0.1337 0.7975 0.9130 0.8514 69 0.6216 0.7931 0.6970 58 0.8434 0.9211 0.8805 152 0.6486 0.8 0.7164 30 0.7576 0.8621 0.8065 29 0.7661 0.8817 0.8198 0.9580
0.0859 50.0 5300 0.1189 0.8026 0.8841 0.8414 69 0.6515 0.7414 0.6935 58 0.8405 0.9013 0.8698 152 0.6765 0.7667 0.7188 30 0.7812 0.8621 0.8197 29 0.7790 0.8550 0.8152 0.9601
0.0849 51.0 5406 0.1275 0.775 0.8986 0.8322 69 0.625 0.7759 0.6923 58 0.8580 0.9145 0.8854 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9593
0.0817 52.0 5512 0.1322 0.7949 0.8986 0.8435 69 0.6164 0.7759 0.6870 58 0.8545 0.9276 0.8896 152 0.6316 0.8 0.7059 30 0.7273 0.8276 0.7742 29 0.7649 0.8757 0.8166 0.9588
0.0846 53.0 5618 0.1272 0.7848 0.8986 0.8378 69 0.6377 0.7586 0.6929 58 0.8485 0.9211 0.8833 152 0.6316 0.8 0.7059 30 0.8065 0.8621 0.8333 29 0.7723 0.8728 0.8194 0.9603
0.0834 54.0 5724 0.1267 0.7683 0.9130 0.8344 69 0.6197 0.7586 0.6822 58 0.8537 0.9211 0.8861 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7649 0.8757 0.8166 0.9595
0.0826 55.0 5830 0.1322 0.8101 0.9275 0.8649 69 0.6164 0.7759 0.6870 58 0.8424 0.9145 0.8770 152 0.6316 0.8 0.7059 30 0.75 0.8276 0.7869 29 0.7649 0.8757 0.8166 0.9590
0.0823 56.0 5936 0.1269 0.7778 0.9130 0.84 69 0.6562 0.7241 0.6885 58 0.8537 0.9211 0.8861 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7757 0.8698 0.8201 0.9603
0.0793 57.0 6042 0.1286 0.7778 0.9130 0.84 69 0.6429 0.7759 0.7031 58 0.8537 0.9211 0.8861 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7694 0.8787 0.8204 0.9595
0.0788 58.0 6148 0.1271 0.7949 0.8986 0.8435 69 0.625 0.7759 0.6923 58 0.8485 0.9211 0.8833 152 0.6486 0.8 0.7164 30 0.7576 0.8621 0.8065 29 0.7688 0.8757 0.8188 0.9598
0.0778 59.0 6254 0.1287 0.7778 0.9130 0.84 69 0.6164 0.7759 0.6870 58 0.8383 0.9211 0.8777 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7551 0.8757 0.8110 0.9585
0.0786 60.0 6360 0.1348 0.7683 0.9130 0.8344 69 0.625 0.7759 0.6923 58 0.8424 0.9145 0.8770 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7590 0.8757 0.8132 0.9583
0.0782 61.0 6466 0.1307 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8485 0.9211 0.8833 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7668 0.8757 0.8177 0.9590
0.0782 62.0 6572 0.1289 0.7654 0.8986 0.8267 69 0.5972 0.7414 0.6615 58 0.8204 0.9013 0.8589 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7417 0.8580 0.7956 0.9570
0.0774 63.0 6678 0.1315 0.7778 0.9130 0.84 69 0.6143 0.7414 0.6719 58 0.8274 0.9145 0.8687 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7513 0.8669 0.8049 0.9572
0.0782 64.0 6784 0.1349 0.7778 0.9130 0.84 69 0.625 0.7759 0.6923 58 0.8393 0.9276 0.8812 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7621 0.8817 0.8176 0.9585
0.0779 65.0 6890 0.1343 0.7683 0.9130 0.8344 69 0.625 0.7759 0.6923 58 0.8485 0.9211 0.8833 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7635 0.8787 0.8171 0.9590
0.0769 66.0 6996 0.1354 0.7683 0.9130 0.8344 69 0.6027 0.7586 0.6718 58 0.8485 0.9211 0.8833 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7570 0.8757 0.8121 0.9580
0.0738 67.0 7102 0.1397 0.7901 0.9275 0.8533 69 0.6081 0.7759 0.6818 58 0.8383 0.9211 0.8777 152 0.6579 0.8333 0.7353 30 0.7576 0.8621 0.8065 29 0.7608 0.8846 0.8181 0.9577
0.0758 68.0 7208 0.1368 0.7590 0.9130 0.8289 69 0.5946 0.7586 0.6667 58 0.8598 0.9276 0.8924 152 0.6410 0.8333 0.7246 30 0.7576 0.8621 0.8065 29 0.7583 0.8817 0.8153 0.9580
0.0772 69.0 7314 0.1326 0.7821 0.8841 0.8299 69 0.6286 0.7586 0.6875 58 0.8485 0.9211 0.8833 152 0.6667 0.8 0.7273 30 0.7812 0.8621 0.8197 29 0.7717 0.8698 0.8178 0.9595
0.0733 70.0 7420 0.1326 0.7821 0.8841 0.8299 69 0.6286 0.7586 0.6875 58 0.8494 0.9276 0.8868 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7702 0.8728 0.8183 0.9593
0.0744 71.0 7526 0.1298 0.7848 0.8986 0.8378 69 0.6338 0.7759 0.6977 58 0.8598 0.9276 0.8924 152 0.6389 0.7667 0.6970 30 0.7812 0.8621 0.8197 29 0.7749 0.8757 0.8222 0.9606
0.0748 72.0 7632 0.1274 0.7821 0.8841 0.8299 69 0.6232 0.7414 0.6772 58 0.8476 0.9145 0.8797 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7658 0.8609 0.8106 0.9593
0.0741 73.0 7738 0.1280 0.7821 0.8841 0.8299 69 0.6232 0.7414 0.6772 58 0.8253 0.9013 0.8616 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7565 0.8550 0.8028 0.9588
0.0743 74.0 7844 0.1388 0.7778 0.9130 0.84 69 0.6164 0.7759 0.6870 58 0.8434 0.9211 0.8805 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7615 0.8787 0.8159 0.9580
0.0726 75.0 7950 0.1335 0.7821 0.8841 0.8299 69 0.6377 0.7586 0.6929 58 0.8589 0.9211 0.8889 152 0.6316 0.8 0.7059 30 0.7576 0.8621 0.8065 29 0.7717 0.8698 0.8178 0.9590
0.0723 76.0 8056 0.1318 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8485 0.9211 0.8833 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7670 0.8669 0.8139 0.9588
0.0727 77.0 8162 0.1338 0.7821 0.8841 0.8299 69 0.6471 0.7586 0.6984 58 0.8434 0.9211 0.8805 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7690 0.8669 0.8150 0.9590
0.0732 78.0 8268 0.1336 0.7821 0.8841 0.8299 69 0.6429 0.7759 0.7031 58 0.8373 0.9145 0.8742 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7650 0.8669 0.8128 0.9590
0.0725 79.0 8374 0.1303 0.7821 0.8841 0.8299 69 0.6471 0.7586 0.6984 58 0.8466 0.9079 0.8762 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7698 0.8609 0.8128 0.9595
0.0718 80.0 8480 0.1375 0.7778 0.9130 0.84 69 0.6338 0.7759 0.6977 58 0.8434 0.9211 0.8805 152 0.6053 0.7667 0.6765 30 0.7576 0.8621 0.8065 29 0.7609 0.8757 0.8143 0.9585
0.0722 81.0 8586 0.1344 0.7848 0.8986 0.8378 69 0.6522 0.7759 0.7087 58 0.8424 0.9145 0.8770 152 0.6154 0.8 0.6957 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9590
0.0707 82.0 8692 0.1353 0.7848 0.8986 0.8378 69 0.6522 0.7759 0.7087 58 0.8424 0.9145 0.8770 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7702 0.8728 0.8183 0.9595
0.0717 83.0 8798 0.1314 0.7722 0.8841 0.8243 69 0.6418 0.7414 0.688 58 0.8303 0.9013 0.8644 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7605 0.8550 0.8050 0.9580
0.0715 84.0 8904 0.1348 0.7821 0.8841 0.8299 69 0.6522 0.7759 0.7087 58 0.8424 0.9145 0.8770 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7717 0.8698 0.8178 0.9595
0.0708 85.0 9010 0.1344 0.7848 0.8986 0.8378 69 0.6338 0.7759 0.6977 58 0.8476 0.9145 0.8797 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9590
0.0712 86.0 9116 0.1333 0.7821 0.8841 0.8299 69 0.6471 0.7586 0.6984 58 0.8424 0.9145 0.8770 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7711 0.8669 0.8162 0.9590
0.0702 87.0 9222 0.1361 0.7848 0.8986 0.8378 69 0.6429 0.7759 0.7031 58 0.8434 0.9211 0.8805 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7708 0.8757 0.8199 0.9595
0.0681 88.0 9328 0.1335 0.775 0.8986 0.8322 69 0.6429 0.7759 0.7031 58 0.8485 0.9211 0.8833 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9595
0.0693 89.0 9434 0.1345 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8485 0.9211 0.8833 152 0.6486 0.8 0.7164 30 0.7576 0.8621 0.8065 29 0.7676 0.8698 0.8155 0.9590
0.0674 90.0 9540 0.1362 0.7722 0.8841 0.8243 69 0.6338 0.7759 0.6977 58 0.8485 0.9211 0.8833 152 0.6486 0.8 0.7164 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9590
0.0703 91.0 9646 0.1353 0.7722 0.8841 0.8243 69 0.6429 0.7759 0.7031 58 0.8485 0.9211 0.8833 152 0.6316 0.8 0.7059 30 0.7812 0.8621 0.8197 29 0.7682 0.8728 0.8172 0.9588
0.0659 92.0 9752 0.1349 0.7821 0.8841 0.8299 69 0.6429 0.7759 0.7031 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7670 0.8669 0.8139 0.9588
0.0674 93.0 9858 0.1367 0.7821 0.8841 0.8299 69 0.6164 0.7759 0.6870 58 0.8424 0.9145 0.8770 152 0.6154 0.8 0.6957 30 0.7812 0.8621 0.8197 29 0.7597 0.8698 0.8110 0.9577
0.0673 94.0 9964 0.1321 0.7821 0.8841 0.8299 69 0.6377 0.7586 0.6929 58 0.8476 0.9145 0.8797 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7684 0.8639 0.8134 0.9590
0.07 95.0 10070 0.1335 0.7821 0.8841 0.8299 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7812 0.8621 0.8197 29 0.7664 0.8639 0.8122 0.9588
0.0677 96.0 10176 0.1356 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.5897 0.7667 0.6667 30 0.7576 0.8621 0.8065 29 0.7584 0.8639 0.8077 0.9575
0.0715 97.0 10282 0.1343 0.7821 0.8841 0.8299 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6053 0.7667 0.6765 30 0.7812 0.8621 0.8197 29 0.7644 0.8639 0.8111 0.9583
0.0664 98.0 10388 0.1353 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6154 0.8 0.6957 30 0.7576 0.8621 0.8065 29 0.7610 0.8669 0.8105 0.9577
0.0666 99.0 10494 0.1354 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6154 0.8 0.6957 30 0.7576 0.8621 0.8065 29 0.7610 0.8669 0.8105 0.9577
0.0675 100.0 10600 0.1358 0.7722 0.8841 0.8243 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6154 0.8 0.6957 30 0.7576 0.8621 0.8065 29 0.7610 0.8669 0.8105 0.9577

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for apwic/nerugm-lora-r16-0

Finetuned
(388)
this model