ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k16_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9839
  • Qwk: 0.6600
  • Mse: 0.9839
  • Rmse: 0.9919

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 2.4157 0.0274 2.4157 1.5543
No log 0.0667 4 1.5151 0.2201 1.5151 1.2309
No log 0.1 6 1.4033 0.1357 1.4033 1.1846
No log 0.1333 8 1.5682 0.1289 1.5682 1.2523
No log 0.1667 10 1.6320 0.2036 1.6320 1.2775
No log 0.2 12 1.6296 0.2556 1.6296 1.2765
No log 0.2333 14 1.5266 0.2630 1.5266 1.2355
No log 0.2667 16 1.3791 0.1762 1.3791 1.1744
No log 0.3 18 1.2996 0.2311 1.2996 1.1400
No log 0.3333 20 1.2822 0.2997 1.2822 1.1323
No log 0.3667 22 1.2363 0.2846 1.2363 1.1119
No log 0.4 24 1.2218 0.2865 1.2218 1.1054
No log 0.4333 26 1.3507 0.3560 1.3507 1.1622
No log 0.4667 28 1.5517 0.3758 1.5517 1.2457
No log 0.5 30 1.6100 0.3359 1.6100 1.2689
No log 0.5333 32 1.5186 0.3227 1.5186 1.2323
No log 0.5667 34 1.3769 0.3402 1.3769 1.1734
No log 0.6 36 1.2383 0.3077 1.2383 1.1128
No log 0.6333 38 1.1778 0.2820 1.1778 1.0853
No log 0.6667 40 1.1980 0.3196 1.1980 1.0945
No log 0.7 42 1.1602 0.3055 1.1602 1.0771
No log 0.7333 44 1.1560 0.3800 1.1560 1.0752
No log 0.7667 46 1.2049 0.4444 1.2049 1.0977
No log 0.8 48 1.1669 0.4357 1.1669 1.0802
No log 0.8333 50 1.0859 0.3731 1.0859 1.0421
No log 0.8667 52 1.0622 0.4491 1.0622 1.0306
No log 0.9 54 1.0449 0.4640 1.0449 1.0222
No log 0.9333 56 1.0385 0.4889 1.0385 1.0191
No log 0.9667 58 1.0234 0.5125 1.0234 1.0116
No log 1.0 60 1.0137 0.4863 1.0137 1.0068
No log 1.0333 62 1.0091 0.4484 1.0091 1.0046
No log 1.0667 64 1.0141 0.4815 1.0141 1.0070
No log 1.1 66 1.0328 0.5184 1.0328 1.0163
No log 1.1333 68 1.0401 0.4318 1.0401 1.0198
No log 1.1667 70 1.0344 0.4391 1.0344 1.0171
No log 1.2 72 1.0045 0.4971 1.0045 1.0023
No log 1.2333 74 1.0202 0.5187 1.0202 1.0100
No log 1.2667 76 1.0422 0.5232 1.0422 1.0209
No log 1.3 78 1.0609 0.4566 1.0609 1.0300
No log 1.3333 80 1.0857 0.4387 1.0857 1.0420
No log 1.3667 82 1.0548 0.4259 1.0548 1.0270
No log 1.4 84 1.0490 0.4629 1.0490 1.0242
No log 1.4333 86 1.0677 0.5104 1.0677 1.0333
No log 1.4667 88 1.0210 0.5048 1.0210 1.0105
No log 1.5 90 1.0025 0.5863 1.0025 1.0012
No log 1.5333 92 0.9303 0.5526 0.9303 0.9645
No log 1.5667 94 0.9472 0.5524 0.9472 0.9732
No log 1.6 96 1.0941 0.5673 1.0941 1.0460
No log 1.6333 98 1.3501 0.5113 1.3501 1.1619
No log 1.6667 100 1.3865 0.5121 1.3865 1.1775
No log 1.7 102 1.3482 0.5495 1.3482 1.1611
No log 1.7333 104 1.2064 0.5724 1.2064 1.0984
No log 1.7667 106 1.1595 0.5694 1.1595 1.0768
No log 1.8 108 1.0940 0.6086 1.0940 1.0459
No log 1.8333 110 0.8698 0.6314 0.8698 0.9327
No log 1.8667 112 0.8032 0.6330 0.8032 0.8962
No log 1.9 114 0.8443 0.6443 0.8443 0.9188
No log 1.9333 116 0.9767 0.6385 0.9767 0.9883
No log 1.9667 118 1.1580 0.5868 1.1580 1.0761
No log 2.0 120 1.3615 0.5644 1.3615 1.1668
No log 2.0333 122 1.4609 0.5578 1.4609 1.2087
No log 2.0667 124 1.4484 0.5783 1.4484 1.2035
No log 2.1 126 1.3015 0.5987 1.3015 1.1408
No log 2.1333 128 1.0736 0.6166 1.0736 1.0362
No log 2.1667 130 0.9967 0.6354 0.9967 0.9983
No log 2.2 132 1.1695 0.6258 1.1695 1.0814
No log 2.2333 134 1.5596 0.5841 1.5596 1.2488
No log 2.2667 136 1.9736 0.4646 1.9736 1.4048
No log 2.3 138 1.9037 0.3457 1.9037 1.3798
No log 2.3333 140 1.8486 0.3265 1.8486 1.3596
No log 2.3667 142 1.8199 0.3732 1.8199 1.3490
No log 2.4 144 1.7986 0.4945 1.7986 1.3411
No log 2.4333 146 1.5133 0.5368 1.5133 1.2302
No log 2.4667 148 1.1380 0.5873 1.1380 1.0668
No log 2.5 150 0.8558 0.6330 0.8558 0.9251
No log 2.5333 152 0.8015 0.6219 0.8015 0.8953
No log 2.5667 154 0.8335 0.6087 0.8335 0.9130
No log 2.6 156 0.9986 0.5876 0.9986 0.9993
No log 2.6333 158 1.1777 0.5671 1.1777 1.0852
No log 2.6667 160 1.3090 0.5298 1.3090 1.1441
No log 2.7 162 1.3046 0.5561 1.3046 1.1422
No log 2.7333 164 1.1876 0.5629 1.1876 1.0898
No log 2.7667 166 1.0624 0.5724 1.0624 1.0307
No log 2.8 168 1.0383 0.5821 1.0383 1.0190
No log 2.8333 170 0.9524 0.6114 0.9524 0.9759
No log 2.8667 172 0.7771 0.6481 0.7771 0.8815
No log 2.9 174 0.7095 0.6794 0.7095 0.8423
No log 2.9333 176 0.7327 0.6453 0.7327 0.8560
No log 2.9667 178 0.7551 0.6272 0.7551 0.8690
No log 3.0 180 0.9776 0.6024 0.9776 0.9887
No log 3.0333 182 1.3662 0.5847 1.3662 1.1689
No log 3.0667 184 1.3643 0.5405 1.3643 1.1680
No log 3.1 186 1.0894 0.5819 1.0894 1.0437
No log 3.1333 188 0.8912 0.5984 0.8912 0.9440
No log 3.1667 190 0.8519 0.6077 0.8519 0.9230
No log 3.2 192 0.8651 0.5980 0.8651 0.9301
No log 3.2333 194 1.0166 0.5852 1.0166 1.0082
No log 3.2667 196 1.3292 0.5615 1.3292 1.1529
No log 3.3 198 1.3492 0.5674 1.3492 1.1616
No log 3.3333 200 1.0971 0.6016 1.0971 1.0474
No log 3.3667 202 0.8593 0.6644 0.8593 0.9270
No log 3.4 204 0.8078 0.6509 0.8078 0.8988
No log 3.4333 206 0.8369 0.6539 0.8369 0.9148
No log 3.4667 208 0.9323 0.6475 0.9323 0.9655
No log 3.5 210 0.9815 0.6300 0.9815 0.9907
No log 3.5333 212 1.0482 0.6176 1.0482 1.0238
No log 3.5667 214 1.0926 0.6099 1.0926 1.0453
No log 3.6 216 1.1546 0.5765 1.1546 1.0745
No log 3.6333 218 1.0095 0.6347 1.0095 1.0047
No log 3.6667 220 0.9005 0.6569 0.9005 0.9490
No log 3.7 222 0.8632 0.6811 0.8632 0.9291
No log 3.7333 224 0.8065 0.7168 0.8065 0.8981
No log 3.7667 226 0.8095 0.7052 0.8095 0.8997
No log 3.8 228 0.7798 0.6844 0.7798 0.8831
No log 3.8333 230 0.7234 0.6946 0.7234 0.8505
No log 3.8667 232 0.7410 0.6827 0.7410 0.8608
No log 3.9 234 0.8469 0.6218 0.8469 0.9202
No log 3.9333 236 1.0321 0.6033 1.0321 1.0159
No log 3.9667 238 1.1145 0.5569 1.1145 1.0557
No log 4.0 240 1.1350 0.5756 1.1350 1.0654
No log 4.0333 242 0.9821 0.6211 0.9821 0.9910
No log 4.0667 244 0.8560 0.6199 0.8560 0.9252
No log 4.1 246 0.8690 0.6265 0.8690 0.9322
No log 4.1333 248 0.9292 0.6250 0.9292 0.9639
No log 4.1667 250 0.9994 0.6437 0.9994 0.9997
No log 4.2 252 0.9400 0.6472 0.9400 0.9695
No log 4.2333 254 0.9180 0.6476 0.9180 0.9581
No log 4.2667 256 0.9289 0.6469 0.9289 0.9638
No log 4.3 258 0.8597 0.6833 0.8597 0.9272
No log 4.3333 260 0.7631 0.6544 0.7631 0.8735
No log 4.3667 262 0.7580 0.6544 0.7580 0.8707
No log 4.4 264 0.7883 0.6751 0.7883 0.8879
No log 4.4333 266 0.8963 0.6559 0.8963 0.9467
No log 4.4667 268 1.1381 0.5857 1.1381 1.0668
No log 4.5 270 1.2863 0.5842 1.2863 1.1341
No log 4.5333 272 1.2285 0.6009 1.2285 1.1084
No log 4.5667 274 1.0284 0.6231 1.0284 1.0141
No log 4.6 276 0.8519 0.6842 0.8519 0.9230
No log 4.6333 278 0.8138 0.6882 0.8138 0.9021
No log 4.6667 280 0.8702 0.6610 0.8702 0.9329
No log 4.7 282 0.9314 0.6542 0.9314 0.9651
No log 4.7333 284 0.8824 0.6748 0.8824 0.9393
No log 4.7667 286 0.8242 0.6852 0.8242 0.9078
No log 4.8 288 0.7672 0.6857 0.7672 0.8759
No log 4.8333 290 0.7353 0.7121 0.7353 0.8575
No log 4.8667 292 0.7315 0.7028 0.7315 0.8553
No log 4.9 294 0.7136 0.7151 0.7136 0.8448
No log 4.9333 296 0.7385 0.7216 0.7385 0.8593
No log 4.9667 298 0.8153 0.6859 0.8153 0.9029
No log 5.0 300 0.8701 0.6600 0.8701 0.9328
No log 5.0333 302 0.8750 0.6529 0.8750 0.9354
No log 5.0667 304 0.8663 0.6706 0.8663 0.9307
No log 5.1 306 0.9292 0.6415 0.9292 0.9639
No log 5.1333 308 1.0542 0.5985 1.0542 1.0268
No log 5.1667 310 1.0992 0.5985 1.0992 1.0484
No log 5.2 312 1.0303 0.6041 1.0303 1.0150
No log 5.2333 314 0.9207 0.6437 0.9207 0.9596
No log 5.2667 316 0.8635 0.6651 0.8635 0.9292
No log 5.3 318 0.8354 0.6825 0.8354 0.9140
No log 5.3333 320 0.9047 0.6620 0.9047 0.9511
No log 5.3667 322 1.0802 0.6016 1.0802 1.0393
No log 5.4 324 1.1412 0.5774 1.1412 1.0683
No log 5.4333 326 1.0947 0.5997 1.0947 1.0463
No log 5.4667 328 0.9341 0.6577 0.9341 0.9665
No log 5.5 330 0.7840 0.6954 0.7840 0.8855
No log 5.5333 332 0.7524 0.6980 0.7524 0.8674
No log 5.5667 334 0.7923 0.6954 0.7923 0.8901
No log 5.6 336 0.9105 0.6736 0.9105 0.9542
No log 5.6333 338 0.9536 0.6656 0.9536 0.9765
No log 5.6667 340 0.9205 0.6992 0.9205 0.9594
No log 5.7 342 0.8798 0.6766 0.8798 0.9380
No log 5.7333 344 0.8726 0.6813 0.8726 0.9341
No log 5.7667 346 0.8779 0.6599 0.8779 0.9370
No log 5.8 348 0.9416 0.6610 0.9416 0.9703
No log 5.8333 350 1.0403 0.6318 1.0403 1.0199
No log 5.8667 352 1.1721 0.5725 1.1721 1.0826
No log 5.9 354 1.2267 0.5573 1.2267 1.1076
No log 5.9333 356 1.2223 0.5777 1.2223 1.1056
No log 5.9667 358 1.1276 0.5946 1.1276 1.0619
No log 6.0 360 1.0602 0.6012 1.0602 1.0297
No log 6.0333 362 1.0447 0.6486 1.0447 1.0221
No log 6.0667 364 1.0207 0.6486 1.0207 1.0103
No log 6.1 366 0.9450 0.6393 0.9450 0.9721
No log 6.1333 368 0.8513 0.6503 0.8513 0.9227
No log 6.1667 370 0.7947 0.6513 0.7947 0.8914
No log 6.2 372 0.7574 0.6707 0.7574 0.8703
No log 6.2333 374 0.7244 0.6847 0.7244 0.8511
No log 6.2667 376 0.7255 0.7083 0.7255 0.8517
No log 6.3 378 0.7421 0.7074 0.7421 0.8614
No log 6.3333 380 0.7864 0.6705 0.7864 0.8868
No log 6.3667 382 0.8140 0.6510 0.8140 0.9022
No log 6.4 384 0.8457 0.6573 0.8457 0.9196
No log 6.4333 386 0.8695 0.6628 0.8695 0.9325
No log 6.4667 388 0.8989 0.6672 0.8989 0.9481
No log 6.5 390 0.9385 0.6672 0.9385 0.9688
No log 6.5333 392 0.9441 0.6672 0.9441 0.9716
No log 6.5667 394 0.9094 0.6524 0.9094 0.9536
No log 6.6 396 0.8445 0.6385 0.8445 0.9190
No log 6.6333 398 0.7809 0.6706 0.7809 0.8837
No log 6.6667 400 0.7284 0.7084 0.7284 0.8535
No log 6.7 402 0.7114 0.7234 0.7114 0.8434
No log 6.7333 404 0.7266 0.7081 0.7266 0.8524
No log 6.7667 406 0.7721 0.6751 0.7721 0.8787
No log 6.8 408 0.8464 0.6524 0.8464 0.9200
No log 6.8333 410 0.9225 0.6579 0.9225 0.9605
No log 6.8667 412 0.9582 0.6581 0.9582 0.9789
No log 6.9 414 0.9458 0.6689 0.9458 0.9725
No log 6.9333 416 0.8834 0.6619 0.8834 0.9399
No log 6.9667 418 0.8475 0.6546 0.8475 0.9206
No log 7.0 420 0.8218 0.6482 0.8218 0.9065
No log 7.0333 422 0.8317 0.6616 0.8317 0.9120
No log 7.0667 424 0.8861 0.6581 0.8861 0.9413
No log 7.1 426 0.9578 0.6350 0.9578 0.9787
No log 7.1333 428 1.0235 0.6243 1.0235 1.0117
No log 7.1667 430 1.0229 0.6257 1.0229 1.0114
No log 7.2 432 0.9786 0.6350 0.9786 0.9893
No log 7.2333 434 0.9327 0.6396 0.9327 0.9658
No log 7.2667 436 0.9072 0.6412 0.9072 0.9525
No log 7.3 438 0.8804 0.6456 0.8804 0.9383
No log 7.3333 440 0.8739 0.6546 0.8739 0.9348
No log 7.3667 442 0.8945 0.6581 0.8945 0.9458
No log 7.4 444 0.9113 0.6548 0.9113 0.9546
No log 7.4333 446 0.9389 0.6531 0.9389 0.9690
No log 7.4667 448 0.9909 0.6517 0.9909 0.9955
No log 7.5 450 1.0082 0.6600 1.0082 1.0041
No log 7.5333 452 0.9867 0.6722 0.9867 0.9933
No log 7.5667 454 0.9257 0.6723 0.9257 0.9621
No log 7.6 456 0.8629 0.6679 0.8629 0.9289
No log 7.6333 458 0.8131 0.6924 0.8131 0.9017
No log 7.6667 460 0.7764 0.6902 0.7764 0.8812
No log 7.7 462 0.7639 0.6965 0.7639 0.8740
No log 7.7333 464 0.7836 0.6795 0.7836 0.8852
No log 7.7667 466 0.8112 0.6714 0.8112 0.9007
No log 7.8 468 0.8340 0.6714 0.8340 0.9132
No log 7.8333 470 0.8409 0.6546 0.8409 0.9170
No log 7.8667 472 0.8446 0.6546 0.8446 0.9190
No log 7.9 474 0.8430 0.6546 0.8430 0.9181
No log 7.9333 476 0.8660 0.6546 0.8660 0.9306
No log 7.9667 478 0.8990 0.6513 0.8990 0.9482
No log 8.0 480 0.9278 0.6548 0.9278 0.9632
No log 8.0333 482 0.9255 0.6531 0.9255 0.9620
No log 8.0667 484 0.9057 0.6636 0.9057 0.9517
No log 8.1 486 0.8825 0.6636 0.8825 0.9394
No log 8.1333 488 0.8648 0.6546 0.8648 0.9300
No log 8.1667 490 0.8740 0.6636 0.8740 0.9349
No log 8.2 492 0.8766 0.6636 0.8766 0.9362
No log 8.2333 494 0.9019 0.6619 0.9019 0.9497
No log 8.2667 496 0.9424 0.6587 0.9424 0.9708
No log 8.3 498 0.9706 0.6506 0.9706 0.9852
0.3412 8.3333 500 0.9796 0.6506 0.9796 0.9897
0.3412 8.3667 502 0.9685 0.6506 0.9685 0.9841
0.3412 8.4 504 0.9321 0.6579 0.9321 0.9655
0.3412 8.4333 506 0.8920 0.6636 0.8920 0.9445
0.3412 8.4667 508 0.8622 0.6679 0.8622 0.9286
0.3412 8.5 510 0.8399 0.6590 0.8399 0.9165
0.3412 8.5333 512 0.8251 0.6590 0.8251 0.9084
0.3412 8.5667 514 0.8328 0.6590 0.8328 0.9126
0.3412 8.6 516 0.8391 0.6590 0.8391 0.9160
0.3412 8.6333 518 0.8308 0.6590 0.8308 0.9115
0.3412 8.6667 520 0.8333 0.6590 0.8333 0.9129
0.3412 8.7 522 0.8536 0.6590 0.8536 0.9239
0.3412 8.7333 524 0.8674 0.6546 0.8674 0.9313
0.3412 8.7667 526 0.8845 0.6449 0.8845 0.9405
0.3412 8.8 528 0.8910 0.6325 0.8910 0.9439
0.3412 8.8333 530 0.8934 0.6360 0.8934 0.9452
0.3412 8.8667 532 0.8757 0.6546 0.8757 0.9358
0.3412 8.9 534 0.8654 0.6546 0.8654 0.9303
0.3412 8.9333 536 0.8616 0.6546 0.8616 0.9282
0.3412 8.9667 538 0.8722 0.6546 0.8722 0.9339
0.3412 9.0 540 0.8887 0.6671 0.8887 0.9427
0.3412 9.0333 542 0.9176 0.6453 0.9176 0.9579
0.3412 9.0667 544 0.9444 0.6542 0.9444 0.9718
0.3412 9.1 546 0.9666 0.6506 0.9666 0.9832
0.3412 9.1333 548 0.9886 0.6506 0.9886 0.9943
0.3412 9.1667 550 1.0004 0.6506 1.0004 1.0002
0.3412 9.2 552 1.0154 0.6600 1.0154 1.0077
0.3412 9.2333 554 1.0181 0.6544 1.0181 1.0090
0.3412 9.2667 556 1.0042 0.6600 1.0042 1.0021
0.3412 9.3 558 0.9903 0.6600 0.9903 0.9952
0.3412 9.3333 560 0.9805 0.6542 0.9805 0.9902
0.3412 9.3667 562 0.9735 0.6542 0.9735 0.9866
0.3412 9.4 564 0.9622 0.6472 0.9622 0.9809
0.3412 9.4333 566 0.9523 0.6553 0.9523 0.9759
0.3412 9.4667 568 0.9481 0.6553 0.9481 0.9737
0.3412 9.5 570 0.9410 0.6553 0.9410 0.9701
0.3412 9.5333 572 0.9415 0.6553 0.9415 0.9703
0.3412 9.5667 574 0.9480 0.6553 0.9480 0.9736
0.3412 9.6 576 0.9513 0.6553 0.9513 0.9753
0.3412 9.6333 578 0.9560 0.6537 0.9560 0.9777
0.3412 9.6667 580 0.9596 0.6521 0.9596 0.9796
0.3412 9.7 582 0.9635 0.6521 0.9635 0.9816
0.3412 9.7333 584 0.9644 0.6521 0.9644 0.9820
0.3412 9.7667 586 0.9674 0.6457 0.9674 0.9835
0.3412 9.8 588 0.9707 0.6517 0.9707 0.9852
0.3412 9.8333 590 0.9749 0.6600 0.9749 0.9874
0.3412 9.8667 592 0.9788 0.6600 0.9788 0.9893
0.3412 9.9 594 0.9818 0.6600 0.9818 0.9908
0.3412 9.9333 596 0.9838 0.6600 0.9838 0.9919
0.3412 9.9667 598 0.9842 0.6600 0.9842 0.9921
0.3412 10.0 600 0.9839 0.6600 0.9839 0.9919

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k16_task5_organization

Finetuned
(4023)
this model