ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8635
  • Qwk: 0.3546
  • Mse: 0.8635
  • Rmse: 0.9293

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.8734 -0.0020 4.8734 2.2076
No log 0.0471 4 2.9419 -0.0050 2.9419 1.7152
No log 0.0706 6 2.4431 -0.0586 2.4431 1.5631
No log 0.0941 8 1.7615 -0.0295 1.7615 1.3272
No log 0.1176 10 1.2688 0.1335 1.2688 1.1264
No log 0.1412 12 1.2933 0.0741 1.2933 1.1372
No log 0.1647 14 1.4232 0.0964 1.4232 1.1930
No log 0.1882 16 1.4495 0.0964 1.4495 1.2039
No log 0.2118 18 1.1301 0.2424 1.1301 1.0631
No log 0.2353 20 1.0660 0.3411 1.0660 1.0325
No log 0.2588 22 1.2132 0.1912 1.2132 1.1014
No log 0.2824 24 1.8400 0.1824 1.8400 1.3565
No log 0.3059 26 2.0161 0.1876 2.0161 1.4199
No log 0.3294 28 1.7199 0.2408 1.7199 1.3114
No log 0.3529 30 1.2625 0.1549 1.2625 1.1236
No log 0.3765 32 1.0930 0.3302 1.0930 1.0455
No log 0.4 34 1.1758 0.1784 1.1758 1.0843
No log 0.4235 36 1.5675 0.1483 1.5675 1.2520
No log 0.4471 38 1.7061 0.1605 1.7061 1.3062
No log 0.4706 40 1.5320 0.1166 1.5320 1.2378
No log 0.4941 42 1.1486 0.2162 1.1486 1.0717
No log 0.5176 44 0.9626 0.3635 0.9626 0.9811
No log 0.5412 46 0.9848 0.3491 0.9848 0.9924
No log 0.5647 48 1.1716 0.2110 1.1716 1.0824
No log 0.5882 50 1.2941 0.2503 1.2941 1.1376
No log 0.6118 52 1.5736 0.3118 1.5736 1.2544
No log 0.6353 54 1.6046 0.3234 1.6046 1.2667
No log 0.6588 56 1.4058 0.3273 1.4058 1.1857
No log 0.6824 58 1.1036 0.2895 1.1036 1.0505
No log 0.7059 60 0.8257 0.5125 0.8257 0.9087
No log 0.7294 62 0.8304 0.4714 0.8304 0.9113
No log 0.7529 64 0.7987 0.5663 0.7987 0.8937
No log 0.7765 66 1.1723 0.2103 1.1723 1.0827
No log 0.8 68 1.3613 0.2286 1.3613 1.1668
No log 0.8235 70 1.5370 0.1418 1.5370 1.2398
No log 0.8471 72 1.4827 0.1418 1.4827 1.2176
No log 0.8706 74 1.1589 0.2644 1.1589 1.0765
No log 0.8941 76 0.8239 0.5009 0.8239 0.9077
No log 0.9176 78 0.8018 0.5622 0.8018 0.8955
No log 0.9412 80 0.8096 0.5247 0.8096 0.8998
No log 0.9647 82 0.9255 0.3348 0.9255 0.9620
No log 0.9882 84 1.1351 0.2019 1.1351 1.0654
No log 1.0118 86 1.3174 0.2301 1.3174 1.1478
No log 1.0353 88 1.3735 0.2248 1.3735 1.1720
No log 1.0588 90 1.0729 0.2817 1.0729 1.0358
No log 1.0824 92 0.7804 0.5010 0.7804 0.8834
No log 1.1059 94 0.9886 0.3975 0.9886 0.9943
No log 1.1294 96 0.9047 0.4640 0.9047 0.9511
No log 1.1529 98 0.7457 0.4324 0.7457 0.8635
No log 1.1765 100 0.9651 0.3875 0.9651 0.9824
No log 1.2 102 1.2038 0.4013 1.2038 1.0972
No log 1.2235 104 1.3971 0.3268 1.3971 1.1820
No log 1.2471 106 1.2020 0.4013 1.2020 1.0964
No log 1.2706 108 0.8962 0.4570 0.8962 0.9467
No log 1.2941 110 0.7105 0.5010 0.7105 0.8429
No log 1.3176 112 0.6713 0.5404 0.6713 0.8193
No log 1.3412 114 0.6642 0.5404 0.6642 0.8150
No log 1.3647 116 0.7409 0.5467 0.7409 0.8607
No log 1.3882 118 1.0070 0.4944 1.0070 1.0035
No log 1.4118 120 1.1032 0.4653 1.1032 1.0504
No log 1.4353 122 1.0891 0.4386 1.0891 1.0436
No log 1.4588 124 0.9233 0.5543 0.9233 0.9609
No log 1.4824 126 0.6558 0.6070 0.6558 0.8098
No log 1.5059 128 0.6896 0.5662 0.6896 0.8304
No log 1.5294 130 0.7088 0.5662 0.7088 0.8419
No log 1.5529 132 0.6618 0.5552 0.6618 0.8135
No log 1.5765 134 0.7446 0.6380 0.7446 0.8629
No log 1.6 136 0.8752 0.5247 0.8752 0.9355
No log 1.6235 138 0.8991 0.5441 0.8991 0.9482
No log 1.6471 140 0.9230 0.5247 0.9230 0.9607
No log 1.6706 142 0.7303 0.6128 0.7303 0.8546
No log 1.6941 144 0.6870 0.6362 0.6870 0.8288
No log 1.7176 146 0.6999 0.6389 0.6999 0.8366
No log 1.7412 148 0.7884 0.5485 0.7884 0.8879
No log 1.7647 150 0.9645 0.5057 0.9645 0.9821
No log 1.7882 152 0.8833 0.5018 0.8833 0.9398
No log 1.8118 154 0.7494 0.4841 0.7494 0.8657
No log 1.8353 156 0.7460 0.5150 0.7460 0.8637
No log 1.8588 158 0.7378 0.5150 0.7378 0.8589
No log 1.8824 160 0.7630 0.5041 0.7630 0.8735
No log 1.9059 162 0.9079 0.4850 0.9079 0.9528
No log 1.9294 164 0.9929 0.4830 0.9929 0.9965
No log 1.9529 166 0.9009 0.4500 0.9009 0.9492
No log 1.9765 168 0.7439 0.4541 0.7439 0.8625
No log 2.0 170 0.7931 0.5261 0.7931 0.8905
No log 2.0235 172 0.8167 0.5130 0.8167 0.9037
No log 2.0471 174 0.7839 0.5606 0.7839 0.8854
No log 2.0706 176 0.8255 0.3848 0.8255 0.9086
No log 2.0941 178 0.9550 0.4355 0.9550 0.9773
No log 2.1176 180 1.0276 0.4485 1.0276 1.0137
No log 2.1412 182 0.8799 0.4906 0.8799 0.9380
No log 2.1647 184 0.7526 0.5236 0.7526 0.8676
No log 2.1882 186 0.7421 0.5837 0.7421 0.8615
No log 2.2118 188 0.7594 0.5635 0.7594 0.8715
No log 2.2353 190 0.7877 0.5046 0.7877 0.8876
No log 2.2588 192 0.8427 0.4180 0.8427 0.9180
No log 2.2824 194 0.8989 0.4393 0.8989 0.9481
No log 2.3059 196 0.9105 0.4286 0.9105 0.9542
No log 2.3294 198 0.9037 0.4116 0.9037 0.9506
No log 2.3529 200 0.8792 0.3909 0.8792 0.9377
No log 2.3765 202 0.8544 0.4261 0.8544 0.9243
No log 2.4 204 0.8091 0.5046 0.8091 0.8995
No log 2.4235 206 0.8072 0.4908 0.8072 0.8985
No log 2.4471 208 0.8189 0.4563 0.8189 0.9049
No log 2.4706 210 0.8667 0.3011 0.8667 0.9309
No log 2.4941 212 0.8395 0.3762 0.8395 0.9162
No log 2.5176 214 0.8407 0.5080 0.8407 0.9169
No log 2.5412 216 0.8696 0.4697 0.8696 0.9325
No log 2.5647 218 0.8539 0.4221 0.8539 0.9241
No log 2.5882 220 0.9275 0.4213 0.9275 0.9631
No log 2.6118 222 0.8788 0.3544 0.8788 0.9374
No log 2.6353 224 0.8496 0.4286 0.8496 0.9217
No log 2.6588 226 0.8574 0.3840 0.8574 0.9260
No log 2.6824 228 0.8537 0.4139 0.8537 0.9240
No log 2.7059 230 0.8595 0.4244 0.8595 0.9271
No log 2.7294 232 0.8399 0.4676 0.8399 0.9164
No log 2.7529 234 0.8047 0.5150 0.8047 0.8970
No log 2.7765 236 0.8099 0.4772 0.8099 0.8999
No log 2.8 238 0.8797 0.4947 0.8797 0.9379
No log 2.8235 240 0.9649 0.4403 0.9649 0.9823
No log 2.8471 242 0.8200 0.5006 0.8200 0.9055
No log 2.8706 244 0.7556 0.5858 0.7556 0.8692
No log 2.8941 246 0.7641 0.5662 0.7641 0.8741
No log 2.9176 248 0.9227 0.3755 0.9227 0.9606
No log 2.9412 250 0.9787 0.3612 0.9787 0.9893
No log 2.9647 252 0.8641 0.3321 0.8641 0.9295
No log 2.9882 254 0.8208 0.5751 0.8208 0.9060
No log 3.0118 256 0.8858 0.4588 0.8858 0.9412
No log 3.0353 258 0.8331 0.5240 0.8331 0.9128
No log 3.0588 260 0.7871 0.4976 0.7871 0.8872
No log 3.0824 262 0.8967 0.3462 0.8967 0.9469
No log 3.1059 264 0.9483 0.3152 0.9483 0.9738
No log 3.1294 266 0.8321 0.4 0.8321 0.9122
No log 3.1529 268 0.7940 0.5112 0.7940 0.8910
No log 3.1765 270 0.7949 0.5748 0.7949 0.8916
No log 3.2 272 0.7984 0.4760 0.7984 0.8935
No log 3.2235 274 0.8517 0.4175 0.8517 0.9229
No log 3.2471 276 1.0082 0.4748 1.0082 1.0041
No log 3.2706 278 1.1130 0.4444 1.1130 1.0550
No log 3.2941 280 0.9482 0.4890 0.9482 0.9738
No log 3.3176 282 0.7820 0.4652 0.7820 0.8843
No log 3.3412 284 0.7556 0.5253 0.7556 0.8693
No log 3.3647 286 0.7562 0.4839 0.7562 0.8696
No log 3.3882 288 0.7921 0.4465 0.7921 0.8900
No log 3.4118 290 0.7977 0.3646 0.7977 0.8931
No log 3.4353 292 0.7654 0.4465 0.7654 0.8749
No log 3.4588 294 0.7376 0.4661 0.7376 0.8588
No log 3.4824 296 0.7008 0.5735 0.7008 0.8371
No log 3.5059 298 0.6945 0.5954 0.6945 0.8333
No log 3.5294 300 0.7261 0.5376 0.7261 0.8521
No log 3.5529 302 0.7402 0.5376 0.7402 0.8604
No log 3.5765 304 0.7066 0.5648 0.7066 0.8406
No log 3.6 306 0.7131 0.6240 0.7131 0.8444
No log 3.6235 308 0.7121 0.5458 0.7121 0.8439
No log 3.6471 310 0.7822 0.4867 0.7822 0.8844
No log 3.6706 312 0.8635 0.5184 0.8635 0.9292
No log 3.6941 314 0.8377 0.4959 0.8377 0.9152
No log 3.7176 316 0.7801 0.4815 0.7801 0.8832
No log 3.7412 318 0.8261 0.4723 0.8261 0.9089
No log 3.7647 320 0.9021 0.4796 0.9021 0.9498
No log 3.7882 322 0.8078 0.5098 0.8078 0.8988
No log 3.8118 324 0.8057 0.5069 0.8057 0.8976
No log 3.8353 326 0.8866 0.4563 0.8866 0.9416
No log 3.8588 328 0.9820 0.4412 0.9820 0.9910
No log 3.8824 330 1.0951 0.4272 1.0951 1.0465
No log 3.9059 332 0.9629 0.3642 0.9629 0.9813
No log 3.9294 334 0.8322 0.3756 0.8322 0.9123
No log 3.9529 336 0.8132 0.4324 0.8132 0.9018
No log 3.9765 338 0.8169 0.4220 0.8169 0.9038
No log 4.0 340 0.8192 0.3956 0.8192 0.9051
No log 4.0235 342 0.8157 0.3614 0.8157 0.9031
No log 4.0471 344 0.7625 0.4879 0.7625 0.8732
No log 4.0706 346 0.7510 0.6259 0.7510 0.8666
No log 4.0941 348 0.8826 0.5513 0.8826 0.9395
No log 4.1176 350 0.8480 0.5536 0.8480 0.9209
No log 4.1412 352 0.7459 0.6085 0.7459 0.8636
No log 4.1647 354 0.8198 0.4368 0.8198 0.9054
No log 4.1882 356 0.9962 0.3549 0.9962 0.9981
No log 4.2118 358 1.0235 0.3536 1.0235 1.0117
No log 4.2353 360 0.9206 0.3128 0.9206 0.9595
No log 4.2588 362 0.8373 0.3657 0.8373 0.9151
No log 4.2824 364 0.7938 0.5337 0.7938 0.8910
No log 4.3059 366 0.8208 0.5665 0.8208 0.9060
No log 4.3294 368 0.8062 0.5441 0.8062 0.8979
No log 4.3529 370 0.8075 0.4159 0.8075 0.8986
No log 4.3765 372 0.8207 0.3806 0.8207 0.9059
No log 4.4 374 0.7925 0.3637 0.7925 0.8902
No log 4.4235 376 0.7823 0.3733 0.7823 0.8845
No log 4.4471 378 0.7907 0.3637 0.7907 0.8892
No log 4.4706 380 0.7893 0.3596 0.7893 0.8884
No log 4.4941 382 0.8174 0.3596 0.8174 0.9041
No log 4.5176 384 0.8313 0.3830 0.8313 0.9118
No log 4.5412 386 0.8503 0.4244 0.8503 0.9221
No log 4.5647 388 0.8892 0.4554 0.8892 0.9430
No log 4.5882 390 0.8757 0.4456 0.8757 0.9358
No log 4.6118 392 0.8663 0.3318 0.8663 0.9308
No log 4.6353 394 0.8794 0.3411 0.8794 0.9378
No log 4.6588 396 0.8836 0.3110 0.8836 0.9400
No log 4.6824 398 0.8874 0.3269 0.8874 0.9420
No log 4.7059 400 0.9022 0.3548 0.9022 0.9498
No log 4.7294 402 0.8880 0.3269 0.8880 0.9423
No log 4.7529 404 0.8683 0.3269 0.8683 0.9318
No log 4.7765 406 0.8512 0.3462 0.8512 0.9226
No log 4.8 408 0.8272 0.3728 0.8272 0.9095
No log 4.8235 410 0.8249 0.4389 0.8249 0.9082
No log 4.8471 412 0.8213 0.5131 0.8213 0.9063
No log 4.8706 414 0.8491 0.3418 0.8491 0.9215
No log 4.8941 416 0.8956 0.3269 0.8956 0.9463
No log 4.9176 418 0.9095 0.3269 0.9095 0.9537
No log 4.9412 420 0.8904 0.3462 0.8904 0.9436
No log 4.9647 422 0.8446 0.4260 0.8446 0.9190
No log 4.9882 424 0.8302 0.4676 0.8302 0.9111
No log 5.0118 426 0.8180 0.4571 0.8180 0.9044
No log 5.0353 428 0.8048 0.4635 0.8048 0.8971
No log 5.0588 430 0.7847 0.4712 0.7847 0.8858
No log 5.0824 432 0.7692 0.5399 0.7692 0.8770
No log 5.1059 434 0.7745 0.4926 0.7745 0.8801
No log 5.1294 436 0.7957 0.3709 0.7957 0.8920
No log 5.1529 438 0.8241 0.3614 0.8241 0.9078
No log 5.1765 440 0.7685 0.4471 0.7685 0.8766
No log 5.2 442 0.7587 0.4570 0.7587 0.8710
No log 5.2235 444 0.7614 0.4635 0.7614 0.8726
No log 5.2471 446 0.7743 0.4434 0.7743 0.8799
No log 5.2706 448 0.8028 0.3756 0.8028 0.8960
No log 5.2941 450 0.8041 0.3756 0.8041 0.8967
No log 5.3176 452 0.7797 0.4701 0.7797 0.8830
No log 5.3412 454 0.7774 0.4701 0.7774 0.8817
No log 5.3647 456 0.8104 0.4096 0.8104 0.9002
No log 5.3882 458 0.8898 0.3546 0.8898 0.9433
No log 5.4118 460 0.8948 0.3879 0.8948 0.9459
No log 5.4353 462 0.8083 0.3996 0.8083 0.8991
No log 5.4588 464 0.7685 0.4696 0.7685 0.8766
No log 5.4824 466 0.7947 0.4624 0.7947 0.8915
No log 5.5059 468 0.8241 0.4283 0.8241 0.9078
No log 5.5294 470 0.8306 0.4621 0.8306 0.9113
No log 5.5529 472 0.8085 0.4690 0.8085 0.8992
No log 5.5765 474 0.7791 0.4368 0.7791 0.8827
No log 5.6 476 0.7600 0.4661 0.7600 0.8718
No log 5.6235 478 0.7822 0.4137 0.7822 0.8844
No log 5.6471 480 0.7801 0.3855 0.7801 0.8832
No log 5.6706 482 0.7768 0.3855 0.7768 0.8813
No log 5.6941 484 0.7367 0.5183 0.7367 0.8583
No log 5.7176 486 0.7349 0.5150 0.7349 0.8572
No log 5.7412 488 0.7481 0.5183 0.7481 0.8649
No log 5.7647 490 0.8248 0.4076 0.8248 0.9082
No log 5.7882 492 0.8813 0.3548 0.8813 0.9388
No log 5.8118 494 0.8829 0.3548 0.8829 0.9396
No log 5.8353 496 0.8154 0.4039 0.8154 0.9030
No log 5.8588 498 0.7989 0.3668 0.7989 0.8938
0.3763 5.8824 500 0.7942 0.3862 0.7942 0.8912
0.3763 5.9059 502 0.7774 0.4158 0.7774 0.8817
0.3763 5.9294 504 0.7792 0.4284 0.7792 0.8827
0.3763 5.9529 506 0.7827 0.4389 0.7827 0.8847
0.3763 5.9765 508 0.7878 0.4350 0.7878 0.8876
0.3763 6.0 510 0.7746 0.4389 0.7746 0.8801
0.3763 6.0235 512 0.7836 0.4119 0.7836 0.8852
0.3763 6.0471 514 0.8046 0.3756 0.8046 0.8970
0.3763 6.0706 516 0.8586 0.4695 0.8586 0.9266
0.3763 6.0941 518 0.8227 0.4915 0.8227 0.9070
0.3763 6.1176 520 0.7128 0.5299 0.7128 0.8443
0.3763 6.1412 522 0.6895 0.6121 0.6895 0.8304
0.3763 6.1647 524 0.7039 0.6297 0.7039 0.8390
0.3763 6.1882 526 0.7011 0.6217 0.7011 0.8373
0.3763 6.2118 528 0.6953 0.5971 0.6953 0.8339
0.3763 6.2353 530 0.7110 0.4665 0.7110 0.8432
0.3763 6.2588 532 0.7185 0.4077 0.7185 0.8476
0.3763 6.2824 534 0.7192 0.4428 0.7192 0.8481
0.3763 6.3059 536 0.7323 0.4769 0.7323 0.8558
0.3763 6.3294 538 0.7447 0.6071 0.7447 0.8629
0.3763 6.3529 540 0.7236 0.6071 0.7236 0.8507
0.3763 6.3765 542 0.6987 0.5450 0.6987 0.8359
0.3763 6.4 544 0.7000 0.4879 0.7000 0.8367
0.3763 6.4235 546 0.7093 0.4806 0.7093 0.8422
0.3763 6.4471 548 0.7050 0.5649 0.7050 0.8396
0.3763 6.4706 550 0.7087 0.6157 0.7087 0.8419
0.3763 6.4941 552 0.7259 0.6256 0.7259 0.8520
0.3763 6.5176 554 0.7367 0.6046 0.7367 0.8583
0.3763 6.5412 556 0.7295 0.5971 0.7295 0.8541
0.3763 6.5647 558 0.7640 0.4260 0.7640 0.8741
0.3763 6.5882 560 0.8306 0.3462 0.8306 0.9114
0.3763 6.6118 562 0.8620 0.3879 0.8620 0.9284
0.3763 6.6353 564 0.8233 0.3462 0.8233 0.9074
0.3763 6.6588 566 0.7684 0.4803 0.7684 0.8766
0.3763 6.6824 568 0.7394 0.5703 0.7394 0.8599
0.3763 6.7059 570 0.7368 0.5291 0.7368 0.8584
0.3763 6.7294 572 0.7765 0.5069 0.7765 0.8812
0.3763 6.7529 574 0.8713 0.4885 0.8713 0.9335
0.3763 6.7765 576 0.8937 0.5004 0.8937 0.9454
0.3763 6.8 578 0.8904 0.3968 0.8904 0.9436
0.3763 6.8235 580 0.8949 0.3696 0.8949 0.9460
0.3763 6.8471 582 0.9256 0.3578 0.9256 0.9621
0.3763 6.8706 584 0.9391 0.3295 0.9391 0.9691
0.3763 6.8941 586 0.9084 0.3295 0.9084 0.9531
0.3763 6.9176 588 0.8635 0.3546 0.8635 0.9293

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k17_task2_organization

Finetuned
(4019)
this model