ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7764
  • Qwk: 0.4546
  • Mse: 0.7764
  • Rmse: 0.8812

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 3.7704 -0.0215 3.7704 1.9418
No log 0.2667 4 2.1309 0.0539 2.1309 1.4598
No log 0.4 6 1.6459 0.0757 1.6459 1.2829
No log 0.5333 8 1.2337 0.1406 1.2337 1.1107
No log 0.6667 10 0.9785 0.2441 0.9785 0.9892
No log 0.8 12 0.9752 0.4312 0.9752 0.9875
No log 0.9333 14 1.2161 0.0 1.2161 1.1028
No log 1.0667 16 1.3649 0.0399 1.3649 1.1683
No log 1.2 18 1.2199 0.0769 1.2199 1.1045
No log 1.3333 20 1.0201 0.3396 1.0201 1.0100
No log 1.4667 22 0.9542 0.3094 0.9542 0.9768
No log 1.6 24 0.9642 0.2166 0.9642 0.9819
No log 1.7333 26 0.9843 0.1644 0.9843 0.9921
No log 1.8667 28 1.1396 0.2439 1.1396 1.0675
No log 2.0 30 1.0690 0.3063 1.0690 1.0339
No log 2.1333 32 1.0372 0.1693 1.0372 1.0184
No log 2.2667 34 1.0805 0.1761 1.0805 1.0395
No log 2.4 36 1.0704 0.2057 1.0704 1.0346
No log 2.5333 38 1.0721 0.1491 1.0721 1.0354
No log 2.6667 40 1.0418 0.2693 1.0418 1.0207
No log 2.8 42 1.1194 0.2914 1.1194 1.0580
No log 2.9333 44 1.1782 0.3650 1.1782 1.0855
No log 3.0667 46 1.1018 0.3034 1.1018 1.0496
No log 3.2 48 0.9894 0.2647 0.9894 0.9947
No log 3.3333 50 1.1128 0.2647 1.1128 1.0549
No log 3.4667 52 1.2557 0.1389 1.2557 1.1206
No log 3.6 54 1.3723 0.1447 1.3723 1.1715
No log 3.7333 56 1.4033 0.1854 1.4033 1.1846
No log 3.8667 58 1.3844 0.1545 1.3844 1.1766
No log 4.0 60 1.2960 0.1986 1.2960 1.1384
No log 4.1333 62 1.1294 0.3806 1.1294 1.0627
No log 4.2667 64 0.9919 0.3820 0.9919 0.9960
No log 4.4 66 0.9680 0.4192 0.9680 0.9839
No log 4.5333 68 0.9906 0.3959 0.9906 0.9953
No log 4.6667 70 0.9608 0.2842 0.9608 0.9802
No log 4.8 72 0.8941 0.3455 0.8941 0.9456
No log 4.9333 74 0.8904 0.3552 0.8904 0.9436
No log 5.0667 76 0.8852 0.2910 0.8852 0.9409
No log 5.2 78 0.9291 0.3412 0.9291 0.9639
No log 5.3333 80 1.0637 0.3201 1.0637 1.0314
No log 5.4667 82 1.0592 0.3406 1.0592 1.0292
No log 5.6 84 1.1253 0.3826 1.1253 1.0608
No log 5.7333 86 1.2017 0.3745 1.2017 1.0962
No log 5.8667 88 1.1209 0.2787 1.1209 1.0587
No log 6.0 90 1.1512 0.3537 1.1512 1.0730
No log 6.1333 92 1.2145 0.3800 1.2145 1.1020
No log 6.2667 94 1.0310 0.4252 1.0310 1.0154
No log 6.4 96 0.9638 0.4037 0.9638 0.9817
No log 6.5333 98 1.1982 0.4064 1.1982 1.0946
No log 6.6667 100 1.2995 0.2697 1.2995 1.1399
No log 6.8 102 1.1486 0.4090 1.1486 1.0717
No log 6.9333 104 0.9222 0.4484 0.9222 0.9603
No log 7.0667 106 0.8922 0.3707 0.8922 0.9445
No log 7.2 108 0.9035 0.3824 0.9035 0.9505
No log 7.3333 110 1.0051 0.4492 1.0051 1.0026
No log 7.4667 112 1.1645 0.4497 1.1645 1.0791
No log 7.6 114 1.1540 0.4200 1.1540 1.0742
No log 7.7333 116 1.0351 0.4196 1.0351 1.0174
No log 7.8667 118 0.9771 0.4069 0.9771 0.9885
No log 8.0 120 0.9961 0.4840 0.9961 0.9980
No log 8.1333 122 1.0121 0.4815 1.0121 1.0060
No log 8.2667 124 0.9146 0.4956 0.9146 0.9563
No log 8.4 126 0.8684 0.4730 0.8684 0.9319
No log 8.5333 128 0.8479 0.4842 0.8479 0.9208
No log 8.6667 130 0.8658 0.5458 0.8658 0.9305
No log 8.8 132 1.0719 0.4513 1.0719 1.0353
No log 8.9333 134 1.4628 0.3280 1.4628 1.2095
No log 9.0667 136 1.5683 0.2994 1.5683 1.2523
No log 9.2 138 1.2465 0.3959 1.2465 1.1165
No log 9.3333 140 0.9313 0.5327 0.9313 0.9650
No log 9.4667 142 0.8332 0.5518 0.8332 0.9128
No log 9.6 144 0.8338 0.5002 0.8338 0.9131
No log 9.7333 146 0.9498 0.5012 0.9498 0.9746
No log 9.8667 148 1.0461 0.4367 1.0461 1.0228
No log 10.0 150 0.9696 0.4563 0.9696 0.9847
No log 10.1333 152 0.7929 0.5070 0.7929 0.8905
No log 10.2667 154 0.7593 0.4368 0.7593 0.8714
No log 10.4 156 0.8343 0.4140 0.8343 0.9134
No log 10.5333 158 0.8611 0.3869 0.8611 0.9279
No log 10.6667 160 0.8487 0.4615 0.8487 0.9212
No log 10.8 162 0.8771 0.4800 0.8771 0.9365
No log 10.9333 164 0.8990 0.5291 0.8990 0.9482
No log 11.0667 166 0.8573 0.4889 0.8573 0.9259
No log 11.2 168 0.8661 0.5291 0.8661 0.9306
No log 11.3333 170 0.8885 0.5222 0.8885 0.9426
No log 11.4667 172 0.8575 0.5232 0.8575 0.9260
No log 11.6 174 0.8123 0.5549 0.8123 0.9013
No log 11.7333 176 0.7925 0.4924 0.7925 0.8902
No log 11.8667 178 0.7537 0.5510 0.7537 0.8682
No log 12.0 180 0.7765 0.4836 0.7765 0.8812
No log 12.1333 182 0.7742 0.5072 0.7742 0.8799
No log 12.2667 184 0.7531 0.5407 0.7531 0.8678
No log 12.4 186 0.7302 0.5657 0.7302 0.8545
No log 12.5333 188 0.7494 0.5410 0.7494 0.8657
No log 12.6667 190 0.8562 0.4369 0.8562 0.9253
No log 12.8 192 0.8546 0.4482 0.8546 0.9244
No log 12.9333 194 0.7544 0.5304 0.7544 0.8686
No log 13.0667 196 0.7523 0.5316 0.7523 0.8673
No log 13.2 198 0.7646 0.5197 0.7646 0.8744
No log 13.3333 200 0.8193 0.4935 0.8193 0.9051
No log 13.4667 202 0.9411 0.4468 0.9411 0.9701
No log 13.6 204 0.8695 0.5041 0.8695 0.9325
No log 13.7333 206 0.7827 0.4519 0.7827 0.8847
No log 13.8667 208 0.8272 0.4050 0.8272 0.9095
No log 14.0 210 0.7885 0.5248 0.7885 0.8879
No log 14.1333 212 0.8659 0.5355 0.8659 0.9305
No log 14.2667 214 0.9021 0.5012 0.9021 0.9498
No log 14.4 216 0.8525 0.5244 0.8525 0.9233
No log 14.5333 218 0.8375 0.5044 0.8375 0.9151
No log 14.6667 220 0.8061 0.4857 0.8061 0.8978
No log 14.8 222 0.7823 0.5678 0.7823 0.8845
No log 14.9333 224 0.7845 0.5210 0.7845 0.8857
No log 15.0667 226 0.8143 0.4857 0.8143 0.9024
No log 15.2 228 0.8023 0.5210 0.8023 0.8957
No log 15.3333 230 0.7979 0.4875 0.7979 0.8933
No log 15.4667 232 0.8398 0.4781 0.8398 0.9164
No log 15.6 234 0.8252 0.5160 0.8252 0.9084
No log 15.7333 236 0.8250 0.4843 0.8250 0.9083
No log 15.8667 238 0.8706 0.5255 0.8706 0.9331
No log 16.0 240 0.8594 0.5267 0.8594 0.9270
No log 16.1333 242 0.7989 0.5186 0.7989 0.8938
No log 16.2667 244 0.7888 0.5197 0.7888 0.8882
No log 16.4 246 0.8379 0.4833 0.8379 0.9154
No log 16.5333 248 0.8294 0.4833 0.8294 0.9107
No log 16.6667 250 0.7945 0.4511 0.7945 0.8914
No log 16.8 252 0.8073 0.4840 0.8073 0.8985
No log 16.9333 254 0.8052 0.4850 0.8052 0.8974
No log 17.0667 256 0.8458 0.5032 0.8458 0.9197
No log 17.2 258 0.8380 0.5150 0.8380 0.9154
No log 17.3333 260 0.8496 0.5150 0.8496 0.9218
No log 17.4667 262 0.8554 0.5032 0.8554 0.9249
No log 17.6 264 0.8949 0.5032 0.8949 0.9460
No log 17.7333 266 0.9696 0.5295 0.9696 0.9847
No log 17.8667 268 0.8892 0.5119 0.8892 0.9430
No log 18.0 270 0.7727 0.5062 0.7727 0.8790
No log 18.1333 272 0.7336 0.5247 0.7336 0.8565
No log 18.2667 274 0.7339 0.5467 0.7339 0.8567
No log 18.4 276 0.7673 0.5062 0.7673 0.8760
No log 18.5333 278 0.8691 0.5026 0.8691 0.9323
No log 18.6667 280 0.9664 0.4777 0.9664 0.9831
No log 18.8 282 0.9094 0.5318 0.9094 0.9536
No log 18.9333 284 0.7878 0.5062 0.7878 0.8876
No log 19.0667 286 0.7460 0.5455 0.7460 0.8637
No log 19.2 288 0.7482 0.5455 0.7482 0.8650
No log 19.3333 290 0.7697 0.4976 0.7697 0.8773
No log 19.4667 292 0.9502 0.5102 0.9502 0.9748
No log 19.6 294 1.0991 0.4475 1.0991 1.0484
No log 19.7333 296 1.0194 0.4668 1.0194 1.0096
No log 19.8667 298 0.8216 0.4929 0.8216 0.9064
No log 20.0 300 0.7426 0.4742 0.7426 0.8617
No log 20.1333 302 0.7379 0.4774 0.7379 0.8590
No log 20.2667 304 0.7422 0.4774 0.7422 0.8615
No log 20.4 306 0.7608 0.4741 0.7608 0.8723
No log 20.5333 308 0.8841 0.4911 0.8841 0.9403
No log 20.6667 310 1.0236 0.4668 1.0236 1.0117
No log 20.8 312 0.9889 0.4882 0.9889 0.9945
No log 20.9333 314 0.8993 0.5018 0.8993 0.9483
No log 21.0667 316 0.8149 0.5671 0.8149 0.9027
No log 21.2 318 0.8232 0.5473 0.8232 0.9073
No log 21.3333 320 0.9124 0.4694 0.9124 0.9552
No log 21.4667 322 0.9979 0.4877 0.9979 0.9990
No log 21.6 324 0.9331 0.4681 0.9331 0.9659
No log 21.7333 326 0.8185 0.4832 0.8185 0.9047
No log 21.8667 328 0.7499 0.4782 0.7499 0.8659
No log 22.0 330 0.7554 0.5068 0.7554 0.8692
No log 22.1333 332 0.7439 0.5136 0.7439 0.8625
No log 22.2667 334 0.7696 0.4969 0.7696 0.8773
No log 22.4 336 0.9147 0.4889 0.9147 0.9564
No log 22.5333 338 0.9893 0.4773 0.9893 0.9946
No log 22.6667 340 0.9299 0.4792 0.9299 0.9643
No log 22.8 342 0.8198 0.5039 0.8198 0.9054
No log 22.9333 344 0.7580 0.5188 0.7580 0.8706
No log 23.0667 346 0.7721 0.4829 0.7721 0.8787
No log 23.2 348 0.8521 0.4924 0.8521 0.9231
No log 23.3333 350 0.9099 0.4579 0.9099 0.9539
No log 23.4667 352 0.8975 0.4349 0.8975 0.9474
No log 23.6 354 0.8650 0.4590 0.8650 0.9300
No log 23.7333 356 0.8119 0.4829 0.8119 0.9010
No log 23.8667 358 0.7787 0.4958 0.7787 0.8824
No log 24.0 360 0.8016 0.4958 0.8016 0.8953
No log 24.1333 362 0.7953 0.5363 0.7953 0.8918
No log 24.2667 364 0.8078 0.5338 0.8078 0.8988
No log 24.4 366 0.8395 0.5416 0.8395 0.9162
No log 24.5333 368 0.8734 0.5121 0.8734 0.9346
No log 24.6667 370 0.8109 0.5234 0.8109 0.9005
No log 24.8 372 0.7529 0.5057 0.7529 0.8677
No log 24.9333 374 0.7335 0.5195 0.7335 0.8565
No log 25.0667 376 0.7239 0.5348 0.7239 0.8508
No log 25.2 378 0.7191 0.4745 0.7191 0.8480
No log 25.3333 380 0.7382 0.5195 0.7382 0.8592
No log 25.4667 382 0.7594 0.5383 0.7594 0.8714
No log 25.6 384 0.7314 0.5584 0.7314 0.8552
No log 25.7333 386 0.7147 0.6100 0.7147 0.8454
No log 25.8667 388 0.7452 0.5537 0.7452 0.8633
No log 26.0 390 0.7792 0.5234 0.7792 0.8827
No log 26.1333 392 0.7787 0.5234 0.7787 0.8824
No log 26.2667 394 0.8061 0.5129 0.8061 0.8978
No log 26.4 396 0.8036 0.5129 0.8036 0.8965
No log 26.5333 398 0.7551 0.5554 0.7551 0.8689
No log 26.6667 400 0.7275 0.5964 0.7275 0.8529
No log 26.8 402 0.7361 0.5898 0.7361 0.8580
No log 26.9333 404 0.7329 0.5224 0.7329 0.8561
No log 27.0667 406 0.7698 0.5305 0.7698 0.8774
No log 27.2 408 0.8543 0.5019 0.8543 0.9243
No log 27.3333 410 0.9056 0.4994 0.9056 0.9516
No log 27.4667 412 0.8548 0.5019 0.8548 0.9245
No log 27.6 414 0.7623 0.5510 0.7623 0.8731
No log 27.7333 416 0.7417 0.5562 0.7417 0.8612
No log 27.8667 418 0.7474 0.5847 0.7474 0.8645
No log 28.0 420 0.7308 0.5545 0.7308 0.8548
No log 28.1333 422 0.7416 0.5378 0.7416 0.8612
No log 28.2667 424 0.8130 0.5019 0.8130 0.9016
No log 28.4 426 0.8563 0.5 0.8563 0.9254
No log 28.5333 428 0.8260 0.4796 0.8260 0.9089
No log 28.6667 430 0.7676 0.4711 0.7676 0.8761
No log 28.8 432 0.7312 0.5210 0.7312 0.8551
No log 28.9333 434 0.7370 0.4789 0.7370 0.8585
No log 29.0667 436 0.7431 0.5455 0.7431 0.8620
No log 29.2 438 0.7493 0.5212 0.7493 0.8656
No log 29.3333 440 0.8014 0.4850 0.8014 0.8952
No log 29.4667 442 0.8711 0.5126 0.8711 0.9333
No log 29.6 444 0.8752 0.5126 0.8752 0.9355
No log 29.7333 446 0.8511 0.5144 0.8511 0.9226
No log 29.8667 448 0.7867 0.5419 0.7867 0.8870
No log 30.0 450 0.7537 0.4203 0.7537 0.8682
No log 30.1333 452 0.7563 0.4192 0.7563 0.8696
No log 30.2667 454 0.7611 0.4297 0.7611 0.8724
No log 30.4 456 0.7994 0.4968 0.7994 0.8941
No log 30.5333 458 0.8994 0.5137 0.8994 0.9484
No log 30.6667 460 0.9457 0.4994 0.9457 0.9725
No log 30.8 462 0.9169 0.4560 0.9169 0.9576
No log 30.9333 464 0.8491 0.4940 0.8491 0.9215
No log 31.0667 466 0.8175 0.4948 0.8175 0.9042
No log 31.2 468 0.7933 0.5067 0.7933 0.8907
No log 31.3333 470 0.8128 0.4948 0.8128 0.9015
No log 31.4667 472 0.8848 0.4820 0.8848 0.9407
No log 31.6 474 0.9809 0.4910 0.9809 0.9904
No log 31.7333 476 1.0169 0.4600 1.0169 1.0084
No log 31.8667 478 0.9845 0.4902 0.9845 0.9922
No log 32.0 480 0.8955 0.4810 0.8955 0.9463
No log 32.1333 482 0.8076 0.4948 0.8076 0.8987
No log 32.2667 484 0.7599 0.5669 0.7599 0.8717
No log 32.4 486 0.7583 0.4565 0.7583 0.8708
No log 32.5333 488 0.7626 0.4565 0.7626 0.8733
No log 32.6667 490 0.7706 0.5223 0.7706 0.8778
No log 32.8 492 0.7963 0.4617 0.7963 0.8923
No log 32.9333 494 0.8468 0.4829 0.8468 0.9202
No log 33.0667 496 0.8631 0.5046 0.8631 0.9290
No log 33.2 498 0.8285 0.4956 0.8285 0.9102
0.2435 33.3333 500 0.7920 0.4648 0.7920 0.8900
0.2435 33.4667 502 0.7903 0.5213 0.7903 0.8890
0.2435 33.6 504 0.7980 0.4880 0.7980 0.8933
0.2435 33.7333 506 0.8174 0.5090 0.8174 0.9041
0.2435 33.8667 508 0.8657 0.4840 0.8657 0.9305
0.2435 34.0 510 0.8944 0.4489 0.8944 0.9457
0.2435 34.1333 512 0.8706 0.4829 0.8706 0.9331
0.2435 34.2667 514 0.8507 0.4829 0.8507 0.9223
0.2435 34.4 516 0.8070 0.4839 0.8070 0.8983
0.2435 34.5333 518 0.7738 0.4169 0.7738 0.8796
0.2435 34.6667 520 0.7641 0.4169 0.7641 0.8741
0.2435 34.8 522 0.7683 0.4546 0.7683 0.8765
0.2435 34.9333 524 0.7770 0.4546 0.7770 0.8815
0.2435 35.0667 526 0.7764 0.4546 0.7764 0.8812

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k4_task5_organization

Finetuned
(4019)
this model