ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6534
  • Qwk: 0.5254
  • Mse: 0.6534
  • Rmse: 0.8084

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 2.4986 -0.0924 2.4986 1.5807
No log 0.1481 4 1.1492 0.1856 1.1492 1.0720
No log 0.2222 6 1.0818 -0.0550 1.0818 1.0401
No log 0.2963 8 1.0794 0.0442 1.0794 1.0390
No log 0.3704 10 1.2113 -0.0058 1.2113 1.1006
No log 0.4444 12 1.3256 -0.0665 1.3256 1.1514
No log 0.5185 14 1.3889 -0.0517 1.3889 1.1785
No log 0.5926 16 1.1396 0.1469 1.1396 1.0675
No log 0.6667 18 1.0366 0.1469 1.0366 1.0181
No log 0.7407 20 0.9565 0.2114 0.9565 0.9780
No log 0.8148 22 0.9161 0.1570 0.9161 0.9571
No log 0.8889 24 0.9314 0.1452 0.9314 0.9651
No log 0.9630 26 0.9039 0.2163 0.9039 0.9508
No log 1.0370 28 0.8069 0.2456 0.8069 0.8983
No log 1.1111 30 0.7905 0.2204 0.7905 0.8891
No log 1.1852 32 0.7905 0.2149 0.7905 0.8891
No log 1.2593 34 0.7946 0.2712 0.7946 0.8914
No log 1.3333 36 0.9880 0.1451 0.9880 0.9940
No log 1.4074 38 0.9193 0.2080 0.9193 0.9588
No log 1.4815 40 0.7745 0.2872 0.7745 0.8800
No log 1.5556 42 0.7203 0.2506 0.7203 0.8487
No log 1.6296 44 0.7076 0.1807 0.7076 0.8412
No log 1.7037 46 0.7031 0.2181 0.7031 0.8385
No log 1.7778 48 0.7165 0.1981 0.7165 0.8465
No log 1.8519 50 0.7222 0.1979 0.7222 0.8498
No log 1.9259 52 0.6939 0.3229 0.6939 0.8330
No log 2.0 54 0.6992 0.3020 0.6992 0.8362
No log 2.0741 56 0.7742 0.3359 0.7742 0.8799
No log 2.1481 58 0.7674 0.3359 0.7674 0.8760
No log 2.2222 60 0.7663 0.3530 0.7663 0.8754
No log 2.2963 62 0.8405 0.3408 0.8405 0.9168
No log 2.3704 64 0.9533 0.2456 0.9533 0.9764
No log 2.4444 66 0.9732 0.2756 0.9732 0.9865
No log 2.5185 68 0.8163 0.2825 0.8163 0.9035
No log 2.5926 70 0.7167 0.2182 0.7167 0.8466
No log 2.6667 72 0.7294 0.3445 0.7294 0.8541
No log 2.7407 74 0.7023 0.3228 0.7023 0.8380
No log 2.8148 76 0.6920 0.2607 0.6920 0.8319
No log 2.8889 78 0.6981 0.2607 0.6981 0.8356
No log 2.9630 80 0.6861 0.2145 0.6861 0.8283
No log 3.0370 82 0.6761 0.2857 0.6761 0.8223
No log 3.1111 84 0.6966 0.2857 0.6966 0.8346
No log 3.1852 86 0.6821 0.2857 0.6821 0.8259
No log 3.2593 88 0.6495 0.3323 0.6495 0.8059
No log 3.3333 90 0.7237 0.1739 0.7237 0.8507
No log 3.4074 92 0.8812 0.2813 0.8812 0.9387
No log 3.4815 94 0.8390 0.2813 0.8390 0.9160
No log 3.5556 96 0.7116 0.3590 0.7116 0.8436
No log 3.6296 98 0.6633 0.3474 0.6633 0.8145
No log 3.7037 100 0.7756 0.3167 0.7756 0.8807
No log 3.7778 102 0.8524 0.4432 0.8524 0.9233
No log 3.8519 104 0.7608 0.4112 0.7608 0.8723
No log 3.9259 106 0.6092 0.3836 0.6092 0.7805
No log 4.0 108 0.6286 0.3780 0.6286 0.7928
No log 4.0741 110 0.6629 0.2979 0.6629 0.8142
No log 4.1481 112 0.7605 0.1959 0.7605 0.8721
No log 4.2222 114 0.8835 0.2787 0.8835 0.9400
No log 4.2963 116 0.8628 0.3361 0.8628 0.9289
No log 4.3704 118 0.7693 0.3216 0.7693 0.8771
No log 4.4444 120 0.6808 0.2857 0.6808 0.8251
No log 4.5185 122 0.6235 0.3258 0.6235 0.7896
No log 4.5926 124 0.6641 0.4124 0.6641 0.8149
No log 4.6667 126 0.7144 0.4997 0.7144 0.8452
No log 4.7407 128 0.6978 0.4769 0.6978 0.8353
No log 4.8148 130 0.6561 0.4774 0.6561 0.8100
No log 4.8889 132 0.6440 0.3833 0.6440 0.8025
No log 4.9630 134 0.8176 0.3676 0.8176 0.9042
No log 5.0370 136 1.1016 0.2702 1.1016 1.0496
No log 5.1111 138 1.0484 0.2604 1.0484 1.0239
No log 5.1852 140 0.7892 0.3373 0.7892 0.8884
No log 5.2593 142 0.6335 0.3754 0.6335 0.7960
No log 5.3333 144 0.7057 0.4618 0.7057 0.8400
No log 5.4074 146 0.8194 0.3604 0.8194 0.9052
No log 5.4815 148 0.8048 0.3844 0.8048 0.8971
No log 5.5556 150 0.6955 0.4352 0.6955 0.8340
No log 5.6296 152 0.6304 0.3530 0.6304 0.7940
No log 5.7037 154 0.7483 0.3547 0.7483 0.8650
No log 5.7778 156 0.8251 0.3825 0.8251 0.9084
No log 5.8519 158 0.7427 0.3986 0.7427 0.8618
No log 5.9259 160 0.6369 0.3396 0.6369 0.7981
No log 6.0 162 0.6091 0.3580 0.6091 0.7804
No log 6.0741 164 0.6288 0.3836 0.6288 0.7930
No log 6.1481 166 0.6447 0.4027 0.6447 0.8029
No log 6.2222 168 0.6446 0.4147 0.6446 0.8029
No log 6.2963 170 0.6527 0.4314 0.6527 0.8079
No log 6.3704 172 0.6597 0.4157 0.6597 0.8122
No log 6.4444 174 0.6194 0.4397 0.6194 0.7870
No log 6.5185 176 0.6011 0.4027 0.6011 0.7753
No log 6.5926 178 0.6465 0.3566 0.6465 0.8041
No log 6.6667 180 0.6837 0.4037 0.6837 0.8268
No log 6.7407 182 0.6994 0.3695 0.6994 0.8363
No log 6.8148 184 0.6349 0.4200 0.6349 0.7968
No log 6.8889 186 0.6114 0.4091 0.6114 0.7819
No log 6.9630 188 0.6327 0.4788 0.6327 0.7954
No log 7.0370 190 0.6438 0.4622 0.6438 0.8023
No log 7.1111 192 0.6467 0.4393 0.6467 0.8042
No log 7.1852 194 0.6156 0.4397 0.6156 0.7846
No log 7.2593 196 0.6118 0.4027 0.6118 0.7822
No log 7.3333 198 0.6222 0.3995 0.6222 0.7888
No log 7.4074 200 0.6498 0.3373 0.6498 0.8061
No log 7.4815 202 0.6324 0.3859 0.6324 0.7952
No log 7.5556 204 0.6179 0.5411 0.6179 0.7860
No log 7.6296 206 0.7482 0.4837 0.7482 0.8650
No log 7.7037 208 0.9337 0.3094 0.9337 0.9663
No log 7.7778 210 0.8671 0.3333 0.8671 0.9312
No log 7.8519 212 0.7100 0.4550 0.7100 0.8426
No log 7.9259 214 0.6247 0.5256 0.6247 0.7904
No log 8.0 216 0.5923 0.4738 0.5923 0.7696
No log 8.0741 218 0.6256 0.5184 0.6256 0.7910
No log 8.1481 220 0.7461 0.4476 0.7461 0.8637
No log 8.2222 222 0.7909 0.4430 0.7909 0.8893
No log 8.2963 224 0.7130 0.5231 0.7130 0.8444
No log 8.3704 226 0.6563 0.5179 0.6563 0.8101
No log 8.4444 228 0.5965 0.5170 0.5965 0.7723
No log 8.5185 230 0.5841 0.5678 0.5841 0.7643
No log 8.5926 232 0.6381 0.5528 0.6381 0.7988
No log 8.6667 234 0.5711 0.4386 0.5711 0.7557
No log 8.7407 236 0.5661 0.4837 0.5661 0.7524
No log 8.8148 238 0.6068 0.4393 0.6068 0.7790
No log 8.8889 240 0.6384 0.4845 0.6384 0.7990
No log 8.9630 242 0.6978 0.4845 0.6978 0.8354
No log 9.0370 244 0.7474 0.4684 0.7474 0.8645
No log 9.1111 246 0.6859 0.4845 0.6859 0.8282
No log 9.1852 248 0.6079 0.4413 0.6079 0.7797
No log 9.2593 250 0.6019 0.4505 0.6019 0.7758
No log 9.3333 252 0.6126 0.5190 0.6126 0.7827
No log 9.4074 254 0.6082 0.4735 0.6082 0.7799
No log 9.4815 256 0.6293 0.5032 0.6293 0.7933
No log 9.5556 258 0.7107 0.4980 0.7107 0.8430
No log 9.6296 260 0.7482 0.4556 0.7482 0.8650
No log 9.7037 262 0.6611 0.4628 0.6611 0.8131
No log 9.7778 264 0.5669 0.5559 0.5669 0.7529
No log 9.8519 266 0.5529 0.5434 0.5529 0.7436
No log 9.9259 268 0.5685 0.5222 0.5685 0.7540
No log 10.0 270 0.5938 0.4740 0.5938 0.7706
No log 10.0741 272 0.6174 0.5081 0.6174 0.7857
No log 10.1481 274 0.6209 0.5081 0.6209 0.7880
No log 10.2222 276 0.5985 0.4504 0.5985 0.7736
No log 10.2963 278 0.5819 0.4624 0.5819 0.7628
No log 10.3704 280 0.6051 0.4698 0.6051 0.7779
No log 10.4444 282 0.6192 0.4624 0.6192 0.7869
No log 10.5185 284 0.6481 0.4504 0.6481 0.8050
No log 10.5926 286 0.6611 0.4587 0.6611 0.8131
No log 10.6667 288 0.6627 0.4737 0.6627 0.8141
No log 10.7407 290 0.6311 0.4681 0.6311 0.7944
No log 10.8148 292 0.5850 0.4700 0.5850 0.7649
No log 10.8889 294 0.5782 0.4484 0.5782 0.7604
No log 10.9630 296 0.5837 0.4484 0.5837 0.7640
No log 11.0370 298 0.5964 0.4949 0.5964 0.7722
No log 11.1111 300 0.6559 0.4622 0.6559 0.8099
No log 11.1852 302 0.7404 0.4218 0.7404 0.8604
No log 11.2593 304 0.8008 0.4315 0.8008 0.8949
No log 11.3333 306 0.8044 0.4315 0.8044 0.8969
No log 11.4074 308 0.7510 0.4085 0.7510 0.8666
No log 11.4815 310 0.7077 0.4094 0.7077 0.8412
No log 11.5556 312 0.7014 0.3518 0.7014 0.8375
No log 11.6296 314 0.7012 0.4058 0.7012 0.8374
No log 11.7037 316 0.6980 0.3724 0.6980 0.8355
No log 11.7778 318 0.8483 0.4684 0.8483 0.9211
No log 11.8519 320 1.0402 0.4040 1.0402 1.0199
No log 11.9259 322 1.0794 0.3347 1.0794 1.0389
No log 12.0 324 1.0302 0.3347 1.0302 1.0150
No log 12.0741 326 0.9542 0.4149 0.9542 0.9768
No log 12.1481 328 0.7926 0.5339 0.7926 0.8903
No log 12.2222 330 0.6559 0.5164 0.6559 0.8099
No log 12.2963 332 0.6202 0.4488 0.6202 0.7875
No log 12.3704 334 0.6486 0.4618 0.6486 0.8053
No log 12.4444 336 0.6437 0.4841 0.6437 0.8023
No log 12.5185 338 0.6109 0.5067 0.6109 0.7816
No log 12.5926 340 0.6341 0.5334 0.6341 0.7963
No log 12.6667 342 0.6674 0.5538 0.6674 0.8169
No log 12.7407 344 0.6475 0.5658 0.6475 0.8047
No log 12.8148 346 0.6762 0.5204 0.6762 0.8223
No log 12.8889 348 0.6917 0.5133 0.6917 0.8317
No log 12.9630 350 0.7243 0.4836 0.7243 0.8511
No log 13.0370 352 0.8373 0.4667 0.8373 0.9150
No log 13.1111 354 0.8840 0.4183 0.8840 0.9402
No log 13.1852 356 0.8461 0.4683 0.8461 0.9198
No log 13.2593 358 0.7745 0.4887 0.7745 0.8801
No log 13.3333 360 0.6859 0.4898 0.6859 0.8282
No log 13.4074 362 0.6085 0.4684 0.6085 0.7801
No log 13.4815 364 0.5940 0.4534 0.5940 0.7707
No log 13.5556 366 0.5812 0.5324 0.5812 0.7624
No log 13.6296 368 0.5957 0.5157 0.5957 0.7718
No log 13.7037 370 0.6026 0.5157 0.6026 0.7763
No log 13.7778 372 0.6127 0.5239 0.6127 0.7828
No log 13.8519 374 0.6049 0.5708 0.6049 0.7777
No log 13.9259 376 0.5906 0.5653 0.5906 0.7685
No log 14.0 378 0.5722 0.5668 0.5722 0.7564
No log 14.0741 380 0.5518 0.5533 0.5518 0.7429
No log 14.1481 382 0.5486 0.5533 0.5486 0.7407
No log 14.2222 384 0.5502 0.5533 0.5502 0.7417
No log 14.2963 386 0.5467 0.4831 0.5467 0.7394
No log 14.3704 388 0.5466 0.5533 0.5466 0.7393
No log 14.4444 390 0.5614 0.5123 0.5614 0.7493
No log 14.5185 392 0.5598 0.5362 0.5598 0.7482
No log 14.5926 394 0.5746 0.4562 0.5746 0.7580
No log 14.6667 396 0.5610 0.5362 0.5610 0.7490
No log 14.7407 398 0.5454 0.4681 0.5454 0.7385
No log 14.8148 400 0.5589 0.4590 0.5589 0.7476
No log 14.8889 402 0.5541 0.4505 0.5541 0.7444
No log 14.9630 404 0.5571 0.5095 0.5571 0.7464
No log 15.0370 406 0.5704 0.5232 0.5704 0.7553
No log 15.1111 408 0.5957 0.4966 0.5957 0.7718
No log 15.1852 410 0.6102 0.4661 0.6102 0.7811
No log 15.2593 412 0.6172 0.4740 0.6172 0.7856
No log 15.3333 414 0.6161 0.4968 0.6161 0.7849
No log 15.4074 416 0.6293 0.4872 0.6293 0.7933
No log 15.4815 418 0.6291 0.4872 0.6291 0.7931
No log 15.5556 420 0.6318 0.4872 0.6318 0.7949
No log 15.6296 422 0.6332 0.5098 0.6332 0.7957
No log 15.7037 424 0.6137 0.4819 0.6137 0.7834
No log 15.7778 426 0.6253 0.4819 0.6253 0.7907
No log 15.8519 428 0.6465 0.5015 0.6465 0.8041
No log 15.9259 430 0.7223 0.4574 0.7223 0.8499
No log 16.0 432 0.8059 0.5132 0.8059 0.8977
No log 16.0741 434 0.7865 0.4952 0.7865 0.8869
No log 16.1481 436 0.7463 0.4648 0.7463 0.8639
No log 16.2222 438 0.7094 0.5133 0.7094 0.8423
No log 16.2963 440 0.7284 0.5170 0.7284 0.8535
No log 16.3704 442 0.6975 0.4920 0.6975 0.8352
No log 16.4444 444 0.6335 0.4968 0.6335 0.7959
No log 16.5185 446 0.6028 0.5150 0.6028 0.7764
No log 16.5926 448 0.5984 0.4659 0.5984 0.7736
No log 16.6667 450 0.6032 0.4719 0.6032 0.7767
No log 16.7407 452 0.6150 0.4777 0.6150 0.7842
No log 16.8148 454 0.6418 0.5308 0.6418 0.8011
No log 16.8889 456 0.6786 0.5342 0.6786 0.8237
No log 16.9630 458 0.7543 0.5155 0.7543 0.8685
No log 17.0370 460 0.7696 0.5155 0.7696 0.8773
No log 17.1111 462 0.6670 0.5726 0.6670 0.8167
No log 17.1852 464 0.5919 0.5528 0.5919 0.7694
No log 17.2593 466 0.5721 0.4640 0.5721 0.7564
No log 17.3333 468 0.5726 0.4919 0.5726 0.7567
No log 17.4074 470 0.5756 0.4596 0.5756 0.7587
No log 17.4815 472 0.5960 0.4964 0.5960 0.7720
No log 17.5556 474 0.6420 0.5063 0.6420 0.8012
No log 17.6296 476 0.6805 0.4827 0.6805 0.8249
No log 17.7037 478 0.6909 0.4979 0.6909 0.8312
No log 17.7778 480 0.6666 0.5063 0.6666 0.8165
No log 17.8519 482 0.6002 0.5177 0.6002 0.7748
No log 17.9259 484 0.5672 0.5362 0.5672 0.7532
No log 18.0 486 0.5502 0.4789 0.5502 0.7417
No log 18.0741 488 0.5568 0.5301 0.5568 0.7462
No log 18.1481 490 0.5659 0.5509 0.5659 0.7523
No log 18.2222 492 0.5624 0.5509 0.5624 0.7499
No log 18.2963 494 0.5549 0.5383 0.5549 0.7449
No log 18.3704 496 0.5663 0.5503 0.5663 0.7525
No log 18.4444 498 0.5792 0.5411 0.5792 0.7611
0.3356 18.5185 500 0.5968 0.5254 0.5968 0.7726
0.3356 18.5926 502 0.6268 0.4513 0.6268 0.7917
0.3356 18.6667 504 0.6170 0.4769 0.6170 0.7855
0.3356 18.7407 506 0.6012 0.5016 0.6012 0.7754
0.3356 18.8148 508 0.5776 0.5485 0.5776 0.7600
0.3356 18.8889 510 0.5614 0.5476 0.5614 0.7493
0.3356 18.9630 512 0.5803 0.4490 0.5803 0.7617
0.3356 19.0370 514 0.6021 0.5013 0.6021 0.7760
0.3356 19.1111 516 0.6147 0.5497 0.6147 0.7840
0.3356 19.1852 518 0.6174 0.5018 0.6174 0.7857
0.3356 19.2593 520 0.6430 0.5060 0.6430 0.8019
0.3356 19.3333 522 0.7021 0.4779 0.7021 0.8379
0.3356 19.4074 524 0.8075 0.5336 0.8075 0.8986
0.3356 19.4815 526 0.7973 0.5377 0.7973 0.8929
0.3356 19.5556 528 0.7069 0.5185 0.7069 0.8408
0.3356 19.6296 530 0.6609 0.5668 0.6609 0.8129
0.3356 19.7037 532 0.6384 0.5134 0.6384 0.7990
0.3356 19.7778 534 0.6457 0.5195 0.6457 0.8035
0.3356 19.8519 536 0.6863 0.5687 0.6863 0.8284
0.3356 19.9259 538 0.7094 0.5524 0.7094 0.8423
0.3356 20.0 540 0.6941 0.5553 0.6941 0.8332
0.3356 20.0741 542 0.6334 0.5653 0.6334 0.7958
0.3356 20.1481 544 0.6120 0.5473 0.6120 0.7823
0.3356 20.2222 546 0.6096 0.5115 0.6096 0.7807
0.3356 20.2963 548 0.6168 0.5272 0.6168 0.7854
0.3356 20.3704 550 0.6122 0.4763 0.6122 0.7824
0.3356 20.4444 552 0.6198 0.4006 0.6198 0.7873
0.3356 20.5185 554 0.6294 0.4763 0.6294 0.7933
0.3356 20.5926 556 0.6530 0.5254 0.6530 0.8081
0.3356 20.6667 558 0.6534 0.5254 0.6534 0.8084

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k5_task7_organization

Finetuned
(4019)
this model