ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5757
  • Qwk: 0.5065
  • Mse: 0.5757
  • Rmse: 0.7587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0198 2 2.7029 -0.0407 2.7029 1.6441
No log 0.0396 4 1.3029 0.0750 1.3029 1.1415
No log 0.0594 6 1.0034 -0.0970 1.0034 1.0017
No log 0.0792 8 1.0302 0.1011 1.0302 1.0150
No log 0.0990 10 1.0804 0.1312 1.0804 1.0394
No log 0.1188 12 0.8393 0.2116 0.8393 0.9161
No log 0.1386 14 0.8154 0.3020 0.8154 0.9030
No log 0.1584 16 0.7451 0.1139 0.7451 0.8632
No log 0.1782 18 0.7407 0.1187 0.7407 0.8606
No log 0.1980 20 0.7469 0.1617 0.7469 0.8643
No log 0.2178 22 0.7250 -0.0027 0.7250 0.8515
No log 0.2376 24 0.8446 0.1739 0.8446 0.9190
No log 0.2574 26 0.9423 0.2613 0.9423 0.9707
No log 0.2772 28 0.9466 0.2942 0.9466 0.9729
No log 0.2970 30 0.7846 0.2361 0.7846 0.8858
No log 0.3168 32 0.7508 0.1790 0.7508 0.8665
No log 0.3366 34 0.7674 0.1308 0.7674 0.8760
No log 0.3564 36 0.7481 0.2576 0.7481 0.8649
No log 0.3762 38 0.8317 0.1660 0.8317 0.9120
No log 0.3960 40 1.2931 0.0540 1.2931 1.1371
No log 0.4158 42 1.5565 0.0 1.5565 1.2476
No log 0.4356 44 1.4560 0.0150 1.4560 1.2066
No log 0.4554 46 1.2021 0.1251 1.2021 1.0964
No log 0.4752 48 1.0903 0.1808 1.0903 1.0442
No log 0.4950 50 1.0136 0.2460 1.0136 1.0068
No log 0.5149 52 0.9725 0.3230 0.9725 0.9861
No log 0.5347 54 0.9886 0.2942 0.9886 0.9943
No log 0.5545 56 0.9970 0.3598 0.9970 0.9985
No log 0.5743 58 1.0450 0.3192 1.0450 1.0223
No log 0.5941 60 1.0476 0.3926 1.0476 1.0235
No log 0.6139 62 0.9855 0.2109 0.9855 0.9927
No log 0.6337 64 0.9559 0.1358 0.9559 0.9777
No log 0.6535 66 0.9849 0.1345 0.9849 0.9924
No log 0.6733 68 0.9461 0.1315 0.9461 0.9727
No log 0.6931 70 0.9227 0.0460 0.9227 0.9606
No log 0.7129 72 0.9739 0.0509 0.9739 0.9868
No log 0.7327 74 1.0132 0.1754 1.0132 1.0066
No log 0.7525 76 0.9870 0.2132 0.9870 0.9935
No log 0.7723 78 0.9496 0.2132 0.9496 0.9745
No log 0.7921 80 0.8729 0.1770 0.8729 0.9343
No log 0.8119 82 0.8443 0.1754 0.8443 0.9189
No log 0.8317 84 0.8091 0.1315 0.8091 0.8995
No log 0.8515 86 0.7931 0.1315 0.7931 0.8905
No log 0.8713 88 0.7530 0.1181 0.7530 0.8678
No log 0.8911 90 0.7418 0.2181 0.7418 0.8613
No log 0.9109 92 0.7287 0.2540 0.7287 0.8536
No log 0.9307 94 0.8058 0.2031 0.8058 0.8976
No log 0.9505 96 0.9283 0.3094 0.9283 0.9635
No log 0.9703 98 1.0943 0.2746 1.0943 1.0461
No log 0.9901 100 1.0768 0.2820 1.0768 1.0377
No log 1.0099 102 1.0485 0.2651 1.0485 1.0240
No log 1.0297 104 0.9823 0.3270 0.9823 0.9911
No log 1.0495 106 0.9121 0.1724 0.9121 0.9550
No log 1.0693 108 0.9188 0.2109 0.9188 0.9585
No log 1.0891 110 0.9236 0.1739 0.9236 0.9610
No log 1.1089 112 0.9011 0.0919 0.9011 0.9493
No log 1.1287 114 0.8283 0.1660 0.8283 0.9101
No log 1.1485 116 0.8417 0.1660 0.8417 0.9175
No log 1.1683 118 0.9088 0.2375 0.9088 0.9533
No log 1.1881 120 0.9220 0.2920 0.9220 0.9602
No log 1.2079 122 0.8915 0.3151 0.8915 0.9442
No log 1.2277 124 0.9269 0.2678 0.9269 0.9627
No log 1.2475 126 0.8747 0.2711 0.8747 0.9353
No log 1.2673 128 0.8088 0.3425 0.8088 0.8993
No log 1.2871 130 0.7763 0.3096 0.7763 0.8811
No log 1.3069 132 0.8049 0.3467 0.8049 0.8972
No log 1.3267 134 0.8665 0.2652 0.8665 0.9309
No log 1.3465 136 0.9232 0.2651 0.9232 0.9608
No log 1.3663 138 0.8843 0.2977 0.8843 0.9404
No log 1.3861 140 0.7945 0.2991 0.7945 0.8913
No log 1.4059 142 0.7424 0.3373 0.7424 0.8616
No log 1.4257 144 0.7402 0.3279 0.7402 0.8604
No log 1.4455 146 0.7636 0.2879 0.7636 0.8739
No log 1.4653 148 0.8457 0.3410 0.8457 0.9196
No log 1.4851 150 0.8606 0.3761 0.8606 0.9277
No log 1.5050 152 0.7810 0.2874 0.7810 0.8837
No log 1.5248 154 0.7446 0.3171 0.7446 0.8629
No log 1.5446 156 0.8232 0.2564 0.8232 0.9073
No log 1.5644 158 0.7910 0.3475 0.7910 0.8894
No log 1.5842 160 0.7625 0.2270 0.7625 0.8732
No log 1.6040 162 0.7997 0.2048 0.7997 0.8943
No log 1.6238 164 0.8766 0.3313 0.8766 0.9363
No log 1.6436 166 0.9486 0.2369 0.9486 0.9740
No log 1.6634 168 0.8929 0.2730 0.8929 0.9449
No log 1.6832 170 0.8233 0.3392 0.8233 0.9074
No log 1.7030 172 0.8027 0.4051 0.8027 0.8960
No log 1.7228 174 0.7956 0.3925 0.7956 0.8920
No log 1.7426 176 0.8142 0.3661 0.8142 0.9023
No log 1.7624 178 0.8464 0.3239 0.8464 0.9200
No log 1.7822 180 1.0085 0.2474 1.0085 1.0042
No log 1.8020 182 1.0468 0.2191 1.0468 1.0231
No log 1.8218 184 0.9121 0.3216 0.9121 0.9551
No log 1.8416 186 0.8771 0.3312 0.8771 0.9366
No log 1.8614 188 0.9225 0.3176 0.9225 0.9605
No log 1.8812 190 0.8439 0.3827 0.8439 0.9186
No log 1.9010 192 0.7792 0.4170 0.7792 0.8827
No log 1.9208 194 0.7144 0.4410 0.7144 0.8452
No log 1.9406 196 0.6629 0.3972 0.6629 0.8142
No log 1.9604 198 0.6423 0.4619 0.6423 0.8014
No log 1.9802 200 0.6377 0.4179 0.6377 0.7986
No log 2.0 202 0.6638 0.4037 0.6638 0.8147
No log 2.0198 204 0.6475 0.4179 0.6475 0.8046
No log 2.0396 206 0.6886 0.3863 0.6886 0.8298
No log 2.0594 208 0.6789 0.3818 0.6789 0.8239
No log 2.0792 210 0.7035 0.3387 0.7035 0.8387
No log 2.0990 212 0.7583 0.3913 0.7583 0.8708
No log 2.1188 214 0.7061 0.3387 0.7061 0.8403
No log 2.1386 216 0.7014 0.4387 0.7014 0.8375
No log 2.1584 218 0.7039 0.4887 0.7039 0.8390
No log 2.1782 220 0.7209 0.4090 0.7209 0.8491
No log 2.1980 222 0.7514 0.3726 0.7514 0.8668
No log 2.2178 224 0.7279 0.4102 0.7279 0.8532
No log 2.2376 226 0.7068 0.4638 0.7068 0.8407
No log 2.2574 228 0.6923 0.4692 0.6923 0.8321
No log 2.2772 230 0.6602 0.4750 0.6602 0.8125
No log 2.2970 232 0.7080 0.3844 0.7080 0.8414
No log 2.3168 234 0.7039 0.3609 0.7039 0.8390
No log 2.3366 236 0.6586 0.4743 0.6586 0.8115
No log 2.3564 238 0.6748 0.4391 0.6748 0.8215
No log 2.3762 240 0.7448 0.3455 0.7448 0.8630
No log 2.3960 242 0.7671 0.3082 0.7671 0.8759
No log 2.4158 244 0.7480 0.4242 0.7480 0.8648
No log 2.4356 246 0.7176 0.4241 0.7176 0.8471
No log 2.4554 248 0.7316 0.3933 0.7316 0.8553
No log 2.4752 250 0.7321 0.3973 0.7321 0.8556
No log 2.4950 252 0.7214 0.3981 0.7214 0.8494
No log 2.5149 254 0.7225 0.4051 0.7225 0.8500
No log 2.5347 256 0.7309 0.3579 0.7309 0.8549
No log 2.5545 258 0.7467 0.4621 0.7467 0.8641
No log 2.5743 260 0.7284 0.4833 0.7284 0.8535
No log 2.5941 262 0.6809 0.4384 0.6809 0.8252
No log 2.6139 264 0.6946 0.4587 0.6946 0.8334
No log 2.6337 266 0.7342 0.4334 0.7342 0.8569
No log 2.6535 268 0.6463 0.4157 0.6463 0.8039
No log 2.6733 270 0.6006 0.5421 0.6006 0.7750
No log 2.6931 272 0.6245 0.5120 0.6245 0.7903
No log 2.7129 274 0.6524 0.4205 0.6524 0.8077
No log 2.7327 276 0.6826 0.4706 0.6826 0.8262
No log 2.7525 278 0.7164 0.4794 0.7164 0.8464
No log 2.7723 280 0.7004 0.4501 0.7004 0.8369
No log 2.7921 282 0.7120 0.4292 0.7120 0.8438
No log 2.8119 284 0.7625 0.4789 0.7625 0.8732
No log 2.8317 286 0.8491 0.3579 0.8491 0.9215
No log 2.8515 288 0.7982 0.3538 0.7982 0.8934
No log 2.8713 290 0.7237 0.4727 0.7237 0.8507
No log 2.8911 292 0.6892 0.4766 0.6892 0.8302
No log 2.9109 294 0.6657 0.4824 0.6657 0.8159
No log 2.9307 296 0.6661 0.4711 0.6661 0.8161
No log 2.9505 298 0.6944 0.3717 0.6944 0.8333
No log 2.9703 300 0.7331 0.3655 0.7331 0.8562
No log 2.9901 302 0.7095 0.4817 0.7095 0.8423
No log 3.0099 304 0.6308 0.5628 0.6308 0.7942
No log 3.0297 306 0.6005 0.5271 0.6005 0.7749
No log 3.0495 308 0.5984 0.5285 0.5984 0.7736
No log 3.0693 310 0.6226 0.5332 0.6226 0.7890
No log 3.0891 312 0.6435 0.5498 0.6435 0.8022
No log 3.1089 314 0.6469 0.5589 0.6469 0.8043
No log 3.1287 316 0.6571 0.5589 0.6571 0.8106
No log 3.1485 318 0.6455 0.5656 0.6455 0.8034
No log 3.1683 320 0.6920 0.5175 0.6920 0.8319
No log 3.1881 322 0.7113 0.4670 0.7113 0.8434
No log 3.2079 324 0.6353 0.5392 0.6353 0.7971
No log 3.2277 326 0.6129 0.4898 0.6129 0.7829
No log 3.2475 328 0.6118 0.5796 0.6118 0.7821
No log 3.2673 330 0.6810 0.5003 0.6810 0.8252
No log 3.2871 332 0.7654 0.3866 0.7654 0.8749
No log 3.3069 334 0.7243 0.4297 0.7243 0.8510
No log 3.3267 336 0.6082 0.6431 0.6082 0.7799
No log 3.3465 338 0.5868 0.4986 0.5868 0.7660
No log 3.3663 340 0.6537 0.4353 0.6537 0.8085
No log 3.3861 342 0.6116 0.4534 0.6116 0.7820
No log 3.4059 344 0.5535 0.5306 0.5535 0.7440
No log 3.4257 346 0.6153 0.4967 0.6153 0.7844
No log 3.4455 348 0.8461 0.2933 0.8461 0.9198
No log 3.4653 350 1.0214 0.2746 1.0214 1.0106
No log 3.4851 352 1.0504 0.2723 1.0504 1.0249
No log 3.5050 354 0.9324 0.3099 0.9324 0.9656
No log 3.5248 356 0.7342 0.4756 0.7342 0.8568
No log 3.5446 358 0.6022 0.5604 0.6022 0.7760
No log 3.5644 360 0.6379 0.4737 0.6379 0.7987
No log 3.5842 362 0.6149 0.5164 0.6149 0.7842
No log 3.6040 364 0.5855 0.5348 0.5855 0.7652
No log 3.6238 366 0.6287 0.4898 0.6287 0.7929
No log 3.6436 368 0.6030 0.5078 0.6030 0.7765
No log 3.6634 370 0.5589 0.5580 0.5589 0.7476
No log 3.6832 372 0.5508 0.5505 0.5508 0.7422
No log 3.7030 374 0.5535 0.4493 0.5535 0.7440
No log 3.7228 376 0.5648 0.5034 0.5648 0.7515
No log 3.7426 378 0.5351 0.5159 0.5351 0.7315
No log 3.7624 380 0.5309 0.5768 0.5309 0.7286
No log 3.7822 382 0.5405 0.5159 0.5405 0.7352
No log 3.8020 384 0.5442 0.5548 0.5442 0.7377
No log 3.8218 386 0.5613 0.5953 0.5613 0.7492
No log 3.8416 388 0.5813 0.5648 0.5813 0.7624
No log 3.8614 390 0.5937 0.5987 0.5937 0.7705
No log 3.8812 392 0.5819 0.6092 0.5819 0.7628
No log 3.9010 394 0.5739 0.5679 0.5739 0.7575
No log 3.9208 396 0.5720 0.5679 0.5720 0.7563
No log 3.9406 398 0.5754 0.5880 0.5754 0.7585
No log 3.9604 400 0.5795 0.5880 0.5795 0.7612
No log 3.9802 402 0.5759 0.5982 0.5759 0.7589
No log 4.0 404 0.5749 0.6052 0.5749 0.7582
No log 4.0198 406 0.5979 0.5306 0.5979 0.7732
No log 4.0396 408 0.5934 0.5093 0.5934 0.7703
No log 4.0594 410 0.5685 0.5662 0.5685 0.7540
No log 4.0792 412 0.5930 0.5642 0.5930 0.7700
No log 4.0990 414 0.6237 0.5030 0.6237 0.7898
No log 4.1188 416 0.6022 0.5363 0.6022 0.7760
No log 4.1386 418 0.5711 0.6371 0.5711 0.7557
No log 4.1584 420 0.6093 0.4282 0.6093 0.7806
No log 4.1782 422 0.6047 0.4644 0.6047 0.7776
No log 4.1980 424 0.5622 0.6024 0.5622 0.7498
No log 4.2178 426 0.6125 0.4321 0.6125 0.7826
No log 4.2376 428 0.7242 0.3747 0.7242 0.8510
No log 4.2574 430 0.7764 0.3965 0.7764 0.8811
No log 4.2772 432 0.7206 0.4268 0.7206 0.8489
No log 4.2970 434 0.6801 0.4773 0.6801 0.8247
No log 4.3168 436 0.7025 0.4719 0.7025 0.8382
No log 4.3366 438 0.7577 0.4367 0.7577 0.8705
No log 4.3564 440 0.7106 0.4719 0.7106 0.8430
No log 4.3762 442 0.5929 0.5315 0.5929 0.7700
No log 4.3960 444 0.5292 0.5855 0.5292 0.7275
No log 4.4158 446 0.5266 0.6130 0.5266 0.7257
No log 4.4356 448 0.5439 0.6235 0.5439 0.7375
No log 4.4554 450 0.5947 0.5421 0.5947 0.7712
No log 4.4752 452 0.6259 0.5058 0.6259 0.7912
No log 4.4950 454 0.6726 0.5252 0.6726 0.8201
No log 4.5149 456 0.7482 0.5088 0.7482 0.8650
No log 4.5347 458 0.7401 0.4906 0.7401 0.8603
No log 4.5545 460 0.6298 0.5373 0.6298 0.7936
No log 4.5743 462 0.5373 0.5718 0.5373 0.7330
No log 4.5941 464 0.5198 0.5861 0.5198 0.7210
No log 4.6139 466 0.5198 0.5781 0.5198 0.7210
No log 4.6337 468 0.5266 0.5702 0.5266 0.7257
No log 4.6535 470 0.5319 0.5702 0.5319 0.7293
No log 4.6733 472 0.5652 0.5855 0.5652 0.7518
No log 4.6931 474 0.5687 0.6235 0.5687 0.7541
No log 4.7129 476 0.5509 0.5722 0.5509 0.7422
No log 4.7327 478 0.5637 0.4795 0.5637 0.7508
No log 4.7525 480 0.5872 0.4901 0.5872 0.7663
No log 4.7723 482 0.5637 0.4838 0.5637 0.7508
No log 4.7921 484 0.5689 0.4901 0.5689 0.7543
No log 4.8119 486 0.5639 0.4596 0.5639 0.7509
No log 4.8317 488 0.5521 0.5421 0.5521 0.7430
No log 4.8515 490 0.5575 0.5634 0.5575 0.7466
No log 4.8713 492 0.5613 0.4659 0.5613 0.7492
No log 4.8911 494 0.5743 0.5141 0.5743 0.7578
No log 4.9109 496 0.5758 0.5214 0.5758 0.7588
No log 4.9307 498 0.5957 0.4808 0.5957 0.7718
0.3815 4.9505 500 0.5802 0.5214 0.5802 0.7617
0.3815 4.9703 502 0.5707 0.5141 0.5707 0.7554
0.3815 4.9901 504 0.5507 0.5336 0.5507 0.7421
0.3815 5.0099 506 0.5595 0.5683 0.5595 0.7480
0.3815 5.0297 508 0.5587 0.5868 0.5587 0.7475
0.3815 5.0495 510 0.5379 0.5336 0.5379 0.7334
0.3815 5.0693 512 0.5775 0.4618 0.5775 0.7599
0.3815 5.0891 514 0.6341 0.4353 0.6341 0.7963
0.3815 5.1089 516 0.6029 0.4845 0.6029 0.7764
0.3815 5.1287 518 0.5268 0.5214 0.5268 0.7258
0.3815 5.1485 520 0.5335 0.6235 0.5335 0.7304
0.3815 5.1683 522 0.6261 0.4346 0.6261 0.7913
0.3815 5.1881 524 0.6522 0.4511 0.6522 0.8076
0.3815 5.2079 526 0.6015 0.4987 0.6015 0.7755
0.3815 5.2277 528 0.5240 0.6305 0.5240 0.7239
0.3815 5.2475 530 0.5190 0.6142 0.5190 0.7204
0.3815 5.2673 532 0.5216 0.5926 0.5216 0.7222
0.3815 5.2871 534 0.5249 0.6156 0.5249 0.7245
0.3815 5.3069 536 0.5511 0.5283 0.5511 0.7423
0.3815 5.3267 538 0.5889 0.4892 0.5889 0.7674
0.3815 5.3465 540 0.5871 0.5090 0.5871 0.7662
0.3815 5.3663 542 0.5639 0.5267 0.5639 0.7510
0.3815 5.3861 544 0.5586 0.6292 0.5586 0.7474
0.3815 5.4059 546 0.5644 0.5718 0.5644 0.7513
0.3815 5.4257 548 0.5619 0.6127 0.5619 0.7496
0.3815 5.4455 550 0.5712 0.5078 0.5712 0.7558
0.3815 5.4653 552 0.6056 0.5184 0.6056 0.7782
0.3815 5.4851 554 0.6023 0.5184 0.6023 0.7761
0.3815 5.5050 556 0.5667 0.5956 0.5667 0.7528
0.3815 5.5248 558 0.5562 0.6317 0.5562 0.7458
0.3815 5.5446 560 0.5620 0.6317 0.5620 0.7497
0.3815 5.5644 562 0.5874 0.5414 0.5874 0.7664
0.3815 5.5842 564 0.6383 0.4794 0.6383 0.7989
0.3815 5.6040 566 0.6372 0.4429 0.6372 0.7983
0.3815 5.6238 568 0.5867 0.5438 0.5867 0.7660
0.3815 5.6436 570 0.5609 0.4137 0.5609 0.7489
0.3815 5.6634 572 0.5688 0.4124 0.5688 0.7542
0.3815 5.6832 574 0.5638 0.4229 0.5638 0.7509
0.3815 5.7030 576 0.5653 0.4444 0.5653 0.7519
0.3815 5.7228 578 0.5757 0.5065 0.5757 0.7587

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task7_organization

Finetuned
(4019)
this model