ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9503
  • Qwk: 0.3704
  • Mse: 0.9503
  • Rmse: 0.9748

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 4.6151 -0.0163 4.6151 2.1483
No log 0.0571 4 2.8121 -0.0144 2.8121 1.6769
No log 0.0857 6 1.8270 0.0062 1.8270 1.3517
No log 0.1143 8 1.4869 0.0682 1.4869 1.2194
No log 0.1429 10 1.7000 0.0227 1.7000 1.3038
No log 0.1714 12 2.2905 -0.0361 2.2905 1.5134
No log 0.2 14 1.9694 0.0283 1.9694 1.4034
No log 0.2286 16 1.6821 0.0372 1.6821 1.2970
No log 0.2571 18 1.4628 0.0 1.4628 1.2095
No log 0.2857 20 1.5651 -0.0066 1.5651 1.2511
No log 0.3143 22 1.3884 0.0714 1.3884 1.1783
No log 0.3429 24 1.1266 0.1995 1.1266 1.0614
No log 0.3714 26 1.1221 0.1979 1.1221 1.0593
No log 0.4 28 1.0809 0.2097 1.0809 1.0397
No log 0.4286 30 1.1136 0.3603 1.1136 1.0553
No log 0.4571 32 1.1909 0.2342 1.1909 1.0913
No log 0.4857 34 1.2726 0.1904 1.2726 1.1281
No log 0.5143 36 1.3252 0.0714 1.3252 1.1512
No log 0.5429 38 1.2383 0.1959 1.2383 1.1128
No log 0.5714 40 1.1991 0.2149 1.1991 1.0950
No log 0.6 42 1.2218 0.2149 1.2218 1.1054
No log 0.6286 44 1.3912 0.0537 1.3912 1.1795
No log 0.6571 46 1.4341 0.0818 1.4341 1.1976
No log 0.6857 48 1.3335 0.1865 1.3335 1.1548
No log 0.7143 50 1.1640 0.1999 1.1640 1.0789
No log 0.7429 52 1.1191 0.2735 1.1191 1.0579
No log 0.7714 54 1.2055 0.1622 1.2055 1.0980
No log 0.8 56 1.3122 0.1568 1.3122 1.1455
No log 0.8286 58 1.1146 0.2786 1.1146 1.0558
No log 0.8571 60 1.0613 0.3411 1.0613 1.0302
No log 0.8857 62 1.0794 0.3457 1.0794 1.0389
No log 0.9143 64 1.1111 0.3441 1.1111 1.0541
No log 0.9429 66 1.1814 0.2395 1.1814 1.0869
No log 0.9714 68 1.2592 0.2298 1.2592 1.1221
No log 1.0 70 1.5259 0.1901 1.5259 1.2353
No log 1.0286 72 1.5958 0.1949 1.5958 1.2633
No log 1.0571 74 1.5498 0.1949 1.5498 1.2449
No log 1.0857 76 1.3114 0.1838 1.3114 1.1451
No log 1.1143 78 1.1300 0.2654 1.1300 1.0630
No log 1.1429 80 1.0554 0.3714 1.0554 1.0273
No log 1.1714 82 1.0573 0.3087 1.0573 1.0283
No log 1.2 84 1.0711 0.3087 1.0711 1.0349
No log 1.2286 86 1.1590 0.2298 1.1590 1.0766
No log 1.2571 88 1.2904 0.1784 1.2904 1.1359
No log 1.2857 90 1.5114 0.0749 1.5114 1.2294
No log 1.3143 92 1.5233 0.0766 1.5233 1.2342
No log 1.3429 94 1.3694 0.1080 1.3694 1.1702
No log 1.3714 96 1.1658 0.1865 1.1658 1.0797
No log 1.4 98 1.0079 0.3544 1.0079 1.0039
No log 1.4286 100 0.9663 0.3779 0.9663 0.9830
No log 1.4571 102 0.9437 0.3779 0.9437 0.9715
No log 1.4857 104 0.9041 0.4075 0.9041 0.9508
No log 1.5143 106 0.9107 0.3909 0.9107 0.9543
No log 1.5429 108 1.0489 0.3791 1.0489 1.0242
No log 1.5714 110 1.2791 0.3679 1.2791 1.1310
No log 1.6 112 1.3188 0.3907 1.3188 1.1484
No log 1.6286 114 1.4219 0.3659 1.4219 1.1924
No log 1.6571 116 1.3685 0.3898 1.3685 1.1698
No log 1.6857 118 1.1491 0.3839 1.1491 1.0720
No log 1.7143 120 1.0806 0.3763 1.0806 1.0395
No log 1.7429 122 1.0132 0.4338 1.0132 1.0066
No log 1.7714 124 1.0549 0.3330 1.0549 1.0271
No log 1.8 126 0.9481 0.4978 0.9481 0.9737
No log 1.8286 128 0.9422 0.5647 0.9422 0.9706
No log 1.8571 130 1.0265 0.4392 1.0265 1.0132
No log 1.8857 132 0.9991 0.4176 0.9991 0.9995
No log 1.9143 134 0.9463 0.5302 0.9463 0.9728
No log 1.9429 136 0.9528 0.5777 0.9528 0.9761
No log 1.9714 138 0.9447 0.5622 0.9447 0.9719
No log 2.0 140 0.9343 0.5701 0.9343 0.9666
No log 2.0286 142 0.9813 0.4425 0.9813 0.9906
No log 2.0571 144 1.0714 0.3255 1.0714 1.0351
No log 2.0857 146 1.0559 0.3389 1.0559 1.0276
No log 2.1143 148 1.0323 0.3347 1.0323 1.0160
No log 2.1429 150 1.0034 0.4514 1.0034 1.0017
No log 2.1714 152 1.0406 0.3342 1.0406 1.0201
No log 2.2 154 1.0687 0.3793 1.0687 1.0338
No log 2.2286 156 1.0265 0.4463 1.0265 1.0132
No log 2.2571 158 1.1680 0.3232 1.1680 1.0807
No log 2.2857 160 1.2498 0.3361 1.2498 1.1180
No log 2.3143 162 1.0661 0.3634 1.0661 1.0325
No log 2.3429 164 0.9727 0.3802 0.9727 0.9863
No log 2.3714 166 1.0254 0.4074 1.0254 1.0126
No log 2.4 168 0.9761 0.3621 0.9761 0.9880
No log 2.4286 170 0.9217 0.3927 0.9217 0.9601
No log 2.4571 172 0.9735 0.4061 0.9735 0.9866
No log 2.4857 174 1.1712 0.3080 1.1712 1.0822
No log 2.5143 176 1.1696 0.3080 1.1696 1.0815
No log 2.5429 178 0.9836 0.4380 0.9836 0.9917
No log 2.5714 180 0.8916 0.4434 0.8916 0.9442
No log 2.6 182 0.9185 0.4465 0.9185 0.9584
No log 2.6286 184 0.9384 0.4470 0.9384 0.9687
No log 2.6571 186 0.9281 0.3909 0.9281 0.9634
No log 2.6857 188 1.0160 0.3981 1.0160 1.0080
No log 2.7143 190 1.1389 0.3429 1.1389 1.0672
No log 2.7429 192 1.1568 0.3377 1.1568 1.0756
No log 2.7714 194 1.1221 0.3622 1.1221 1.0593
No log 2.8 196 1.0622 0.4155 1.0622 1.0307
No log 2.8286 198 0.9829 0.3942 0.9829 0.9914
No log 2.8571 200 0.9494 0.4137 0.9494 0.9744
No log 2.8857 202 0.9855 0.4606 0.9855 0.9927
No log 2.9143 204 1.1219 0.4701 1.1219 1.0592
No log 2.9429 206 1.0924 0.4453 1.0924 1.0452
No log 2.9714 208 0.9960 0.4392 0.9960 0.9980
No log 3.0 210 1.0384 0.4058 1.0384 1.0190
No log 3.0286 212 1.0931 0.3674 1.0931 1.0455
No log 3.0571 214 1.0780 0.2703 1.0780 1.0383
No log 3.0857 216 1.1083 0.3472 1.1083 1.0528
No log 3.1143 218 1.1700 0.3480 1.1700 1.0816
No log 3.1429 220 1.1326 0.3806 1.1326 1.0642
No log 3.1714 222 1.0810 0.3354 1.0810 1.0397
No log 3.2 224 1.0886 0.2230 1.0886 1.0434
No log 3.2286 226 1.0827 0.3015 1.0827 1.0405
No log 3.2571 228 1.1704 0.3128 1.1704 1.0819
No log 3.2857 230 1.3991 0.2191 1.3991 1.1828
No log 3.3143 232 1.4633 0.2453 1.4633 1.2097
No log 3.3429 234 1.3011 0.2840 1.3011 1.1406
No log 3.3714 236 1.1110 0.3759 1.1110 1.0540
No log 3.4 238 1.0336 0.3486 1.0336 1.0167
No log 3.4286 240 1.0204 0.3969 1.0204 1.0101
No log 3.4571 242 1.0476 0.3516 1.0476 1.0235
No log 3.4857 244 1.1159 0.3078 1.1159 1.0564
No log 3.5143 246 1.1580 0.3590 1.1580 1.0761
No log 3.5429 248 1.2181 0.3202 1.2181 1.1037
No log 3.5714 250 1.1790 0.3549 1.1790 1.0858
No log 3.6 252 1.0970 0.3601 1.0970 1.0474
No log 3.6286 254 0.9996 0.3256 0.9996 0.9998
No log 3.6571 256 0.9635 0.3529 0.9635 0.9816
No log 3.6857 258 1.0094 0.3584 1.0094 1.0047
No log 3.7143 260 1.0350 0.3624 1.0350 1.0174
No log 3.7429 262 1.0446 0.3928 1.0446 1.0221
No log 3.7714 264 0.9997 0.3494 0.9997 0.9999
No log 3.8 266 0.9379 0.4181 0.9379 0.9685
No log 3.8286 268 0.9295 0.4963 0.9295 0.9641
No log 3.8571 270 0.9242 0.4181 0.9242 0.9614
No log 3.8857 272 0.9317 0.4120 0.9317 0.9652
No log 3.9143 274 0.9563 0.3584 0.9563 0.9779
No log 3.9429 276 0.9979 0.3841 0.9979 0.9989
No log 3.9714 278 0.9796 0.4104 0.9796 0.9897
No log 4.0 280 0.9280 0.4658 0.9280 0.9633
No log 4.0286 282 0.9314 0.4535 0.9314 0.9651
No log 4.0571 284 0.9645 0.4792 0.9645 0.9821
No log 4.0857 286 0.9977 0.4082 0.9977 0.9988
No log 4.1143 288 0.9675 0.4328 0.9675 0.9836
No log 4.1429 290 0.9530 0.3953 0.9530 0.9762
No log 4.1714 292 0.9612 0.4123 0.9612 0.9804
No log 4.2 294 1.0426 0.3805 1.0426 1.0211
No log 4.2286 296 1.2338 0.3482 1.2338 1.1108
No log 4.2571 298 1.2266 0.3482 1.2266 1.1075
No log 4.2857 300 1.0627 0.3534 1.0627 1.0309
No log 4.3143 302 0.9520 0.3497 0.9520 0.9757
No log 4.3429 304 0.9375 0.4381 0.9375 0.9683
No log 4.3714 306 0.9347 0.3938 0.9347 0.9668
No log 4.4 308 0.9851 0.3572 0.9851 0.9925
No log 4.4286 310 1.0331 0.3705 1.0331 1.0164
No log 4.4571 312 0.9708 0.3852 0.9708 0.9853
No log 4.4857 314 0.9298 0.3830 0.9298 0.9643
No log 4.5143 316 0.9544 0.3070 0.9544 0.9769
No log 4.5429 318 0.9645 0.3223 0.9645 0.9821
No log 4.5714 320 0.9736 0.2681 0.9736 0.9867
No log 4.6 322 1.0138 0.2986 1.0138 1.0069
No log 4.6286 324 1.0893 0.3696 1.0893 1.0437
No log 4.6571 326 1.0790 0.3601 1.0790 1.0388
No log 4.6857 328 0.9740 0.2795 0.9740 0.9869
No log 4.7143 330 0.9469 0.3643 0.9469 0.9731
No log 4.7429 332 0.9955 0.3285 0.9955 0.9978
No log 4.7714 334 0.9857 0.3196 0.9857 0.9928
No log 4.8 336 0.9250 0.3771 0.9250 0.9618
No log 4.8286 338 0.9571 0.4175 0.9571 0.9783
No log 4.8571 340 0.9950 0.3935 0.9950 0.9975
No log 4.8857 342 0.9603 0.4500 0.9603 0.9800
No log 4.9143 344 0.9578 0.4368 0.9578 0.9787
No log 4.9429 346 0.9388 0.4257 0.9388 0.9689
No log 4.9714 348 0.9587 0.2970 0.9587 0.9791
No log 5.0 350 0.9636 0.3326 0.9636 0.9817
No log 5.0286 352 0.9743 0.3095 0.9743 0.9871
No log 5.0571 354 0.9993 0.3559 0.9993 0.9996
No log 5.0857 356 1.0415 0.3665 1.0415 1.0206
No log 5.1143 358 1.1234 0.3972 1.1234 1.0599
No log 5.1429 360 1.0823 0.3665 1.0823 1.0403
No log 5.1714 362 0.9900 0.3294 0.9900 0.9950
No log 5.2 364 0.9679 0.3539 0.9679 0.9838
No log 5.2286 366 0.9710 0.3974 0.9710 0.9854
No log 5.2571 368 0.9977 0.4 0.9977 0.9989
No log 5.2857 370 1.0032 0.3806 1.0032 1.0016
No log 5.3143 372 0.9730 0.3974 0.9730 0.9864
No log 5.3429 374 0.9775 0.2968 0.9775 0.9887
No log 5.3714 376 0.9898 0.3788 0.9898 0.9949
No log 5.4 378 1.0251 0.3256 1.0251 1.0125
No log 5.4286 380 1.1786 0.4005 1.1786 1.0856
No log 5.4571 382 1.3447 0.2942 1.3447 1.1596
No log 5.4857 384 1.2976 0.2942 1.2976 1.1391
No log 5.5143 386 1.1709 0.3535 1.1709 1.0821
No log 5.5429 388 1.0465 0.3897 1.0465 1.0230
No log 5.5714 390 1.0027 0.2707 1.0027 1.0013
No log 5.6 392 1.0157 0.2904 1.0157 1.0078
No log 5.6286 394 0.9961 0.2707 0.9961 0.9980
No log 5.6571 396 1.0022 0.4615 1.0022 1.0011
No log 5.6857 398 1.1216 0.3883 1.1216 1.0591
No log 5.7143 400 1.2050 0.3293 1.2050 1.0977
No log 5.7429 402 1.1260 0.3584 1.1260 1.0611
No log 5.7714 404 1.0182 0.4234 1.0182 1.0091
No log 5.8 406 1.0262 0.3191 1.0262 1.0130
No log 5.8286 408 1.0579 0.3464 1.0579 1.0285
No log 5.8571 410 1.0181 0.3191 1.0181 1.0090
No log 5.8857 412 0.9701 0.2503 0.9701 0.9849
No log 5.9143 414 1.0008 0.3747 1.0008 1.0004
No log 5.9429 416 1.0229 0.3839 1.0229 1.0114
No log 5.9714 418 0.9874 0.3806 0.9874 0.9937
No log 6.0 420 0.9424 0.3933 0.9424 0.9708
No log 6.0286 422 0.9638 0.2673 0.9638 0.9817
No log 6.0571 424 1.0188 0.3451 1.0188 1.0094
No log 6.0857 426 1.0029 0.3645 1.0029 1.0015
No log 6.1143 428 0.9382 0.2640 0.9382 0.9686
No log 6.1429 430 0.9243 0.2939 0.9243 0.9614
No log 6.1714 432 0.9507 0.3708 0.9507 0.9750
No log 6.2 434 0.9538 0.3708 0.9538 0.9766
No log 6.2286 436 0.9508 0.3663 0.9508 0.9751
No log 6.2571 438 0.9567 0.3557 0.9567 0.9781
No log 6.2857 440 0.9529 0.3145 0.9529 0.9761
No log 6.3143 442 0.9391 0.2743 0.9391 0.9690
No log 6.3429 444 0.9267 0.3263 0.9267 0.9627
No log 6.3714 446 0.9053 0.3511 0.9053 0.9515
No log 6.4 448 0.8823 0.4122 0.8823 0.9393
No log 6.4286 450 0.8698 0.4120 0.8698 0.9326
No log 6.4571 452 0.8615 0.4120 0.8615 0.9282
No log 6.4857 454 0.8553 0.4257 0.8553 0.9248
No log 6.5143 456 0.8524 0.3979 0.8524 0.9233
No log 6.5429 458 0.8708 0.4359 0.8708 0.9332
No log 6.5714 460 0.9282 0.4096 0.9282 0.9634
No log 6.6 462 0.9881 0.4100 0.9881 0.9940
No log 6.6286 464 0.9711 0.3082 0.9711 0.9855
No log 6.6571 466 0.9121 0.4158 0.9121 0.9550
No log 6.6857 468 0.8982 0.3744 0.8982 0.9477
No log 6.7143 470 0.9195 0.3609 0.9195 0.9589
No log 6.7429 472 0.9110 0.3408 0.9110 0.9545
No log 6.7714 474 0.9210 0.3408 0.9210 0.9597
No log 6.8 476 0.9380 0.3190 0.9380 0.9685
No log 6.8286 478 0.9623 0.3337 0.9623 0.9810
No log 6.8571 480 0.9661 0.3583 0.9661 0.9829
No log 6.8857 482 0.9660 0.3869 0.9660 0.9829
No log 6.9143 484 1.0447 0.3843 1.0447 1.0221
No log 6.9429 486 1.0984 0.3710 1.0984 1.0480
No log 6.9714 488 1.0134 0.3634 1.0134 1.0067
No log 7.0 490 0.9665 0.3542 0.9665 0.9831
No log 7.0286 492 0.9447 0.3527 0.9447 0.9720
No log 7.0571 494 0.9440 0.3571 0.9440 0.9716
No log 7.0857 496 0.9912 0.4116 0.9912 0.9956
No log 7.1143 498 1.0466 0.3875 1.0466 1.0230
0.3733 7.1429 500 1.1242 0.3955 1.1242 1.0603
0.3733 7.1714 502 1.1250 0.3650 1.1250 1.0607
0.3733 7.2 504 1.0117 0.4283 1.0117 1.0058
0.3733 7.2286 506 0.9447 0.4435 0.9447 0.9720
0.3733 7.2571 508 0.9213 0.4004 0.9213 0.9598
0.3733 7.2857 510 0.9356 0.3942 0.9356 0.9673
0.3733 7.3143 512 0.9202 0.4175 0.9202 0.9593
0.3733 7.3429 514 0.9075 0.3714 0.9075 0.9526
0.3733 7.3714 516 0.9172 0.3714 0.9172 0.9577
0.3733 7.4 518 0.9488 0.3602 0.9488 0.9741
0.3733 7.4286 520 0.9703 0.3256 0.9703 0.9850
0.3733 7.4571 522 0.9503 0.3704 0.9503 0.9748

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task2_organization

Finetuned
(4019)
this model