ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8506
  • Qwk: 0.3922
  • Mse: 0.8506
  • Rmse: 0.9223

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.0729 -0.0033 4.0729 2.0181
No log 0.0769 4 2.4546 -0.0553 2.4546 1.5667
No log 0.1154 6 1.8473 -0.0773 1.8473 1.3592
No log 0.1538 8 1.2596 0.1003 1.2596 1.1223
No log 0.1923 10 1.2313 0.1698 1.2313 1.1097
No log 0.2308 12 1.1099 0.0792 1.1099 1.0535
No log 0.2692 14 1.1135 0.1046 1.1135 1.0552
No log 0.3077 16 1.1180 0.1417 1.1180 1.0573
No log 0.3462 18 1.0967 0.2068 1.0967 1.0472
No log 0.3846 20 1.3614 0.1057 1.3614 1.1668
No log 0.4231 22 1.2397 0.1564 1.2397 1.1134
No log 0.4615 24 1.0416 0.3407 1.0416 1.0206
No log 0.5 26 1.0236 0.2286 1.0236 1.0117
No log 0.5385 28 1.0784 0.2522 1.0784 1.0385
No log 0.5769 30 1.0442 0.2207 1.0442 1.0219
No log 0.6154 32 1.1120 0.2138 1.1120 1.0545
No log 0.6538 34 1.1089 0.2163 1.1089 1.0531
No log 0.6923 36 1.0045 0.3466 1.0045 1.0023
No log 0.7308 38 1.0071 0.2745 1.0071 1.0036
No log 0.7692 40 0.9977 0.2360 0.9977 0.9988
No log 0.8077 42 0.9735 0.3215 0.9735 0.9867
No log 0.8462 44 1.0445 0.3418 1.0445 1.0220
No log 0.8846 46 1.0278 0.3365 1.0278 1.0138
No log 0.9231 48 1.0027 0.3467 1.0027 1.0013
No log 0.9615 50 1.0538 0.3278 1.0538 1.0266
No log 1.0 52 1.0437 0.2933 1.0437 1.0216
No log 1.0385 54 1.0233 0.2667 1.0233 1.0116
No log 1.0769 56 1.0127 0.2057 1.0127 1.0063
No log 1.1154 58 0.9119 0.2619 0.9119 0.9549
No log 1.1538 60 0.8966 0.3610 0.8966 0.9469
No log 1.1923 62 0.8999 0.3773 0.8999 0.9486
No log 1.2308 64 0.9018 0.3651 0.9018 0.9496
No log 1.2692 66 0.8654 0.4038 0.8654 0.9303
No log 1.3077 68 0.8371 0.4124 0.8371 0.9149
No log 1.3462 70 0.8174 0.4641 0.8174 0.9041
No log 1.3846 72 0.8116 0.4 0.8116 0.9009
No log 1.4231 74 0.8153 0.4409 0.8153 0.9029
No log 1.4615 76 0.8708 0.4161 0.8708 0.9332
No log 1.5 78 0.8743 0.4433 0.8743 0.9351
No log 1.5385 80 0.8011 0.4133 0.8011 0.8951
No log 1.5769 82 0.8154 0.3721 0.8154 0.9030
No log 1.6154 84 0.8257 0.3994 0.8257 0.9087
No log 1.6538 86 0.8360 0.4097 0.8360 0.9144
No log 1.6923 88 0.8330 0.4254 0.8330 0.9127
No log 1.7308 90 0.8494 0.4025 0.8494 0.9216
No log 1.7692 92 0.8438 0.4025 0.8438 0.9186
No log 1.8077 94 0.8554 0.4025 0.8554 0.9249
No log 1.8462 96 0.9538 0.4018 0.9538 0.9766
No log 1.8846 98 1.2600 0.3093 1.2600 1.1225
No log 1.9231 100 1.4184 0.1190 1.4184 1.1910
No log 1.9615 102 1.2524 0.1889 1.2524 1.1191
No log 2.0 104 0.9904 0.3503 0.9904 0.9952
No log 2.0385 106 0.8127 0.4794 0.8127 0.9015
No log 2.0769 108 0.8537 0.4656 0.8537 0.9239
No log 2.1154 110 0.8395 0.4204 0.8395 0.9162
No log 2.1538 112 0.8300 0.4204 0.8300 0.9110
No log 2.1923 114 0.8776 0.3107 0.8776 0.9368
No log 2.2308 116 0.9626 0.4020 0.9626 0.9811
No log 2.2692 118 0.9048 0.3785 0.9048 0.9512
No log 2.3077 120 0.8177 0.4359 0.8177 0.9042
No log 2.3462 122 0.8084 0.4772 0.8084 0.8991
No log 2.3846 124 0.8193 0.4261 0.8193 0.9052
No log 2.4231 126 0.8919 0.4186 0.8919 0.9444
No log 2.4615 128 1.1434 0.3744 1.1434 1.0693
No log 2.5 130 1.2687 0.2667 1.2687 1.1264
No log 2.5385 132 1.1824 0.3744 1.1824 1.0874
No log 2.5769 134 0.9450 0.3565 0.9450 0.9721
No log 2.6154 136 0.8150 0.5179 0.8150 0.9028
No log 2.6538 138 0.9933 0.4339 0.9933 0.9966
No log 2.6923 140 0.9726 0.4641 0.9726 0.9862
No log 2.7308 142 0.7921 0.5424 0.7921 0.8900
No log 2.7692 144 0.7509 0.5117 0.7509 0.8665
No log 2.8077 146 0.7951 0.4192 0.7951 0.8917
No log 2.8462 148 0.8196 0.3067 0.8196 0.9053
No log 2.8846 150 0.8782 0.3716 0.8782 0.9371
No log 2.9231 152 0.9014 0.3863 0.9014 0.9494
No log 2.9615 154 0.9009 0.4417 0.9009 0.9492
No log 3.0 156 0.8844 0.4417 0.8844 0.9404
No log 3.0385 158 0.8830 0.3824 0.8830 0.9397
No log 3.0769 160 0.8644 0.3804 0.8644 0.9297
No log 3.1154 162 0.8737 0.3503 0.8737 0.9347
No log 3.1538 164 0.9029 0.3342 0.9029 0.9502
No log 3.1923 166 0.8641 0.3617 0.8641 0.9296
No log 3.2308 168 0.8224 0.3902 0.8224 0.9069
No log 3.2692 170 0.8382 0.4041 0.8382 0.9155
No log 3.3077 172 0.8901 0.2886 0.8901 0.9434
No log 3.3462 174 0.8668 0.4231 0.8668 0.9310
No log 3.3846 176 0.8587 0.4257 0.8587 0.9266
No log 3.4231 178 0.8425 0.4123 0.8425 0.9179
No log 3.4615 180 0.8768 0.3027 0.8768 0.9364
No log 3.5 182 0.9965 0.3041 0.9965 0.9982
No log 3.5385 184 1.0694 0.2481 1.0694 1.0341
No log 3.5769 186 1.0131 0.3119 1.0131 1.0065
No log 3.6154 188 0.9120 0.2865 0.9120 0.9550
No log 3.6538 190 0.9386 0.2886 0.9386 0.9688
No log 3.6923 192 1.0754 0.3571 1.0754 1.0370
No log 3.7308 194 1.1571 0.3744 1.1571 1.0757
No log 3.7692 196 1.0780 0.3514 1.0780 1.0383
No log 3.8077 198 0.9389 0.3472 0.9389 0.9689
No log 3.8462 200 0.8819 0.4385 0.8819 0.9391
No log 3.8846 202 0.8956 0.3347 0.8956 0.9464
No log 3.9231 204 0.9345 0.3044 0.9345 0.9667
No log 3.9615 206 0.9622 0.3063 0.9622 0.9809
No log 4.0 208 0.9839 0.3902 0.9839 0.9919
No log 4.0385 210 1.0268 0.3902 1.0268 1.0133
No log 4.0769 212 0.9668 0.4174 0.9668 0.9832
No log 4.1154 214 0.9084 0.3725 0.9084 0.9531
No log 4.1538 216 0.9029 0.3576 0.9029 0.9502
No log 4.1923 218 0.9083 0.3145 0.9083 0.9530
No log 4.2308 220 0.9647 0.4039 0.9647 0.9822
No log 4.2692 222 0.9535 0.3902 0.9535 0.9765
No log 4.3077 224 0.8790 0.3044 0.8790 0.9375
No log 4.3462 226 0.8314 0.3382 0.8314 0.9118
No log 4.3846 228 0.8209 0.3693 0.8209 0.9060
No log 4.4231 230 0.8224 0.3693 0.8224 0.9069
No log 4.4615 232 0.8735 0.3611 0.8735 0.9346
No log 4.5 234 1.0350 0.2574 1.0350 1.0173
No log 4.5385 236 1.1642 0.3250 1.1642 1.0790
No log 4.5769 238 1.1361 0.3659 1.1361 1.0659
No log 4.6154 240 1.0474 0.4036 1.0474 1.0234
No log 4.6538 242 1.0220 0.4036 1.0220 1.0109
No log 4.6923 244 1.0696 0.2939 1.0696 1.0342
No log 4.7308 246 1.1212 0.1998 1.1212 1.0589
No log 4.7692 248 1.0633 0.1998 1.0633 1.0312
No log 4.8077 250 0.9874 0.1671 0.9874 0.9937
No log 4.8462 252 0.9326 0.3360 0.9326 0.9657
No log 4.8846 254 0.8822 0.3335 0.8822 0.9392
No log 4.9231 256 0.8714 0.3194 0.8714 0.9335
No log 4.9615 258 0.8780 0.3922 0.8780 0.9370
No log 5.0 260 0.9027 0.3169 0.9027 0.9501
No log 5.0385 262 0.9418 0.2505 0.9418 0.9705
No log 5.0769 264 1.0396 0.3519 1.0396 1.0196
No log 5.1154 266 1.0511 0.3654 1.0511 1.0252
No log 5.1538 268 0.9685 0.4214 0.9685 0.9841
No log 5.1923 270 0.9080 0.3992 0.9080 0.9529
No log 5.2308 272 0.9105 0.3740 0.9105 0.9542
No log 5.2692 274 0.9342 0.3622 0.9342 0.9666
No log 5.3077 276 0.9731 0.2748 0.9731 0.9864
No log 5.3462 278 1.0703 0.2175 1.0703 1.0346
No log 5.3846 280 1.2458 0.1170 1.2458 1.1161
No log 5.4231 282 1.3473 0.2864 1.3473 1.1607
No log 5.4615 284 1.2878 0.2627 1.2878 1.1348
No log 5.5 286 1.1141 0.3344 1.1141 1.0555
No log 5.5385 288 0.9475 0.5279 0.9475 0.9734
No log 5.5769 290 0.8416 0.4336 0.8416 0.9174
No log 5.6154 292 0.8474 0.3922 0.8474 0.9206
No log 5.6538 294 0.8920 0.3780 0.8920 0.9445
No log 5.6923 296 0.9741 0.3229 0.9741 0.9870
No log 5.7308 298 1.0811 0.2773 1.0811 1.0397
No log 5.7692 300 1.0999 0.4197 1.0999 1.0487
No log 5.8077 302 1.0484 0.3780 1.0484 1.0239
No log 5.8462 304 0.9417 0.5154 0.9417 0.9704
No log 5.8846 306 0.8513 0.4697 0.8513 0.9226
No log 5.9231 308 0.7906 0.4456 0.7906 0.8891
No log 5.9615 310 0.7741 0.4345 0.7741 0.8798
No log 6.0 312 0.7933 0.4490 0.7933 0.8907
No log 6.0385 314 0.8293 0.3922 0.8293 0.9107
No log 6.0769 316 0.8306 0.3922 0.8306 0.9113
No log 6.1154 318 0.8103 0.4327 0.8103 0.9002
No log 6.1538 320 0.8491 0.4456 0.8491 0.9214
No log 6.1923 322 0.8312 0.5319 0.8312 0.9117
No log 6.2308 324 0.8058 0.4483 0.8058 0.8977
No log 6.2692 326 0.7937 0.4612 0.7937 0.8909
No log 6.3077 328 0.8138 0.4819 0.8138 0.9021
No log 6.3462 330 0.9075 0.4686 0.9075 0.9526
No log 6.3846 332 0.9128 0.4907 0.9128 0.9554
No log 6.4231 334 0.8280 0.4471 0.8280 0.9099
No log 6.4615 336 0.8452 0.4023 0.8452 0.9193
No log 6.5 338 0.9196 0.4460 0.9196 0.9589
No log 6.5385 340 0.9031 0.4943 0.9031 0.9503
No log 6.5769 342 0.8524 0.4612 0.8524 0.9232
No log 6.6154 344 0.8467 0.4101 0.8467 0.9201
No log 6.6538 346 0.8566 0.4101 0.8566 0.9255
No log 6.6923 348 0.8609 0.4101 0.8609 0.9278
No log 6.7308 350 0.8667 0.4729 0.8667 0.9310
No log 6.7692 352 0.9208 0.4335 0.9208 0.9596
No log 6.8077 354 1.0118 0.3462 1.0118 1.0059
No log 6.8462 356 1.0540 0.3677 1.0540 1.0266
No log 6.8846 358 0.9910 0.3654 0.9910 0.9955
No log 6.9231 360 0.8875 0.3192 0.8875 0.9421
No log 6.9615 362 0.8582 0.3658 0.8582 0.9264
No log 7.0 364 0.8421 0.4608 0.8421 0.9177
No log 7.0385 366 0.8536 0.4608 0.8536 0.9239
No log 7.0769 368 0.8692 0.4608 0.8692 0.9323
No log 7.1154 370 0.9083 0.3658 0.9083 0.9530
No log 7.1538 372 1.0066 0.2961 1.0066 1.0033
No log 7.1923 374 1.0977 0.3374 1.0977 1.0477
No log 7.2308 376 1.0776 0.3374 1.0776 1.0381
No log 7.2692 378 0.9933 0.3229 0.9933 0.9966
No log 7.3077 380 0.9622 0.3229 0.9622 0.9809
No log 7.3462 382 0.9750 0.3229 0.9750 0.9874
No log 7.3846 384 0.9293 0.3637 0.9293 0.9640
No log 7.4231 386 0.9239 0.3637 0.9239 0.9612
No log 7.4615 388 0.9345 0.3590 0.9345 0.9667
No log 7.5 390 0.8765 0.3736 0.8765 0.9362
No log 7.5385 392 0.8292 0.3837 0.8292 0.9106
No log 7.5769 394 0.8511 0.3756 0.8511 0.9225
No log 7.6154 396 0.8183 0.4 0.8183 0.9046
No log 7.6538 398 0.8020 0.4230 0.8020 0.8955
No log 7.6923 400 0.7995 0.4160 0.7995 0.8942
No log 7.7308 402 0.8290 0.3733 0.8290 0.9105
No log 7.7692 404 0.8205 0.3733 0.8205 0.9058
No log 7.8077 406 0.7833 0.4086 0.7833 0.8850
No log 7.8462 408 0.7643 0.3590 0.7643 0.8742
No log 7.8846 410 0.7576 0.4345 0.7576 0.8704
No log 7.9231 412 0.8034 0.4444 0.8034 0.8963
No log 7.9615 414 0.8300 0.4300 0.8300 0.9110
No log 8.0 416 0.8709 0.3432 0.8709 0.9332
No log 8.0385 418 0.8679 0.3432 0.8679 0.9316
No log 8.0769 420 0.8851 0.3432 0.8851 0.9408
No log 8.1154 422 0.9187 0.3130 0.9187 0.9585
No log 8.1538 424 0.9596 0.3443 0.9596 0.9796
No log 8.1923 426 0.9517 0.3443 0.9517 0.9756
No log 8.2308 428 0.9214 0.3130 0.9214 0.9599
No log 8.2692 430 0.9552 0.3443 0.9552 0.9774
No log 8.3077 432 0.9673 0.3443 0.9673 0.9835
No log 8.3462 434 0.9631 0.4023 0.9631 0.9814
No log 8.3846 436 0.9257 0.4192 0.9257 0.9621
No log 8.4231 438 0.8505 0.4461 0.8505 0.9222
No log 8.4615 440 0.8117 0.3979 0.8117 0.9010
No log 8.5 442 0.8124 0.3693 0.8124 0.9013
No log 8.5385 444 0.8059 0.3693 0.8059 0.8977
No log 8.5769 446 0.8059 0.3979 0.8059 0.8977
No log 8.6154 448 0.8474 0.4461 0.8474 0.9206
No log 8.6538 450 0.8694 0.4461 0.8694 0.9324
No log 8.6923 452 0.8486 0.4227 0.8486 0.9212
No log 8.7308 454 0.8378 0.3860 0.8378 0.9153
No log 8.7692 456 0.8368 0.4227 0.8368 0.9147
No log 8.8077 458 0.8397 0.3860 0.8397 0.9163
No log 8.8462 460 0.8271 0.3860 0.8271 0.9094
No log 8.8846 462 0.8226 0.3733 0.8226 0.9070
No log 8.9231 464 0.8577 0.3773 0.8577 0.9261
No log 8.9615 466 0.8569 0.3896 0.8569 0.9257
No log 9.0 468 0.8457 0.3876 0.8457 0.9196
No log 9.0385 470 0.8423 0.3446 0.8423 0.9178
No log 9.0769 472 0.8503 0.3446 0.8503 0.9221
No log 9.1154 474 0.8603 0.3314 0.8603 0.9275
No log 9.1538 476 0.8526 0.3717 0.8526 0.9234
No log 9.1923 478 0.8523 0.3435 0.8523 0.9232
No log 9.2308 480 0.8164 0.3858 0.8164 0.9035
No log 9.2692 482 0.7988 0.4133 0.7988 0.8938
No log 9.3077 484 0.8179 0.4234 0.8179 0.9044
No log 9.3462 486 0.8488 0.4025 0.8488 0.9213
No log 9.3846 488 0.8993 0.3590 0.8993 0.9483
No log 9.4231 490 0.9099 0.3590 0.9099 0.9539
No log 9.4615 492 0.8716 0.3153 0.8716 0.9336
No log 9.5 494 0.8709 0.3214 0.8709 0.9332
No log 9.5385 496 0.9225 0.3590 0.9225 0.9605
No log 9.5769 498 0.9827 0.3766 0.9827 0.9913
0.2698 9.6154 500 1.0078 0.4036 1.0078 1.0039
0.2698 9.6538 502 0.9576 0.3956 0.9576 0.9786
0.2698 9.6923 504 0.8931 0.4192 0.8931 0.9450
0.2698 9.7308 506 0.8656 0.3940 0.8656 0.9304
0.2698 9.7692 508 0.8789 0.4060 0.8789 0.9375
0.2698 9.8077 510 0.8980 0.3001 0.8980 0.9476
0.2698 9.8462 512 0.8959 0.3001 0.8959 0.9465
0.2698 9.8846 514 0.8750 0.3663 0.8750 0.9354
0.2698 9.9231 516 0.8456 0.4746 0.8456 0.9195
0.2698 9.9615 518 0.8301 0.3552 0.8301 0.9111
0.2698 10.0 520 0.8343 0.4288 0.8343 0.9134
0.2698 10.0385 522 0.8910 0.4712 0.8910 0.9439
0.2698 10.0769 524 0.9741 0.4439 0.9741 0.9870
0.2698 10.1154 526 1.0110 0.3085 1.0110 1.0055
0.2698 10.1538 528 1.0247 0.2599 1.0247 1.0123
0.2698 10.1923 530 0.9531 0.3085 0.9531 0.9763
0.2698 10.2308 532 0.8749 0.3922 0.8749 0.9354
0.2698 10.2692 534 0.8506 0.3922 0.8506 0.9223

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k11_task5_organization

Finetuned
(4019)
this model