ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6688
  • Qwk: 0.5412
  • Mse: 0.6688
  • Rmse: 0.8178

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 4.1552 0.0130 4.1552 2.0384
No log 0.0667 4 2.5427 -0.0274 2.5427 1.5946
No log 0.1 6 1.5145 -0.0078 1.5145 1.2306
No log 0.1333 8 1.2325 0.0761 1.2325 1.1102
No log 0.1667 10 1.2366 0.0791 1.2366 1.1120
No log 0.2 12 1.3236 -0.0278 1.3236 1.1505
No log 0.2333 14 1.9060 0.0632 1.9060 1.3806
No log 0.2667 16 1.9723 0.0219 1.9723 1.4044
No log 0.3 18 1.7620 -0.0469 1.7620 1.3274
No log 0.3333 20 1.4636 0.0380 1.4636 1.2098
No log 0.3667 22 1.3111 0.0318 1.3111 1.1451
No log 0.4 24 1.1310 0.1416 1.1310 1.0635
No log 0.4333 26 1.0274 0.2166 1.0274 1.0136
No log 0.4667 28 1.0581 0.2834 1.0581 1.0286
No log 0.5 30 1.2430 0.1142 1.2430 1.1149
No log 0.5333 32 1.1691 0.1114 1.1691 1.0812
No log 0.5667 34 1.0593 0.2221 1.0593 1.0292
No log 0.6 36 0.9277 0.2811 0.9277 0.9632
No log 0.6333 38 0.9421 0.2314 0.9421 0.9706
No log 0.6667 40 1.0042 0.2263 1.0042 1.0021
No log 0.7 42 0.8985 0.4395 0.8985 0.9479
No log 0.7333 44 1.3636 0.2707 1.3636 1.1677
No log 0.7667 46 1.7538 0.2605 1.7538 1.3243
No log 0.8 48 1.5271 0.2972 1.5271 1.2358
No log 0.8333 50 1.0886 0.3452 1.0886 1.0433
No log 0.8667 52 0.9047 0.3970 0.9047 0.9511
No log 0.9 54 0.8828 0.3188 0.8828 0.9396
No log 0.9333 56 0.8672 0.4269 0.8672 0.9312
No log 0.9667 58 0.9075 0.4135 0.9075 0.9526
No log 1.0 60 1.2217 0.3125 1.2217 1.1053
No log 1.0333 62 1.6775 0.2867 1.6775 1.2952
No log 1.0667 64 1.6174 0.2591 1.6174 1.2718
No log 1.1 66 1.2787 0.3323 1.2787 1.1308
No log 1.1333 68 0.9986 0.3433 0.9986 0.9993
No log 1.1667 70 0.8736 0.4186 0.8736 0.9347
No log 1.2 72 0.8096 0.5517 0.8096 0.8998
No log 1.2333 74 0.8008 0.5463 0.8008 0.8948
No log 1.2667 76 0.8565 0.5418 0.8565 0.9255
No log 1.3 78 0.8999 0.5130 0.8999 0.9486
No log 1.3333 80 0.8170 0.6217 0.8170 0.9039
No log 1.3667 82 0.9141 0.4186 0.9141 0.9561
No log 1.4 84 0.9950 0.4575 0.9950 0.9975
No log 1.4333 86 0.8827 0.4455 0.8827 0.9395
No log 1.4667 88 0.8350 0.5103 0.8350 0.9138
No log 1.5 90 0.8053 0.4416 0.8053 0.8974
No log 1.5333 92 0.8375 0.3719 0.8375 0.9152
No log 1.5667 94 0.7795 0.5503 0.7795 0.8829
No log 1.6 96 0.8842 0.4696 0.8842 0.9403
No log 1.6333 98 0.8879 0.5137 0.8879 0.9423
No log 1.6667 100 0.7867 0.5835 0.7867 0.8870
No log 1.7 102 0.8761 0.4479 0.8761 0.9360
No log 1.7333 104 0.9217 0.4376 0.9217 0.9600
No log 1.7667 106 0.8571 0.4304 0.8571 0.9258
No log 1.8 108 0.8543 0.4304 0.8543 0.9243
No log 1.8333 110 0.8711 0.4035 0.8711 0.9333
No log 1.8667 112 0.8522 0.3673 0.8522 0.9231
No log 1.9 114 0.9402 0.4455 0.9402 0.9696
No log 1.9333 116 0.9472 0.4681 0.9472 0.9732
No log 1.9667 118 0.8869 0.4835 0.8869 0.9417
No log 2.0 120 0.8004 0.5117 0.8004 0.8947
No log 2.0333 122 0.7592 0.5582 0.7592 0.8713
No log 2.0667 124 0.7539 0.5106 0.7539 0.8683
No log 2.1 126 0.7969 0.5256 0.7969 0.8927
No log 2.1333 128 0.8067 0.5353 0.8067 0.8981
No log 2.1667 130 0.7200 0.6229 0.7200 0.8485
No log 2.2 132 0.8109 0.5151 0.8109 0.9005
No log 2.2333 134 0.9043 0.4668 0.9043 0.9509
No log 2.2667 136 0.7757 0.5348 0.7757 0.8807
No log 2.3 138 0.7215 0.6435 0.7215 0.8494
No log 2.3333 140 0.8793 0.4470 0.8793 0.9377
No log 2.3667 142 0.8358 0.4815 0.8358 0.9142
No log 2.4 144 0.7130 0.6073 0.7130 0.8444
No log 2.4333 146 0.7363 0.5645 0.7363 0.8581
No log 2.4667 148 0.7281 0.5442 0.7281 0.8533
No log 2.5 150 0.6912 0.6254 0.6912 0.8314
No log 2.5333 152 0.8062 0.5083 0.8062 0.8979
No log 2.5667 154 0.9078 0.4552 0.9078 0.9528
No log 2.6 156 0.8675 0.4923 0.8675 0.9314
No log 2.6333 158 0.7580 0.4576 0.7580 0.8706
No log 2.6667 160 0.7112 0.5247 0.7112 0.8433
No log 2.7 162 0.6799 0.5939 0.6799 0.8246
No log 2.7333 164 0.6618 0.6461 0.6618 0.8135
No log 2.7667 166 0.6495 0.6561 0.6495 0.8059
No log 2.8 168 0.6707 0.6501 0.6707 0.8190
No log 2.8333 170 0.6895 0.6493 0.6895 0.8304
No log 2.8667 172 0.6725 0.6528 0.6725 0.8201
No log 2.9 174 0.6433 0.6032 0.6433 0.8021
No log 2.9333 176 0.6920 0.5806 0.6920 0.8319
No log 2.9667 178 0.6637 0.5302 0.6637 0.8147
No log 3.0 180 0.7236 0.4599 0.7236 0.8507
No log 3.0333 182 0.7658 0.4471 0.7658 0.8751
No log 3.0667 184 0.7131 0.5231 0.7131 0.8445
No log 3.1 186 0.6565 0.5042 0.6565 0.8102
No log 3.1333 188 0.7207 0.5255 0.7207 0.8490
No log 3.1667 190 0.7379 0.5027 0.7379 0.8590
No log 3.2 192 0.6632 0.4975 0.6632 0.8144
No log 3.2333 194 0.6846 0.5746 0.6846 0.8274
No log 3.2667 196 0.8359 0.4573 0.8359 0.9142
No log 3.3 198 0.9977 0.4152 0.9977 0.9988
No log 3.3333 200 0.9992 0.3970 0.9992 0.9996
No log 3.3667 202 0.8781 0.3956 0.8781 0.9371
No log 3.4 204 0.7929 0.4952 0.7929 0.8905
No log 3.4333 206 0.7743 0.4697 0.7743 0.8800
No log 3.4667 208 0.7778 0.3523 0.7778 0.8819
No log 3.5 210 0.7947 0.4318 0.7947 0.8914
No log 3.5333 212 0.7615 0.5056 0.7615 0.8727
No log 3.5667 214 0.7164 0.5103 0.7164 0.8464
No log 3.6 216 0.6679 0.5505 0.6679 0.8172
No log 3.6333 218 0.6852 0.5737 0.6852 0.8278
No log 3.6667 220 0.6404 0.6025 0.6404 0.8002
No log 3.7 222 0.6256 0.5796 0.6256 0.7909
No log 3.7333 224 0.6237 0.6133 0.6237 0.7897
No log 3.7667 226 0.6246 0.6175 0.6246 0.7903
No log 3.8 228 0.6645 0.5917 0.6645 0.8152
No log 3.8333 230 0.6928 0.5446 0.6928 0.8323
No log 3.8667 232 0.7155 0.5035 0.7155 0.8459
No log 3.9 234 0.8157 0.4697 0.8157 0.9032
No log 3.9333 236 0.8513 0.4162 0.8513 0.9227
No log 3.9667 238 0.7916 0.4028 0.7916 0.8897
No log 4.0 240 0.7346 0.4898 0.7346 0.8571
No log 4.0333 242 0.7170 0.5331 0.7170 0.8468
No log 4.0667 244 0.7070 0.5573 0.7070 0.8408
No log 4.1 246 0.6703 0.5794 0.6703 0.8187
No log 4.1333 248 0.6445 0.5669 0.6445 0.8028
No log 4.1667 250 0.6245 0.6778 0.6245 0.7902
No log 4.2 252 0.6120 0.6890 0.6120 0.7823
No log 4.2333 254 0.6144 0.6057 0.6144 0.7838
No log 4.2667 256 0.6156 0.6787 0.6156 0.7846
No log 4.3 258 0.6629 0.6120 0.6629 0.8142
No log 4.3333 260 0.6602 0.5314 0.6602 0.8125
No log 4.3667 262 0.6983 0.5747 0.6983 0.8357
No log 4.4 264 0.7455 0.4858 0.7455 0.8634
No log 4.4333 266 0.7750 0.4209 0.7750 0.8803
No log 4.4667 268 0.7991 0.4209 0.7991 0.8939
No log 4.5 270 0.7849 0.4327 0.7849 0.8859
No log 4.5333 272 0.7184 0.4524 0.7184 0.8476
No log 4.5667 274 0.7069 0.5315 0.7069 0.8408
No log 4.6 276 0.7286 0.5215 0.7286 0.8536
No log 4.6333 278 0.7250 0.5188 0.7250 0.8515
No log 4.6667 280 0.7466 0.5121 0.7466 0.8640
No log 4.7 282 0.7995 0.4696 0.7995 0.8941
No log 4.7333 284 0.7799 0.4712 0.7799 0.8831
No log 4.7667 286 0.7189 0.5274 0.7189 0.8479
No log 4.8 288 0.7098 0.5647 0.7098 0.8425
No log 4.8333 290 0.7003 0.5759 0.7003 0.8368
No log 4.8667 292 0.6957 0.5656 0.6957 0.8341
No log 4.9 294 0.6886 0.5882 0.6886 0.8298
No log 4.9333 296 0.6725 0.5831 0.6725 0.8201
No log 4.9667 298 0.6573 0.6324 0.6573 0.8107
No log 5.0 300 0.6236 0.5994 0.6236 0.7897
No log 5.0333 302 0.6234 0.5568 0.6234 0.7896
No log 5.0667 304 0.6322 0.4868 0.6322 0.7951
No log 5.1 306 0.6148 0.5782 0.6148 0.7841
No log 5.1333 308 0.6191 0.6510 0.6191 0.7869
No log 5.1667 310 0.6469 0.7084 0.6469 0.8043
No log 5.2 312 0.6684 0.6990 0.6684 0.8175
No log 5.2333 314 0.6647 0.7033 0.6647 0.8153
No log 5.2667 316 0.6502 0.6614 0.6502 0.8064
No log 5.3 318 0.6583 0.6119 0.6583 0.8114
No log 5.3333 320 0.6337 0.6455 0.6337 0.7961
No log 5.3667 322 0.6174 0.5438 0.6174 0.7858
No log 5.4 324 0.6172 0.5202 0.6172 0.7856
No log 5.4333 326 0.6220 0.5074 0.6220 0.7887
No log 5.4667 328 0.6419 0.5107 0.6419 0.8012
No log 5.5 330 0.6594 0.5197 0.6594 0.8120
No log 5.5333 332 0.7026 0.5439 0.7026 0.8382
No log 5.5667 334 0.6440 0.5993 0.6440 0.8025
No log 5.6 336 0.6010 0.6174 0.6010 0.7752
No log 5.6333 338 0.6060 0.6438 0.6060 0.7785
No log 5.6667 340 0.6005 0.6347 0.6005 0.7749
No log 5.7 342 0.6220 0.5710 0.6220 0.7887
No log 5.7333 344 0.6647 0.5521 0.6647 0.8153
No log 5.7667 346 0.7012 0.5455 0.7012 0.8374
No log 5.8 348 0.6624 0.5153 0.6624 0.8139
No log 5.8333 350 0.5983 0.6452 0.5983 0.7735
No log 5.8667 352 0.6207 0.6308 0.6207 0.7878
No log 5.9 354 0.7104 0.5372 0.7104 0.8428
No log 5.9333 356 0.7335 0.4964 0.7335 0.8565
No log 5.9667 358 0.6511 0.6457 0.6511 0.8069
No log 6.0 360 0.6276 0.5622 0.6276 0.7922
No log 6.0333 362 0.6767 0.4608 0.6767 0.8226
No log 6.0667 364 0.6606 0.4503 0.6606 0.8128
No log 6.1 366 0.6472 0.4833 0.6472 0.8045
No log 6.1333 368 0.6820 0.5828 0.6820 0.8258
No log 6.1667 370 0.6834 0.5946 0.6834 0.8267
No log 6.2 372 0.6177 0.6025 0.6177 0.7859
No log 6.2333 374 0.5759 0.6317 0.5759 0.7589
No log 6.2667 376 0.5826 0.6427 0.5826 0.7633
No log 6.3 378 0.5876 0.6427 0.5876 0.7666
No log 6.3333 380 0.6114 0.5794 0.6114 0.7819
No log 6.3667 382 0.6191 0.6129 0.6191 0.7868
No log 6.4 384 0.6274 0.5805 0.6274 0.7921
No log 6.4333 386 0.6387 0.6014 0.6387 0.7992
No log 6.4667 388 0.6332 0.6014 0.6332 0.7957
No log 6.5 390 0.6198 0.5455 0.6198 0.7873
No log 6.5333 392 0.6083 0.5316 0.6083 0.7800
No log 6.5667 394 0.6186 0.5510 0.6186 0.7865
No log 6.6 396 0.6307 0.5174 0.6307 0.7941
No log 6.6333 398 0.6532 0.5421 0.6532 0.8082
No log 6.6667 400 0.6872 0.5542 0.6872 0.8290
No log 6.7 402 0.6899 0.4866 0.6899 0.8306
No log 6.7333 404 0.7039 0.4494 0.7039 0.8390
No log 6.7667 406 0.7320 0.4595 0.7320 0.8556
No log 6.8 408 0.7086 0.3922 0.7086 0.8418
No log 6.8333 410 0.6842 0.4641 0.6842 0.8271
No log 6.8667 412 0.6873 0.5704 0.6873 0.8290
No log 6.9 414 0.6613 0.6139 0.6613 0.8132
No log 6.9333 416 0.6428 0.6007 0.6428 0.8017
No log 6.9667 418 0.6633 0.5777 0.6633 0.8144
No log 7.0 420 0.6277 0.6406 0.6277 0.7922
No log 7.0333 422 0.6265 0.6238 0.6265 0.7915
No log 7.0667 424 0.6711 0.6050 0.6711 0.8192
No log 7.1 426 0.6574 0.5706 0.6574 0.8108
No log 7.1333 428 0.6413 0.5536 0.6413 0.8008
No log 7.1667 430 0.6608 0.4927 0.6608 0.8129
No log 7.2 432 0.7003 0.5545 0.7003 0.8368
No log 7.2333 434 0.7134 0.5592 0.7134 0.8446
No log 7.2667 436 0.7086 0.5224 0.7086 0.8418
No log 7.3 438 0.7039 0.5224 0.7039 0.8390
No log 7.3333 440 0.7176 0.5016 0.7176 0.8471
No log 7.3667 442 0.7919 0.3351 0.7919 0.8899
No log 7.4 444 0.8240 0.2721 0.8240 0.9077
No log 7.4333 446 0.8022 0.3498 0.8022 0.8956
No log 7.4667 448 0.7522 0.4365 0.7522 0.8673
No log 7.5 450 0.7082 0.4658 0.7082 0.8416
No log 7.5333 452 0.6827 0.4520 0.6827 0.8263
No log 7.5667 454 0.7021 0.4995 0.7021 0.8379
No log 7.6 456 0.7271 0.4968 0.7271 0.8527
No log 7.6333 458 0.6974 0.5221 0.6974 0.8351
No log 7.6667 460 0.6731 0.5402 0.6731 0.8204
No log 7.7 462 0.6664 0.4381 0.6664 0.8163
No log 7.7333 464 0.6697 0.4520 0.6697 0.8183
No log 7.7667 466 0.6786 0.5073 0.6786 0.8238
No log 7.8 468 0.6797 0.4919 0.6797 0.8244
No log 7.8333 470 0.6781 0.5057 0.6781 0.8235
No log 7.8667 472 0.6849 0.4960 0.6849 0.8276
No log 7.9 474 0.6879 0.5342 0.6879 0.8294
No log 7.9333 476 0.6802 0.5441 0.6802 0.8247
No log 7.9667 478 0.7058 0.5480 0.7058 0.8401
No log 8.0 480 0.7537 0.4737 0.7537 0.8682
No log 8.0333 482 0.7636 0.4840 0.7636 0.8739
No log 8.0667 484 0.7620 0.4257 0.7620 0.8729
No log 8.1 486 0.7860 0.4192 0.7860 0.8866
No log 8.1333 488 0.8290 0.3921 0.8290 0.9105
No log 8.1667 490 0.8016 0.4054 0.8016 0.8953
No log 8.2 492 0.7339 0.4511 0.7339 0.8567
No log 8.2333 494 0.6713 0.5939 0.6713 0.8193
No log 8.2667 496 0.6465 0.5770 0.6465 0.8040
No log 8.3 498 0.6349 0.6021 0.6349 0.7968
0.2777 8.3333 500 0.6425 0.5586 0.6425 0.8016
0.2777 8.3667 502 0.6375 0.5696 0.6375 0.7985
0.2777 8.4 504 0.6304 0.5868 0.6304 0.7940
0.2777 8.4333 506 0.6469 0.5439 0.6469 0.8043
0.2777 8.4667 508 0.6421 0.5692 0.6421 0.8013
0.2777 8.5 510 0.6388 0.5455 0.6388 0.7992
0.2777 8.5333 512 0.6929 0.5951 0.6929 0.8324
0.2777 8.5667 514 0.7227 0.5828 0.7227 0.8501
0.2777 8.6 516 0.6890 0.5951 0.6890 0.8301
0.2777 8.6333 518 0.6390 0.5581 0.6390 0.7994
0.2777 8.6667 520 0.6304 0.5783 0.6304 0.7940
0.2777 8.7 522 0.6386 0.5215 0.6386 0.7991
0.2777 8.7333 524 0.6891 0.5493 0.6891 0.8301
0.2777 8.7667 526 0.7772 0.5614 0.7772 0.8816
0.2777 8.8 528 0.7674 0.5721 0.7674 0.8760
0.2777 8.8333 530 0.7102 0.5697 0.7102 0.8427
0.2777 8.8667 532 0.7002 0.5806 0.7002 0.8368
0.2777 8.9 534 0.6836 0.5446 0.6836 0.8268
0.2777 8.9333 536 0.7013 0.5078 0.7013 0.8374
0.2777 8.9667 538 0.6954 0.4692 0.6954 0.8339
0.2777 9.0 540 0.6774 0.4938 0.6774 0.8231
0.2777 9.0333 542 0.6688 0.5412 0.6688 0.8178

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task5_organization

Finetuned
(4019)
this model