ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6591
  • Qwk: 0.5963
  • Mse: 0.6591
  • Rmse: 0.8118

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0556 2 4.3305 0.0138 4.3305 2.0810
No log 0.1111 4 2.4410 0.0301 2.4410 1.5624
No log 0.1667 6 1.4279 0.0 1.4279 1.1950
No log 0.2222 8 1.0606 0.375 1.0606 1.0298
No log 0.2778 10 1.0393 0.3519 1.0393 1.0195
No log 0.3333 12 1.1853 0.0201 1.1853 1.0887
No log 0.3889 14 1.3117 0.0 1.3117 1.1453
No log 0.4444 16 1.2800 0.0 1.2800 1.1314
No log 0.5 18 1.3159 0.0 1.3159 1.1471
No log 0.5556 20 1.3959 0.0 1.3959 1.1815
No log 0.6111 22 1.3638 0.0 1.3638 1.1678
No log 0.6667 24 1.2999 0.0 1.2999 1.1401
No log 0.7222 26 1.1893 0.1407 1.1893 1.0905
No log 0.7778 28 1.1566 0.1764 1.1566 1.0754
No log 0.8333 30 1.1526 0.1591 1.1526 1.0736
No log 0.8889 32 1.1581 0.2125 1.1581 1.0762
No log 0.9444 34 1.1946 0.1618 1.1946 1.0930
No log 1.0 36 1.1673 0.1444 1.1673 1.0804
No log 1.0556 38 1.1658 0.1028 1.1658 1.0797
No log 1.1111 40 1.3275 0.0116 1.3275 1.1522
No log 1.1667 42 1.4147 0.0 1.4147 1.1894
No log 1.2222 44 1.3718 0.1886 1.3718 1.1712
No log 1.2778 46 1.1501 0.1268 1.1501 1.0724
No log 1.3333 48 1.2534 0.2590 1.2534 1.1196
No log 1.3889 50 1.3482 0.2002 1.3482 1.1611
No log 1.4444 52 1.2281 0.1810 1.2281 1.1082
No log 1.5 54 0.9890 0.2015 0.9890 0.9945
No log 1.5556 56 1.0263 0.1908 1.0263 1.0131
No log 1.6111 58 0.9921 0.2179 0.9921 0.9960
No log 1.6667 60 1.0699 0.1971 1.0699 1.0344
No log 1.7222 62 1.1108 0.1482 1.1108 1.0539
No log 1.7778 64 1.1610 0.2367 1.1610 1.0775
No log 1.8333 66 1.0679 0.1901 1.0679 1.0334
No log 1.8889 68 1.0077 0.0912 1.0077 1.0039
No log 1.9444 70 0.9418 0.3314 0.9418 0.9705
No log 2.0 72 0.9217 0.3209 0.9217 0.9600
No log 2.0556 74 0.8975 0.2974 0.8975 0.9474
No log 2.1111 76 0.9137 0.3236 0.9137 0.9559
No log 2.1667 78 0.9046 0.2689 0.9046 0.9511
No log 2.2222 80 0.9009 0.4710 0.9009 0.9492
No log 2.2778 82 1.2362 0.3310 1.2362 1.1119
No log 2.3333 84 1.2833 0.3191 1.2833 1.1328
No log 2.3889 86 1.1297 0.4030 1.1297 1.0629
No log 2.4444 88 0.9440 0.3879 0.9440 0.9716
No log 2.5 90 0.9082 0.4597 0.9082 0.9530
No log 2.5556 92 0.8538 0.4847 0.8538 0.9240
No log 2.6111 94 0.8630 0.4478 0.8630 0.9290
No log 2.6667 96 0.9605 0.4812 0.9605 0.9801
No log 2.7222 98 0.9795 0.4444 0.9795 0.9897
No log 2.7778 100 0.9741 0.3976 0.9741 0.9869
No log 2.8333 102 0.8584 0.4730 0.8584 0.9265
No log 2.8889 104 0.8039 0.4180 0.8039 0.8966
No log 2.9444 106 0.8040 0.3215 0.8040 0.8967
No log 3.0 108 0.8198 0.3878 0.8198 0.9054
No log 3.0556 110 0.9783 0.4292 0.9783 0.9891
No log 3.1111 112 1.0618 0.2864 1.0618 1.0304
No log 3.1667 114 0.9251 0.4681 0.9251 0.9618
No log 3.2222 116 0.9006 0.4220 0.9006 0.9490
No log 3.2778 118 0.9300 0.4460 0.9300 0.9644
No log 3.3333 120 1.0024 0.4150 1.0024 1.0012
No log 3.3889 122 1.0828 0.3913 1.0828 1.0406
No log 3.4444 124 0.9995 0.2958 0.9995 0.9997
No log 3.5 126 1.0351 0.4018 1.0351 1.0174
No log 3.5556 128 1.0149 0.4356 1.0149 1.0074
No log 3.6111 130 0.9742 0.5134 0.9742 0.9870
No log 3.6667 132 0.9456 0.4310 0.9456 0.9724
No log 3.7222 134 0.8921 0.4209 0.8921 0.9445
No log 3.7778 136 0.8227 0.4878 0.8227 0.9070
No log 3.8333 138 0.8418 0.4862 0.8418 0.9175
No log 3.8889 140 0.8811 0.4812 0.8811 0.9387
No log 3.9444 142 0.8980 0.4214 0.8980 0.9476
No log 4.0 144 0.8930 0.3327 0.8930 0.9450
No log 4.0556 146 0.9073 0.3715 0.9073 0.9525
No log 4.1111 148 0.9100 0.2951 0.9100 0.9540
No log 4.1667 150 0.9587 0.4532 0.9587 0.9791
No log 4.2222 152 0.9466 0.4532 0.9466 0.9729
No log 4.2778 154 0.9154 0.4284 0.9154 0.9568
No log 4.3333 156 0.8959 0.4370 0.8959 0.9465
No log 4.3889 158 0.8560 0.4577 0.8560 0.9252
No log 4.4444 160 0.8535 0.4761 0.8535 0.9239
No log 4.5 162 0.8304 0.5491 0.8304 0.9113
No log 4.5556 164 0.8089 0.5481 0.8089 0.8994
No log 4.6111 166 0.7916 0.5578 0.7916 0.8897
No log 4.6667 168 0.8367 0.5854 0.8367 0.9147
No log 4.7222 170 0.8985 0.56 0.8985 0.9479
No log 4.7778 172 0.7912 0.5305 0.7912 0.8895
No log 4.8333 174 0.7544 0.5418 0.7544 0.8686
No log 4.8889 176 0.7573 0.4737 0.7573 0.8703
No log 4.9444 178 0.8077 0.5450 0.8077 0.8987
No log 5.0 180 1.0517 0.4430 1.0517 1.0255
No log 5.0556 182 1.1155 0.4036 1.1155 1.0562
No log 5.1111 184 0.9098 0.4202 0.9098 0.9538
No log 5.1667 186 0.7624 0.4646 0.7624 0.8732
No log 5.2222 188 0.7649 0.4646 0.7649 0.8746
No log 5.2778 190 0.7776 0.4754 0.7776 0.8818
No log 5.3333 192 0.8340 0.4476 0.8340 0.9133
No log 5.3889 194 0.9766 0.4458 0.9766 0.9882
No log 5.4444 196 1.0456 0.3942 1.0456 1.0226
No log 5.5 198 0.9209 0.4787 0.9209 0.9596
No log 5.5556 200 0.8602 0.4115 0.8602 0.9275
No log 5.6111 202 0.9314 0.5332 0.9314 0.9651
No log 5.6667 204 1.0431 0.4050 1.0431 1.0213
No log 5.7222 206 1.0216 0.4158 1.0216 1.0107
No log 5.7778 208 0.7954 0.5153 0.7954 0.8919
No log 5.8333 210 0.7075 0.5260 0.7075 0.8411
No log 5.8889 212 0.6909 0.5260 0.6909 0.8312
No log 5.9444 214 0.7052 0.5698 0.7052 0.8398
No log 6.0 216 0.7275 0.5663 0.7275 0.8529
No log 6.0556 218 0.6782 0.5932 0.6782 0.8235
No log 6.1111 220 0.6638 0.5820 0.6638 0.8147
No log 6.1667 222 0.6929 0.6043 0.6929 0.8324
No log 6.2222 224 0.6955 0.5712 0.6955 0.8340
No log 6.2778 226 0.6901 0.4932 0.6901 0.8307
No log 6.3333 228 0.7798 0.5019 0.7798 0.8831
No log 6.3889 230 0.8147 0.4214 0.8147 0.9026
No log 6.4444 232 0.7462 0.4461 0.7462 0.8638
No log 6.5 234 0.7357 0.4388 0.7357 0.8577
No log 6.5556 236 0.7249 0.4660 0.7249 0.8514
No log 6.6111 238 0.7196 0.4831 0.7196 0.8483
No log 6.6667 240 0.7559 0.4903 0.7559 0.8694
No log 6.7222 242 0.7224 0.5696 0.7224 0.8499
No log 6.7778 244 0.6695 0.5505 0.6695 0.8182
No log 6.8333 246 0.6662 0.5443 0.6662 0.8162
No log 6.8889 248 0.6454 0.5724 0.6454 0.8033
No log 6.9444 250 0.6673 0.5490 0.6673 0.8169
No log 7.0 252 0.6926 0.4478 0.6926 0.8322
No log 7.0556 254 0.7320 0.4562 0.7320 0.8556
No log 7.1111 256 0.6797 0.4063 0.6797 0.8244
No log 7.1667 258 0.6625 0.5149 0.6625 0.8139
No log 7.2222 260 0.6755 0.4337 0.6755 0.8219
No log 7.2778 262 0.7221 0.3922 0.7221 0.8497
No log 7.3333 264 0.8162 0.4562 0.8162 0.9035
No log 7.3889 266 0.8058 0.4180 0.8058 0.8976
No log 7.4444 268 0.7772 0.3682 0.7772 0.8816
No log 7.5 270 0.7773 0.3393 0.7773 0.8816
No log 7.5556 272 0.7770 0.3424 0.7770 0.8815
No log 7.6111 274 0.7705 0.4075 0.7705 0.8778
No log 7.6667 276 0.7845 0.4586 0.7845 0.8857
No log 7.7222 278 0.7674 0.4343 0.7674 0.8760
No log 7.7778 280 0.8043 0.4318 0.8043 0.8968
No log 7.8333 282 0.8432 0.4310 0.8432 0.9183
No log 7.8889 284 0.7971 0.4318 0.7971 0.8928
No log 7.9444 286 0.7644 0.4456 0.7644 0.8743
No log 8.0 288 0.8110 0.4318 0.8110 0.9006
No log 8.0556 290 0.8721 0.4666 0.8721 0.9339
No log 8.1111 292 0.7776 0.4935 0.7776 0.8818
No log 8.1667 294 0.6953 0.5522 0.6953 0.8339
No log 8.2222 296 0.7041 0.5569 0.7041 0.8391
No log 8.2778 298 0.7046 0.5419 0.7046 0.8394
No log 8.3333 300 0.7069 0.5575 0.7069 0.8408
No log 8.3889 302 0.7078 0.4975 0.7078 0.8413
No log 8.4444 304 0.7096 0.6057 0.7096 0.8424
No log 8.5 306 0.7313 0.5217 0.7313 0.8551
No log 8.5556 308 0.7830 0.4331 0.7830 0.8849
No log 8.6111 310 0.7655 0.4671 0.7655 0.8749
No log 8.6667 312 0.7160 0.5089 0.7160 0.8461
No log 8.7222 314 0.7180 0.5093 0.7180 0.8474
No log 8.7778 316 0.7056 0.5428 0.7056 0.8400
No log 8.8333 318 0.6965 0.5396 0.6965 0.8346
No log 8.8889 320 0.6724 0.6345 0.6724 0.8200
No log 8.9444 322 0.6754 0.5540 0.6754 0.8219
No log 9.0 324 0.6737 0.5429 0.6737 0.8208
No log 9.0556 326 0.6454 0.5724 0.6454 0.8034
No log 9.1111 328 0.6846 0.5561 0.6846 0.8274
No log 9.1667 330 0.7688 0.4450 0.7688 0.8768
No log 9.2222 332 0.7646 0.4686 0.7646 0.8744
No log 9.2778 334 0.6894 0.5188 0.6894 0.8303
No log 9.3333 336 0.7202 0.5383 0.7202 0.8487
No log 9.3889 338 0.7143 0.5686 0.7143 0.8451
No log 9.4444 340 0.6782 0.5395 0.6782 0.8235
No log 9.5 342 0.6511 0.5877 0.6511 0.8069
No log 9.5556 344 0.6601 0.5719 0.6601 0.8125
No log 9.6111 346 0.7428 0.5555 0.7428 0.8619
No log 9.6667 348 0.8160 0.5526 0.8160 0.9033
No log 9.7222 350 0.7765 0.5725 0.7765 0.8812
No log 9.7778 352 0.6689 0.6083 0.6689 0.8178
No log 9.8333 354 0.6520 0.5987 0.6520 0.8075
No log 9.8889 356 0.6490 0.6598 0.6490 0.8056
No log 9.9444 358 0.6559 0.6015 0.6559 0.8099
No log 10.0 360 0.6484 0.6589 0.6484 0.8052
No log 10.0556 362 0.6586 0.5711 0.6586 0.8115
No log 10.1111 364 0.6660 0.6048 0.6660 0.8161
No log 10.1667 366 0.7135 0.5645 0.7135 0.8447
No log 10.2222 368 0.7586 0.5036 0.7586 0.8710
No log 10.2778 370 0.7014 0.5565 0.7014 0.8375
No log 10.3333 372 0.6463 0.6499 0.6463 0.8040
No log 10.3889 374 0.6437 0.6578 0.6437 0.8023
No log 10.4444 376 0.6340 0.6610 0.6340 0.7963
No log 10.5 378 0.6496 0.6488 0.6496 0.8060
No log 10.5556 380 0.6573 0.6087 0.6573 0.8107
No log 10.6111 382 0.6563 0.5977 0.6563 0.8101
No log 10.6667 384 0.6376 0.6724 0.6376 0.7985
No log 10.7222 386 0.6487 0.6446 0.6487 0.8054
No log 10.7778 388 0.6410 0.6764 0.6410 0.8006
No log 10.8333 390 0.6325 0.6720 0.6325 0.7953
No log 10.8889 392 0.6523 0.5955 0.6523 0.8077
No log 10.9444 394 0.6680 0.5955 0.6680 0.8173
No log 11.0 396 0.6630 0.5961 0.6630 0.8142
No log 11.0556 398 0.6677 0.6102 0.6677 0.8172
No log 11.1111 400 0.6869 0.6003 0.6869 0.8288
No log 11.1667 402 0.6728 0.6065 0.6728 0.8202
No log 11.2222 404 0.6840 0.4750 0.6840 0.8270
No log 11.2778 406 0.7262 0.5245 0.7262 0.8522
No log 11.3333 408 0.6937 0.5232 0.6937 0.8329
No log 11.3889 410 0.6554 0.6380 0.6554 0.8096
No log 11.4444 412 0.6942 0.5429 0.6942 0.8332
No log 11.5 414 0.7538 0.5160 0.7538 0.8682
No log 11.5556 416 0.7588 0.5181 0.7588 0.8711
No log 11.6111 418 0.6724 0.5763 0.6724 0.8200
No log 11.6667 420 0.6381 0.6196 0.6381 0.7988
No log 11.7222 422 0.6646 0.5650 0.6646 0.8153
No log 11.7778 424 0.6808 0.5103 0.6808 0.8251
No log 11.8333 426 0.6621 0.5795 0.6621 0.8137
No log 11.8889 428 0.6509 0.6107 0.6509 0.8068
No log 11.9444 430 0.6924 0.5305 0.6924 0.8321
No log 12.0 432 0.7243 0.5342 0.7243 0.8511
No log 12.0556 434 0.6580 0.5895 0.6580 0.8112
No log 12.1111 436 0.6483 0.5737 0.6483 0.8052
No log 12.1667 438 0.7077 0.5348 0.7077 0.8412
No log 12.2222 440 0.6984 0.5348 0.6984 0.8357
No log 12.2778 442 0.6413 0.6225 0.6413 0.8008
No log 12.3333 444 0.6521 0.6451 0.6521 0.8075
No log 12.3889 446 0.6771 0.6151 0.6771 0.8229
No log 12.4444 448 0.6407 0.6012 0.6407 0.8004
No log 12.5 450 0.6468 0.5770 0.6468 0.8043
No log 12.5556 452 0.7290 0.5148 0.7290 0.8538
No log 12.6111 454 0.7356 0.5148 0.7356 0.8577
No log 12.6667 456 0.6984 0.5355 0.6984 0.8357
No log 12.7222 458 0.6875 0.5555 0.6875 0.8292
No log 12.7778 460 0.6869 0.5301 0.6869 0.8288
No log 12.8333 462 0.6662 0.5712 0.6662 0.8162
No log 12.8889 464 0.6552 0.6345 0.6552 0.8094
No log 12.9444 466 0.6527 0.6306 0.6527 0.8079
No log 13.0 468 0.6657 0.5686 0.6657 0.8159
No log 13.0556 470 0.7334 0.5236 0.7334 0.8564
No log 13.1111 472 0.6573 0.6009 0.6573 0.8107
No log 13.1667 474 0.6235 0.6451 0.6235 0.7896
No log 13.2222 476 0.6207 0.6451 0.6207 0.7878
No log 13.2778 478 0.6158 0.6065 0.6158 0.7847
No log 13.3333 480 0.6271 0.6087 0.6271 0.7919
No log 13.3889 482 0.6170 0.6087 0.6170 0.7855
No log 13.4444 484 0.6154 0.5988 0.6154 0.7845
No log 13.5 486 0.6012 0.6507 0.6012 0.7754
No log 13.5556 488 0.5889 0.6649 0.5889 0.7674
No log 13.6111 490 0.5926 0.6762 0.5926 0.7698
No log 13.6667 492 0.6099 0.6699 0.6099 0.7809
No log 13.7222 494 0.6423 0.5865 0.6423 0.8015
No log 13.7778 496 0.6325 0.6185 0.6325 0.7953
No log 13.8333 498 0.6421 0.6672 0.6421 0.8013
0.2599 13.8889 500 0.6779 0.6189 0.6779 0.8233
0.2599 13.9444 502 0.6640 0.6337 0.6640 0.8148
0.2599 14.0 504 0.6449 0.6364 0.6449 0.8031
0.2599 14.0556 506 0.6560 0.6185 0.6560 0.8099
0.2599 14.1111 508 0.6529 0.6164 0.6529 0.8080
0.2599 14.1667 510 0.6448 0.6610 0.6448 0.8030
0.2599 14.2222 512 0.6806 0.5850 0.6806 0.8250
0.2599 14.2778 514 0.7545 0.5958 0.7545 0.8686
0.2599 14.3333 516 0.7444 0.5660 0.7444 0.8628
0.2599 14.3889 518 0.6963 0.5850 0.6963 0.8345
0.2599 14.4444 520 0.6591 0.5963 0.6591 0.8118

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

Finetuned
(4019)
this model