ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6914
  • Qwk: 0.6307
  • Mse: 0.6914
  • Rmse: 0.8315

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 4.2037 0.0420 4.2037 2.0503
No log 0.1379 4 2.6316 -0.0325 2.6316 1.6222
No log 0.2069 6 1.5453 0.0372 1.5453 1.2431
No log 0.2759 8 1.2343 0.1691 1.2343 1.1110
No log 0.3448 10 1.1704 0.1307 1.1704 1.0819
No log 0.4138 12 1.1858 0.1671 1.1858 1.0889
No log 0.4828 14 1.4007 0.0905 1.4007 1.1835
No log 0.5517 16 1.8151 0.0905 1.8151 1.3472
No log 0.6207 18 1.4385 0.1084 1.4385 1.1994
No log 0.6897 20 1.1037 0.2402 1.1037 1.0506
No log 0.7586 22 1.0807 0.3086 1.0807 1.0396
No log 0.8276 24 1.2811 0.1875 1.2811 1.1319
No log 0.8966 26 1.1296 0.2432 1.1296 1.0628
No log 0.9655 28 1.0095 0.3970 1.0095 1.0047
No log 1.0345 30 1.0050 0.3583 1.0050 1.0025
No log 1.1034 32 0.9359 0.4498 0.9359 0.9674
No log 1.1724 34 0.8886 0.4695 0.8886 0.9427
No log 1.2414 36 0.8723 0.5299 0.8723 0.9340
No log 1.3103 38 0.8455 0.5299 0.8455 0.9195
No log 1.3793 40 0.8288 0.5214 0.8288 0.9104
No log 1.4483 42 0.8594 0.5146 0.8594 0.9270
No log 1.5172 44 0.8505 0.5515 0.8505 0.9222
No log 1.5862 46 0.9485 0.5332 0.9485 0.9739
No log 1.6552 48 0.9537 0.5078 0.9537 0.9766
No log 1.7241 50 0.9288 0.5392 0.9288 0.9637
No log 1.7931 52 1.0842 0.4977 1.0842 1.0412
No log 1.8621 54 1.2285 0.4737 1.2285 1.1084
No log 1.9310 56 1.0970 0.4824 1.0970 1.0474
No log 2.0 58 1.0357 0.5676 1.0357 1.0177
No log 2.0690 60 1.0101 0.5730 1.0101 1.0051
No log 2.1379 62 0.9783 0.5588 0.9783 0.9891
No log 2.2069 64 0.9403 0.5743 0.9403 0.9697
No log 2.2759 66 0.9229 0.5938 0.9229 0.9607
No log 2.3448 68 0.9203 0.5646 0.9203 0.9593
No log 2.4138 70 0.9398 0.4733 0.9398 0.9694
No log 2.4828 72 0.9019 0.5458 0.9019 0.9497
No log 2.5517 74 0.9301 0.4673 0.9301 0.9644
No log 2.6207 76 0.9209 0.5161 0.9209 0.9596
No log 2.6897 78 0.8794 0.4767 0.8794 0.9377
No log 2.7586 80 0.9041 0.4914 0.9041 0.9508
No log 2.8276 82 0.9114 0.4737 0.9114 0.9547
No log 2.8966 84 0.9302 0.4606 0.9302 0.9645
No log 2.9655 86 0.9001 0.4601 0.9001 0.9487
No log 3.0345 88 0.9412 0.4648 0.9412 0.9702
No log 3.1034 90 0.8636 0.4508 0.8636 0.9293
No log 3.1724 92 0.9203 0.5240 0.9203 0.9593
No log 3.2414 94 0.8510 0.5810 0.8510 0.9225
No log 3.3103 96 0.8248 0.4738 0.8248 0.9082
No log 3.3793 98 1.0355 0.4088 1.0355 1.0176
No log 3.4483 100 1.0409 0.3687 1.0409 1.0202
No log 3.5172 102 0.8792 0.5607 0.8792 0.9377
No log 3.5862 104 0.9619 0.4472 0.9619 0.9808
No log 3.6552 106 1.0609 0.4719 1.0609 1.0300
No log 3.7241 108 0.9787 0.5461 0.9787 0.9893
No log 3.7931 110 0.9512 0.5766 0.9512 0.9753
No log 3.8621 112 0.9738 0.5390 0.9738 0.9868
No log 3.9310 114 0.9376 0.5229 0.9376 0.9683
No log 4.0 116 1.0609 0.5040 1.0609 1.0300
No log 4.0690 118 1.1594 0.3811 1.1594 1.0768
No log 4.1379 120 1.0171 0.4960 1.0171 1.0085
No log 4.2069 122 0.8378 0.5160 0.8378 0.9153
No log 4.2759 124 0.8630 0.5306 0.8630 0.9290
No log 4.3448 126 0.7795 0.5632 0.7795 0.8829
No log 4.4138 128 0.8781 0.5818 0.8781 0.9371
No log 4.4828 130 0.9432 0.5844 0.9432 0.9712
No log 4.5517 132 0.8942 0.5722 0.8942 0.9456
No log 4.6207 134 0.7879 0.6162 0.7879 0.8876
No log 4.6897 136 0.7286 0.5916 0.7286 0.8536
No log 4.7586 138 0.7360 0.5729 0.7360 0.8579
No log 4.8276 140 0.8203 0.5517 0.8203 0.9057
No log 4.8966 142 0.9920 0.4990 0.9920 0.9960
No log 4.9655 144 0.8900 0.5769 0.8900 0.9434
No log 5.0345 146 0.7627 0.5885 0.7627 0.8733
No log 5.1034 148 0.7723 0.6054 0.7723 0.8788
No log 5.1724 150 0.8021 0.6163 0.8021 0.8956
No log 5.2414 152 0.8118 0.5648 0.8118 0.9010
No log 5.3103 154 0.8749 0.5268 0.8749 0.9354
No log 5.3793 156 0.8428 0.5029 0.8428 0.9180
No log 5.4483 158 0.7554 0.5866 0.7554 0.8692
No log 5.5172 160 0.7517 0.5342 0.7517 0.8670
No log 5.5862 162 0.7560 0.5361 0.7560 0.8695
No log 5.6552 164 0.7480 0.6006 0.7480 0.8648
No log 5.7241 166 0.7718 0.6019 0.7718 0.8785
No log 5.7931 168 0.8328 0.5430 0.8328 0.9126
No log 5.8621 170 0.7713 0.6010 0.7713 0.8782
No log 5.9310 172 0.7594 0.6244 0.7594 0.8715
No log 6.0 174 0.8138 0.5975 0.8138 0.9021
No log 6.0690 176 0.9491 0.5672 0.9491 0.9742
No log 6.1379 178 0.8795 0.5448 0.8795 0.9378
No log 6.2069 180 0.7208 0.6528 0.7208 0.8490
No log 6.2759 182 0.7723 0.4591 0.7723 0.8788
No log 6.3448 184 0.8473 0.5398 0.8473 0.9205
No log 6.4138 186 0.8634 0.5324 0.8634 0.9292
No log 6.4828 188 0.7160 0.5407 0.7160 0.8462
No log 6.5517 190 0.7561 0.6196 0.7561 0.8696
No log 6.6207 192 0.7868 0.6068 0.7868 0.8870
No log 6.6897 194 0.7707 0.6404 0.7707 0.8779
No log 6.7586 196 0.7487 0.6023 0.7487 0.8653
No log 6.8276 198 0.7441 0.6205 0.7441 0.8626
No log 6.8966 200 0.7801 0.5823 0.7801 0.8833
No log 6.9655 202 0.9013 0.5786 0.9013 0.9494
No log 7.0345 204 0.8661 0.6213 0.8661 0.9307
No log 7.1034 206 0.8050 0.5564 0.8050 0.8972
No log 7.1724 208 0.8481 0.5242 0.8481 0.9209
No log 7.2414 210 0.8235 0.5087 0.8235 0.9075
No log 7.3103 212 0.9009 0.5835 0.9009 0.9492
No log 7.3793 214 1.0845 0.5531 1.0845 1.0414
No log 7.4483 216 1.0588 0.5474 1.0588 1.0290
No log 7.5172 218 0.9313 0.5493 0.9313 0.9651
No log 7.5862 220 0.8532 0.4637 0.8532 0.9237
No log 7.6552 222 0.8458 0.4768 0.8458 0.9197
No log 7.7241 224 0.8415 0.4768 0.8415 0.9174
No log 7.7931 226 0.8437 0.5165 0.8437 0.9185
No log 7.8621 228 0.9391 0.5743 0.9391 0.9691
No log 7.9310 230 0.9716 0.5671 0.9716 0.9857
No log 8.0 232 0.8300 0.5733 0.8300 0.9110
No log 8.0690 234 0.7963 0.5125 0.7963 0.8923
No log 8.1379 236 0.9060 0.4063 0.9060 0.9519
No log 8.2069 238 0.8594 0.3584 0.8594 0.9270
No log 8.2759 240 0.7524 0.6049 0.7524 0.8674
No log 8.3448 242 0.8004 0.5733 0.8004 0.8947
No log 8.4138 244 0.7776 0.6052 0.7776 0.8818
No log 8.4828 246 0.7837 0.6085 0.7837 0.8852
No log 8.5517 248 0.7828 0.5785 0.7828 0.8848
No log 8.6207 250 0.7703 0.5992 0.7703 0.8777
No log 8.6897 252 0.8006 0.5926 0.8006 0.8947
No log 8.7586 254 0.7893 0.5855 0.7893 0.8884
No log 8.8276 256 0.7665 0.5751 0.7665 0.8755
No log 8.8966 258 0.7506 0.5890 0.7506 0.8664
No log 8.9655 260 0.7696 0.5754 0.7696 0.8773
No log 9.0345 262 0.8072 0.5839 0.8072 0.8985
No log 9.1034 264 0.8390 0.5823 0.8390 0.9160
No log 9.1724 266 0.7850 0.5543 0.7850 0.8860
No log 9.2414 268 0.7574 0.5606 0.7574 0.8703
No log 9.3103 270 0.7831 0.4393 0.7831 0.8849
No log 9.3793 272 0.7631 0.5291 0.7631 0.8735
No log 9.4483 274 0.7593 0.5729 0.7593 0.8714
No log 9.5172 276 0.7580 0.5580 0.7580 0.8706
No log 9.5862 278 0.7737 0.5791 0.7737 0.8796
No log 9.6552 280 0.7977 0.5797 0.7977 0.8931
No log 9.7241 282 0.8756 0.5814 0.8756 0.9357
No log 9.7931 284 0.8856 0.5814 0.8856 0.9411
No log 9.8621 286 0.8628 0.5791 0.8628 0.9289
No log 9.9310 288 0.8360 0.5692 0.8360 0.9143
No log 10.0 290 0.8386 0.4820 0.8386 0.9157
No log 10.0690 292 0.8227 0.4820 0.8227 0.9070
No log 10.1379 294 0.7981 0.4734 0.7981 0.8934
No log 10.2069 296 0.7924 0.5455 0.7924 0.8901
No log 10.2759 298 0.7946 0.6056 0.7946 0.8914
No log 10.3448 300 0.8038 0.6163 0.8038 0.8966
No log 10.4138 302 0.7965 0.5827 0.7965 0.8924
No log 10.4828 304 0.7906 0.5827 0.7906 0.8891
No log 10.5517 306 0.7679 0.5823 0.7679 0.8763
No log 10.6207 308 0.7386 0.5657 0.7386 0.8594
No log 10.6897 310 0.7252 0.5840 0.7252 0.8516
No log 10.7586 312 0.7160 0.5726 0.7160 0.8462
No log 10.8276 314 0.7118 0.5726 0.7118 0.8437
No log 10.8966 316 0.7143 0.5242 0.7143 0.8451
No log 10.9655 318 0.7205 0.5233 0.7205 0.8488
No log 11.0345 320 0.7237 0.5726 0.7237 0.8507
No log 11.1034 322 0.7427 0.5342 0.7427 0.8618
No log 11.1724 324 0.7940 0.5427 0.7940 0.8910
No log 11.2414 326 0.7360 0.4929 0.7360 0.8579
No log 11.3103 328 0.7220 0.6061 0.7220 0.8497
No log 11.3793 330 0.7193 0.5672 0.7193 0.8481
No log 11.4483 332 0.7162 0.5672 0.7162 0.8463
No log 11.5172 334 0.7270 0.5622 0.7270 0.8527
No log 11.5862 336 0.7531 0.6076 0.7531 0.8678
No log 11.6552 338 0.7673 0.6044 0.7673 0.8759
No log 11.7241 340 0.7271 0.6095 0.7271 0.8527
No log 11.7931 342 0.7317 0.5708 0.7317 0.8554
No log 11.8621 344 0.7469 0.6121 0.7469 0.8643
No log 11.9310 346 0.7323 0.6240 0.7323 0.8557
No log 12.0 348 0.7165 0.6240 0.7165 0.8465
No log 12.0690 350 0.7144 0.6240 0.7144 0.8452
No log 12.1379 352 0.7169 0.6240 0.7169 0.8467
No log 12.2069 354 0.7073 0.6189 0.7073 0.8410
No log 12.2759 356 0.7013 0.6429 0.7013 0.8375
No log 12.3448 358 0.7130 0.6422 0.7130 0.8444
No log 12.4138 360 0.7903 0.5793 0.7903 0.8890
No log 12.4828 362 0.8350 0.5988 0.8350 0.9138
No log 12.5517 364 0.7709 0.5818 0.7709 0.8780
No log 12.6207 366 0.7070 0.6215 0.7070 0.8408
No log 12.6897 368 0.6925 0.6107 0.6925 0.8322
No log 12.7586 370 0.7028 0.5923 0.7028 0.8383
No log 12.8276 372 0.7158 0.6084 0.7158 0.8460
No log 12.8966 374 0.7552 0.6098 0.7552 0.8690
No log 12.9655 376 0.7789 0.6162 0.7789 0.8826
No log 13.0345 378 0.7588 0.5783 0.7588 0.8711
No log 13.1034 380 0.7203 0.5709 0.7203 0.8487
No log 13.1724 382 0.7479 0.5870 0.7479 0.8648
No log 13.2414 384 0.7670 0.5618 0.7670 0.8758
No log 13.3103 386 0.7232 0.5932 0.7232 0.8504
No log 13.3793 388 0.7305 0.4960 0.7305 0.8547
No log 13.4483 390 0.7291 0.5163 0.7291 0.8539
No log 13.5172 392 0.7673 0.5841 0.7673 0.8759
No log 13.5862 394 0.8657 0.5306 0.8657 0.9304
No log 13.6552 396 0.9403 0.5268 0.9403 0.9697
No log 13.7241 398 0.8909 0.5268 0.8909 0.9439
No log 13.7931 400 0.7706 0.5928 0.7706 0.8778
No log 13.8621 402 0.7457 0.5107 0.7457 0.8635
No log 13.9310 404 0.7485 0.4932 0.7485 0.8652
No log 14.0 406 0.7306 0.6227 0.7306 0.8547
No log 14.0690 408 0.7915 0.6049 0.7915 0.8897
No log 14.1379 410 0.9548 0.6131 0.9548 0.9771
No log 14.2069 412 1.0315 0.5771 1.0315 1.0156
No log 14.2759 414 0.9298 0.6224 0.9298 0.9643
No log 14.3448 416 0.7854 0.5707 0.7854 0.8862
No log 14.4138 418 0.7372 0.6400 0.7372 0.8586
No log 14.4828 420 0.7595 0.6014 0.7595 0.8715
No log 14.5517 422 0.7359 0.6187 0.7359 0.8578
No log 14.6207 424 0.7204 0.6274 0.7204 0.8487
No log 14.6897 426 0.7425 0.6280 0.7425 0.8617
No log 14.7586 428 0.8153 0.5797 0.8153 0.9030
No log 14.8276 430 0.9304 0.5216 0.9304 0.9646
No log 14.8966 432 0.9627 0.5365 0.9627 0.9812
No log 14.9655 434 0.8785 0.5408 0.8785 0.9373
No log 15.0345 436 0.7827 0.5926 0.7827 0.8847
No log 15.1034 438 0.7419 0.6142 0.7419 0.8614
No log 15.1724 440 0.7371 0.6289 0.7371 0.8585
No log 15.2414 442 0.7326 0.6108 0.7326 0.8559
No log 15.3103 444 0.7385 0.6098 0.7385 0.8594
No log 15.3793 446 0.7419 0.6120 0.7419 0.8613
No log 15.4483 448 0.7387 0.5963 0.7387 0.8595
No log 15.5172 450 0.7296 0.5923 0.7296 0.8542
No log 15.5862 452 0.7271 0.5958 0.7271 0.8527
No log 15.6552 454 0.7316 0.5657 0.7316 0.8553
No log 15.7241 456 0.7313 0.5811 0.7313 0.8552
No log 15.7931 458 0.7452 0.5684 0.7452 0.8632
No log 15.8621 460 0.7957 0.5462 0.7957 0.8920
No log 15.9310 462 0.8068 0.4613 0.8068 0.8982
No log 16.0 464 0.7580 0.5684 0.7580 0.8706
No log 16.0690 466 0.7394 0.6192 0.7394 0.8599
No log 16.1379 468 0.7527 0.5819 0.7527 0.8676
No log 16.2069 470 0.7662 0.5745 0.7662 0.8754
No log 16.2759 472 0.7958 0.5876 0.7958 0.8921
No log 16.3448 474 0.8275 0.5887 0.8275 0.9097
No log 16.4138 476 0.8557 0.5847 0.8557 0.9251
No log 16.4828 478 0.8145 0.5899 0.8145 0.9025
No log 16.5517 480 0.7611 0.6371 0.7611 0.8724
No log 16.6207 482 0.7359 0.6086 0.7359 0.8578
No log 16.6897 484 0.7295 0.6141 0.7295 0.8541
No log 16.7586 486 0.7300 0.6108 0.7300 0.8544
No log 16.8276 488 0.7410 0.6197 0.7410 0.8608
No log 16.8966 490 0.7593 0.5956 0.7593 0.8714
No log 16.9655 492 0.7709 0.5988 0.7709 0.8780
No log 17.0345 494 0.7878 0.5658 0.7878 0.8876
No log 17.1034 496 0.7551 0.6182 0.7551 0.8690
No log 17.1724 498 0.7123 0.6447 0.7123 0.8440
0.2693 17.2414 500 0.7046 0.6380 0.7046 0.8394
0.2693 17.3103 502 0.7030 0.6021 0.7030 0.8384
0.2693 17.3793 504 0.7037 0.6167 0.7037 0.8389
0.2693 17.4483 506 0.7016 0.5916 0.7016 0.8376
0.2693 17.5172 508 0.7021 0.6002 0.7021 0.8379
0.2693 17.5862 510 0.6914 0.6307 0.6914 0.8315

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task2_organization

Finetuned
(4019)
this model