ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7667
  • Qwk: 0.5838
  • Mse: 0.7667
  • Rmse: 0.8756

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 4.7350 0.0010 4.7350 2.1760
No log 0.0702 4 3.1306 -0.0274 3.1306 1.7694
No log 0.1053 6 1.8221 0.0629 1.8221 1.3499
No log 0.1404 8 1.2986 0.0818 1.2986 1.1396
No log 0.1754 10 1.1532 0.2513 1.1532 1.0739
No log 0.2105 12 1.2149 0.1649 1.2149 1.1022
No log 0.2456 14 1.2740 -0.0011 1.2740 1.1287
No log 0.2807 16 1.2609 -0.0199 1.2609 1.1229
No log 0.3158 18 1.2005 0.0420 1.2005 1.0957
No log 0.3509 20 1.2040 0.0883 1.2040 1.0973
No log 0.3860 22 1.2335 0.0697 1.2335 1.1106
No log 0.4211 24 1.2558 0.1469 1.2558 1.1206
No log 0.4561 26 1.6417 0.1524 1.6417 1.2813
No log 0.4912 28 1.5971 0.2540 1.5971 1.2638
No log 0.5263 30 1.0582 0.3607 1.0582 1.0287
No log 0.5614 32 1.2479 0.4242 1.2479 1.1171
No log 0.5965 34 1.1921 0.4342 1.1921 1.0918
No log 0.6316 36 0.9991 0.4311 0.9991 0.9995
No log 0.6667 38 0.9596 0.5053 0.9596 0.9796
No log 0.7018 40 0.8828 0.5039 0.8828 0.9396
No log 0.7368 42 0.8327 0.5335 0.8327 0.9125
No log 0.7719 44 0.8816 0.4401 0.8816 0.9389
No log 0.8070 46 0.9340 0.4235 0.9340 0.9665
No log 0.8421 48 0.7952 0.5335 0.7952 0.8917
No log 0.8772 50 0.8393 0.4823 0.8393 0.9161
No log 0.9123 52 0.8676 0.4752 0.8676 0.9314
No log 0.9474 54 0.8463 0.5053 0.8463 0.9200
No log 0.9825 56 0.8038 0.6328 0.8038 0.8966
No log 1.0175 58 0.7960 0.6580 0.7960 0.8922
No log 1.0526 60 0.8349 0.5593 0.8349 0.9137
No log 1.0877 62 0.8624 0.5870 0.8624 0.9287
No log 1.1228 64 0.8263 0.6618 0.8263 0.9090
No log 1.1579 66 0.8108 0.6411 0.8108 0.9004
No log 1.1930 68 0.8176 0.6411 0.8176 0.9042
No log 1.2281 70 0.7887 0.5811 0.7887 0.8881
No log 1.2632 72 0.7799 0.5811 0.7799 0.8831
No log 1.2982 74 0.7936 0.6355 0.7936 0.8908
No log 1.3333 76 0.8720 0.5340 0.8720 0.9338
No log 1.3684 78 0.8005 0.5819 0.8005 0.8947
No log 1.4035 80 0.7803 0.4701 0.7803 0.8834
No log 1.4386 82 0.7960 0.4839 0.7960 0.8922
No log 1.4737 84 0.8941 0.4273 0.8941 0.9456
No log 1.5088 86 0.8315 0.4297 0.8315 0.9119
No log 1.5439 88 0.8646 0.5396 0.8646 0.9298
No log 1.5789 90 0.9655 0.4986 0.9655 0.9826
No log 1.6140 92 0.9448 0.5411 0.9448 0.9720
No log 1.6491 94 0.9421 0.4935 0.9421 0.9706
No log 1.6842 96 0.9302 0.4694 0.9302 0.9645
No log 1.7193 98 0.9861 0.5559 0.9861 0.9930
No log 1.7544 100 0.9623 0.5490 0.9623 0.9810
No log 1.7895 102 0.9152 0.5854 0.9152 0.9567
No log 1.8246 104 1.0677 0.5016 1.0677 1.0333
No log 1.8596 106 1.2000 0.5108 1.2000 1.0955
No log 1.8947 108 1.1795 0.5202 1.1795 1.0861
No log 1.9298 110 0.9006 0.5162 0.9006 0.9490
No log 1.9649 112 0.9556 0.6067 0.9556 0.9775
No log 2.0 114 1.1490 0.5424 1.1490 1.0719
No log 2.0351 116 1.1890 0.4504 1.1890 1.0904
No log 2.0702 118 1.0700 0.4513 1.0700 1.0344
No log 2.1053 120 0.8445 0.5769 0.8445 0.9190
No log 2.1404 122 0.7516 0.4866 0.7516 0.8669
No log 2.1754 124 0.7720 0.5180 0.7720 0.8786
No log 2.2105 126 0.7356 0.5861 0.7356 0.8577
No log 2.2456 128 0.8067 0.5757 0.8067 0.8982
No log 2.2807 130 0.9884 0.5495 0.9884 0.9942
No log 2.3158 132 1.0161 0.5439 1.0161 1.0080
No log 2.3509 134 0.8809 0.5483 0.8809 0.9386
No log 2.3860 136 0.7737 0.5862 0.7737 0.8796
No log 2.4211 138 0.7712 0.5693 0.7712 0.8782
No log 2.4561 140 0.7757 0.5807 0.7757 0.8808
No log 2.4912 142 0.8432 0.5470 0.8432 0.9182
No log 2.5263 144 0.9171 0.5799 0.9171 0.9576
No log 2.5614 146 0.8664 0.6118 0.8664 0.9308
No log 2.5965 148 0.8458 0.6662 0.8458 0.9197
No log 2.6316 150 0.8891 0.6077 0.8891 0.9429
No log 2.6667 152 0.8431 0.6555 0.8431 0.9182
No log 2.7018 154 0.8996 0.5315 0.8996 0.9484
No log 2.7368 156 1.0478 0.5126 1.0478 1.0236
No log 2.7719 158 1.0936 0.5480 1.0936 1.0458
No log 2.8070 160 0.9585 0.5311 0.9585 0.9790
No log 2.8421 162 0.8390 0.6032 0.8390 0.9160
No log 2.8772 164 0.8279 0.4932 0.8279 0.9099
No log 2.9123 166 0.8219 0.4742 0.8219 0.9066
No log 2.9474 168 0.8025 0.5794 0.8025 0.8958
No log 2.9825 170 0.9218 0.5635 0.9218 0.9601
No log 3.0175 172 1.0915 0.5714 1.0915 1.0447
No log 3.0526 174 1.2176 0.4530 1.2176 1.1035
No log 3.0877 176 1.1729 0.4931 1.1729 1.0830
No log 3.1228 178 0.9399 0.5907 0.9399 0.9695
No log 3.1579 180 0.8286 0.5407 0.8286 0.9103
No log 3.1930 182 1.0030 0.3959 1.0030 1.0015
No log 3.2281 184 0.9837 0.4260 0.9837 0.9918
No log 3.2632 186 0.8139 0.4402 0.8139 0.9022
No log 3.2982 188 0.7407 0.6015 0.7407 0.8606
No log 3.3333 190 0.8822 0.5469 0.8822 0.9392
No log 3.3684 192 0.9694 0.5140 0.9694 0.9846
No log 3.4035 194 0.9151 0.5250 0.9151 0.9566
No log 3.4386 196 0.7633 0.4575 0.7633 0.8736
No log 3.4737 198 0.8005 0.4752 0.8005 0.8947
No log 3.5088 200 0.8901 0.4973 0.8901 0.9435
No log 3.5439 202 0.8330 0.4842 0.8330 0.9127
No log 3.5789 204 0.7486 0.5785 0.7486 0.8652
No log 3.6140 206 0.7786 0.6100 0.7786 0.8824
No log 3.6491 208 0.8504 0.5947 0.8504 0.9222
No log 3.6842 210 0.8395 0.5961 0.8395 0.9162
No log 3.7193 212 0.8457 0.5778 0.8457 0.9196
No log 3.7544 214 0.8818 0.5700 0.8818 0.9391
No log 3.7895 216 0.9587 0.5736 0.9587 0.9791
No log 3.8246 218 1.0483 0.5476 1.0483 1.0239
No log 3.8596 220 1.1014 0.4866 1.1014 1.0495
No log 3.8947 222 0.9794 0.5614 0.9794 0.9897
No log 3.9298 224 0.8391 0.5720 0.8391 0.9160
No log 3.9649 226 0.9145 0.4518 0.9145 0.9563
No log 4.0 228 0.9598 0.4962 0.9598 0.9797
No log 4.0351 230 0.8731 0.4792 0.8731 0.9344
No log 4.0702 232 0.8186 0.5802 0.8186 0.9047
No log 4.1053 234 0.9408 0.5241 0.9408 0.9700
No log 4.1404 236 0.9810 0.4760 0.9810 0.9904
No log 4.1754 238 0.9177 0.5658 0.9177 0.9580
No log 4.2105 240 0.8241 0.5646 0.8241 0.9078
No log 4.2456 242 0.7899 0.5667 0.7899 0.8887
No log 4.2807 244 0.7753 0.5679 0.7753 0.8805
No log 4.3158 246 0.7793 0.5391 0.7793 0.8828
No log 4.3509 248 0.7948 0.5279 0.7948 0.8915
No log 4.3860 250 0.8782 0.5350 0.8782 0.9371
No log 4.4211 252 0.9220 0.5670 0.9220 0.9602
No log 4.4561 254 0.8908 0.6 0.8908 0.9438
No log 4.4912 256 0.8280 0.5788 0.8280 0.9100
No log 4.5263 258 0.8353 0.5899 0.8353 0.9140
No log 4.5614 260 0.8013 0.5957 0.8013 0.8952
No log 4.5965 262 0.7521 0.5957 0.7521 0.8672
No log 4.6316 264 0.7202 0.6171 0.7202 0.8486
No log 4.6667 266 0.7071 0.6461 0.7071 0.8409
No log 4.7018 268 0.7439 0.6089 0.7439 0.8625
No log 4.7368 270 0.8456 0.5899 0.8456 0.9196
No log 4.7719 272 0.9226 0.5468 0.9226 0.9605
No log 4.8070 274 0.9215 0.5135 0.9215 0.9600
No log 4.8421 276 0.8673 0.56 0.8673 0.9313
No log 4.8772 278 0.8182 0.5560 0.8182 0.9045
No log 4.9123 280 0.7485 0.5756 0.7485 0.8652
No log 4.9474 282 0.7545 0.5928 0.7545 0.8686
No log 4.9825 284 0.7410 0.5716 0.7410 0.8608
No log 5.0175 286 0.7346 0.5738 0.7346 0.8571
No log 5.0526 288 0.7470 0.6167 0.7470 0.8643
No log 5.0877 290 0.7925 0.5759 0.7925 0.8902
No log 5.1228 292 0.9905 0.5471 0.9905 0.9952
No log 5.1579 294 1.0896 0.5192 1.0896 1.0438
No log 5.1930 296 0.9521 0.5286 0.9521 0.9758
No log 5.2281 298 0.8127 0.5702 0.8127 0.9015
No log 5.2632 300 0.7664 0.5569 0.7664 0.8754
No log 5.2982 302 0.7531 0.5569 0.7531 0.8678
No log 5.3333 304 0.7453 0.5690 0.7453 0.8633
No log 5.3684 306 0.7969 0.5898 0.7969 0.8927
No log 5.4035 308 0.8517 0.5763 0.8517 0.9229
No log 5.4386 310 0.9191 0.5536 0.9191 0.9587
No log 5.4737 312 0.8667 0.5788 0.8667 0.9310
No log 5.5088 314 0.7994 0.5254 0.7994 0.8941
No log 5.5439 316 0.7858 0.5335 0.7858 0.8864
No log 5.5789 318 0.7795 0.5382 0.7795 0.8829
No log 5.6140 320 0.7729 0.5382 0.7729 0.8791
No log 5.6491 322 0.7717 0.5335 0.7717 0.8785
No log 5.6842 324 0.7646 0.5647 0.7646 0.8744
No log 5.7193 326 0.8151 0.5385 0.8151 0.9028
No log 5.7544 328 0.9262 0.5797 0.9262 0.9624
No log 5.7895 330 0.9226 0.5801 0.9226 0.9605
No log 5.8246 332 0.8574 0.6 0.8574 0.9260
No log 5.8596 334 0.7421 0.6249 0.7421 0.8615
No log 5.8947 336 0.6793 0.5931 0.6793 0.8242
No log 5.9298 338 0.6826 0.6041 0.6826 0.8262
No log 5.9649 340 0.7102 0.6065 0.7102 0.8428
No log 6.0 342 0.8019 0.6305 0.8019 0.8955
No log 6.0351 344 0.9531 0.5380 0.9531 0.9763
No log 6.0702 346 0.9395 0.5847 0.9395 0.9693
No log 6.1053 348 0.9026 0.5743 0.9026 0.9500
No log 6.1404 350 0.8819 0.5542 0.8819 0.9391
No log 6.1754 352 0.8970 0.5346 0.8970 0.9471
No log 6.2105 354 0.9461 0.5426 0.9461 0.9727
No log 6.2456 356 0.9965 0.5878 0.9965 0.9983
No log 6.2807 358 0.9653 0.5793 0.9653 0.9825
No log 6.3158 360 0.8779 0.5683 0.8779 0.9370
No log 6.3509 362 0.8848 0.5517 0.8848 0.9406
No log 6.3860 364 0.9161 0.5797 0.9161 0.9571
No log 6.4211 366 0.9204 0.5793 0.9204 0.9594
No log 6.4561 368 0.9098 0.5793 0.9098 0.9538
No log 6.4912 370 0.9407 0.5945 0.9407 0.9699
No log 6.5263 372 0.9333 0.5818 0.9333 0.9661
No log 6.5614 374 0.9051 0.5660 0.9051 0.9514
No log 6.5965 376 0.9230 0.5823 0.9230 0.9607
No log 6.6316 378 0.8655 0.5168 0.8655 0.9303
No log 6.6667 380 0.8214 0.5545 0.8214 0.9063
No log 6.7018 382 0.8169 0.5521 0.8169 0.9038
No log 6.7368 384 0.8017 0.5202 0.8017 0.8954
No log 6.7719 386 0.8476 0.4615 0.8476 0.9207
No log 6.8070 388 0.8775 0.5216 0.8775 0.9367
No log 6.8421 390 0.8682 0.5150 0.8682 0.9318
No log 6.8772 392 0.7999 0.5157 0.7999 0.8944
No log 6.9123 394 0.7933 0.5141 0.7933 0.8907
No log 6.9474 396 0.7910 0.5141 0.7910 0.8894
No log 6.9825 398 0.7759 0.5151 0.7759 0.8809
No log 7.0175 400 0.7805 0.5476 0.7805 0.8835
No log 7.0526 402 0.8066 0.5141 0.8066 0.8981
No log 7.0877 404 0.7787 0.5473 0.7787 0.8824
No log 7.1228 406 0.7113 0.6189 0.7113 0.8434
No log 7.1579 408 0.6893 0.5884 0.6893 0.8303
No log 7.1930 410 0.6855 0.6002 0.6855 0.8279
No log 7.2281 412 0.6895 0.5996 0.6895 0.8303
No log 7.2632 414 0.7523 0.5769 0.7523 0.8673
No log 7.2982 416 0.8834 0.5779 0.8834 0.9399
No log 7.3333 418 0.9228 0.5485 0.9228 0.9606
No log 7.3684 420 0.9431 0.5485 0.9431 0.9711
No log 7.4035 422 0.9486 0.5094 0.9486 0.9740
No log 7.4386 424 0.9102 0.5408 0.9102 0.9540
No log 7.4737 426 0.8232 0.5962 0.8232 0.9073
No log 7.5088 428 0.7810 0.6066 0.7810 0.8838
No log 7.5439 430 0.7575 0.5838 0.7575 0.8704
No log 7.5789 432 0.7592 0.5868 0.7592 0.8713
No log 7.6140 434 0.7629 0.5424 0.7629 0.8735
No log 7.6491 436 0.7731 0.4889 0.7731 0.8793
No log 7.6842 438 0.7850 0.4889 0.7850 0.8860
No log 7.7193 440 0.7920 0.4876 0.7920 0.8900
No log 7.7544 442 0.8224 0.4840 0.8224 0.9068
No log 7.7895 444 0.8003 0.5625 0.8003 0.8946
No log 7.8246 446 0.7447 0.6499 0.7447 0.8629
No log 7.8596 448 0.7450 0.6479 0.7450 0.8632
No log 7.8947 450 0.7509 0.6404 0.7509 0.8666
No log 7.9298 452 0.7476 0.5086 0.7476 0.8646
No log 7.9649 454 0.7789 0.4954 0.7789 0.8825
No log 8.0 456 0.8138 0.4969 0.8138 0.9021
No log 8.0351 458 0.8347 0.5478 0.8347 0.9136
No log 8.0702 460 0.8358 0.5601 0.8358 0.9142
No log 8.1053 462 0.8219 0.5823 0.8219 0.9066
No log 8.1404 464 0.8316 0.5300 0.8316 0.9119
No log 8.1754 466 0.8420 0.5647 0.8420 0.9176
No log 8.2105 468 0.9262 0.5743 0.9262 0.9624
No log 8.2456 470 0.9456 0.5743 0.9456 0.9724
No log 8.2807 472 0.8923 0.5332 0.8923 0.9446
No log 8.3158 474 0.8118 0.5159 0.8118 0.9010
No log 8.3509 476 0.7743 0.5895 0.7743 0.8799
No log 8.3860 478 0.7636 0.6107 0.7636 0.8738
No log 8.4211 480 0.7662 0.5504 0.7662 0.8753
No log 8.4561 482 0.7904 0.5595 0.7904 0.8890
No log 8.4912 484 0.8056 0.5331 0.8056 0.8976
No log 8.5263 486 0.7842 0.5743 0.7842 0.8855
No log 8.5614 488 0.7681 0.5917 0.7681 0.8764
No log 8.5965 490 0.7710 0.5949 0.7710 0.8780
No log 8.6316 492 0.7658 0.6098 0.7658 0.8751
No log 8.6667 494 0.7652 0.6157 0.7652 0.8748
No log 8.7018 496 0.7707 0.5695 0.7707 0.8779
No log 8.7368 498 0.7999 0.5659 0.7999 0.8944
0.3412 8.7719 500 0.7973 0.5892 0.7973 0.8929
0.3412 8.8070 502 0.7917 0.5892 0.7917 0.8897
0.3412 8.8421 504 0.7861 0.5785 0.7861 0.8866
0.3412 8.8772 506 0.8007 0.5283 0.8007 0.8948
0.3412 8.9123 508 0.7972 0.5264 0.7972 0.8928
0.3412 8.9474 510 0.7667 0.5838 0.7667 0.8756

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization

Finetuned
(4019)
this model