ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6271
  • Qwk: 0.5339
  • Mse: 0.6271
  • Rmse: 0.7919

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.1445 -0.0020 4.1445 2.0358
No log 0.0471 4 2.2479 0.0790 2.2479 1.4993
No log 0.0706 6 1.5649 0.0143 1.5649 1.2510
No log 0.0941 8 1.0879 0.1711 1.0879 1.0430
No log 0.1176 10 1.0040 0.3130 1.0040 1.0020
No log 0.1412 12 1.0047 0.2643 1.0047 1.0023
No log 0.1647 14 1.0981 0.1851 1.0981 1.0479
No log 0.1882 16 1.1770 0.2075 1.1770 1.0849
No log 0.2118 18 0.8374 0.3590 0.8374 0.9151
No log 0.2353 20 0.7653 0.4192 0.7653 0.8748
No log 0.2588 22 0.7185 0.4304 0.7185 0.8477
No log 0.2824 24 1.1932 0.3561 1.1932 1.0923
No log 0.3059 26 1.8345 0.2324 1.8345 1.3544
No log 0.3294 28 1.4785 0.3329 1.4785 1.2159
No log 0.3529 30 0.7518 0.6357 0.7518 0.8671
No log 0.3765 32 0.6501 0.6365 0.6501 0.8063
No log 0.4 34 0.6344 0.6186 0.6344 0.7965
No log 0.4235 36 0.7981 0.5348 0.7981 0.8934
No log 0.4471 38 0.9656 0.4693 0.9656 0.9827
No log 0.4706 40 0.8436 0.5345 0.8436 0.9185
No log 0.4941 42 0.7330 0.5940 0.7330 0.8562
No log 0.5176 44 0.7906 0.5541 0.7906 0.8891
No log 0.5412 46 0.7715 0.5640 0.7715 0.8784
No log 0.5647 48 0.7559 0.6175 0.7559 0.8694
No log 0.5882 50 0.7426 0.6113 0.7426 0.8617
No log 0.6118 52 0.7196 0.6038 0.7196 0.8483
No log 0.6353 54 0.7580 0.5718 0.7580 0.8707
No log 0.6588 56 0.7378 0.5339 0.7378 0.8590
No log 0.6824 58 0.6326 0.6374 0.6326 0.7954
No log 0.7059 60 0.6344 0.6374 0.6344 0.7965
No log 0.7294 62 0.6595 0.6203 0.6595 0.8121
No log 0.7529 64 0.9634 0.5464 0.9634 0.9815
No log 0.7765 66 1.0746 0.4779 1.0746 1.0366
No log 0.8 68 0.7777 0.6645 0.7777 0.8819
No log 0.8235 70 0.6178 0.6230 0.6178 0.7860
No log 0.8471 72 0.6071 0.6493 0.6071 0.7792
No log 0.8706 74 0.6667 0.6795 0.6667 0.8165
No log 0.8941 76 0.6854 0.6221 0.6854 0.8279
No log 0.9176 78 0.7047 0.6048 0.7047 0.8395
No log 0.9412 80 0.6494 0.5524 0.6494 0.8058
No log 0.9647 82 0.6250 0.5726 0.6250 0.7905
No log 0.9882 84 0.6032 0.6780 0.6032 0.7767
No log 1.0118 86 0.6201 0.6667 0.6201 0.7874
No log 1.0353 88 0.6351 0.5999 0.6351 0.7969
No log 1.0588 90 0.6906 0.6175 0.6906 0.8310
No log 1.0824 92 0.9067 0.5772 0.9067 0.9522
No log 1.1059 94 0.9197 0.5457 0.9197 0.9590
No log 1.1294 96 0.7663 0.6612 0.7663 0.8754
No log 1.1529 98 0.8336 0.6166 0.8336 0.9130
No log 1.1765 100 0.7524 0.5884 0.7524 0.8674
No log 1.2 102 0.7185 0.6559 0.7185 0.8477
No log 1.2235 104 1.1409 0.4594 1.1409 1.0681
No log 1.2471 106 1.2753 0.4289 1.2753 1.1293
No log 1.2706 108 1.0246 0.5191 1.0246 1.0122
No log 1.2941 110 0.7460 0.6350 0.7460 0.8637
No log 1.3176 112 0.7713 0.6535 0.7713 0.8782
No log 1.3412 114 0.7919 0.6741 0.7919 0.8899
No log 1.3647 116 0.7326 0.6287 0.7326 0.8559
No log 1.3882 118 0.7023 0.5824 0.7023 0.8380
No log 1.4118 120 0.8496 0.5800 0.8496 0.9217
No log 1.4353 122 0.8197 0.5687 0.8197 0.9054
No log 1.4588 124 0.6830 0.5481 0.6830 0.8264
No log 1.4824 126 0.6486 0.6169 0.6486 0.8054
No log 1.5059 128 0.6399 0.6510 0.6399 0.7999
No log 1.5294 130 0.6991 0.5406 0.6991 0.8361
No log 1.5529 132 0.9188 0.5357 0.9188 0.9585
No log 1.5765 134 0.9676 0.5181 0.9676 0.9837
No log 1.6 136 0.7649 0.6219 0.7649 0.8746
No log 1.6235 138 0.6645 0.5937 0.6645 0.8151
No log 1.6471 140 0.7014 0.6676 0.7014 0.8375
No log 1.6706 142 0.6716 0.6840 0.6716 0.8195
No log 1.6941 144 0.6629 0.5951 0.6629 0.8142
No log 1.7176 146 0.6787 0.6043 0.6787 0.8238
No log 1.7412 148 0.6479 0.6260 0.6479 0.8049
No log 1.7647 150 0.6267 0.6508 0.6267 0.7917
No log 1.7882 152 0.6142 0.6804 0.6142 0.7837
No log 1.8118 154 0.6128 0.6564 0.6128 0.7828
No log 1.8353 156 0.6720 0.6019 0.6720 0.8198
No log 1.8588 158 0.7340 0.5812 0.7340 0.8567
No log 1.8824 160 0.6755 0.6091 0.6755 0.8219
No log 1.9059 162 0.6447 0.6790 0.6447 0.8029
No log 1.9294 164 0.7480 0.6029 0.7480 0.8649
No log 1.9529 166 0.7255 0.6013 0.7255 0.8518
No log 1.9765 168 0.6574 0.6479 0.6574 0.8108
No log 2.0 170 0.6886 0.6555 0.6886 0.8298
No log 2.0235 172 0.6956 0.6426 0.6956 0.8341
No log 2.0471 174 0.6380 0.6695 0.6380 0.7988
No log 2.0706 176 0.6728 0.6427 0.6728 0.8202
No log 2.0941 178 0.6796 0.6388 0.6796 0.8244
No log 2.1176 180 0.6237 0.6518 0.6237 0.7898
No log 2.1412 182 0.6714 0.6035 0.6714 0.8194
No log 2.1647 184 0.7415 0.5157 0.7415 0.8611
No log 2.1882 186 0.7191 0.5739 0.7191 0.8480
No log 2.2118 188 0.6364 0.6089 0.6364 0.7978
No log 2.2353 190 0.6116 0.6979 0.6116 0.7821
No log 2.2588 192 0.5980 0.6664 0.5980 0.7733
No log 2.2824 194 0.5824 0.6701 0.5824 0.7631
No log 2.3059 196 0.5707 0.6578 0.5707 0.7555
No log 2.3294 198 0.5890 0.6246 0.5890 0.7675
No log 2.3529 200 0.6042 0.6089 0.6042 0.7773
No log 2.3765 202 0.6000 0.5879 0.6000 0.7746
No log 2.4 204 0.5661 0.6762 0.5661 0.7524
No log 2.4235 206 0.5570 0.6988 0.5570 0.7463
No log 2.4471 208 0.5890 0.5966 0.5890 0.7675
No log 2.4706 210 0.7765 0.5832 0.7765 0.8812
No log 2.4941 212 0.8740 0.5840 0.8740 0.9349
No log 2.5176 214 0.7621 0.6281 0.7621 0.8730
No log 2.5412 216 0.6043 0.6144 0.6043 0.7774
No log 2.5647 218 0.6264 0.6511 0.6264 0.7914
No log 2.5882 220 0.6894 0.6502 0.6894 0.8303
No log 2.6118 222 0.6606 0.6397 0.6606 0.8128
No log 2.6353 224 0.6475 0.5301 0.6475 0.8047
No log 2.6588 226 0.7459 0.6013 0.7459 0.8637
No log 2.6824 228 0.8423 0.5832 0.8423 0.9178
No log 2.7059 230 0.8166 0.5924 0.8166 0.9037
No log 2.7294 232 0.7044 0.5847 0.7044 0.8393
No log 2.7529 234 0.6263 0.5851 0.6263 0.7914
No log 2.7765 236 0.6478 0.6573 0.6478 0.8048
No log 2.8 238 0.6407 0.6575 0.6407 0.8004
No log 2.8235 240 0.6071 0.6360 0.6071 0.7792
No log 2.8471 242 0.6400 0.4975 0.6400 0.8000
No log 2.8706 244 0.7048 0.5134 0.7048 0.8395
No log 2.8941 246 0.6495 0.5571 0.6495 0.8059
No log 2.9176 248 0.5960 0.6377 0.5960 0.7720
No log 2.9412 250 0.6098 0.6636 0.6098 0.7809
No log 2.9647 252 0.6207 0.6714 0.6207 0.7879
No log 2.9882 254 0.6292 0.6441 0.6292 0.7932
No log 3.0118 256 0.6727 0.6091 0.6727 0.8202
No log 3.0353 258 0.6685 0.5922 0.6685 0.8176
No log 3.0588 260 0.6667 0.6045 0.6667 0.8165
No log 3.0824 262 0.6567 0.6586 0.6567 0.8103
No log 3.1059 264 0.6523 0.6613 0.6523 0.8077
No log 3.1294 266 0.6501 0.6518 0.6501 0.8063
No log 3.1529 268 0.6486 0.6492 0.6486 0.8053
No log 3.1765 270 0.6289 0.6392 0.6289 0.7930
No log 3.2 272 0.6260 0.5950 0.6260 0.7912
No log 3.2235 274 0.6214 0.5879 0.6214 0.7883
No log 3.2471 276 0.6226 0.6154 0.6226 0.7890
No log 3.2706 278 0.6255 0.7153 0.6255 0.7909
No log 3.2941 280 0.6621 0.6221 0.6621 0.8137
No log 3.3176 282 0.6451 0.6350 0.6451 0.8032
No log 3.3412 284 0.6525 0.6426 0.6525 0.8078
No log 3.3647 286 0.7297 0.5648 0.7297 0.8542
No log 3.3882 288 0.7733 0.5658 0.7733 0.8794
No log 3.4118 290 0.6604 0.5583 0.6604 0.8126
No log 3.4353 292 0.5797 0.6230 0.5797 0.7614
No log 3.4588 294 0.6096 0.6291 0.6096 0.7808
No log 3.4824 296 0.6138 0.6575 0.6138 0.7835
No log 3.5059 298 0.5891 0.6545 0.5891 0.7676
No log 3.5294 300 0.6114 0.6354 0.6114 0.7819
No log 3.5529 302 0.6225 0.6269 0.6225 0.7890
No log 3.5765 304 0.6699 0.5617 0.6699 0.8185
No log 3.6 306 0.6541 0.6065 0.6541 0.8087
No log 3.6235 308 0.6110 0.6545 0.6110 0.7817
No log 3.6471 310 0.6125 0.6641 0.6125 0.7826
No log 3.6706 312 0.6238 0.6096 0.6238 0.7898
No log 3.6941 314 0.6921 0.5548 0.6921 0.8319
No log 3.7176 316 0.7626 0.5132 0.7626 0.8733
No log 3.7412 318 0.7107 0.5335 0.7107 0.8430
No log 3.7647 320 0.6370 0.5524 0.6370 0.7981
No log 3.7882 322 0.6284 0.6038 0.6284 0.7927
No log 3.8118 324 0.6536 0.5905 0.6536 0.8084
No log 3.8353 326 0.6610 0.5905 0.6610 0.8130
No log 3.8588 328 0.6478 0.5794 0.6478 0.8049
No log 3.8824 330 0.7330 0.5525 0.7330 0.8562
No log 3.9059 332 0.7625 0.5730 0.7625 0.8732
No log 3.9294 334 0.6919 0.5849 0.6919 0.8318
No log 3.9529 336 0.6539 0.4938 0.6539 0.8086
No log 3.9765 338 0.6870 0.5375 0.6870 0.8289
No log 4.0 340 0.7180 0.5858 0.7180 0.8473
No log 4.0235 342 0.6673 0.6177 0.6673 0.8169
No log 4.0471 344 0.6046 0.5955 0.6046 0.7776
No log 4.0706 346 0.6390 0.6173 0.6390 0.7993
No log 4.0941 348 0.7269 0.5924 0.7269 0.8526
No log 4.1176 350 0.7188 0.6092 0.7188 0.8478
No log 4.1412 352 0.7020 0.5992 0.7020 0.8379
No log 4.1647 354 0.6306 0.6721 0.6306 0.7941
No log 4.1882 356 0.6081 0.6737 0.6081 0.7798
No log 4.2118 358 0.6089 0.6896 0.6089 0.7803
No log 4.2353 360 0.6002 0.6896 0.6002 0.7747
No log 4.2588 362 0.5899 0.6561 0.5899 0.7681
No log 4.2824 364 0.6125 0.6386 0.6125 0.7826
No log 4.3059 366 0.6110 0.6386 0.6110 0.7817
No log 4.3294 368 0.6232 0.6603 0.6232 0.7894
No log 4.3529 370 0.6068 0.5839 0.6068 0.7790
No log 4.3765 372 0.6439 0.5215 0.6439 0.8024
No log 4.4 374 0.6537 0.5342 0.6537 0.8085
No log 4.4235 376 0.6199 0.5982 0.6199 0.7873
No log 4.4471 378 0.6160 0.6135 0.6160 0.7849
No log 4.4706 380 0.6258 0.6299 0.6258 0.7911
No log 4.4941 382 0.6871 0.6619 0.6871 0.8289
No log 4.5176 384 0.7156 0.6251 0.7156 0.8459
No log 4.5412 386 0.6747 0.5835 0.6747 0.8214
No log 4.5647 388 0.6205 0.6410 0.6205 0.7877
No log 4.5882 390 0.5929 0.6598 0.5929 0.7700
No log 4.6118 392 0.5907 0.6853 0.5907 0.7686
No log 4.6353 394 0.5987 0.6528 0.5987 0.7737
No log 4.6588 396 0.6086 0.6451 0.6086 0.7801
No log 4.6824 398 0.6086 0.6557 0.6086 0.7801
No log 4.7059 400 0.6126 0.6557 0.6126 0.7827
No log 4.7294 402 0.6102 0.6292 0.6102 0.7811
No log 4.7529 404 0.6324 0.5653 0.6324 0.7953
No log 4.7765 406 0.6145 0.5653 0.6145 0.7839
No log 4.8 408 0.5885 0.6830 0.5885 0.7671
No log 4.8235 410 0.5865 0.6748 0.5865 0.7658
No log 4.8471 412 0.5750 0.6748 0.5750 0.7583
No log 4.8706 414 0.5669 0.6561 0.5669 0.7529
No log 4.8941 416 0.6028 0.6701 0.6028 0.7764
No log 4.9176 418 0.6023 0.6701 0.6023 0.7761
No log 4.9412 420 0.5819 0.6097 0.5819 0.7628
No log 4.9647 422 0.5870 0.6230 0.5870 0.7662
No log 4.9882 424 0.6140 0.6117 0.6140 0.7836
No log 5.0118 426 0.6287 0.6198 0.6287 0.7929
No log 5.0353 428 0.6164 0.5796 0.6164 0.7851
No log 5.0588 430 0.6125 0.5562 0.6125 0.7826
No log 5.0824 432 0.6024 0.5542 0.6024 0.7761
No log 5.1059 434 0.5998 0.6444 0.5998 0.7745
No log 5.1294 436 0.6151 0.6891 0.6151 0.7843
No log 5.1529 438 0.6153 0.6934 0.6153 0.7844
No log 5.1765 440 0.6065 0.6761 0.6065 0.7788
No log 5.2 442 0.5923 0.6307 0.5923 0.7696
No log 5.2235 444 0.5970 0.6343 0.5970 0.7726
No log 5.2471 446 0.6003 0.5679 0.6003 0.7748
No log 5.2706 448 0.6252 0.5459 0.6252 0.7907
No log 5.2941 450 0.6037 0.5459 0.6037 0.7770
No log 5.3176 452 0.5879 0.6139 0.5879 0.7668
No log 5.3412 454 0.5925 0.6445 0.5925 0.7698
No log 5.3647 456 0.6326 0.5810 0.6326 0.7954
No log 5.3882 458 0.6325 0.5835 0.6325 0.7953
No log 5.4118 460 0.6120 0.6117 0.6120 0.7823
No log 5.4353 462 0.5912 0.6648 0.5912 0.7689
No log 5.4588 464 0.5909 0.6358 0.5909 0.7687
No log 5.4824 466 0.6049 0.6525 0.6049 0.7778
No log 5.5059 468 0.6247 0.6330 0.6247 0.7904
No log 5.5294 470 0.6399 0.6135 0.6399 0.8000
No log 5.5529 472 0.6219 0.6771 0.6219 0.7886
No log 5.5765 474 0.5914 0.6772 0.5914 0.7691
No log 5.6 476 0.6232 0.6570 0.6232 0.7894
No log 5.6235 478 0.7032 0.6387 0.7032 0.8385
No log 5.6471 480 0.7382 0.6466 0.7382 0.8592
No log 5.6706 482 0.6886 0.6636 0.6886 0.8298
No log 5.6941 484 0.6048 0.6644 0.6048 0.7777
No log 5.7176 486 0.5782 0.6846 0.5782 0.7604
No log 5.7412 488 0.6104 0.5748 0.6104 0.7813
No log 5.7647 490 0.6269 0.5548 0.6269 0.7917
No log 5.7882 492 0.6097 0.5542 0.6097 0.7808
No log 5.8118 494 0.5833 0.6648 0.5833 0.7637
No log 5.8353 496 0.5787 0.6745 0.5787 0.7607
No log 5.8588 498 0.5726 0.6962 0.5726 0.7567
0.2743 5.8824 500 0.5681 0.6772 0.5681 0.7537
0.2743 5.9059 502 0.6416 0.6565 0.6416 0.8010
0.2743 5.9294 504 0.7245 0.6248 0.7245 0.8512
0.2743 5.9529 506 0.7520 0.6076 0.7520 0.8672
0.2743 5.9765 508 0.6949 0.5690 0.6949 0.8336
0.2743 6.0 510 0.6271 0.5339 0.6271 0.7919

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

Finetuned
(4019)
this model