ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k12_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6736
  • Qwk: 0.5971
  • Mse: 0.6736
  • Rmse: 0.8208

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 3.8428 -0.0172 3.8428 1.9603
No log 0.0667 4 1.9405 0.0103 1.9405 1.3930
No log 0.1 6 1.1931 0.1086 1.1931 1.0923
No log 0.1333 8 1.1671 0.1125 1.1671 1.0803
No log 0.1667 10 1.1488 0.3014 1.1488 1.0718
No log 0.2 12 1.0444 0.3344 1.0444 1.0219
No log 0.2333 14 1.1067 0.2963 1.1067 1.0520
No log 0.2667 16 1.2664 0.3216 1.2664 1.1253
No log 0.3 18 0.8656 0.5565 0.8656 0.9304
No log 0.3333 20 0.7256 0.5787 0.7256 0.8518
No log 0.3667 22 0.7532 0.6206 0.7532 0.8679
No log 0.4 24 0.7092 0.6274 0.7092 0.8422
No log 0.4333 26 0.9955 0.5608 0.9955 0.9977
No log 0.4667 28 0.9179 0.5365 0.9179 0.9581
No log 0.5 30 0.6891 0.6319 0.6891 0.8301
No log 0.5333 32 0.7616 0.5680 0.7616 0.8727
No log 0.5667 34 0.8821 0.5410 0.8821 0.9392
No log 0.6 36 0.9564 0.4692 0.9564 0.9780
No log 0.6333 38 0.7757 0.5425 0.7757 0.8808
No log 0.6667 40 0.6957 0.6051 0.6957 0.8341
No log 0.7 42 0.8472 0.6067 0.8472 0.9204
No log 0.7333 44 0.9421 0.5825 0.9421 0.9706
No log 0.7667 46 0.8897 0.6129 0.8897 0.9432
No log 0.8 48 0.8053 0.6844 0.8053 0.8974
No log 0.8333 50 0.7313 0.6221 0.7313 0.8551
No log 0.8667 52 0.7635 0.6618 0.7635 0.8738
No log 0.9 54 0.7477 0.6317 0.7477 0.8647
No log 0.9333 56 0.6973 0.6142 0.6973 0.8350
No log 0.9667 58 0.7154 0.5828 0.7154 0.8458
No log 1.0 60 0.7250 0.5869 0.7250 0.8515
No log 1.0333 62 0.7018 0.5713 0.7018 0.8377
No log 1.0667 64 0.6896 0.6352 0.6896 0.8304
No log 1.1 66 0.6898 0.6520 0.6898 0.8305
No log 1.1333 68 0.6496 0.6578 0.6496 0.8060
No log 1.1667 70 0.6722 0.6473 0.6722 0.8199
No log 1.2 72 0.7030 0.6689 0.7030 0.8385
No log 1.2333 74 0.7165 0.6662 0.7165 0.8465
No log 1.2667 76 0.6671 0.6452 0.6671 0.8167
No log 1.3 78 0.6804 0.6294 0.6804 0.8249
No log 1.3333 80 0.6599 0.6215 0.6599 0.8124
No log 1.3667 82 0.6794 0.6087 0.6794 0.8243
No log 1.4 84 0.6916 0.6240 0.6916 0.8316
No log 1.4333 86 0.8802 0.6369 0.8802 0.9382
No log 1.4667 88 1.1230 0.4260 1.1230 1.0597
No log 1.5 90 0.9930 0.5466 0.9930 0.9965
No log 1.5333 92 0.7575 0.6296 0.7575 0.8704
No log 1.5667 94 0.7500 0.6106 0.7500 0.8661
No log 1.6 96 0.7578 0.6230 0.7578 0.8705
No log 1.6333 98 0.6727 0.6094 0.6727 0.8202
No log 1.6667 100 0.6717 0.6354 0.6717 0.8196
No log 1.7 102 0.6652 0.6345 0.6652 0.8156
No log 1.7333 104 0.6472 0.6205 0.6472 0.8045
No log 1.7667 106 0.6741 0.5914 0.6741 0.8210
No log 1.8 108 0.6797 0.6067 0.6797 0.8244
No log 1.8333 110 0.6639 0.6059 0.6639 0.8148
No log 1.8667 112 0.6520 0.6077 0.6520 0.8074
No log 1.9 114 0.6313 0.6235 0.6313 0.7945
No log 1.9333 116 0.6253 0.6822 0.6253 0.7908
No log 1.9667 118 0.6292 0.6822 0.6292 0.7932
No log 2.0 120 0.6366 0.6659 0.6366 0.7979
No log 2.0333 122 0.6373 0.6561 0.6373 0.7983
No log 2.0667 124 0.6281 0.7088 0.6281 0.7926
No log 2.1 126 0.6141 0.6822 0.6141 0.7836
No log 2.1333 128 0.6233 0.6518 0.6233 0.7895
No log 2.1667 130 0.6173 0.6822 0.6173 0.7857
No log 2.2 132 0.6736 0.6107 0.6736 0.8207
No log 2.2333 134 0.6620 0.6396 0.6620 0.8136
No log 2.2667 136 0.6453 0.6602 0.6453 0.8033
No log 2.3 138 0.6690 0.6324 0.6690 0.8179
No log 2.3333 140 0.6459 0.6570 0.6459 0.8037
No log 2.3667 142 0.6378 0.6805 0.6378 0.7986
No log 2.4 144 0.7424 0.5425 0.7424 0.8617
No log 2.4333 146 0.7104 0.5032 0.7104 0.8429
No log 2.4667 148 0.6215 0.6073 0.6215 0.7883
No log 2.5 150 0.6903 0.5446 0.6903 0.8308
No log 2.5333 152 0.7026 0.6321 0.7026 0.8382
No log 2.5667 154 0.6630 0.6468 0.6630 0.8143
No log 2.6 156 0.6954 0.6630 0.6954 0.8339
No log 2.6333 158 0.7406 0.6648 0.7406 0.8606
No log 2.6667 160 0.7314 0.7001 0.7314 0.8552
No log 2.7 162 0.6812 0.6170 0.6812 0.8253
No log 2.7333 164 0.6450 0.6430 0.6450 0.8031
No log 2.7667 166 0.6401 0.6814 0.6401 0.8001
No log 2.8 168 0.6462 0.6632 0.6462 0.8039
No log 2.8333 170 0.6406 0.6593 0.6406 0.8004
No log 2.8667 172 0.6416 0.6517 0.6416 0.8010
No log 2.9 174 0.6872 0.5908 0.6872 0.8290
No log 2.9333 176 0.6917 0.6098 0.6917 0.8317
No log 2.9667 178 0.6757 0.6749 0.6757 0.8220
No log 3.0 180 0.7001 0.6797 0.7001 0.8367
No log 3.0333 182 0.7187 0.6968 0.7187 0.8478
No log 3.0667 184 0.7177 0.6833 0.7177 0.8472
No log 3.1 186 0.6872 0.6705 0.6872 0.8290
No log 3.1333 188 0.6819 0.6554 0.6819 0.8258
No log 3.1667 190 0.7191 0.6083 0.7191 0.8480
No log 3.2 192 0.6714 0.6244 0.6714 0.8194
No log 3.2333 194 0.6589 0.6038 0.6589 0.8117
No log 3.2667 196 0.7320 0.5726 0.7320 0.8556
No log 3.3 198 0.7367 0.6269 0.7367 0.8583
No log 3.3333 200 0.7156 0.6180 0.7156 0.8459
No log 3.3667 202 0.6968 0.6335 0.6968 0.8348
No log 3.4 204 0.6549 0.5545 0.6549 0.8092
No log 3.4333 206 0.6644 0.5830 0.6644 0.8151
No log 3.4667 208 0.7830 0.5691 0.7830 0.8849
No log 3.5 210 0.8600 0.5676 0.8600 0.9273
No log 3.5333 212 0.7164 0.6490 0.7164 0.8464
No log 3.5667 214 0.6392 0.6288 0.6392 0.7995
No log 3.6 216 0.6855 0.6219 0.6855 0.8279
No log 3.6333 218 0.6805 0.5690 0.6805 0.8249
No log 3.6667 220 0.6707 0.5618 0.6707 0.8189
No log 3.7 222 0.6927 0.5650 0.6927 0.8323
No log 3.7333 224 0.6779 0.5759 0.6779 0.8233
No log 3.7667 226 0.6738 0.5261 0.6738 0.8209
No log 3.8 228 0.6715 0.4804 0.6715 0.8195
No log 3.8333 230 0.6722 0.6365 0.6722 0.8199
No log 3.8667 232 0.7002 0.6234 0.7002 0.8368
No log 3.9 234 0.6968 0.6234 0.6968 0.8348
No log 3.9333 236 0.6542 0.6178 0.6542 0.8088
No log 3.9667 238 0.6876 0.6602 0.6876 0.8292
No log 4.0 240 0.7510 0.5993 0.7510 0.8666
No log 4.0333 242 0.7351 0.6289 0.7351 0.8574
No log 4.0667 244 0.6618 0.6638 0.6618 0.8135
No log 4.1 246 0.6669 0.6012 0.6669 0.8166
No log 4.1333 248 0.7211 0.5560 0.7211 0.8491
No log 4.1667 250 0.7289 0.5560 0.7289 0.8538
No log 4.2 252 0.6630 0.5588 0.6630 0.8143
No log 4.2333 254 0.6299 0.6650 0.6299 0.7937
No log 4.2667 256 0.6479 0.6703 0.6479 0.8049
No log 4.3 258 0.6393 0.6602 0.6393 0.7996
No log 4.3333 260 0.6785 0.6718 0.6785 0.8237
No log 4.3667 262 0.7895 0.5632 0.7895 0.8886
No log 4.4 264 0.7846 0.5655 0.7846 0.8858
No log 4.4333 266 0.6805 0.6206 0.6805 0.8249
No log 4.4667 268 0.6317 0.6717 0.6317 0.7948
No log 4.5 270 0.6356 0.6853 0.6356 0.7972
No log 4.5333 272 0.6519 0.6729 0.6519 0.8074
No log 4.5667 274 0.6764 0.6055 0.6764 0.8225
No log 4.6 276 0.7138 0.5944 0.7138 0.8449
No log 4.6333 278 0.7103 0.5919 0.7103 0.8428
No log 4.6667 280 0.6805 0.6269 0.6805 0.8249
No log 4.7 282 0.6683 0.6602 0.6683 0.8175
No log 4.7333 284 0.6548 0.6748 0.6548 0.8092
No log 4.7667 286 0.6673 0.6709 0.6673 0.8169
No log 4.8 288 0.6872 0.7034 0.6872 0.8290
No log 4.8333 290 0.7056 0.6975 0.7056 0.8400
No log 4.8667 292 0.7483 0.6689 0.7483 0.8650
No log 4.9 294 0.7454 0.6632 0.7454 0.8634
No log 4.9333 296 0.6847 0.7219 0.6847 0.8275
No log 4.9667 298 0.6694 0.6578 0.6694 0.8182
No log 5.0 300 0.6829 0.6644 0.6829 0.8264
No log 5.0333 302 0.6668 0.6316 0.6668 0.8166
No log 5.0667 304 0.6631 0.6689 0.6631 0.8143
No log 5.1 306 0.6599 0.6813 0.6599 0.8124
No log 5.1333 308 0.6504 0.6909 0.6504 0.8065
No log 5.1667 310 0.6296 0.6909 0.6296 0.7935
No log 5.2 312 0.6185 0.7080 0.6185 0.7865
No log 5.2333 314 0.6380 0.6450 0.6380 0.7987
No log 5.2667 316 0.6359 0.6612 0.6359 0.7974
No log 5.3 318 0.6292 0.7026 0.6292 0.7932
No log 5.3333 320 0.6442 0.6995 0.6442 0.8026
No log 5.3667 322 0.6627 0.6951 0.6627 0.8141
No log 5.4 324 0.6639 0.6822 0.6639 0.8148
No log 5.4333 326 0.6734 0.6649 0.6734 0.8206
No log 5.4667 328 0.6831 0.6649 0.6831 0.8265
No log 5.5 330 0.6920 0.6610 0.6920 0.8319
No log 5.5333 332 0.6890 0.6572 0.6890 0.8301
No log 5.5667 334 0.6955 0.6247 0.6955 0.8339
No log 5.6 336 0.6508 0.6389 0.6508 0.8067
No log 5.6333 338 0.6123 0.6473 0.6123 0.7825
No log 5.6667 340 0.6476 0.6595 0.6476 0.8047
No log 5.7 342 0.6755 0.6026 0.6755 0.8219
No log 5.7333 344 0.6258 0.7336 0.6258 0.7911
No log 5.7667 346 0.6191 0.6888 0.6191 0.7868
No log 5.8 348 0.6268 0.6888 0.6268 0.7917
No log 5.8333 350 0.6151 0.7137 0.6151 0.7843
No log 5.8667 352 0.6265 0.6888 0.6265 0.7915
No log 5.9 354 0.6154 0.7063 0.6154 0.7845
No log 5.9333 356 0.5931 0.7012 0.5931 0.7701
No log 5.9667 358 0.5973 0.6923 0.5973 0.7728
No log 6.0 360 0.5918 0.6916 0.5918 0.7693
No log 6.0333 362 0.6150 0.7454 0.6150 0.7842
No log 6.0667 364 0.6476 0.6746 0.6476 0.8047
No log 6.1 366 0.6314 0.6684 0.6314 0.7946
No log 6.1333 368 0.5856 0.6680 0.5856 0.7652
No log 6.1667 370 0.5882 0.6659 0.5882 0.7670
No log 6.2 372 0.6085 0.6358 0.6085 0.7801
No log 6.2333 374 0.5748 0.6659 0.5748 0.7582
No log 6.2667 376 0.5660 0.6564 0.5660 0.7524
No log 6.3 378 0.6486 0.6538 0.6486 0.8054
No log 6.3333 380 0.6874 0.5964 0.6874 0.8291
No log 6.3667 382 0.6504 0.6177 0.6504 0.8065
No log 6.4 384 0.6133 0.6380 0.6133 0.7831
No log 6.4333 386 0.5975 0.6546 0.5975 0.7730
No log 6.4667 388 0.6005 0.6890 0.6005 0.7749
No log 6.5 390 0.6086 0.6850 0.6086 0.7801
No log 6.5333 392 0.6280 0.6642 0.6280 0.7924
No log 6.5667 394 0.6236 0.6642 0.6236 0.7897
No log 6.6 396 0.6194 0.6822 0.6194 0.7870
No log 6.6333 398 0.6054 0.6822 0.6054 0.7781
No log 6.6667 400 0.5834 0.6720 0.5834 0.7638
No log 6.7 402 0.5703 0.6846 0.5703 0.7552
No log 6.7333 404 0.5684 0.7278 0.5684 0.7539
No log 6.7667 406 0.5949 0.7148 0.5949 0.7713
No log 6.8 408 0.6204 0.6389 0.6204 0.7877
No log 6.8333 410 0.5921 0.6638 0.5921 0.7695
No log 6.8667 412 0.5885 0.6932 0.5885 0.7671
No log 6.9 414 0.6168 0.6528 0.6168 0.7853
No log 6.9333 416 0.6392 0.6528 0.6392 0.7995
No log 6.9667 418 0.6230 0.6932 0.6230 0.7893
No log 7.0 420 0.6220 0.6973 0.6220 0.7886
No log 7.0333 422 0.6465 0.6324 0.6465 0.8041
No log 7.0667 424 0.6280 0.6543 0.6280 0.7925
No log 7.1 426 0.6098 0.6940 0.6098 0.7809
No log 7.1333 428 0.6181 0.6575 0.6181 0.7862
No log 7.1667 430 0.6094 0.6575 0.6094 0.7806
No log 7.2 432 0.6030 0.6555 0.6030 0.7765
No log 7.2333 434 0.5952 0.6196 0.5952 0.7715
No log 7.2667 436 0.5846 0.6575 0.5846 0.7646
No log 7.3 438 0.5976 0.6672 0.5976 0.7730
No log 7.3333 440 0.6387 0.6758 0.6387 0.7992
No log 7.3667 442 0.6277 0.6758 0.6277 0.7923
No log 7.4 444 0.5851 0.6846 0.5851 0.7649
No log 7.4333 446 0.5895 0.6940 0.5895 0.7678
No log 7.4667 448 0.6069 0.6488 0.6069 0.7791
No log 7.5 450 0.6159 0.6383 0.6159 0.7848
No log 7.5333 452 0.5980 0.6762 0.5980 0.7733
No log 7.5667 454 0.5995 0.6672 0.5995 0.7743
No log 7.6 456 0.6109 0.6397 0.6109 0.7816
No log 7.6333 458 0.6103 0.6397 0.6103 0.7812
No log 7.6667 460 0.5950 0.6672 0.5950 0.7714
No log 7.7 462 0.6012 0.6619 0.6012 0.7754
No log 7.7333 464 0.6146 0.6237 0.6146 0.7840
No log 7.7667 466 0.6223 0.6526 0.6223 0.7889
No log 7.8 468 0.6327 0.6756 0.6327 0.7955
No log 7.8333 470 0.6485 0.6226 0.6485 0.8053
No log 7.8667 472 0.6659 0.6119 0.6659 0.8160
No log 7.9 474 0.6497 0.6712 0.6497 0.8061
No log 7.9333 476 0.6524 0.6339 0.6524 0.8077
No log 7.9667 478 0.6760 0.5865 0.6760 0.8222
No log 8.0 480 0.6626 0.6246 0.6626 0.8140
No log 8.0333 482 0.6451 0.6365 0.6451 0.8032
No log 8.0667 484 0.6769 0.6564 0.6769 0.8227
No log 8.1 486 0.6826 0.6247 0.6826 0.8262
No log 8.1333 488 0.6608 0.6753 0.6608 0.8129
No log 8.1667 490 0.6564 0.6433 0.6564 0.8102
No log 8.2 492 0.6478 0.6796 0.6478 0.8049
No log 8.2333 494 0.6433 0.6712 0.6433 0.8021
No log 8.2667 496 0.6613 0.6672 0.6613 0.8132
No log 8.3 498 0.6598 0.6672 0.6598 0.8123
0.241 8.3333 500 0.6511 0.6365 0.6511 0.8069
0.241 8.3667 502 0.6403 0.6297 0.6403 0.8002
0.241 8.4 504 0.6356 0.6297 0.6356 0.7973
0.241 8.4333 506 0.6324 0.6745 0.6324 0.7952
0.241 8.4667 508 0.6408 0.6745 0.6408 0.8005
0.241 8.5 510 0.6572 0.6704 0.6572 0.8107
0.241 8.5333 512 0.6688 0.6129 0.6688 0.8178
0.241 8.5667 514 0.6623 0.6275 0.6623 0.8138
0.241 8.6 516 0.6563 0.6105 0.6563 0.8101
0.241 8.6333 518 0.6631 0.5971 0.6631 0.8143
0.241 8.6667 520 0.6736 0.5971 0.6736 0.8208

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k12_task5_organization

Finetuned
(4019)
this model