ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6955
  • Qwk: 0.5541
  • Mse: 0.6955
  • Rmse: 0.8339

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 4.1398 0.0130 4.1398 2.0346
No log 0.0444 4 2.3525 0.0458 2.3525 1.5338
No log 0.0667 6 1.5084 -0.0457 1.5084 1.2282
No log 0.0889 8 1.3266 -0.0695 1.3266 1.1518
No log 0.1111 10 2.0830 0.0709 2.0830 1.4433
No log 0.1333 12 2.4225 0.0618 2.4225 1.5564
No log 0.1556 14 1.7524 0.2137 1.7524 1.3238
No log 0.1778 16 0.8995 0.3776 0.8995 0.9484
No log 0.2 18 0.8415 0.3360 0.8415 0.9173
No log 0.2222 20 0.8698 0.2849 0.8698 0.9327
No log 0.2444 22 0.8543 0.3001 0.8543 0.9243
No log 0.2667 24 0.8116 0.3961 0.8116 0.9009
No log 0.2889 26 0.9272 0.3793 0.9272 0.9629
No log 0.3111 28 0.9571 0.3919 0.9571 0.9783
No log 0.3333 30 1.1281 0.375 1.1281 1.0621
No log 0.3556 32 1.3072 0.4079 1.3072 1.1433
No log 0.3778 34 0.9727 0.4053 0.9727 0.9863
No log 0.4 36 0.9531 0.4574 0.9531 0.9763
No log 0.4222 38 1.3637 0.3511 1.3637 1.1678
No log 0.4444 40 1.4513 0.3261 1.4513 1.2047
No log 0.4667 42 0.9842 0.5376 0.9842 0.9920
No log 0.4889 44 0.7493 0.5979 0.7493 0.8656
No log 0.5111 46 0.8381 0.6255 0.8381 0.9155
No log 0.5333 48 1.1957 0.4455 1.1957 1.0935
No log 0.5556 50 1.1459 0.4688 1.1459 1.0705
No log 0.5778 52 0.9064 0.6128 0.9064 0.9520
No log 0.6 54 0.7538 0.6602 0.7538 0.8682
No log 0.6222 56 0.7585 0.6384 0.7585 0.8709
No log 0.6444 58 0.7896 0.6369 0.7896 0.8886
No log 0.6667 60 0.7837 0.6188 0.7837 0.8853
No log 0.6889 62 0.7466 0.6412 0.7466 0.8641
No log 0.7111 64 0.7678 0.6202 0.7678 0.8763
No log 0.7333 66 0.8969 0.6322 0.8969 0.9471
No log 0.7556 68 0.7901 0.6264 0.7901 0.8889
No log 0.7778 70 0.8272 0.6516 0.8272 0.9095
No log 0.8 72 1.0201 0.5690 1.0201 1.0100
No log 0.8222 74 0.8797 0.6061 0.8797 0.9379
No log 0.8444 76 0.7088 0.5835 0.7088 0.8419
No log 0.8667 78 0.7069 0.6382 0.7069 0.8408
No log 0.8889 80 0.7209 0.6403 0.7209 0.8490
No log 0.9111 82 0.6601 0.6411 0.6601 0.8125
No log 0.9333 84 0.6473 0.6328 0.6473 0.8046
No log 0.9556 86 0.7222 0.6035 0.7222 0.8498
No log 0.9778 88 0.8216 0.5749 0.8216 0.9064
No log 1.0 90 0.7764 0.5564 0.7764 0.8811
No log 1.0222 92 0.8798 0.6370 0.8798 0.9380
No log 1.0444 94 1.0081 0.5873 1.0081 1.0040
No log 1.0667 96 0.8703 0.6075 0.8703 0.9329
No log 1.0889 98 0.8161 0.5620 0.8161 0.9034
No log 1.1111 100 1.0445 0.4643 1.0445 1.0220
No log 1.1333 102 0.9518 0.4905 0.9518 0.9756
No log 1.1556 104 0.8900 0.5350 0.8900 0.9434
No log 1.1778 106 0.7363 0.6063 0.7363 0.8581
No log 1.2 108 0.6991 0.6176 0.6991 0.8361
No log 1.2222 110 0.6769 0.6543 0.6769 0.8227
No log 1.2444 112 0.6478 0.6789 0.6478 0.8049
No log 1.2667 114 0.6286 0.6622 0.6286 0.7928
No log 1.2889 116 0.6189 0.6444 0.6189 0.7867
No log 1.3111 118 0.6170 0.6636 0.6170 0.7855
No log 1.3333 120 0.6370 0.6630 0.6370 0.7981
No log 1.3556 122 0.7052 0.6492 0.7052 0.8398
No log 1.3778 124 0.7420 0.6555 0.7420 0.8614
No log 1.4 126 0.7075 0.6706 0.7075 0.8411
No log 1.4222 128 0.6794 0.6568 0.6794 0.8242
No log 1.4444 130 0.7155 0.6396 0.7155 0.8459
No log 1.4667 132 0.7974 0.5536 0.7974 0.8930
No log 1.4889 134 0.7802 0.5359 0.7802 0.8833
No log 1.5111 136 0.6821 0.5830 0.6821 0.8259
No log 1.5333 138 0.7016 0.6148 0.7016 0.8376
No log 1.5556 140 0.8073 0.5574 0.8073 0.8985
No log 1.5778 142 1.0499 0.5505 1.0499 1.0247
No log 1.6 144 1.0556 0.5341 1.0556 1.0274
No log 1.6222 146 0.8334 0.5964 0.8334 0.9129
No log 1.6444 148 0.7358 0.6087 0.7358 0.8578
No log 1.6667 150 0.7060 0.6169 0.7060 0.8402
No log 1.6889 152 0.7040 0.6205 0.7040 0.8390
No log 1.7111 154 0.7093 0.6664 0.7093 0.8422
No log 1.7333 156 0.7376 0.6105 0.7376 0.8589
No log 1.7556 158 0.7629 0.5760 0.7629 0.8734
No log 1.7778 160 0.6822 0.6278 0.6822 0.8260
No log 1.8 162 0.6948 0.4874 0.6948 0.8336
No log 1.8222 164 0.7088 0.5737 0.7088 0.8419
No log 1.8444 166 0.7409 0.5999 0.7409 0.8608
No log 1.8667 168 0.7947 0.6134 0.7947 0.8914
No log 1.8889 170 0.8402 0.6336 0.8402 0.9166
No log 1.9111 172 0.9772 0.5455 0.9772 0.9885
No log 1.9333 174 0.9536 0.5155 0.9536 0.9765
No log 1.9556 176 0.7895 0.5772 0.7895 0.8885
No log 1.9778 178 0.7687 0.5709 0.7687 0.8768
No log 2.0 180 0.8661 0.5607 0.8661 0.9306
No log 2.0222 182 0.8385 0.5327 0.8385 0.9157
No log 2.0444 184 0.7522 0.5329 0.7522 0.8673
No log 2.0667 186 0.6930 0.6460 0.6930 0.8325
No log 2.0889 188 0.7108 0.5902 0.7108 0.8431
No log 2.1111 190 0.7402 0.6032 0.7402 0.8603
No log 2.1333 192 0.7540 0.6379 0.7540 0.8683
No log 2.1556 194 0.7703 0.6214 0.7703 0.8777
No log 2.1778 196 0.7653 0.5757 0.7653 0.8748
No log 2.2 198 0.7556 0.6168 0.7556 0.8693
No log 2.2222 200 0.7574 0.6388 0.7574 0.8703
No log 2.2444 202 0.8013 0.6131 0.8013 0.8952
No log 2.2667 204 0.7792 0.5988 0.7792 0.8827
No log 2.2889 206 0.7154 0.5534 0.7154 0.8458
No log 2.3111 208 0.7309 0.6044 0.7309 0.8550
No log 2.3333 210 0.7449 0.6044 0.7449 0.8631
No log 2.3556 212 0.7227 0.5140 0.7227 0.8501
No log 2.3778 214 0.7364 0.5911 0.7364 0.8582
No log 2.4 216 0.7531 0.6502 0.7531 0.8678
No log 2.4222 218 0.7599 0.6655 0.7599 0.8717
No log 2.4444 220 0.7347 0.6502 0.7347 0.8571
No log 2.4667 222 0.7116 0.5508 0.7116 0.8435
No log 2.4889 224 0.7082 0.5508 0.7082 0.8415
No log 2.5111 226 0.7132 0.6293 0.7132 0.8445
No log 2.5333 228 0.6740 0.6868 0.6740 0.8210
No log 2.5556 230 0.6374 0.6493 0.6374 0.7984
No log 2.5778 232 0.6255 0.7064 0.6255 0.7909
No log 2.6 234 0.5992 0.6813 0.5992 0.7741
No log 2.6222 236 0.5979 0.6536 0.5979 0.7732
No log 2.6444 238 0.5935 0.6709 0.5935 0.7704
No log 2.6667 240 0.5998 0.6709 0.5998 0.7744
No log 2.6889 242 0.6036 0.6553 0.6036 0.7769
No log 2.7111 244 0.6391 0.6846 0.6391 0.7994
No log 2.7333 246 0.6672 0.6368 0.6672 0.8168
No log 2.7556 248 0.7055 0.6070 0.7055 0.8399
No log 2.7778 250 0.7950 0.5621 0.7950 0.8917
No log 2.8 252 0.8469 0.5496 0.8469 0.9203
No log 2.8222 254 0.8259 0.5591 0.8259 0.9088
No log 2.8444 256 0.7961 0.6317 0.7961 0.8922
No log 2.8667 258 0.7341 0.6053 0.7341 0.8568
No log 2.8889 260 0.6814 0.6226 0.6814 0.8255
No log 2.9111 262 0.6586 0.6536 0.6586 0.8115
No log 2.9333 264 0.6423 0.6581 0.6423 0.8014
No log 2.9556 266 0.6378 0.6605 0.6378 0.7986
No log 2.9778 268 0.6261 0.6709 0.6261 0.7913
No log 3.0 270 0.6175 0.6545 0.6175 0.7858
No log 3.0222 272 0.6383 0.6626 0.6383 0.7989
No log 3.0444 274 0.6744 0.6018 0.6744 0.8212
No log 3.0667 276 0.6516 0.6128 0.6516 0.8072
No log 3.0889 278 0.6120 0.6345 0.6120 0.7823
No log 3.1111 280 0.6021 0.6470 0.6021 0.7760
No log 3.1333 282 0.6068 0.6572 0.6068 0.7790
No log 3.1556 284 0.6427 0.6709 0.6427 0.8017
No log 3.1778 286 0.6973 0.6535 0.6973 0.8351
No log 3.2 288 0.7003 0.6322 0.7003 0.8369
No log 3.2222 290 0.6896 0.5889 0.6896 0.8304
No log 3.2444 292 0.7025 0.5775 0.7025 0.8381
No log 3.2667 294 0.7002 0.6699 0.7002 0.8368
No log 3.2889 296 0.6841 0.6673 0.6841 0.8271
No log 3.3111 298 0.6798 0.6699 0.6798 0.8245
No log 3.3333 300 0.6938 0.5958 0.6938 0.8330
No log 3.3556 302 0.7510 0.6340 0.7510 0.8666
No log 3.3778 304 0.7070 0.5919 0.7070 0.8408
No log 3.4 306 0.6658 0.5969 0.6658 0.8160
No log 3.4222 308 0.6528 0.6464 0.6528 0.8079
No log 3.4444 310 0.6508 0.6298 0.6508 0.8068
No log 3.4667 312 0.6466 0.6549 0.6466 0.8041
No log 3.4889 314 0.6636 0.6623 0.6636 0.8146
No log 3.5111 316 0.6743 0.6599 0.6743 0.8212
No log 3.5333 318 0.6626 0.6503 0.6626 0.8140
No log 3.5556 320 0.6584 0.6503 0.6584 0.8114
No log 3.5778 322 0.6618 0.6637 0.6618 0.8135
No log 3.6 324 0.6459 0.6423 0.6459 0.8037
No log 3.6222 326 0.6114 0.6955 0.6114 0.7819
No log 3.6444 328 0.6075 0.6470 0.6075 0.7794
No log 3.6667 330 0.5978 0.7224 0.5978 0.7732
No log 3.6889 332 0.6220 0.6714 0.6220 0.7887
No log 3.7111 334 0.6524 0.6379 0.6524 0.8077
No log 3.7333 336 0.6828 0.6527 0.6828 0.8263
No log 3.7556 338 0.6846 0.6179 0.6846 0.8274
No log 3.7778 340 0.6597 0.6261 0.6597 0.8122
No log 3.8 342 0.6743 0.6781 0.6743 0.8211
No log 3.8222 344 0.7062 0.6148 0.7062 0.8404
No log 3.8444 346 0.7647 0.6061 0.7647 0.8745
No log 3.8667 348 0.7119 0.6328 0.7119 0.8438
No log 3.8889 350 0.6759 0.6789 0.6759 0.8222
No log 3.9111 352 0.7181 0.6129 0.7181 0.8474
No log 3.9333 354 0.7892 0.6294 0.7892 0.8883
No log 3.9556 356 0.7363 0.6197 0.7363 0.8581
No log 3.9778 358 0.6416 0.6347 0.6416 0.8010
No log 4.0 360 0.6252 0.6589 0.6252 0.7907
No log 4.0222 362 0.6336 0.6589 0.6336 0.7960
No log 4.0444 364 0.6398 0.6447 0.6398 0.7999
No log 4.0667 366 0.6549 0.6447 0.6549 0.8093
No log 4.0889 368 0.6615 0.6355 0.6615 0.8134
No log 4.1111 370 0.6598 0.6408 0.6598 0.8123
No log 4.1333 372 0.6493 0.6408 0.6493 0.8058
No log 4.1556 374 0.6406 0.6286 0.6406 0.8004
No log 4.1778 376 0.6409 0.4957 0.6409 0.8005
No log 4.2 378 0.6418 0.5819 0.6418 0.8011
No log 4.2222 380 0.6366 0.6167 0.6366 0.7979
No log 4.2444 382 0.6295 0.6399 0.6295 0.7934
No log 4.2667 384 0.6355 0.6484 0.6355 0.7972
No log 4.2889 386 0.6471 0.5772 0.6471 0.8044
No log 4.3111 388 0.6143 0.6060 0.6143 0.7838
No log 4.3333 390 0.6166 0.6129 0.6166 0.7852
No log 4.3556 392 0.6069 0.5726 0.6069 0.7791
No log 4.3778 394 0.6111 0.6650 0.6111 0.7817
No log 4.4 396 0.6445 0.6909 0.6445 0.8028
No log 4.4222 398 0.6656 0.6805 0.6656 0.8159
No log 4.4444 400 0.6511 0.6846 0.6511 0.8069
No log 4.4667 402 0.6306 0.6690 0.6306 0.7941
No log 4.4889 404 0.6290 0.6097 0.6290 0.7931
No log 4.5111 406 0.6142 0.6409 0.6142 0.7837
No log 4.5333 408 0.5995 0.6830 0.5995 0.7742
No log 4.5556 410 0.5949 0.6923 0.5949 0.7713
No log 4.5778 412 0.6073 0.7054 0.6073 0.7793
No log 4.6 414 0.6467 0.6472 0.6467 0.8042
No log 4.6222 416 0.6606 0.6446 0.6606 0.8128
No log 4.6444 418 0.6677 0.6476 0.6677 0.8172
No log 4.6667 420 0.7456 0.5841 0.7456 0.8635
No log 4.6889 422 0.7070 0.5344 0.7070 0.8409
No log 4.7111 424 0.6359 0.5809 0.6359 0.7974
No log 4.7333 426 0.6830 0.5763 0.6830 0.8264
No log 4.7556 428 0.8293 0.5266 0.8293 0.9107
No log 4.7778 430 0.8823 0.5604 0.8823 0.9393
No log 4.8 432 0.8024 0.6025 0.8024 0.8958
No log 4.8222 434 0.6827 0.6602 0.6827 0.8263
No log 4.8444 436 0.6588 0.5275 0.6588 0.8116
No log 4.8667 438 0.6567 0.5068 0.6567 0.8104
No log 4.8889 440 0.6353 0.5845 0.6353 0.7970
No log 4.9111 442 0.6492 0.6311 0.6492 0.8057
No log 4.9333 444 0.6618 0.6325 0.6618 0.8135
No log 4.9556 446 0.6406 0.5886 0.6406 0.8004
No log 4.9778 448 0.6651 0.5343 0.6651 0.8155
No log 5.0 450 0.6623 0.5228 0.6623 0.8138
No log 5.0222 452 0.6231 0.5856 0.6231 0.7894
No log 5.0444 454 0.6333 0.6120 0.6333 0.7958
No log 5.0667 456 0.6521 0.6733 0.6521 0.8075
No log 5.0889 458 0.6825 0.6326 0.6825 0.8261
No log 5.1111 460 0.7244 0.5987 0.7244 0.8511
No log 5.1333 462 0.7223 0.5730 0.7223 0.8499
No log 5.1556 464 0.6741 0.5707 0.6741 0.8210
No log 5.1778 466 0.6585 0.6051 0.6585 0.8115
No log 5.2 468 0.7177 0.6685 0.7177 0.8472
No log 5.2222 470 0.7885 0.5877 0.7885 0.8880
No log 5.2444 472 0.7754 0.6145 0.7754 0.8806
No log 5.2667 474 0.7041 0.5972 0.7041 0.8391
No log 5.2889 476 0.6521 0.5884 0.6521 0.8075
No log 5.3111 478 0.6536 0.5300 0.6536 0.8084
No log 5.3333 480 0.6679 0.5326 0.6679 0.8172
No log 5.3556 482 0.6477 0.6047 0.6477 0.8048
No log 5.3778 484 0.6176 0.6217 0.6176 0.7859
No log 5.4 486 0.6294 0.6732 0.6294 0.7933
No log 5.4222 488 0.6571 0.6684 0.6571 0.8106
No log 5.4444 490 0.6322 0.6449 0.6322 0.7951
No log 5.4667 492 0.6281 0.6486 0.6281 0.7925
No log 5.4889 494 0.6217 0.6266 0.6217 0.7885
No log 5.5111 496 0.6279 0.5759 0.6279 0.7924
No log 5.5333 498 0.6287 0.5759 0.6287 0.7929
0.2826 5.5556 500 0.6282 0.6498 0.6282 0.7926
0.2826 5.5778 502 0.6217 0.6575 0.6217 0.7884
0.2826 5.6 504 0.6158 0.6398 0.6158 0.7847
0.2826 5.6222 506 0.6317 0.5950 0.6317 0.7948
0.2826 5.6444 508 0.7153 0.5889 0.7153 0.8457
0.2826 5.6667 510 0.7640 0.5821 0.7640 0.8741
0.2826 5.6889 512 0.6955 0.5541 0.6955 0.8339

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task5_organization

Finetuned
(4019)
this model