ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6397
  • Qwk: 0.7613
  • Mse: 0.6397
  • Rmse: 0.7998

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0148 2 6.7271 0.0239 6.7271 2.5937
No log 0.0296 4 4.3700 0.0389 4.3700 2.0905
No log 0.0444 6 2.8876 0.0755 2.8876 1.6993
No log 0.0593 8 2.1178 0.1765 2.1178 1.4553
No log 0.0741 10 1.7177 0.1296 1.7177 1.3106
No log 0.0889 12 1.6225 0.2143 1.6225 1.2738
No log 0.1037 14 1.6543 0.1802 1.6543 1.2862
No log 0.1185 16 1.6449 0.3077 1.6449 1.2825
No log 0.1333 18 1.6746 0.1651 1.6746 1.2941
No log 0.1481 20 2.0586 0.2000 2.0586 1.4348
No log 0.1630 22 2.5926 0.0541 2.5926 1.6102
No log 0.1778 24 2.0437 0.2014 2.0437 1.4296
No log 0.1926 26 1.1723 0.5079 1.1723 1.0827
No log 0.2074 28 1.2314 0.4677 1.2314 1.1097
No log 0.2222 30 1.0487 0.5846 1.0487 1.0241
No log 0.2370 32 0.8613 0.6043 0.8613 0.9281
No log 0.2519 34 1.8842 0.3355 1.8842 1.3727
No log 0.2667 36 1.8586 0.3694 1.8586 1.3633
No log 0.2815 38 1.1714 0.5775 1.1714 1.0823
No log 0.2963 40 0.7191 0.7183 0.7191 0.8480
No log 0.3111 42 0.9573 0.625 0.9573 0.9784
No log 0.3259 44 0.8494 0.6939 0.8494 0.9216
No log 0.3407 46 0.6915 0.7285 0.6915 0.8315
No log 0.3556 48 1.5092 0.5595 1.5092 1.2285
No log 0.3704 50 2.1209 0.4278 2.1209 1.4563
No log 0.3852 52 1.6855 0.4824 1.6855 1.2983
No log 0.4 54 0.9614 0.6753 0.9614 0.9805
No log 0.4148 56 0.7874 0.7034 0.7874 0.8873
No log 0.4296 58 0.9232 0.6294 0.9232 0.9608
No log 0.4444 60 1.1422 0.5547 1.1422 1.0687
No log 0.4593 62 1.1119 0.5882 1.1119 1.0545
No log 0.4741 64 0.8764 0.7 0.8764 0.9361
No log 0.4889 66 0.8434 0.6713 0.8434 0.9183
No log 0.5037 68 0.8222 0.6849 0.8222 0.9068
No log 0.5185 70 0.7514 0.7034 0.7514 0.8668
No log 0.5333 72 0.7156 0.7273 0.7156 0.8460
No log 0.5481 74 0.7439 0.7083 0.7439 0.8625
No log 0.5630 76 0.7574 0.7133 0.7574 0.8703
No log 0.5778 78 0.8319 0.7020 0.8319 0.9121
No log 0.5926 80 0.9286 0.7051 0.9286 0.9636
No log 0.6074 82 0.7339 0.7805 0.7339 0.8567
No log 0.6222 84 0.6234 0.8045 0.6234 0.7896
No log 0.6370 86 0.6521 0.8223 0.6521 0.8075
No log 0.6519 88 0.7236 0.8043 0.7236 0.8506
No log 0.6667 90 0.7629 0.7425 0.7629 0.8735
No log 0.6815 92 0.6986 0.7564 0.6986 0.8358
No log 0.6963 94 0.6664 0.7517 0.6664 0.8163
No log 0.7111 96 0.7718 0.7042 0.7718 0.8785
No log 0.7259 98 0.8112 0.7042 0.8112 0.9006
No log 0.7407 100 0.7459 0.7234 0.7459 0.8636
No log 0.7556 102 0.7174 0.7042 0.7174 0.8470
No log 0.7704 104 0.7243 0.6933 0.7243 0.8511
No log 0.7852 106 0.7658 0.7152 0.7658 0.8751
No log 0.8 108 0.6699 0.7355 0.6699 0.8185
No log 0.8148 110 0.6290 0.7871 0.6290 0.7931
No log 0.8296 112 0.6897 0.7843 0.6897 0.8305
No log 0.8444 114 0.6845 0.7333 0.6845 0.8273
No log 0.8593 116 0.6865 0.7451 0.6865 0.8285
No log 0.8741 118 0.6919 0.7342 0.6919 0.8318
No log 0.8889 120 0.8047 0.7125 0.8047 0.8971
No log 0.9037 122 0.7048 0.7683 0.7048 0.8395
No log 0.9185 124 0.6062 0.7955 0.6062 0.7786
No log 0.9333 126 0.6162 0.8065 0.6162 0.7850
No log 0.9481 128 0.6352 0.8466 0.6352 0.7970
No log 0.9630 130 0.6890 0.7717 0.6890 0.8301
No log 0.9778 132 0.6442 0.7901 0.6442 0.8026
No log 0.9926 134 0.6979 0.7297 0.6979 0.8354
No log 1.0074 136 0.8125 0.7285 0.8125 0.9014
No log 1.0222 138 0.7418 0.72 0.7418 0.8613
No log 1.0370 140 0.7100 0.7075 0.7100 0.8426
No log 1.0519 142 0.9265 0.7101 0.9265 0.9625
No log 1.0667 144 1.0619 0.6667 1.0619 1.0305
No log 1.0815 146 0.9444 0.6818 0.9444 0.9718
No log 1.0963 148 0.6877 0.7826 0.6877 0.8293
No log 1.1111 150 0.7174 0.7673 0.7174 0.8470
No log 1.1259 152 0.8430 0.7333 0.8430 0.9181
No log 1.1407 154 0.7996 0.7248 0.7996 0.8942
No log 1.1556 156 0.6841 0.7582 0.6841 0.8271
No log 1.1704 158 0.6531 0.7613 0.6531 0.8081
No log 1.1852 160 0.6478 0.7925 0.6478 0.8048
No log 1.2 162 0.6765 0.7853 0.6765 0.8225
No log 1.2148 164 0.6927 0.7805 0.6927 0.8323
No log 1.2296 166 0.5844 0.8409 0.5844 0.7645
No log 1.2444 168 0.6290 0.7760 0.6290 0.7931
No log 1.2593 170 0.5714 0.8182 0.5714 0.7559
No log 1.2741 172 0.5117 0.8466 0.5117 0.7153
No log 1.2889 174 0.7114 0.7375 0.7114 0.8435
No log 1.3037 176 0.6649 0.7578 0.6649 0.8154
No log 1.3185 178 0.5821 0.8182 0.5821 0.7629
No log 1.3333 180 0.7983 0.7089 0.7983 0.8935
No log 1.3481 182 0.9079 0.6752 0.9079 0.9528
No log 1.3630 184 0.8708 0.6752 0.8708 0.9332
No log 1.3778 186 0.7324 0.6962 0.7324 0.8558
No log 1.3926 188 0.6438 0.7771 0.6438 0.8023
No log 1.4074 190 0.6281 0.7771 0.6281 0.7925
No log 1.4222 192 0.6450 0.7613 0.6450 0.8031
No log 1.4370 194 0.7393 0.6968 0.7393 0.8598
No log 1.4519 196 0.8633 0.6301 0.8633 0.9291
No log 1.4667 198 0.8297 0.6906 0.8297 0.9109
No log 1.4815 200 0.7589 0.7206 0.7589 0.8712
No log 1.4963 202 0.7274 0.7413 0.7274 0.8529
No log 1.5111 204 0.6620 0.7755 0.6620 0.8136
No log 1.5259 206 0.6106 0.8079 0.6106 0.7814
No log 1.5407 208 0.6768 0.7097 0.6768 0.8227
No log 1.5556 210 0.6674 0.7308 0.6674 0.8169
No log 1.5704 212 0.5848 0.8075 0.5848 0.7647
No log 1.5852 214 0.5816 0.7950 0.5816 0.7627
No log 1.6 216 0.6266 0.8 0.6266 0.7916
No log 1.6148 218 0.6860 0.7821 0.6860 0.8283
No log 1.6296 220 0.6666 0.7925 0.6666 0.8165
No log 1.6444 222 0.7043 0.7712 0.7043 0.8392
No log 1.6593 224 0.8172 0.7059 0.8172 0.9040
No log 1.6741 226 0.8394 0.7059 0.8394 0.9162
No log 1.6889 228 0.7918 0.7134 0.7918 0.8899
No log 1.7037 230 0.8368 0.7368 0.8368 0.9148
No log 1.7185 232 0.7968 0.7746 0.7968 0.8926
No log 1.7333 234 0.7093 0.7590 0.7093 0.8422
No log 1.7481 236 0.6561 0.7771 0.6561 0.8100
No log 1.7630 238 0.6533 0.7421 0.6533 0.8083
No log 1.7778 240 0.7167 0.7308 0.7167 0.8466
No log 1.7926 242 0.7556 0.7368 0.7556 0.8692
No log 1.8074 244 0.7243 0.7671 0.7243 0.8510
No log 1.8222 246 0.7160 0.7413 0.7160 0.8462
No log 1.8370 248 0.6262 0.7671 0.6262 0.7914
No log 1.8519 250 0.5566 0.8289 0.5566 0.7461
No log 1.8667 252 0.4925 0.825 0.4925 0.7018
No log 1.8815 254 0.4741 0.8521 0.4741 0.6886
No log 1.8963 256 0.4810 0.8344 0.4810 0.6936
No log 1.9111 258 0.5341 0.825 0.5341 0.7308
No log 1.9259 260 0.6274 0.8077 0.6274 0.7921
No log 1.9407 262 0.7459 0.7362 0.7459 0.8637
No log 1.9556 264 0.7008 0.7640 0.7008 0.8371
No log 1.9704 266 0.6029 0.7977 0.6029 0.7765
No log 1.9852 268 0.5661 0.7907 0.5661 0.7524
No log 2.0 270 0.6150 0.7607 0.6150 0.7842
No log 2.0148 272 0.5676 0.8125 0.5676 0.7534
No log 2.0296 274 0.5894 0.8129 0.5894 0.7677
No log 2.0444 276 0.6339 0.7763 0.6339 0.7962
No log 2.0593 278 0.6512 0.7919 0.6512 0.8070
No log 2.0741 280 0.7065 0.7260 0.7065 0.8405
No log 2.0889 282 0.6870 0.6887 0.6870 0.8288
No log 2.1037 284 0.6278 0.7485 0.6278 0.7924
No log 2.1185 286 0.5644 0.8471 0.5644 0.7513
No log 2.1333 288 0.5645 0.8284 0.5645 0.7513
No log 2.1481 290 0.5922 0.7927 0.5922 0.7695
No log 2.1630 292 0.6022 0.8208 0.6022 0.7760
No log 2.1778 294 0.7924 0.7459 0.7924 0.8902
No log 2.1926 296 1.1670 0.6413 1.1670 1.0803
No log 2.2074 298 1.2549 0.6111 1.2549 1.1202
No log 2.2222 300 1.0403 0.6456 1.0403 1.0200
No log 2.2370 302 0.8172 0.7 0.8172 0.9040
No log 2.2519 304 0.6761 0.7808 0.6761 0.8223
No log 2.2667 306 0.5961 0.8289 0.5961 0.7721
No log 2.2815 308 0.5588 0.8235 0.5588 0.7475
No log 2.2963 310 0.5766 0.7895 0.5766 0.7594
No log 2.3111 312 0.6232 0.7451 0.6232 0.7895
No log 2.3259 314 0.5618 0.7895 0.5618 0.7496
No log 2.3407 316 0.4722 0.8182 0.4722 0.6871
No log 2.3556 318 0.4370 0.8442 0.4370 0.6611
No log 2.3704 320 0.4742 0.8387 0.4742 0.6886
No log 2.3852 322 0.4743 0.8205 0.4743 0.6887
No log 2.4 324 0.4427 0.8535 0.4427 0.6653
No log 2.4148 326 0.4790 0.8125 0.4790 0.6921
No log 2.4296 328 0.5525 0.8050 0.5525 0.7433
No log 2.4444 330 0.5758 0.7975 0.5758 0.7588
No log 2.4593 332 0.5753 0.7895 0.5753 0.7585
No log 2.4741 334 0.5578 0.8133 0.5578 0.7469
No log 2.4889 336 0.5566 0.8133 0.5566 0.7461
No log 2.5037 338 0.5758 0.7975 0.5758 0.7588
No log 2.5185 340 0.6447 0.7590 0.6447 0.8029
No log 2.5333 342 0.6335 0.775 0.6335 0.7959
No log 2.5481 344 0.5931 0.7733 0.5931 0.7701
No log 2.5630 346 0.6375 0.75 0.6375 0.7984
No log 2.5778 348 0.6710 0.7448 0.6710 0.8192
No log 2.5926 350 0.7320 0.7133 0.7320 0.8556
No log 2.6074 352 0.7241 0.7133 0.7241 0.8510
No log 2.6222 354 0.6474 0.75 0.6474 0.8046
No log 2.6370 356 0.5774 0.8163 0.5774 0.7599
No log 2.6519 358 0.5735 0.8182 0.5735 0.7573
No log 2.6667 360 0.5780 0.7974 0.5780 0.7603
No log 2.6815 362 0.5369 0.8333 0.5369 0.7328
No log 2.6963 364 0.5480 0.8024 0.5480 0.7402
No log 2.7111 366 0.6246 0.8108 0.6246 0.7903
No log 2.7259 368 0.7953 0.7473 0.7953 0.8918
No log 2.7407 370 0.8965 0.6923 0.8965 0.9469
No log 2.7556 372 0.8257 0.6905 0.8257 0.9087
No log 2.7704 374 0.6877 0.7361 0.6877 0.8293
No log 2.7852 376 0.6135 0.7919 0.6135 0.7832
No log 2.8 378 0.6597 0.7724 0.6597 0.8122
No log 2.8148 380 0.6284 0.8027 0.6284 0.7927
No log 2.8296 382 0.5764 0.8 0.5764 0.7592
No log 2.8444 384 0.7225 0.7582 0.7225 0.8500
No log 2.8593 386 0.8982 0.7204 0.8982 0.9477
No log 2.8741 388 0.8896 0.7213 0.8896 0.9432
No log 2.8889 390 0.8029 0.6946 0.8029 0.8960
No log 2.9037 392 0.7336 0.7170 0.7336 0.8565
No log 2.9185 394 0.6803 0.7785 0.6803 0.8248
No log 2.9333 396 0.6507 0.8158 0.6507 0.8066
No log 2.9481 398 0.6477 0.8105 0.6477 0.8048
No log 2.9630 400 0.6384 0.8158 0.6384 0.7990
No log 2.9778 402 0.6309 0.8054 0.6309 0.7943
No log 2.9926 404 0.6892 0.7436 0.6892 0.8302
No log 3.0074 406 0.7073 0.7746 0.7073 0.8410
No log 3.0222 408 0.6234 0.8182 0.6234 0.7896
No log 3.0370 410 0.5166 0.8706 0.5166 0.7188
No log 3.0519 412 0.4806 0.8302 0.4806 0.6933
No log 3.0667 414 0.5060 0.8333 0.5060 0.7113
No log 3.0815 416 0.5515 0.8212 0.5515 0.7426
No log 3.0963 418 0.5906 0.8108 0.5906 0.7685
No log 3.1111 420 0.6545 0.7832 0.6545 0.8090
No log 3.1259 422 0.7358 0.7059 0.7358 0.8578
No log 3.1407 424 0.7440 0.7101 0.7440 0.8626
No log 3.1556 426 0.7074 0.7133 0.7074 0.8410
No log 3.1704 428 0.5932 0.8054 0.5932 0.7702
No log 3.1852 430 0.5112 0.8312 0.5112 0.7150
No log 3.2 432 0.4743 0.8366 0.4743 0.6887
No log 3.2148 434 0.4704 0.8312 0.4704 0.6858
No log 3.2296 436 0.4833 0.8312 0.4833 0.6952
No log 3.2444 438 0.5015 0.8182 0.5015 0.7082
No log 3.2593 440 0.5255 0.8235 0.5255 0.7249
No log 3.2741 442 0.5355 0.8354 0.5355 0.7317
No log 3.2889 444 0.5441 0.8302 0.5441 0.7376
No log 3.3037 446 0.5604 0.8 0.5604 0.7486
No log 3.3185 448 0.5530 0.8302 0.5530 0.7437
No log 3.3333 450 0.5665 0.8077 0.5665 0.7527
No log 3.3481 452 0.6539 0.7861 0.6539 0.8086
No log 3.3630 454 0.7594 0.7528 0.7594 0.8714
No log 3.3778 456 0.7986 0.7011 0.7986 0.8937
No log 3.3926 458 0.7500 0.7543 0.7500 0.8660
No log 3.4074 460 0.6367 0.7821 0.6367 0.7979
No log 3.4222 462 0.6164 0.7643 0.6164 0.7851
No log 3.4370 464 0.5460 0.8125 0.5460 0.7389
No log 3.4519 466 0.5014 0.8176 0.5014 0.7081
No log 3.4667 468 0.4693 0.8395 0.4693 0.6851
No log 3.4815 470 0.4673 0.8452 0.4673 0.6836
No log 3.4963 472 0.4907 0.8452 0.4907 0.7005
No log 3.5111 474 0.4742 0.8571 0.4742 0.6887
No log 3.5259 476 0.5073 0.8077 0.5073 0.7122
No log 3.5407 478 0.5788 0.7871 0.5788 0.7608
No log 3.5556 480 0.6774 0.7632 0.6774 0.8231
No log 3.5704 482 0.7130 0.7586 0.7130 0.8444
No log 3.5852 484 0.7270 0.7465 0.7270 0.8527
No log 3.6 486 0.6863 0.7619 0.6863 0.8284
No log 3.6148 488 0.6400 0.7561 0.6400 0.8000
No log 3.6296 490 0.6383 0.7765 0.6383 0.7989
No log 3.6444 492 0.5824 0.7805 0.5824 0.7632
No log 3.6593 494 0.5378 0.7792 0.5378 0.7333
No log 3.6741 496 0.5499 0.8077 0.5499 0.7415
No log 3.6889 498 0.5429 0.8077 0.5429 0.7368
0.4397 3.7037 500 0.5257 0.7871 0.5257 0.7251
0.4397 3.7185 502 0.5494 0.7922 0.5494 0.7412
0.4397 3.7333 504 0.5774 0.7843 0.5774 0.7599
0.4397 3.7481 506 0.6895 0.75 0.6895 0.8304
0.4397 3.7630 508 0.7472 0.7172 0.7472 0.8644
0.4397 3.7778 510 0.7127 0.7123 0.7127 0.8442
0.4397 3.7926 512 0.6397 0.7613 0.6397 0.7998

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task1_organization

Finetuned
(4023)
this model