ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k16_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6063
  • Qwk: 0.6348
  • Mse: 0.6063
  • Rmse: 0.7787

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.025 2 4.1556 0.0035 4.1556 2.0385
No log 0.05 4 2.2190 0.0203 2.2190 1.4896
No log 0.075 6 1.3883 -0.0180 1.3883 1.1783
No log 0.1 8 1.0683 0.2711 1.0683 1.0336
No log 0.125 10 1.0352 0.4205 1.0352 1.0175
No log 0.15 12 1.0448 0.2865 1.0448 1.0221
No log 0.175 14 1.0528 0.2161 1.0528 1.0260
No log 0.2 16 1.0382 0.1263 1.0382 1.0189
No log 0.225 18 1.0084 0.2365 1.0084 1.0042
No log 0.25 20 1.0114 0.2416 1.0114 1.0057
No log 0.275 22 1.0707 0.2640 1.0707 1.0347
No log 0.3 24 1.1175 0.3090 1.1175 1.0571
No log 0.325 26 1.0302 0.3069 1.0302 1.0150
No log 0.35 28 0.8929 0.3891 0.8929 0.9449
No log 0.375 30 0.9809 0.3860 0.9809 0.9904
No log 0.4 32 1.0953 0.2790 1.0953 1.0465
No log 0.425 34 0.9048 0.3972 0.9048 0.9512
No log 0.45 36 0.8112 0.4557 0.8112 0.9007
No log 0.475 38 0.9695 0.2864 0.9695 0.9846
No log 0.5 40 1.2148 0.3340 1.2148 1.1022
No log 0.525 42 1.1005 0.3296 1.1005 1.0491
No log 0.55 44 0.7862 0.4754 0.7862 0.8867
No log 0.575 46 0.8583 0.4665 0.8583 0.9264
No log 0.6 48 1.2013 0.3388 1.2013 1.0960
No log 0.625 50 1.3987 0.2682 1.3987 1.1826
No log 0.65 52 1.6462 0.2136 1.6462 1.2831
No log 0.675 54 1.5795 0.2923 1.5795 1.2568
No log 0.7 56 1.3293 0.3998 1.3293 1.1529
No log 0.725 58 1.2785 0.2510 1.2785 1.1307
No log 0.75 60 1.0521 0.4307 1.0521 1.0257
No log 0.775 62 0.8837 0.5363 0.8837 0.9400
No log 0.8 64 0.8636 0.5215 0.8636 0.9293
No log 0.825 66 0.8960 0.4983 0.8960 0.9466
No log 0.85 68 1.0048 0.4619 1.0048 1.0024
No log 0.875 70 1.0707 0.4063 1.0707 1.0347
No log 0.9 72 1.0519 0.4389 1.0519 1.0256
No log 0.925 74 1.0855 0.3998 1.0855 1.0419
No log 0.95 76 0.9323 0.4694 0.9323 0.9656
No log 0.975 78 0.8083 0.5032 0.8083 0.8991
No log 1.0 80 0.7585 0.4752 0.7585 0.8709
No log 1.025 82 0.7828 0.4847 0.7828 0.8847
No log 1.05 84 0.9438 0.4474 0.9438 0.9715
No log 1.075 86 1.1100 0.2632 1.1100 1.0536
No log 1.1 88 1.1058 0.3045 1.1058 1.0516
No log 1.125 90 1.0062 0.4130 1.0062 1.0031
No log 1.15 92 0.7599 0.5571 0.7599 0.8717
No log 1.175 94 0.6930 0.5833 0.6930 0.8324
No log 1.2 96 0.6731 0.6342 0.6731 0.8204
No log 1.225 98 0.7205 0.6113 0.7205 0.8488
No log 1.25 100 0.8902 0.5458 0.8902 0.9435
No log 1.275 102 0.9048 0.5222 0.9048 0.9512
No log 1.3 104 0.8044 0.5353 0.8044 0.8969
No log 1.325 106 0.7411 0.5554 0.7411 0.8609
No log 1.35 108 0.7744 0.5342 0.7744 0.8800
No log 1.375 110 0.9023 0.5110 0.9023 0.9499
No log 1.4 112 1.1121 0.4281 1.1121 1.0546
No log 1.425 114 1.0828 0.4295 1.0828 1.0406
No log 1.45 116 0.9450 0.4681 0.9450 0.9721
No log 1.475 118 0.7647 0.5046 0.7647 0.8745
No log 1.5 120 0.7354 0.5413 0.7354 0.8576
No log 1.525 122 0.8142 0.5458 0.8142 0.9023
No log 1.55 124 0.8284 0.5140 0.8284 0.9102
No log 1.575 126 0.7425 0.5728 0.7425 0.8617
No log 1.6 128 0.6671 0.6438 0.6671 0.8167
No log 1.625 130 0.6290 0.6438 0.6290 0.7931
No log 1.65 132 0.6268 0.6806 0.6268 0.7917
No log 1.675 134 0.6232 0.6965 0.6232 0.7894
No log 1.7 136 0.6015 0.6939 0.6015 0.7756
No log 1.725 138 0.5964 0.6921 0.5964 0.7723
No log 1.75 140 0.5883 0.7314 0.5883 0.7670
No log 1.775 142 0.5833 0.6911 0.5833 0.7638
No log 1.8 144 0.6086 0.6675 0.6086 0.7801
No log 1.825 146 0.5885 0.6796 0.5885 0.7672
No log 1.85 148 0.6142 0.6807 0.6142 0.7837
No log 1.875 150 0.7116 0.5266 0.7116 0.8436
No log 1.9 152 0.7433 0.5254 0.7433 0.8621
No log 1.925 154 0.8058 0.5254 0.8058 0.8977
No log 1.95 156 0.7743 0.5370 0.7743 0.8799
No log 1.975 158 0.7236 0.5912 0.7236 0.8506
No log 2.0 160 0.7078 0.5799 0.7078 0.8413
No log 2.025 162 0.6440 0.6035 0.6440 0.8025
No log 2.05 164 0.6431 0.6177 0.6431 0.8020
No log 2.075 166 0.6840 0.5964 0.6840 0.8270
No log 2.1 168 0.7873 0.5220 0.7873 0.8873
No log 2.125 170 0.9002 0.5179 0.9002 0.9488
No log 2.15 172 0.8117 0.5318 0.8117 0.9009
No log 2.175 174 0.7129 0.5651 0.7129 0.8444
No log 2.2 176 0.6786 0.5734 0.6786 0.8237
No log 2.225 178 0.6649 0.5862 0.6649 0.8154
No log 2.25 180 0.6733 0.6276 0.6733 0.8205
No log 2.275 182 0.6856 0.5855 0.6856 0.8280
No log 2.3 184 0.6852 0.6228 0.6852 0.8278
No log 2.325 186 0.6875 0.6228 0.6875 0.8292
No log 2.35 188 0.6797 0.6412 0.6797 0.8244
No log 2.375 190 0.7055 0.6502 0.7055 0.8399
No log 2.4 192 0.6928 0.6352 0.6928 0.8323
No log 2.425 194 0.7501 0.5963 0.7501 0.8661
No log 2.45 196 0.7779 0.5877 0.7779 0.8820
No log 2.475 198 0.7278 0.5697 0.7278 0.8531
No log 2.5 200 0.6380 0.6464 0.6380 0.7988
No log 2.525 202 0.6311 0.6983 0.6311 0.7944
No log 2.55 204 0.6317 0.7054 0.6317 0.7948
No log 2.575 206 0.6728 0.6620 0.6728 0.8202
No log 2.6 208 0.8481 0.5076 0.8481 0.9209
No log 2.625 210 0.8731 0.5 0.8731 0.9344
No log 2.65 212 0.8129 0.4946 0.8129 0.9016
No log 2.675 214 0.7742 0.5183 0.7742 0.8799
No log 2.7 216 0.7356 0.5397 0.7356 0.8577
No log 2.725 218 0.7008 0.6537 0.7008 0.8371
No log 2.75 220 0.6658 0.6628 0.6658 0.8160
No log 2.775 222 0.6799 0.6357 0.6799 0.8245
No log 2.8 224 0.7482 0.5958 0.7482 0.8650
No log 2.825 226 0.7589 0.5451 0.7589 0.8712
No log 2.85 228 0.7369 0.5482 0.7369 0.8585
No log 2.875 230 0.7064 0.4660 0.7064 0.8405
No log 2.9 232 0.7280 0.4547 0.7280 0.8532
No log 2.925 234 0.7637 0.5572 0.7637 0.8739
No log 2.95 236 0.8807 0.5 0.8807 0.9385
No log 2.975 238 0.8260 0.5 0.8260 0.9088
No log 3.0 240 0.7612 0.5602 0.7612 0.8725
No log 3.025 242 0.7298 0.5602 0.7298 0.8543
No log 3.05 244 0.7097 0.6137 0.7097 0.8424
No log 3.075 246 0.7519 0.5658 0.7519 0.8671
No log 3.1 248 0.8287 0.5486 0.8287 0.9103
No log 3.125 250 0.8533 0.5385 0.8533 0.9238
No log 3.15 252 0.8051 0.5425 0.8051 0.8973
No log 3.175 254 0.7461 0.5397 0.7461 0.8638
No log 3.2 256 0.7864 0.5675 0.7864 0.8868
No log 3.225 258 0.7941 0.5675 0.7941 0.8911
No log 3.25 260 0.8501 0.4894 0.8501 0.9220
No log 3.275 262 0.9373 0.4681 0.9373 0.9682
No log 3.3 264 0.9272 0.4503 0.9272 0.9629
No log 3.325 266 0.7445 0.6045 0.7445 0.8629
No log 3.35 268 0.6724 0.6209 0.6724 0.8200
No log 3.375 270 0.6681 0.5905 0.6681 0.8174
No log 3.4 272 0.6926 0.5975 0.6926 0.8322
No log 3.425 274 0.7322 0.5864 0.7322 0.8557
No log 3.45 276 0.7336 0.6045 0.7336 0.8565
No log 3.475 278 0.7013 0.5777 0.7013 0.8375
No log 3.5 280 0.6560 0.5657 0.6560 0.8099
No log 3.525 282 0.6469 0.5631 0.6469 0.8043
No log 3.55 284 0.6622 0.5422 0.6622 0.8137
No log 3.575 286 0.8135 0.5294 0.8135 0.9019
No log 3.6 288 0.9643 0.5339 0.9643 0.9820
No log 3.625 290 0.9790 0.5075 0.9790 0.9895
No log 3.65 292 0.7961 0.5210 0.7961 0.8923
No log 3.675 294 0.6942 0.5697 0.6942 0.8332
No log 3.7 296 0.6822 0.5516 0.6822 0.8259
No log 3.725 298 0.7211 0.5888 0.7211 0.8492
No log 3.75 300 0.8477 0.4994 0.8477 0.9207
No log 3.775 302 0.8452 0.5013 0.8452 0.9194
No log 3.8 304 0.8141 0.5041 0.8141 0.9023
No log 3.825 306 0.7848 0.5777 0.7848 0.8859
No log 3.85 308 0.7577 0.5777 0.7577 0.8704
No log 3.875 310 0.7528 0.5938 0.7528 0.8677
No log 3.9 312 0.6759 0.5588 0.6759 0.8221
No log 3.925 314 0.6354 0.6239 0.6354 0.7971
No log 3.95 316 0.6383 0.6046 0.6383 0.7989
No log 3.975 318 0.6506 0.5943 0.6506 0.8066
No log 4.0 320 0.6880 0.5331 0.6880 0.8295
No log 4.025 322 0.7763 0.4926 0.7763 0.8811
No log 4.05 324 1.0260 0.5168 1.0260 1.0129
No log 4.075 326 1.1306 0.4094 1.1306 1.0633
No log 4.1 328 0.9477 0.5 0.9477 0.9735
No log 4.125 330 0.7750 0.5718 0.7750 0.8803
No log 4.15 332 0.7066 0.5917 0.7066 0.8406
No log 4.175 334 0.7035 0.5517 0.7035 0.8388
No log 4.2 336 0.7150 0.5798 0.7150 0.8456
No log 4.225 338 0.7051 0.5798 0.7051 0.8397
No log 4.25 340 0.7183 0.5477 0.7183 0.8475
No log 4.275 342 0.6710 0.6014 0.6710 0.8191
No log 4.3 344 0.6596 0.5891 0.6596 0.8122
No log 4.325 346 0.6928 0.5763 0.6928 0.8323
No log 4.35 348 0.7987 0.5344 0.7987 0.8937
No log 4.375 350 0.8619 0.5625 0.8619 0.9284
No log 4.4 352 0.8029 0.5344 0.8029 0.8960
No log 4.425 354 0.6756 0.5975 0.6756 0.8220
No log 4.45 356 0.6340 0.6364 0.6340 0.7962
No log 4.475 358 0.6326 0.6401 0.6326 0.7953
No log 4.5 360 0.6346 0.6820 0.6346 0.7966
No log 4.525 362 0.6490 0.6444 0.6490 0.8056
No log 4.55 364 0.6394 0.6969 0.6394 0.7996
No log 4.575 366 0.6196 0.6976 0.6196 0.7872
No log 4.6 368 0.6293 0.6426 0.6293 0.7933
No log 4.625 370 0.6531 0.5066 0.6531 0.8082
No log 4.65 372 0.6668 0.4428 0.6668 0.8166
No log 4.675 374 0.6906 0.5113 0.6906 0.8310
No log 4.7 376 0.7409 0.5356 0.7409 0.8608
No log 4.725 378 0.7983 0.5331 0.7983 0.8935
No log 4.75 380 0.8252 0.5106 0.8252 0.9084
No log 4.775 382 0.7865 0.5266 0.7865 0.8868
No log 4.8 384 0.7565 0.5157 0.7565 0.8698
No log 4.825 386 0.6979 0.5798 0.6979 0.8354
No log 4.85 388 0.6647 0.5798 0.6647 0.8153
No log 4.875 390 0.6388 0.7101 0.6388 0.7993
No log 4.9 392 0.6212 0.7012 0.6212 0.7881
No log 4.925 394 0.6258 0.6630 0.6258 0.7911
No log 4.95 396 0.6348 0.6543 0.6348 0.7967
No log 4.975 398 0.6227 0.6995 0.6227 0.7891
No log 5.0 400 0.7156 0.5666 0.7156 0.8459
No log 5.025 402 0.8185 0.5331 0.8185 0.9047
No log 5.05 404 0.8719 0.4898 0.8719 0.9337
No log 5.075 406 0.9023 0.4898 0.9023 0.9499
No log 5.1 408 0.9027 0.4573 0.9027 0.9501
No log 5.125 410 0.8930 0.4573 0.8930 0.9450
No log 5.15 412 0.8961 0.4796 0.8961 0.9466
No log 5.175 414 0.8889 0.4796 0.8889 0.9428
No log 5.2 416 0.9171 0.4781 0.9171 0.9577
No log 5.225 418 0.9707 0.4668 0.9707 0.9853
No log 5.25 420 0.9091 0.5295 0.9091 0.9535
No log 5.275 422 0.8632 0.4681 0.8632 0.9291
No log 5.3 424 0.7910 0.5253 0.7910 0.8894
No log 5.325 426 0.7815 0.5157 0.7815 0.8840
No log 5.35 428 0.8061 0.5253 0.8061 0.8978
No log 5.375 430 0.8236 0.5253 0.8236 0.9075
No log 5.4 432 0.7738 0.5372 0.7738 0.8797
No log 5.425 434 0.7095 0.5509 0.7095 0.8423
No log 5.45 436 0.7486 0.5891 0.7486 0.8652
No log 5.475 438 0.7648 0.5526 0.7648 0.8746
No log 5.5 440 0.6686 0.6015 0.6686 0.8177
No log 5.525 442 0.6139 0.6301 0.6139 0.7835
No log 5.55 444 0.6068 0.6301 0.6068 0.7789
No log 5.575 446 0.6226 0.6228 0.6226 0.7890
No log 5.6 448 0.6672 0.6310 0.6672 0.8168
No log 5.625 450 0.6875 0.5385 0.6875 0.8292
No log 5.65 452 0.6685 0.5534 0.6685 0.8176
No log 5.675 454 0.6894 0.4966 0.6894 0.8303
No log 5.7 456 0.7292 0.5062 0.7292 0.8540
No log 5.725 458 0.7928 0.5332 0.7928 0.8904
No log 5.75 460 0.7484 0.5346 0.7484 0.8651
No log 5.775 462 0.6655 0.5651 0.6655 0.8158
No log 5.8 464 0.6204 0.5361 0.6204 0.7876
No log 5.825 466 0.6365 0.5103 0.6365 0.7978
No log 5.85 468 0.7272 0.5319 0.7272 0.8528
No log 5.875 470 0.8898 0.5391 0.8898 0.9433
No log 5.9 472 0.9222 0.5484 0.9222 0.9603
No log 5.925 474 0.8640 0.5484 0.8640 0.9295
No log 5.95 476 0.7342 0.5307 0.7342 0.8568
No log 5.975 478 0.6311 0.6188 0.6311 0.7944
No log 6.0 480 0.6146 0.6003 0.6146 0.7840
No log 6.025 482 0.6270 0.5898 0.6270 0.7918
No log 6.05 484 0.7006 0.5479 0.7006 0.8370
No log 6.075 486 0.7587 0.5360 0.7587 0.8710
No log 6.1 488 0.7755 0.5644 0.7755 0.8806
No log 6.125 490 0.7424 0.5756 0.7424 0.8616
No log 6.15 492 0.7121 0.5521 0.7121 0.8439
No log 6.175 494 0.6245 0.6099 0.6245 0.7903
No log 6.2 496 0.5524 0.6805 0.5524 0.7433
No log 6.225 498 0.5385 0.7090 0.5385 0.7338
0.2913 6.25 500 0.5578 0.7131 0.5578 0.7469
0.2913 6.275 502 0.5632 0.6881 0.5632 0.7505
0.2913 6.3 504 0.5591 0.7012 0.5591 0.7477
0.2913 6.325 506 0.5702 0.6672 0.5702 0.7551
0.2913 6.35 508 0.5610 0.7469 0.5610 0.7490
0.2913 6.375 510 0.5590 0.7136 0.5590 0.7477
0.2913 6.4 512 0.5596 0.7136 0.5596 0.7481
0.2913 6.425 514 0.5738 0.6944 0.5738 0.7575
0.2913 6.45 516 0.6086 0.6207 0.6086 0.7801
0.2913 6.475 518 0.6176 0.6073 0.6176 0.7859
0.2913 6.5 520 0.6294 0.5777 0.6294 0.7933
0.2913 6.525 522 0.6063 0.6348 0.6063 0.7787

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k16_task5_organization

Finetuned
(4019)
this model