ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6876
  • Qwk: 0.3460
  • Mse: 0.6876
  • Rmse: 0.8292

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 2.5606 -0.0924 2.5606 1.6002
No log 0.0597 4 1.3181 -0.0471 1.3181 1.1481
No log 0.0896 6 1.3574 -0.1248 1.3574 1.1651
No log 0.1194 8 1.3474 -0.1628 1.3474 1.1608
No log 0.1493 10 1.3132 -0.0703 1.3132 1.1460
No log 0.1791 12 1.2851 -0.0730 1.2851 1.1336
No log 0.2090 14 1.1568 0.0283 1.1568 1.0755
No log 0.2388 16 1.1203 0.0493 1.1203 1.0584
No log 0.2687 18 1.1645 0.0263 1.1645 1.0791
No log 0.2985 20 1.1028 0.0741 1.1028 1.0501
No log 0.3284 22 1.0014 0.0539 1.0014 1.0007
No log 0.3582 24 0.9218 0.1459 0.9218 0.9601
No log 0.3881 26 0.9198 0.1332 0.9198 0.9590
No log 0.4179 28 0.9435 0.1254 0.9435 0.9713
No log 0.4478 30 0.9541 0.1217 0.9541 0.9768
No log 0.4776 32 0.9108 0.1293 0.9108 0.9543
No log 0.5075 34 0.9100 0.1293 0.9100 0.9540
No log 0.5373 36 0.9504 0.1534 0.9504 0.9749
No log 0.5672 38 0.9447 0.1566 0.9447 0.9720
No log 0.5970 40 0.9168 0.1773 0.9168 0.9575
No log 0.6269 42 0.9277 0.1404 0.9277 0.9632
No log 0.6567 44 0.9716 0.1244 0.9716 0.9857
No log 0.6866 46 1.0564 -0.0151 1.0564 1.0278
No log 0.7164 48 1.1242 0.0664 1.1242 1.0603
No log 0.7463 50 1.1251 0.0085 1.1251 1.0607
No log 0.7761 52 0.9995 0.0670 0.9995 0.9998
No log 0.8060 54 0.8971 0.0334 0.8971 0.9471
No log 0.8358 56 0.8695 -0.0143 0.8695 0.9325
No log 0.8657 58 0.9343 0.2359 0.9343 0.9666
No log 0.8955 60 0.9655 0.2691 0.9655 0.9826
No log 0.9254 62 0.9796 0.2328 0.9796 0.9898
No log 0.9552 64 1.0433 0.2076 1.0433 1.0214
No log 0.9851 66 0.9662 0.1328 0.9662 0.9830
No log 1.0149 68 0.8265 -0.0533 0.8265 0.9091
No log 1.0448 70 0.8367 0.0 0.8367 0.9147
No log 1.0746 72 0.8294 0.0 0.8294 0.9107
No log 1.1045 74 0.8020 -0.0079 0.8020 0.8955
No log 1.1343 76 0.8542 0.1183 0.8542 0.9242
No log 1.1642 78 0.9745 0.1575 0.9745 0.9872
No log 1.1940 80 0.9230 0.2045 0.9230 0.9608
No log 1.2239 82 0.8377 0.2285 0.8377 0.9152
No log 1.2537 84 0.7840 0.1699 0.7840 0.8855
No log 1.2836 86 0.7939 0.1598 0.7939 0.8910
No log 1.3134 88 0.8815 0.2063 0.8815 0.9389
No log 1.3433 90 1.0487 0.2589 1.0487 1.0241
No log 1.3731 92 1.1793 0.1057 1.1793 1.0859
No log 1.4030 94 1.0841 0.1463 1.0841 1.0412
No log 1.4328 96 0.9715 0.1914 0.9715 0.9856
No log 1.4627 98 0.9374 0.2171 0.9374 0.9682
No log 1.4925 100 0.9254 0.2754 0.9254 0.9620
No log 1.5224 102 0.8886 0.2149 0.8886 0.9426
No log 1.5522 104 0.8513 0.2172 0.8513 0.9227
No log 1.5821 106 0.8574 0.2203 0.8574 0.9260
No log 1.6119 108 0.8815 0.2661 0.8815 0.9389
No log 1.6418 110 0.8850 0.3068 0.8850 0.9407
No log 1.6716 112 0.8729 0.3068 0.8729 0.9343
No log 1.7015 114 0.8747 0.3292 0.8747 0.9353
No log 1.7313 116 0.9068 0.3503 0.9068 0.9523
No log 1.7612 118 0.9300 0.3417 0.9300 0.9644
No log 1.7910 120 0.9618 0.3043 0.9618 0.9807
No log 1.8209 122 0.9448 0.2564 0.9448 0.9720
No log 1.8507 124 0.9317 0.1946 0.9317 0.9653
No log 1.8806 126 0.9428 0.3069 0.9428 0.9710
No log 1.9104 128 0.9761 0.3100 0.9761 0.9880
No log 1.9403 130 0.9180 0.3219 0.9180 0.9581
No log 1.9701 132 0.8904 0.3095 0.8904 0.9436
No log 2.0 134 0.8527 0.2722 0.8527 0.9234
No log 2.0299 136 0.8487 0.2440 0.8487 0.9213
No log 2.0597 138 0.8545 0.2576 0.8545 0.9244
No log 2.0896 140 0.8387 0.2926 0.8387 0.9158
No log 2.1194 142 0.8317 0.2953 0.8317 0.9120
No log 2.1493 144 0.8327 0.3235 0.8327 0.9125
No log 2.1791 146 0.8303 0.3235 0.8303 0.9112
No log 2.2090 148 0.8221 0.3958 0.8221 0.9067
No log 2.2388 150 0.8038 0.3842 0.8038 0.8966
No log 2.2687 152 0.8128 0.4587 0.8128 0.9016
No log 2.2985 154 0.8081 0.4605 0.8081 0.8989
No log 2.3284 156 0.8287 0.4812 0.8287 0.9103
No log 2.3582 158 0.8034 0.4681 0.8034 0.8963
No log 2.3881 160 0.7734 0.4681 0.7734 0.8794
No log 2.4179 162 0.7438 0.4341 0.7438 0.8624
No log 2.4478 164 0.7318 0.3175 0.7318 0.8554
No log 2.4776 166 0.7231 0.3335 0.7231 0.8504
No log 2.5075 168 0.7148 0.3151 0.7148 0.8455
No log 2.5373 170 0.7367 0.4017 0.7367 0.8583
No log 2.5672 172 0.8217 0.3844 0.8217 0.9065
No log 2.5970 174 0.9014 0.4133 0.9014 0.9494
No log 2.6269 176 0.8810 0.4085 0.8810 0.9386
No log 2.6567 178 0.9748 0.3439 0.9748 0.9873
No log 2.6866 180 1.0105 0.3290 1.0105 1.0052
No log 2.7164 182 0.9656 0.3333 0.9656 0.9827
No log 2.7463 184 0.8749 0.3891 0.8749 0.9354
No log 2.7761 186 0.8472 0.3653 0.8472 0.9205
No log 2.8060 188 0.8143 0.3723 0.8143 0.9024
No log 2.8358 190 0.7430 0.3425 0.7430 0.8620
No log 2.8657 192 0.7253 0.3078 0.7253 0.8517
No log 2.8955 194 0.7569 0.4076 0.7569 0.8700
No log 2.9254 196 0.8860 0.4203 0.8860 0.9413
No log 2.9552 198 0.9313 0.3499 0.9313 0.9650
No log 2.9851 200 0.8926 0.3847 0.8926 0.9448
No log 3.0149 202 0.8261 0.3754 0.8261 0.9089
No log 3.0448 204 0.8420 0.3891 0.8420 0.9176
No log 3.0746 206 0.8024 0.3653 0.8024 0.8958
No log 3.1045 208 0.7984 0.3653 0.7984 0.8935
No log 3.1343 210 0.7825 0.3653 0.7825 0.8846
No log 3.1642 212 0.7816 0.3409 0.7816 0.8841
No log 3.1940 214 0.7921 0.3409 0.7921 0.8900
No log 3.2239 216 0.8420 0.3653 0.8420 0.9176
No log 3.2537 218 0.8740 0.3219 0.8740 0.9349
No log 3.2836 220 0.8287 0.3219 0.8287 0.9103
No log 3.3134 222 0.7498 0.3746 0.7498 0.8659
No log 3.3433 224 0.7296 0.3814 0.7296 0.8542
No log 3.3731 226 0.7497 0.3867 0.7497 0.8658
No log 3.4030 228 0.7527 0.3867 0.7527 0.8676
No log 3.4328 230 0.6977 0.3814 0.6977 0.8353
No log 3.4627 232 0.6812 0.4358 0.6812 0.8254
No log 3.4925 234 0.7284 0.3112 0.7284 0.8535
No log 3.5224 236 0.7380 0.3492 0.7380 0.8591
No log 3.5522 238 0.7360 0.4125 0.7360 0.8579
No log 3.5821 240 0.8520 0.3280 0.8520 0.9230
No log 3.6119 242 1.0654 0.3381 1.0654 1.0322
No log 3.6418 244 1.1409 0.3161 1.1409 1.0681
No log 3.6716 246 1.1266 0.2806 1.1266 1.0614
No log 3.7015 248 1.0009 0.2836 1.0009 1.0004
No log 3.7313 250 0.8492 0.0888 0.8492 0.9215
No log 3.7612 252 0.7942 -0.0143 0.7942 0.8912
No log 3.7910 254 0.7796 0.0361 0.7796 0.8829
No log 3.8209 256 0.7807 0.1495 0.7807 0.8835
No log 3.8507 258 0.7773 0.1598 0.7773 0.8817
No log 3.8806 260 0.7446 0.2027 0.7446 0.8629
No log 3.9104 262 0.7313 0.2345 0.7313 0.8552
No log 3.9403 264 0.7243 0.3673 0.7243 0.8510
No log 3.9701 266 0.8157 0.3319 0.8157 0.9031
No log 4.0 268 0.8431 0.3688 0.8431 0.9182
No log 4.0299 270 0.7730 0.3955 0.7730 0.8792
No log 4.0597 272 0.6657 0.4397 0.6657 0.8159
No log 4.0896 274 0.6564 0.4182 0.6564 0.8102
No log 4.1194 276 0.6778 0.3958 0.6778 0.8233
No log 4.1493 278 0.6388 0.4160 0.6388 0.7993
No log 4.1791 280 0.6368 0.4660 0.6368 0.7980
No log 4.2090 282 0.7783 0.3710 0.7783 0.8822
No log 4.2388 284 0.9269 0.3431 0.9269 0.9628
No log 4.2687 286 0.9434 0.3377 0.9434 0.9713
No log 4.2985 288 0.8478 0.3803 0.8478 0.9208
No log 4.3284 290 0.7288 0.4294 0.7288 0.8537
No log 4.3582 292 0.6820 0.3788 0.6820 0.8259
No log 4.3881 294 0.6598 0.3813 0.6598 0.8123
No log 4.4179 296 0.6567 0.4504 0.6567 0.8103
No log 4.4478 298 0.6530 0.4402 0.6530 0.8081
No log 4.4776 300 0.6519 0.4006 0.6519 0.8074
No log 4.5075 302 0.6580 0.4006 0.6580 0.8112
No log 4.5373 304 0.6550 0.3955 0.6550 0.8093
No log 4.5672 306 0.6482 0.4322 0.6482 0.8051
No log 4.5970 308 0.6190 0.4380 0.6190 0.7868
No log 4.6269 310 0.6181 0.3886 0.6181 0.7862
No log 4.6567 312 0.6275 0.3198 0.6275 0.7921
No log 4.6866 314 0.6440 0.4013 0.6440 0.8025
No log 4.7164 316 0.6785 0.4292 0.6785 0.8237
No log 4.7463 318 0.7532 0.4513 0.7532 0.8678
No log 4.7761 320 0.8270 0.3754 0.8270 0.9094
No log 4.8060 322 0.8433 0.3359 0.8433 0.9183
No log 4.8358 324 0.8140 0.3746 0.8140 0.9022
No log 4.8657 326 0.7994 0.4144 0.7994 0.8941
No log 4.8955 328 0.8128 0.3746 0.8128 0.9016
No log 4.9254 330 0.8074 0.3819 0.8074 0.8985
No log 4.9552 332 0.8184 0.3564 0.8184 0.9046
No log 4.9851 334 0.8765 0.3310 0.8765 0.9362
No log 5.0149 336 0.9053 0.3160 0.9053 0.9515
No log 5.0448 338 0.8650 0.3319 0.8650 0.9300
No log 5.0746 340 0.8244 0.1225 0.8244 0.9079
No log 5.1045 342 0.8364 -0.0149 0.8364 0.9146
No log 5.1343 344 0.8558 -0.0171 0.8558 0.9251
No log 5.1642 346 0.8982 0.1332 0.8982 0.9477
No log 5.1940 348 1.0614 0.2948 1.0614 1.0302
No log 5.2239 350 1.1987 0.2815 1.1987 1.0949
No log 5.2537 352 1.1284 0.2815 1.1284 1.0622
No log 5.2836 354 0.9131 0.3022 0.9131 0.9556
No log 5.3134 356 0.7388 0.4036 0.7388 0.8595
No log 5.3433 358 0.7036 0.3569 0.7036 0.8388
No log 5.3731 360 0.6963 0.3340 0.6963 0.8345
No log 5.4030 362 0.7116 0.3051 0.7116 0.8435
No log 5.4328 364 0.7176 0.2953 0.7176 0.8471
No log 5.4627 366 0.7002 0.3183 0.7002 0.8368
No log 5.4925 368 0.6897 0.2813 0.6897 0.8305
No log 5.5224 370 0.6749 0.2813 0.6749 0.8216
No log 5.5522 372 0.6678 0.2751 0.6678 0.8172
No log 5.5821 374 0.6666 0.4513 0.6666 0.8165
No log 5.6119 376 0.7442 0.4051 0.7442 0.8627
No log 5.6418 378 0.8220 0.3803 0.8220 0.9066
No log 5.6716 380 0.7998 0.3623 0.7998 0.8943
No log 5.7015 382 0.6911 0.4522 0.6911 0.8313
No log 5.7313 384 0.6045 0.4875 0.6045 0.7775
No log 5.7612 386 0.5955 0.4788 0.5955 0.7717
No log 5.7910 388 0.6028 0.4747 0.6028 0.7764
No log 5.8209 390 0.6548 0.5086 0.6548 0.8092
No log 5.8507 392 0.7410 0.4531 0.7410 0.8608
No log 5.8806 394 0.8697 0.3346 0.8697 0.9326
No log 5.9104 396 0.9176 0.3290 0.9176 0.9579
No log 5.9403 398 0.8136 0.3803 0.8136 0.9020
No log 5.9701 400 0.6660 0.5131 0.6660 0.8161
No log 6.0 402 0.6002 0.5104 0.6002 0.7747
No log 6.0299 404 0.6080 0.5104 0.6080 0.7798
No log 6.0597 406 0.6338 0.4769 0.6338 0.7961
No log 6.0896 408 0.7123 0.4531 0.7123 0.8440
No log 6.1194 410 0.7168 0.4684 0.7168 0.8466
No log 6.1493 412 0.6891 0.4684 0.6891 0.8301
No log 6.1791 414 0.7123 0.4684 0.7123 0.8440
No log 6.2090 416 0.6815 0.4916 0.6815 0.8255
No log 6.2388 418 0.6770 0.4484 0.6770 0.8228
No log 6.2687 420 0.7292 0.4684 0.7292 0.8539
No log 6.2985 422 0.7579 0.4464 0.7579 0.8706
No log 6.3284 424 0.7484 0.4451 0.7484 0.8651
No log 6.3582 426 0.7378 0.2907 0.7378 0.8590
No log 6.3881 428 0.7039 0.1760 0.7039 0.8390
No log 6.4179 430 0.6748 0.2360 0.6748 0.8214
No log 6.4478 432 0.6663 0.4336 0.6663 0.8163
No log 6.4776 434 0.7046 0.4644 0.7046 0.8394
No log 6.5075 436 0.6858 0.4644 0.6858 0.8282
No log 6.5373 438 0.6592 0.4660 0.6592 0.8119
No log 6.5672 440 0.6613 0.4265 0.6613 0.8132
No log 6.5970 442 0.6811 0.4205 0.6811 0.8253
No log 6.6269 444 0.6766 0.4149 0.6766 0.8225
No log 6.6567 446 0.6439 0.3460 0.6439 0.8024
No log 6.6866 448 0.6855 0.4769 0.6855 0.8280
No log 6.7164 450 0.8390 0.3740 0.8390 0.9160
No log 6.7463 452 0.9793 0.3105 0.9793 0.9896
No log 6.7761 454 1.0230 0.3214 1.0230 1.0114
No log 6.8060 456 1.0005 0.3324 1.0005 1.0003
No log 6.8358 458 0.9505 0.2881 0.9505 0.9749
No log 6.8657 460 0.8504 0.2784 0.8504 0.9222
No log 6.8955 462 0.7629 0.2227 0.7629 0.8735
No log 6.9254 464 0.7360 0.0344 0.7360 0.8579
No log 6.9552 466 0.7213 0.1176 0.7213 0.8493
No log 6.9851 468 0.6897 0.2193 0.6897 0.8305
No log 7.0149 470 0.6687 0.3183 0.6687 0.8177
No log 7.0448 472 0.6960 0.4186 0.6960 0.8343
No log 7.0746 474 0.7304 0.4389 0.7304 0.8546
No log 7.1045 476 0.7239 0.4606 0.7239 0.8508
No log 7.1343 478 0.6777 0.4513 0.6777 0.8232
No log 7.1642 480 0.6251 0.4473 0.6251 0.7906
No log 7.1940 482 0.6038 0.3625 0.6038 0.7770
No log 7.2239 484 0.6036 0.3625 0.6036 0.7769
No log 7.2537 486 0.6270 0.3640 0.6270 0.7918
No log 7.2836 488 0.7237 0.4247 0.7237 0.8507
No log 7.3134 490 0.8599 0.3333 0.8599 0.9273
No log 7.3433 492 0.9141 0.3439 0.9141 0.9561
No log 7.3731 494 0.9164 0.3439 0.9164 0.9573
No log 7.4030 496 0.8414 0.3100 0.8414 0.9173
No log 7.4328 498 0.7507 0.4072 0.7507 0.8664
0.3886 7.4627 500 0.7018 0.3814 0.7018 0.8378
0.3886 7.4925 502 0.6736 0.3267 0.6736 0.8208
0.3886 7.5224 504 0.6708 0.2943 0.6708 0.8190
0.3886 7.5522 506 0.6729 0.3280 0.6729 0.8203
0.3886 7.5821 508 0.6772 0.3702 0.6772 0.8229
0.3886 7.6119 510 0.6876 0.3460 0.6876 0.8292

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

Finetuned
(4019)
this model