ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6899
  • Qwk: 0.5032
  • Mse: 0.6899
  • Rmse: 0.8306

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0714 2 3.8281 -0.0106 3.8281 1.9566
No log 0.1429 4 1.8621 0.0123 1.8621 1.3646
No log 0.2143 6 1.2259 0.1576 1.2259 1.1072
No log 0.2857 8 1.0504 0.2991 1.0504 1.0249
No log 0.3571 10 1.0613 0.2441 1.0613 1.0302
No log 0.4286 12 1.1457 0.1821 1.1457 1.0704
No log 0.5 14 1.1714 0.1848 1.1714 1.0823
No log 0.5714 16 1.0734 0.2467 1.0734 1.0361
No log 0.6429 18 1.0366 0.1233 1.0366 1.0182
No log 0.7143 20 1.0719 0.0981 1.0719 1.0353
No log 0.7857 22 1.1265 0.1321 1.1265 1.0614
No log 0.8571 24 1.1238 0.1107 1.1238 1.0601
No log 0.9286 26 1.0749 0.1131 1.0749 1.0368
No log 1.0 28 1.2506 0.1711 1.2506 1.1183
No log 1.0714 30 1.7667 0.0057 1.7667 1.3292
No log 1.1429 32 1.7460 -0.0321 1.7460 1.3213
No log 1.2143 34 1.3730 0.0613 1.3730 1.1717
No log 1.2857 36 1.0829 0.2611 1.0829 1.0406
No log 1.3571 38 0.9910 0.2523 0.9910 0.9955
No log 1.4286 40 1.0842 0.2632 1.0842 1.0413
No log 1.5 42 1.0750 0.2569 1.0750 1.0368
No log 1.5714 44 1.1568 0.1698 1.1568 1.0755
No log 1.6429 46 1.5514 -0.0256 1.5514 1.2455
No log 1.7143 48 1.7968 -0.0508 1.7968 1.3404
No log 1.7857 50 1.6861 -0.0078 1.6861 1.2985
No log 1.8571 52 1.4247 -0.0148 1.4247 1.1936
No log 1.9286 54 1.1175 0.1618 1.1175 1.0571
No log 2.0 56 0.8767 0.4803 0.8767 0.9363
No log 2.0714 58 0.8822 0.2499 0.8822 0.9393
No log 2.1429 60 0.8756 0.2474 0.8756 0.9357
No log 2.2143 62 0.8307 0.3902 0.8307 0.9114
No log 2.2857 64 0.8417 0.5564 0.8417 0.9175
No log 2.3571 66 0.8686 0.3954 0.8686 0.9320
No log 2.4286 68 0.8318 0.4347 0.8318 0.9121
No log 2.5 70 0.7543 0.5288 0.7543 0.8685
No log 2.5714 72 0.7009 0.6014 0.7009 0.8372
No log 2.6429 74 0.6956 0.5319 0.6956 0.8340
No log 2.7143 76 0.7105 0.5630 0.7105 0.8429
No log 2.7857 78 0.7096 0.5491 0.7096 0.8424
No log 2.8571 80 0.6430 0.6349 0.6430 0.8019
No log 2.9286 82 0.7224 0.5098 0.7224 0.8499
No log 3.0 84 0.8416 0.3793 0.8416 0.9174
No log 3.0714 86 0.8019 0.4389 0.8019 0.8955
No log 3.1429 88 0.6795 0.5432 0.6795 0.8243
No log 3.2143 90 0.7131 0.5927 0.7131 0.8445
No log 3.2857 92 0.8921 0.4435 0.8921 0.9445
No log 3.3571 94 0.9213 0.4087 0.9213 0.9598
No log 3.4286 96 0.7709 0.5579 0.7709 0.8780
No log 3.5 98 0.6734 0.5634 0.6734 0.8206
No log 3.5714 100 0.6619 0.5475 0.6619 0.8136
No log 3.6429 102 0.6536 0.5785 0.6536 0.8084
No log 3.7143 104 0.7054 0.5048 0.7054 0.8399
No log 3.7857 106 0.7607 0.5220 0.7607 0.8722
No log 3.8571 108 0.7140 0.5864 0.7140 0.8450
No log 3.9286 110 0.6957 0.4944 0.6957 0.8341
No log 4.0 112 0.7336 0.3793 0.7336 0.8565
No log 4.0714 114 0.7084 0.4450 0.7084 0.8417
No log 4.1429 116 0.6742 0.5989 0.6742 0.8211
No log 4.2143 118 0.6631 0.5960 0.6631 0.8143
No log 4.2857 120 0.6597 0.5597 0.6597 0.8122
No log 4.3571 122 0.6727 0.5759 0.6727 0.8202
No log 4.4286 124 0.6772 0.5983 0.6772 0.8229
No log 4.5 126 0.6782 0.5610 0.6782 0.8235
No log 4.5714 128 0.7013 0.5305 0.7013 0.8374
No log 4.6429 130 0.7268 0.5039 0.7268 0.8525
No log 4.7143 132 0.6865 0.5810 0.6865 0.8286
No log 4.7857 134 0.6714 0.5747 0.6714 0.8194
No log 4.8571 136 0.6721 0.5747 0.6721 0.8198
No log 4.9286 138 0.6652 0.5735 0.6652 0.8156
No log 5.0 140 0.6657 0.5949 0.6657 0.8159
No log 5.0714 142 0.6910 0.5078 0.6910 0.8313
No log 5.1429 144 0.6987 0.5078 0.6987 0.8359
No log 5.2143 146 0.6782 0.4914 0.6782 0.8235
No log 5.2857 148 0.6768 0.5575 0.6768 0.8227
No log 5.3571 150 0.6506 0.6602 0.6506 0.8066
No log 5.4286 152 0.6534 0.6368 0.6534 0.8083
No log 5.5 154 0.6782 0.5862 0.6782 0.8235
No log 5.5714 156 0.6262 0.6733 0.6262 0.7914
No log 5.6429 158 0.6034 0.6796 0.6034 0.7768
No log 5.7143 160 0.6007 0.5983 0.6007 0.7750
No log 5.7857 162 0.6183 0.5679 0.6183 0.7863
No log 5.8571 164 0.6025 0.6597 0.6025 0.7762
No log 5.9286 166 0.6609 0.6071 0.6609 0.8130
No log 6.0 168 0.6005 0.6025 0.6005 0.7749
No log 6.0714 170 0.5419 0.6296 0.5419 0.7361
No log 6.1429 172 0.5810 0.6935 0.5810 0.7623
No log 6.2143 174 0.5583 0.6786 0.5583 0.7472
No log 6.2857 176 0.5616 0.6728 0.5616 0.7494
No log 6.3571 178 0.5749 0.6360 0.5749 0.7582
No log 6.4286 180 0.6095 0.6447 0.6095 0.7807
No log 6.5 182 0.7195 0.5377 0.7195 0.8482
No log 6.5714 184 0.6956 0.5924 0.6956 0.8340
No log 6.6429 186 0.6012 0.6249 0.6012 0.7754
No log 6.7143 188 0.6980 0.5647 0.6980 0.8355
No log 6.7857 190 0.6909 0.5842 0.6909 0.8312
No log 6.8571 192 0.6512 0.6675 0.6512 0.8070
No log 6.9286 194 0.8101 0.5179 0.8101 0.9000
No log 7.0 196 0.9329 0.4740 0.9329 0.9659
No log 7.0714 198 0.8520 0.4826 0.8520 0.9231
No log 7.1429 200 0.6294 0.6729 0.6294 0.7933
No log 7.2143 202 0.6190 0.6266 0.6190 0.7868
No log 7.2857 204 0.6407 0.5864 0.6407 0.8005
No log 7.3571 206 0.5968 0.6209 0.5968 0.7725
No log 7.4286 208 0.6147 0.5644 0.6147 0.7840
No log 7.5 210 0.6271 0.5759 0.6271 0.7919
No log 7.5714 212 0.6777 0.5728 0.6777 0.8232
No log 7.6429 214 0.7219 0.5725 0.7219 0.8496
No log 7.7143 216 0.6983 0.5842 0.6983 0.8356
No log 7.7857 218 0.6354 0.5944 0.6354 0.7971
No log 7.8571 220 0.6611 0.6508 0.6611 0.8131
No log 7.9286 222 0.6996 0.6654 0.6996 0.8364
No log 8.0 224 0.6671 0.6535 0.6671 0.8167
No log 8.0714 226 0.6584 0.5688 0.6584 0.8114
No log 8.1429 228 0.8005 0.5675 0.8005 0.8947
No log 8.2143 230 0.8793 0.4667 0.8793 0.9377
No log 8.2857 232 0.7990 0.5051 0.7990 0.8939
No log 8.3571 234 0.7228 0.5334 0.7228 0.8502
No log 8.4286 236 0.6953 0.5332 0.6953 0.8338
No log 8.5 238 0.7002 0.4829 0.7002 0.8368
No log 8.5714 240 0.6701 0.5859 0.6701 0.8186
No log 8.6429 242 0.6933 0.5634 0.6933 0.8327
No log 8.7143 244 0.7128 0.5833 0.7128 0.8443
No log 8.7857 246 0.7387 0.5595 0.7387 0.8595
No log 8.8571 248 0.6865 0.5565 0.6865 0.8286
No log 8.9286 250 0.6947 0.4675 0.6947 0.8335
No log 9.0 252 0.7266 0.5093 0.7266 0.8524
No log 9.0714 254 0.7493 0.4864 0.7493 0.8656
No log 9.1429 256 0.7063 0.5048 0.7063 0.8404
No log 9.2143 258 0.6788 0.5835 0.6788 0.8239
No log 9.2857 260 0.6584 0.5835 0.6584 0.8114
No log 9.3571 262 0.6588 0.5654 0.6588 0.8117
No log 9.4286 264 0.6381 0.6479 0.6381 0.7988
No log 9.5 266 0.6619 0.5895 0.6619 0.8136
No log 9.5714 268 0.6783 0.5817 0.6783 0.8236
No log 9.6429 270 0.6665 0.5895 0.6665 0.8164
No log 9.7143 272 0.6509 0.6229 0.6509 0.8068
No log 9.7857 274 0.6822 0.5774 0.6822 0.8259
No log 9.8571 276 0.7641 0.5756 0.7641 0.8741
No log 9.9286 278 0.7589 0.5566 0.7589 0.8712
No log 10.0 280 0.7206 0.5634 0.7206 0.8489
No log 10.0714 282 0.7097 0.5654 0.7097 0.8424
No log 10.1429 284 0.7096 0.5751 0.7096 0.8424
No log 10.2143 286 0.7194 0.5923 0.7194 0.8482
No log 10.2857 288 0.7787 0.5639 0.7787 0.8824
No log 10.3571 290 0.8035 0.5604 0.8035 0.8964
No log 10.4286 292 0.6983 0.5766 0.6983 0.8356
No log 10.5 294 0.6529 0.6288 0.6529 0.8080
No log 10.5714 296 0.6711 0.5817 0.6711 0.8192
No log 10.6429 298 0.6849 0.5841 0.6849 0.8276
No log 10.7143 300 0.6787 0.5684 0.6787 0.8238
No log 10.7857 302 0.6828 0.6704 0.6828 0.8263
No log 10.8571 304 0.6895 0.6092 0.6895 0.8304
No log 10.9286 306 0.6932 0.5909 0.6932 0.8326
No log 11.0 308 0.6931 0.6063 0.6931 0.8325
No log 11.0714 310 0.6796 0.6259 0.6796 0.8244
No log 11.1429 312 0.6783 0.6063 0.6783 0.8236
No log 11.2143 314 0.6855 0.5708 0.6855 0.8280
No log 11.2857 316 0.6741 0.6008 0.6741 0.8210
No log 11.3571 318 0.6561 0.6634 0.6561 0.8100
No log 11.4286 320 0.6490 0.6709 0.6490 0.8056
No log 11.5 322 0.6248 0.6813 0.6248 0.7904
No log 11.5714 324 0.6066 0.6709 0.6066 0.7789
No log 11.6429 326 0.6163 0.6894 0.6163 0.7850
No log 11.7143 328 0.5952 0.6880 0.5952 0.7715
No log 11.7857 330 0.5863 0.6978 0.5863 0.7657
No log 11.8571 332 0.5835 0.6796 0.5835 0.7638
No log 11.9286 334 0.6304 0.6249 0.6304 0.7940
No log 12.0 336 0.6455 0.6337 0.6454 0.8034
No log 12.0714 338 0.6654 0.6328 0.6654 0.8157
No log 12.1429 340 0.6765 0.6337 0.6765 0.8225
No log 12.2143 342 0.7130 0.6064 0.7130 0.8444
No log 12.2857 344 0.7346 0.5766 0.7346 0.8571
No log 12.3571 346 0.6797 0.6555 0.6797 0.8244
No log 12.4286 348 0.6642 0.5843 0.6642 0.8150
No log 12.5 350 0.7170 0.4888 0.7170 0.8468
No log 12.5714 352 0.7590 0.5131 0.7590 0.8712
No log 12.6429 354 0.7010 0.5355 0.7010 0.8373
No log 12.7143 356 0.6424 0.5666 0.6424 0.8015
No log 12.7857 358 0.6768 0.5546 0.6768 0.8227
No log 12.8571 360 0.7288 0.5183 0.7288 0.8537
No log 12.9286 362 0.7016 0.5577 0.7016 0.8376
No log 13.0 364 0.7095 0.4781 0.7095 0.8423
No log 13.0714 366 0.7234 0.5221 0.7234 0.8505
No log 13.1429 368 0.7700 0.4836 0.7700 0.8775
No log 13.2143 370 0.7901 0.5707 0.7901 0.8889
No log 13.2857 372 0.7307 0.5084 0.7307 0.8548
No log 13.3571 374 0.6827 0.5247 0.6827 0.8263
No log 13.4286 376 0.6767 0.5183 0.6767 0.8226
No log 13.5 378 0.6688 0.5422 0.6688 0.8178
No log 13.5714 380 0.7125 0.5413 0.7125 0.8441
No log 13.6429 382 0.7775 0.5134 0.7775 0.8818
No log 13.7143 384 0.7600 0.5707 0.7600 0.8718
No log 13.7857 386 0.7625 0.5707 0.7625 0.8732
No log 13.8571 388 0.7795 0.5686 0.7795 0.8829
No log 13.9286 390 0.7619 0.5076 0.7619 0.8729
No log 14.0 392 0.7369 0.5235 0.7369 0.8584
No log 14.0714 394 0.7359 0.4466 0.7359 0.8578
No log 14.1429 396 0.7298 0.4208 0.7298 0.8543
No log 14.2143 398 0.7254 0.4660 0.7254 0.8517
No log 14.2857 400 0.7519 0.5206 0.7519 0.8671
No log 14.3571 402 0.7776 0.4696 0.7776 0.8818
No log 14.4286 404 0.7538 0.4832 0.7538 0.8682
No log 14.5 406 0.7285 0.5666 0.7285 0.8535
No log 14.5714 408 0.7171 0.4660 0.7171 0.8468
No log 14.6429 410 0.7324 0.5455 0.7324 0.8558
No log 14.7143 412 0.7496 0.4821 0.7496 0.8658
No log 14.7857 414 0.7331 0.4581 0.7331 0.8562
No log 14.8571 416 0.6851 0.4867 0.6851 0.8277
No log 14.9286 418 0.6630 0.5536 0.6630 0.8143
No log 15.0 420 0.6579 0.5795 0.6579 0.8111
No log 15.0714 422 0.6758 0.5430 0.6758 0.8221
No log 15.1429 424 0.6617 0.5388 0.6617 0.8135
No log 15.2143 426 0.6590 0.6237 0.6590 0.8118
No log 15.2857 428 0.7311 0.6071 0.7311 0.8551
No log 15.3571 430 0.7866 0.5655 0.7866 0.8869
No log 15.4286 432 0.7493 0.5766 0.7493 0.8656
No log 15.5 434 0.6942 0.5516 0.6942 0.8332
No log 15.5714 436 0.6852 0.5425 0.6852 0.8277
No log 15.6429 438 0.6970 0.4843 0.6970 0.8348
No log 15.7143 440 0.7273 0.5181 0.7273 0.8528
No log 15.7857 442 0.7419 0.5370 0.7419 0.8613
No log 15.8571 444 0.7082 0.5410 0.7082 0.8416
No log 15.9286 446 0.6894 0.4828 0.6894 0.8303
No log 16.0 448 0.7151 0.4738 0.7151 0.8456
No log 16.0714 450 0.7010 0.5345 0.7010 0.8373
No log 16.1429 452 0.6605 0.6154 0.6605 0.8127
No log 16.2143 454 0.7071 0.5697 0.7071 0.8409
No log 16.2857 456 0.7744 0.5766 0.7744 0.8800
No log 16.3571 458 0.7282 0.5777 0.7282 0.8534
No log 16.4286 460 0.6707 0.5455 0.6707 0.8190
No log 16.5 462 0.7370 0.4857 0.7370 0.8585
No log 16.5714 464 0.8626 0.4522 0.8626 0.9288
No log 16.6429 466 0.8489 0.4505 0.8489 0.9213
No log 16.7143 468 0.7777 0.3896 0.7777 0.8819
No log 16.7857 470 0.7975 0.5516 0.7975 0.8930
No log 16.8571 472 0.8637 0.5019 0.8637 0.9294
No log 16.9286 474 0.8581 0.5242 0.8581 0.9264
No log 17.0 476 0.7841 0.5602 0.7841 0.8855
No log 17.0714 478 0.7427 0.4197 0.7427 0.8618
No log 17.1429 480 0.7283 0.4563 0.7283 0.8534
No log 17.2143 482 0.7047 0.5381 0.7047 0.8395
No log 17.2857 484 0.6887 0.5540 0.6887 0.8299
No log 17.3571 486 0.6751 0.5830 0.6751 0.8216
No log 17.4286 488 0.6471 0.6269 0.6471 0.8044
No log 17.5 490 0.6343 0.6045 0.6343 0.7964
No log 17.5714 492 0.6541 0.6109 0.6541 0.8088
No log 17.6429 494 0.6437 0.6109 0.6437 0.8023
No log 17.7143 496 0.6186 0.6337 0.6186 0.7865
No log 17.7857 498 0.6081 0.6186 0.6081 0.7798
0.3292 17.8571 500 0.6231 0.6733 0.6231 0.7893
0.3292 17.9286 502 0.6452 0.6317 0.6452 0.8032
0.3292 18.0 504 0.6597 0.5771 0.6597 0.8122
0.3292 18.0714 506 0.6793 0.5419 0.6793 0.8242
0.3292 18.1429 508 0.6932 0.4794 0.6932 0.8326
0.3292 18.2143 510 0.6899 0.5032 0.6899 0.8306

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task5_organization

Finetuned
(4019)
this model