ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6083
  • Qwk: 0.6054
  • Mse: 0.6083
  • Rmse: 0.7799

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 4.1126 0.0176 4.1126 2.0280
No log 0.1081 4 2.3443 0.0110 2.3443 1.5311
No log 0.1622 6 1.8839 0.0741 1.8839 1.3725
No log 0.2162 8 1.3728 0.0408 1.3728 1.1717
No log 0.2703 10 1.0649 0.2711 1.0649 1.0319
No log 0.3243 12 0.9581 0.1962 0.9581 0.9788
No log 0.3784 14 0.9388 0.2288 0.9388 0.9689
No log 0.4324 16 0.9273 0.2035 0.9273 0.9630
No log 0.4865 18 0.9431 0.2569 0.9431 0.9711
No log 0.5405 20 1.1566 0.1602 1.1567 1.0755
No log 0.5946 22 1.3522 0.1500 1.3522 1.1629
No log 0.6486 24 1.2103 0.2592 1.2103 1.1001
No log 0.7027 26 0.9883 0.3890 0.9883 0.9941
No log 0.7568 28 0.9172 0.2015 0.9172 0.9577
No log 0.8108 30 0.9382 0.1761 0.9382 0.9686
No log 0.8649 32 0.9059 0.3133 0.9059 0.9518
No log 0.9189 34 0.9554 0.4024 0.9554 0.9775
No log 0.9730 36 1.2918 0.2170 1.2918 1.1366
No log 1.0270 38 2.0578 0.1240 2.0578 1.4345
No log 1.0811 40 1.6328 0.1564 1.6328 1.2778
No log 1.1351 42 1.2469 0.2592 1.2469 1.1166
No log 1.1892 44 1.0999 0.3088 1.0999 1.0488
No log 1.2432 46 0.9856 0.2711 0.9856 0.9928
No log 1.2973 48 0.9677 0.3407 0.9677 0.9837
No log 1.3514 50 0.9308 0.3557 0.9308 0.9648
No log 1.4054 52 0.9118 0.3737 0.9118 0.9549
No log 1.4595 54 0.9266 0.3852 0.9266 0.9626
No log 1.5135 56 0.9443 0.3852 0.9443 0.9718
No log 1.5676 58 0.9566 0.3570 0.9566 0.9781
No log 1.6216 60 0.8942 0.3852 0.8942 0.9456
No log 1.6757 62 0.8683 0.3517 0.8683 0.9318
No log 1.7297 64 0.9518 0.3457 0.9518 0.9756
No log 1.7838 66 0.9716 0.2217 0.9716 0.9857
No log 1.8378 68 0.9874 0.3478 0.9874 0.9937
No log 1.8919 70 0.9476 0.3478 0.9476 0.9734
No log 1.9459 72 0.8828 0.3836 0.8828 0.9396
No log 2.0 74 0.8365 0.4063 0.8365 0.9146
No log 2.0541 76 0.8487 0.2449 0.8487 0.9213
No log 2.1081 78 0.8251 0.2790 0.8251 0.9083
No log 2.1622 80 0.7918 0.3773 0.7918 0.8898
No log 2.2162 82 0.8348 0.4382 0.8348 0.9137
No log 2.2703 84 0.9281 0.4026 0.9281 0.9634
No log 2.3243 86 0.8556 0.4518 0.8556 0.9250
No log 2.3784 88 0.7675 0.4501 0.7675 0.8761
No log 2.4324 90 0.7507 0.5117 0.7507 0.8664
No log 2.4865 92 0.8604 0.4222 0.8604 0.9276
No log 2.5405 94 0.9191 0.3280 0.9191 0.9587
No log 2.5946 96 0.8956 0.4713 0.8956 0.9464
No log 2.6486 98 0.9678 0.3795 0.9678 0.9837
No log 2.7027 100 0.9264 0.3795 0.9264 0.9625
No log 2.7568 102 0.8134 0.4439 0.8134 0.9019
No log 2.8108 104 0.7699 0.5799 0.7699 0.8774
No log 2.8649 106 0.8122 0.4681 0.8122 0.9012
No log 2.9189 108 0.8114 0.4954 0.8114 0.9008
No log 2.9730 110 0.7817 0.4203 0.7817 0.8841
No log 3.0270 112 0.8007 0.4754 0.8007 0.8948
No log 3.0811 114 0.7940 0.5342 0.7940 0.8911
No log 3.1351 116 0.7800 0.4581 0.7800 0.8832
No log 3.1892 118 0.7925 0.4180 0.7925 0.8902
No log 3.2432 120 0.7948 0.4180 0.7948 0.8915
No log 3.2973 122 0.8199 0.3784 0.8199 0.9055
No log 3.3514 124 0.8427 0.3455 0.8427 0.9180
No log 3.4054 126 0.9146 0.4086 0.9146 0.9564
No log 3.4595 128 0.9253 0.4239 0.9253 0.9619
No log 3.5135 130 0.9019 0.4207 0.9019 0.9497
No log 3.5676 132 0.8764 0.4175 0.8764 0.9361
No log 3.6216 134 0.9012 0.4667 0.9012 0.9493
No log 3.6757 136 0.7985 0.4681 0.7985 0.8936
No log 3.7297 138 0.7622 0.4943 0.7622 0.8731
No log 3.7838 140 0.7906 0.4681 0.7906 0.8891
No log 3.8378 142 0.7552 0.5070 0.7552 0.8690
No log 3.8919 144 0.7929 0.5370 0.7929 0.8905
No log 3.9459 146 0.7926 0.5356 0.7926 0.8903
No log 4.0 148 0.6705 0.6025 0.6705 0.8188
No log 4.0541 150 0.6562 0.5594 0.6562 0.8100
No log 4.1081 152 0.6906 0.5582 0.6906 0.8310
No log 4.1622 154 0.8184 0.4946 0.8184 0.9047
No log 4.2162 156 0.8230 0.4712 0.8230 0.9072
No log 4.2703 158 0.6951 0.5498 0.6951 0.8337
No log 4.3243 160 0.6554 0.5759 0.6554 0.8096
No log 4.3784 162 0.6790 0.5735 0.6790 0.8240
No log 4.4324 164 0.7537 0.5079 0.7537 0.8682
No log 4.4865 166 0.7869 0.5521 0.7869 0.8871
No log 4.5405 168 0.7104 0.4782 0.7104 0.8428
No log 4.5946 170 0.6903 0.5406 0.6903 0.8308
No log 4.6486 172 0.6744 0.5287 0.6744 0.8212
No log 4.7027 174 0.7194 0.5847 0.7194 0.8482
No log 4.7568 176 0.7557 0.5921 0.7557 0.8693
No log 4.8108 178 0.7368 0.5040 0.7368 0.8584
No log 4.8649 180 0.7217 0.4932 0.7217 0.8495
No log 4.9189 182 0.7244 0.5847 0.7244 0.8511
No log 4.9730 184 0.8463 0.4787 0.8463 0.9199
No log 5.0270 186 0.9294 0.4449 0.9294 0.9641
No log 5.0811 188 0.8849 0.4560 0.8849 0.9407
No log 5.1351 190 0.8022 0.4711 0.8022 0.8957
No log 5.1892 192 0.7287 0.4968 0.7287 0.8536
No log 5.2432 194 0.6920 0.5304 0.6920 0.8319
No log 5.2973 196 0.7127 0.5626 0.7127 0.8442
No log 5.3514 198 0.7400 0.5708 0.7400 0.8602
No log 5.4054 200 0.7287 0.5740 0.7287 0.8537
No log 5.4595 202 0.6967 0.5393 0.6967 0.8347
No log 5.5135 204 0.6843 0.5179 0.6843 0.8272
No log 5.5676 206 0.6650 0.5855 0.6650 0.8155
No log 5.6216 208 0.7734 0.6010 0.7734 0.8794
No log 5.6757 210 0.7976 0.5810 0.7976 0.8931
No log 5.7297 212 0.8084 0.5618 0.8084 0.8991
No log 5.7838 214 0.6791 0.6169 0.6791 0.8240
No log 5.8378 216 0.6401 0.6259 0.6401 0.8001
No log 5.8919 218 0.6237 0.6073 0.6237 0.7898
No log 5.9459 220 0.6305 0.5582 0.6305 0.7941
No log 6.0 222 0.6428 0.6068 0.6428 0.8018
No log 6.0541 224 0.6678 0.6085 0.6678 0.8172
No log 6.1081 226 0.6966 0.5667 0.6966 0.8346
No log 6.1622 228 0.7411 0.5753 0.7411 0.8609
No log 6.2162 230 0.7158 0.6148 0.7158 0.8460
No log 6.2703 232 0.7411 0.6388 0.7411 0.8609
No log 6.3243 234 0.9254 0.4484 0.9254 0.9620
No log 6.3784 236 0.9760 0.4559 0.9760 0.9880
No log 6.4324 238 0.8305 0.4465 0.8305 0.9113
No log 6.4865 240 0.7043 0.5033 0.7043 0.8392
No log 6.5405 242 0.6969 0.4588 0.6969 0.8348
No log 6.5946 244 0.6899 0.5316 0.6899 0.8306
No log 6.6486 246 0.6744 0.5784 0.6744 0.8212
No log 6.7027 248 0.6430 0.5549 0.6430 0.8019
No log 6.7568 250 0.6437 0.5672 0.6437 0.8023
No log 6.8108 252 0.6961 0.5144 0.6961 0.8343
No log 6.8649 254 0.7607 0.4820 0.7607 0.8722
No log 6.9189 256 0.6880 0.5257 0.6880 0.8294
No log 6.9730 258 0.6156 0.5438 0.6156 0.7846
No log 7.0270 260 0.6573 0.5509 0.6573 0.8108
No log 7.0811 262 0.7016 0.5195 0.7016 0.8376
No log 7.1351 264 0.6752 0.5666 0.6752 0.8217
No log 7.1892 266 0.6471 0.6177 0.6471 0.8044
No log 7.2432 268 0.6275 0.6167 0.6275 0.7921
No log 7.2973 270 0.6451 0.5986 0.6451 0.8032
No log 7.3514 272 0.7567 0.5686 0.7567 0.8699
No log 7.4054 274 0.8171 0.5543 0.8171 0.9039
No log 7.4595 276 0.7061 0.5397 0.7061 0.8403
No log 7.5135 278 0.6059 0.6364 0.6059 0.7784
No log 7.5676 280 0.6113 0.6334 0.6113 0.7818
No log 7.6216 282 0.6351 0.5435 0.6351 0.7970
No log 7.6757 284 0.6386 0.5748 0.6386 0.7991
No log 7.7297 286 0.6445 0.6125 0.6445 0.8028
No log 7.7838 288 0.6732 0.5435 0.6732 0.8205
No log 7.8378 290 0.6881 0.5427 0.6881 0.8295
No log 7.8919 292 0.7008 0.5909 0.7008 0.8371
No log 7.9459 294 0.7474 0.6188 0.7474 0.8645
No log 8.0 296 0.8254 0.5934 0.8254 0.9085
No log 8.0541 298 0.8436 0.5560 0.8436 0.9185
No log 8.1081 300 0.7993 0.5872 0.7993 0.8940
No log 8.1622 302 0.7316 0.5773 0.7316 0.8553
No log 8.2162 304 0.7283 0.5716 0.7283 0.8534
No log 8.2703 306 0.7167 0.5599 0.7167 0.8466
No log 8.3243 308 0.7387 0.5640 0.7387 0.8595
No log 8.3784 310 0.8318 0.6266 0.8318 0.9120
No log 8.4324 312 0.8343 0.6434 0.8343 0.9134
No log 8.4865 314 0.7568 0.6188 0.7568 0.8699
No log 8.5405 316 0.7137 0.5654 0.7137 0.8448
No log 8.5946 318 0.7504 0.5412 0.7504 0.8662
No log 8.6486 320 0.7584 0.5520 0.7584 0.8709
No log 8.7027 322 0.6983 0.5323 0.6983 0.8356
No log 8.7568 324 0.6866 0.5451 0.6866 0.8286
No log 8.8108 326 0.7673 0.5395 0.7673 0.8760
No log 8.8649 328 0.8588 0.5020 0.8588 0.9267
No log 8.9189 330 0.8463 0.4334 0.8463 0.9199
No log 8.9730 332 0.7837 0.4089 0.7837 0.8853
No log 9.0270 334 0.7300 0.5490 0.7300 0.8544
No log 9.0811 336 0.6947 0.4947 0.6947 0.8335
No log 9.1351 338 0.6762 0.5409 0.6762 0.8223
No log 9.1892 340 0.6690 0.5835 0.6690 0.8179
No log 9.2432 342 0.7201 0.5927 0.7201 0.8486
No log 9.2973 344 0.7634 0.5844 0.7634 0.8737
No log 9.3514 346 0.7648 0.5714 0.7648 0.8746
No log 9.4054 348 0.7222 0.6061 0.7222 0.8498
No log 9.4595 350 0.7090 0.4968 0.7090 0.8420
No log 9.5135 352 0.7061 0.5346 0.7061 0.8403
No log 9.5676 354 0.7299 0.5117 0.7299 0.8543
No log 9.6216 356 0.7682 0.4832 0.7682 0.8765
No log 9.6757 358 0.7939 0.5231 0.7939 0.8910
No log 9.7297 360 0.7454 0.5482 0.7454 0.8633
No log 9.7838 362 0.6517 0.5669 0.6517 0.8073
No log 9.8378 364 0.6476 0.5971 0.6476 0.8048
No log 9.8919 366 0.6411 0.6195 0.6411 0.8007
No log 9.9459 368 0.6453 0.6066 0.6453 0.8033
No log 10.0 370 0.6991 0.5787 0.6991 0.8361
No log 10.0541 372 0.7358 0.5875 0.7358 0.8578
No log 10.1081 374 0.7148 0.5786 0.7148 0.8455
No log 10.1622 376 0.6796 0.5690 0.6796 0.8244
No log 10.2162 378 0.6748 0.5455 0.6748 0.8215
No log 10.2703 380 0.6674 0.5455 0.6674 0.8170
No log 10.3243 382 0.6694 0.5731 0.6694 0.8182
No log 10.3784 384 0.6708 0.6092 0.6708 0.8190
No log 10.4324 386 0.6535 0.6272 0.6535 0.8084
No log 10.4865 388 0.6520 0.6272 0.6520 0.8075
No log 10.5405 390 0.6553 0.6468 0.6553 0.8095
No log 10.5946 392 0.6879 0.6864 0.6879 0.8294
No log 10.6486 394 0.7678 0.6160 0.7678 0.8762
No log 10.7027 396 0.7948 0.5856 0.7948 0.8915
No log 10.7568 398 0.7412 0.6107 0.7412 0.8609
No log 10.8108 400 0.6767 0.5429 0.6767 0.8226
No log 10.8649 402 0.6661 0.5546 0.6661 0.8162
No log 10.9189 404 0.6611 0.5318 0.6611 0.8131
No log 10.9730 406 0.6683 0.5504 0.6683 0.8175
No log 11.0270 408 0.6617 0.5707 0.6617 0.8134
No log 11.0811 410 0.6364 0.5833 0.6364 0.7978
No log 11.1351 412 0.6136 0.5751 0.6136 0.7833
No log 11.1892 414 0.6393 0.5912 0.6393 0.7995
No log 11.2432 416 0.6774 0.6410 0.6774 0.8231
No log 11.2973 418 0.7011 0.5891 0.7011 0.8373
No log 11.3514 420 0.6666 0.5872 0.6666 0.8165
No log 11.4054 422 0.6453 0.5676 0.6453 0.8033
No log 11.4595 424 0.6317 0.5869 0.6317 0.7948
No log 11.5135 426 0.6073 0.6094 0.6073 0.7793
No log 11.5676 428 0.6472 0.6395 0.6472 0.8045
No log 11.6216 430 0.7431 0.5618 0.7431 0.8620
No log 11.6757 432 0.7787 0.5306 0.7787 0.8824
No log 11.7297 434 0.7510 0.5735 0.7510 0.8666
No log 11.7838 436 0.6501 0.5720 0.6501 0.8063
No log 11.8378 438 0.6346 0.5618 0.6346 0.7966
No log 11.8919 440 0.6345 0.5618 0.6345 0.7966
No log 11.9459 442 0.6433 0.6038 0.6433 0.8020
No log 12.0 444 0.6563 0.5797 0.6563 0.8101
No log 12.0541 446 0.7157 0.5998 0.7157 0.8460
No log 12.1081 448 0.7262 0.5666 0.7262 0.8522
No log 12.1622 450 0.6916 0.5810 0.6916 0.8316
No log 12.2162 452 0.6739 0.5894 0.6739 0.8209
No log 12.2703 454 0.7202 0.5279 0.7202 0.8487
No log 12.3243 456 0.7533 0.4558 0.7533 0.8679
No log 12.3784 458 0.7332 0.5074 0.7332 0.8563
No log 12.4324 460 0.7111 0.5463 0.7111 0.8433
No log 12.4865 462 0.6690 0.5835 0.6690 0.8179
No log 12.5405 464 0.6561 0.5835 0.6561 0.8100
No log 12.5946 466 0.7026 0.5383 0.7026 0.8382
No log 12.6486 468 0.8269 0.5198 0.8269 0.9094
No log 12.7027 470 0.8197 0.5198 0.8197 0.9054
No log 12.7568 472 0.7145 0.5560 0.7145 0.8453
No log 12.8108 474 0.6715 0.6547 0.6715 0.8195
No log 12.8649 476 0.6693 0.6122 0.6693 0.8181
No log 12.9189 478 0.7214 0.4940 0.7214 0.8493
No log 12.9730 480 0.8466 0.4430 0.8466 0.9201
No log 13.0270 482 0.9041 0.4332 0.9041 0.9508
No log 13.0811 484 0.8379 0.5320 0.8379 0.9154
No log 13.1351 486 0.7145 0.5718 0.7145 0.8453
No log 13.1892 488 0.6559 0.5858 0.6559 0.8099
No log 13.2432 490 0.6009 0.6217 0.6009 0.7752
No log 13.2973 492 0.5921 0.6217 0.5921 0.7695
No log 13.3514 494 0.6092 0.6102 0.6092 0.7805
No log 13.4054 496 0.6456 0.6319 0.6456 0.8035
No log 13.4595 498 0.7031 0.6238 0.7031 0.8385
0.3674 13.5135 500 0.6797 0.6238 0.6797 0.8244
0.3674 13.5676 502 0.6209 0.6678 0.6209 0.7880
0.3674 13.6216 504 0.5905 0.6154 0.5905 0.7685
0.3674 13.6757 506 0.5885 0.6272 0.5885 0.7671
0.3674 13.7297 508 0.6060 0.6102 0.6060 0.7784
0.3674 13.7838 510 0.7054 0.6548 0.7054 0.8399
0.3674 13.8378 512 0.7846 0.5417 0.7846 0.8858
0.3674 13.8919 514 0.7519 0.5734 0.7519 0.8671
0.3674 13.9459 516 0.6843 0.5938 0.6843 0.8272
0.3674 14.0 518 0.6386 0.6092 0.6386 0.7991
0.3674 14.0541 520 0.6083 0.6054 0.6083 0.7799

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

Finetuned
(4019)
this model