ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7144
  • Qwk: 0.4691
  • Mse: 0.7144
  • Rmse: 0.8452

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 7.6349 0.0051 7.6349 2.7631
No log 0.5714 4 6.4238 0.0142 6.4238 2.5345
No log 0.8571 6 4.9392 0.0448 4.9392 2.2224
No log 1.1429 8 3.4364 0.0 3.4364 1.8538
No log 1.4286 10 2.1667 0.1266 2.1667 1.4720
No log 1.7143 12 1.6118 0.0030 1.6118 1.2696
No log 2.0 14 1.2792 0.0380 1.2792 1.1310
No log 2.2857 16 1.1070 0.2663 1.1070 1.0521
No log 2.5714 18 1.0724 0.1418 1.0724 1.0356
No log 2.8571 20 1.0958 0.0478 1.0958 1.0468
No log 3.1429 22 1.0330 0.1516 1.0330 1.0164
No log 3.4286 24 1.0816 0.2588 1.0816 1.0400
No log 3.7143 26 1.1252 0.2150 1.1252 1.0608
No log 4.0 28 1.0623 0.2271 1.0623 1.0307
No log 4.2857 30 0.9752 0.3251 0.9752 0.9875
No log 4.5714 32 0.9120 0.2541 0.9120 0.9550
No log 4.8571 34 0.8968 0.3048 0.8968 0.9470
No log 5.1429 36 0.8831 0.3414 0.8831 0.9397
No log 5.4286 38 0.8567 0.3392 0.8567 0.9256
No log 5.7143 40 0.8457 0.3982 0.8457 0.9196
No log 6.0 42 0.8926 0.3686 0.8926 0.9448
No log 6.2857 44 0.9769 0.3149 0.9769 0.9884
No log 6.5714 46 1.0230 0.2635 1.0230 1.0114
No log 6.8571 48 0.8518 0.3667 0.8518 0.9229
No log 7.1429 50 0.9051 0.3667 0.9051 0.9514
No log 7.4286 52 0.7988 0.4373 0.7988 0.8937
No log 7.7143 54 0.7312 0.4923 0.7312 0.8551
No log 8.0 56 0.7266 0.5206 0.7266 0.8524
No log 8.2857 58 0.7361 0.5107 0.7361 0.8580
No log 8.5714 60 0.7378 0.4321 0.7378 0.8590
No log 8.8571 62 0.9032 0.4803 0.9032 0.9504
No log 9.1429 64 0.7973 0.4874 0.7973 0.8929
No log 9.4286 66 0.7123 0.5558 0.7123 0.8440
No log 9.7143 68 0.7239 0.5442 0.7239 0.8508
No log 10.0 70 0.7362 0.5740 0.7362 0.8580
No log 10.2857 72 0.6896 0.5467 0.6896 0.8304
No log 10.5714 74 0.7695 0.4838 0.7695 0.8772
No log 10.8571 76 0.7374 0.4461 0.7374 0.8587
No log 11.1429 78 0.6664 0.5112 0.6664 0.8163
No log 11.4286 80 0.8546 0.5920 0.8546 0.9244
No log 11.7143 82 0.7790 0.4907 0.7790 0.8826
No log 12.0 84 0.6955 0.5026 0.6955 0.8340
No log 12.2857 86 0.6783 0.5057 0.6783 0.8236
No log 12.5714 88 0.6583 0.4794 0.6583 0.8113
No log 12.8571 90 0.7132 0.4804 0.7132 0.8445
No log 13.1429 92 0.7860 0.4724 0.7860 0.8866
No log 13.4286 94 0.8239 0.4489 0.8239 0.9077
No log 13.7143 96 0.7915 0.4606 0.7915 0.8896
No log 14.0 98 0.7646 0.4656 0.7646 0.8744
No log 14.2857 100 0.7125 0.4611 0.7125 0.8441
No log 14.5714 102 0.6697 0.5964 0.6697 0.8184
No log 14.8571 104 0.6908 0.5919 0.6908 0.8311
No log 15.1429 106 0.7717 0.5658 0.7717 0.8785
No log 15.4286 108 0.7103 0.5798 0.7103 0.8428
No log 15.7143 110 0.7022 0.5250 0.7022 0.8380
No log 16.0 112 0.7218 0.5368 0.7218 0.8496
No log 16.2857 114 0.7592 0.6002 0.7592 0.8713
No log 16.5714 116 0.7519 0.5828 0.7519 0.8671
No log 16.8571 118 0.7629 0.5806 0.7629 0.8734
No log 17.1429 120 0.7971 0.5884 0.7971 0.8928
No log 17.4286 122 0.7689 0.5766 0.7689 0.8769
No log 17.7143 124 0.7893 0.4743 0.7893 0.8884
No log 18.0 126 0.7680 0.5252 0.7680 0.8763
No log 18.2857 128 0.7625 0.5925 0.7625 0.8732
No log 18.5714 130 0.7663 0.4948 0.7663 0.8754
No log 18.8571 132 0.7509 0.5352 0.7509 0.8665
No log 19.1429 134 0.7406 0.6104 0.7406 0.8606
No log 19.4286 136 0.7308 0.5262 0.7308 0.8549
No log 19.7143 138 0.7246 0.5263 0.7246 0.8512
No log 20.0 140 0.7269 0.5470 0.7269 0.8526
No log 20.2857 142 0.7363 0.5370 0.7363 0.8581
No log 20.5714 144 0.7136 0.5931 0.7136 0.8448
No log 20.8571 146 0.6882 0.6293 0.6882 0.8295
No log 21.1429 148 0.6641 0.6397 0.6641 0.8149
No log 21.4286 150 0.6631 0.6581 0.6631 0.8143
No log 21.7143 152 0.6577 0.5763 0.6577 0.8110
No log 22.0 154 0.7265 0.4998 0.7265 0.8524
No log 22.2857 156 0.7098 0.5312 0.7098 0.8425
No log 22.5714 158 0.6879 0.6157 0.6879 0.8294
No log 22.8571 160 0.7283 0.6189 0.7283 0.8534
No log 23.1429 162 0.6915 0.5475 0.6915 0.8316
No log 23.4286 164 0.7299 0.5102 0.7299 0.8543
No log 23.7143 166 0.7636 0.5335 0.7636 0.8739
No log 24.0 168 0.7028 0.5314 0.7028 0.8383
No log 24.2857 170 0.7000 0.5774 0.7000 0.8367
No log 24.5714 172 0.7166 0.5648 0.7166 0.8465
No log 24.8571 174 0.7152 0.5545 0.7152 0.8457
No log 25.1429 176 0.6997 0.5190 0.6997 0.8365
No log 25.4286 178 0.6949 0.5313 0.6949 0.8336
No log 25.7143 180 0.6887 0.4965 0.6887 0.8299
No log 26.0 182 0.6854 0.6138 0.6854 0.8279
No log 26.2857 184 0.6813 0.6138 0.6813 0.8254
No log 26.5714 186 0.6689 0.5178 0.6689 0.8179
No log 26.8571 188 0.6609 0.5774 0.6609 0.8129
No log 27.1429 190 0.6650 0.6073 0.6650 0.8155
No log 27.4286 192 0.6455 0.5368 0.6455 0.8034
No log 27.7143 194 0.6495 0.4968 0.6495 0.8059
No log 28.0 196 0.6630 0.5215 0.6630 0.8143
No log 28.2857 198 0.6733 0.5656 0.6733 0.8205
No log 28.5714 200 0.6652 0.5412 0.6652 0.8156
No log 28.8571 202 0.6834 0.5323 0.6834 0.8267
No log 29.1429 204 0.7106 0.5435 0.7106 0.8430
No log 29.4286 206 0.7053 0.5323 0.7053 0.8398
No log 29.7143 208 0.6706 0.5412 0.6706 0.8189
No log 30.0 210 0.6893 0.6275 0.6893 0.8302
No log 30.2857 212 0.6809 0.5944 0.6809 0.8252
No log 30.5714 214 0.6780 0.5415 0.6780 0.8234
No log 30.8571 216 0.6860 0.5412 0.6860 0.8283
No log 31.1429 218 0.6946 0.5274 0.6946 0.8334
No log 31.4286 220 0.6960 0.5131 0.6960 0.8343
No log 31.7143 222 0.6962 0.5131 0.6962 0.8344
No log 32.0 224 0.6852 0.5402 0.6852 0.8277
No log 32.2857 226 0.6751 0.5555 0.6751 0.8216
No log 32.5714 228 0.6682 0.5202 0.6682 0.8174
No log 32.8571 230 0.6608 0.5166 0.6608 0.8129
No log 33.1429 232 0.6784 0.5678 0.6784 0.8236
No log 33.4286 234 0.6902 0.6122 0.6902 0.8308
No log 33.7143 236 0.7070 0.6315 0.7070 0.8409
No log 34.0 238 0.6755 0.5582 0.6755 0.8219
No log 34.2857 240 0.6717 0.4936 0.6717 0.8195
No log 34.5714 242 0.6781 0.4927 0.6781 0.8235
No log 34.8571 244 0.6800 0.4611 0.6800 0.8246
No log 35.1429 246 0.7056 0.5098 0.7056 0.8400
No log 35.4286 248 0.7124 0.5770 0.7124 0.8441
No log 35.7143 250 0.6750 0.4856 0.6750 0.8216
No log 36.0 252 0.6860 0.6405 0.6860 0.8282
No log 36.2857 254 0.7130 0.5933 0.7130 0.8444
No log 36.5714 256 0.6911 0.5996 0.6911 0.8313
No log 36.8571 258 0.6811 0.5542 0.6811 0.8253
No log 37.1429 260 0.6965 0.4962 0.6965 0.8345
No log 37.4286 262 0.6941 0.5315 0.6941 0.8331
No log 37.7143 264 0.6912 0.5274 0.6912 0.8314
No log 38.0 266 0.7232 0.5986 0.7232 0.8504
No log 38.2857 268 0.7122 0.5024 0.7122 0.8439
No log 38.5714 270 0.7019 0.5052 0.7019 0.8378
No log 38.8571 272 0.7101 0.5076 0.7101 0.8427
No log 39.1429 274 0.7014 0.5081 0.7014 0.8375
No log 39.4286 276 0.6977 0.5606 0.6977 0.8353
No log 39.7143 278 0.7023 0.5399 0.7023 0.8380
No log 40.0 280 0.7044 0.5399 0.7044 0.8393
No log 40.2857 282 0.7081 0.5399 0.7081 0.8415
No log 40.5714 284 0.7073 0.5415 0.7073 0.8410
No log 40.8571 286 0.7088 0.4706 0.7088 0.8419
No log 41.1429 288 0.7092 0.5060 0.7092 0.8421
No log 41.4286 290 0.7104 0.5274 0.7104 0.8429
No log 41.7143 292 0.7091 0.5250 0.7091 0.8421
No log 42.0 294 0.7033 0.5466 0.7033 0.8387
No log 42.2857 296 0.6969 0.5466 0.6969 0.8348
No log 42.5714 298 0.6903 0.5563 0.6903 0.8308
No log 42.8571 300 0.6834 0.5249 0.6834 0.8267
No log 43.1429 302 0.6866 0.5261 0.6866 0.8286
No log 43.4286 304 0.6953 0.5044 0.6953 0.8339
No log 43.7143 306 0.6973 0.5368 0.6973 0.8350
No log 44.0 308 0.7044 0.5582 0.7044 0.8393
No log 44.2857 310 0.7045 0.5052 0.7045 0.8393
No log 44.5714 312 0.7172 0.4845 0.7172 0.8469
No log 44.8571 314 0.7078 0.4845 0.7078 0.8413
No log 45.1429 316 0.7163 0.5094 0.7163 0.8464
No log 45.4286 318 0.7053 0.5081 0.7053 0.8398
No log 45.7143 320 0.6894 0.6028 0.6894 0.8303
No log 46.0 322 0.6789 0.5881 0.6789 0.8240
No log 46.2857 324 0.6745 0.5485 0.6745 0.8213
No log 46.5714 326 0.6717 0.5275 0.6717 0.8196
No log 46.8571 328 0.6713 0.5287 0.6713 0.8193
No log 47.1429 330 0.6613 0.5275 0.6613 0.8132
No log 47.4286 332 0.6570 0.5275 0.6570 0.8105
No log 47.7143 334 0.6539 0.5249 0.6539 0.8086
No log 48.0 336 0.6542 0.5016 0.6542 0.8088
No log 48.2857 338 0.6552 0.5032 0.6552 0.8095
No log 48.5714 340 0.6594 0.4676 0.6594 0.8120
No log 48.8571 342 0.6702 0.4938 0.6702 0.8186
No log 49.1429 344 0.6756 0.4944 0.6756 0.8219
No log 49.4286 346 0.6715 0.5054 0.6715 0.8194
No log 49.7143 348 0.6662 0.4789 0.6662 0.8162
No log 50.0 350 0.6588 0.4918 0.6588 0.8117
No log 50.2857 352 0.6499 0.5163 0.6499 0.8062
No log 50.5714 354 0.6475 0.5163 0.6475 0.8047
No log 50.8571 356 0.6499 0.5152 0.6499 0.8061
No log 51.1429 358 0.6699 0.6291 0.6699 0.8184
No log 51.4286 360 0.6657 0.6167 0.6657 0.8159
No log 51.7143 362 0.6579 0.5275 0.6579 0.8111
No log 52.0 364 0.6896 0.5560 0.6896 0.8304
No log 52.2857 366 0.7109 0.5455 0.7109 0.8431
No log 52.5714 368 0.6913 0.5324 0.6913 0.8315
No log 52.8571 370 0.6722 0.5287 0.6722 0.8199
No log 53.1429 372 0.6747 0.5135 0.6747 0.8214
No log 53.4286 374 0.6773 0.5485 0.6773 0.8230
No log 53.7143 376 0.6823 0.5485 0.6823 0.8260
No log 54.0 378 0.6718 0.4676 0.6718 0.8197
No log 54.2857 380 0.6728 0.4804 0.6728 0.8202
No log 54.5714 382 0.6983 0.4968 0.6983 0.8356
No log 54.8571 384 0.7182 0.5463 0.7182 0.8475
No log 55.1429 386 0.6983 0.4968 0.6983 0.8356
No log 55.4286 388 0.6709 0.5060 0.6709 0.8191
No log 55.7143 390 0.6651 0.4898 0.6651 0.8155
No log 56.0 392 0.6797 0.5774 0.6797 0.8244
No log 56.2857 394 0.6763 0.5577 0.6763 0.8223
No log 56.5714 396 0.6649 0.5024 0.6649 0.8154
No log 56.8571 398 0.6815 0.4955 0.6815 0.8255
No log 57.1429 400 0.7194 0.4763 0.7194 0.8482
No log 57.4286 402 0.7158 0.4749 0.7158 0.8460
No log 57.7143 404 0.6972 0.4845 0.6972 0.8350
No log 58.0 406 0.6928 0.5169 0.6928 0.8324
No log 58.2857 408 0.6940 0.5500 0.6940 0.8331
No log 58.5714 410 0.7014 0.4842 0.7014 0.8375
No log 58.8571 412 0.7312 0.4761 0.7312 0.8551
No log 59.1429 414 0.7686 0.5350 0.7686 0.8767
No log 59.4286 416 0.7603 0.5119 0.7603 0.8719
No log 59.7143 418 0.7191 0.5094 0.7191 0.8480
No log 60.0 420 0.6909 0.5599 0.6909 0.8312
No log 60.2857 422 0.7196 0.5657 0.7196 0.8483
No log 60.5714 424 0.7434 0.5948 0.7434 0.8622
No log 60.8571 426 0.7298 0.5678 0.7298 0.8543
No log 61.1429 428 0.7105 0.5797 0.7105 0.8429
No log 61.4286 430 0.6999 0.5171 0.6999 0.8366
No log 61.7143 432 0.7071 0.4329 0.7071 0.8409
No log 62.0 434 0.7266 0.4736 0.7266 0.8524
No log 62.2857 436 0.7344 0.4860 0.7344 0.8570
No log 62.5714 438 0.7344 0.4860 0.7344 0.8570
No log 62.8571 440 0.7131 0.4594 0.7131 0.8444
No log 63.1429 442 0.7026 0.4947 0.7026 0.8382
No log 63.4286 444 0.6982 0.5054 0.6982 0.8356
No log 63.7143 446 0.6939 0.5171 0.6939 0.8330
No log 64.0 448 0.6925 0.5171 0.6925 0.8322
No log 64.2857 450 0.6912 0.5171 0.6912 0.8314
No log 64.5714 452 0.6886 0.4691 0.6886 0.8298
No log 64.8571 454 0.6849 0.4807 0.6849 0.8276
No log 65.1429 456 0.6806 0.5288 0.6806 0.8250
No log 65.4286 458 0.6782 0.5288 0.6782 0.8235
No log 65.7143 460 0.6760 0.5288 0.6760 0.8222
No log 66.0 462 0.6758 0.5156 0.6758 0.8221
No log 66.2857 464 0.6790 0.5391 0.6790 0.8240
No log 66.5714 466 0.6848 0.5597 0.6848 0.8275
No log 66.8571 468 0.6845 0.5618 0.6845 0.8274
No log 67.1429 470 0.6803 0.5391 0.6803 0.8248
No log 67.4286 472 0.6762 0.5522 0.6762 0.8223
No log 67.7143 474 0.6767 0.5536 0.6767 0.8226
No log 68.0 476 0.6839 0.5530 0.6839 0.8270
No log 68.2857 478 0.6915 0.4856 0.6915 0.8316
No log 68.5714 480 0.6930 0.4947 0.6930 0.8325
No log 68.8571 482 0.6961 0.5301 0.6961 0.8343
No log 69.1429 484 0.7021 0.5040 0.7021 0.8379
No log 69.4286 486 0.7065 0.5274 0.7065 0.8406
No log 69.7143 488 0.7080 0.5179 0.7080 0.8414
No log 70.0 490 0.7138 0.5192 0.7138 0.8449
No log 70.2857 492 0.7163 0.4866 0.7163 0.8463
No log 70.5714 494 0.7100 0.5190 0.7100 0.8426
No log 70.8571 496 0.7046 0.4918 0.7046 0.8394
No log 71.1429 498 0.7120 0.5125 0.7120 0.8438
0.3256 71.4286 500 0.7148 0.5125 0.7148 0.8454
0.3256 71.7143 502 0.7109 0.5125 0.7109 0.8431
0.3256 72.0 504 0.7062 0.5016 0.7062 0.8404
0.3256 72.2857 506 0.7033 0.4691 0.7033 0.8387
0.3256 72.5714 508 0.7078 0.5190 0.7078 0.8413
0.3256 72.8571 510 0.7140 0.5190 0.7140 0.8450
0.3256 73.1429 512 0.7162 0.4845 0.7162 0.8463
0.3256 73.4286 514 0.7051 0.5190 0.7051 0.8397
0.3256 73.7143 516 0.6973 0.4807 0.6973 0.8351
0.3256 74.0 518 0.6995 0.5156 0.6995 0.8364
0.3256 74.2857 520 0.7013 0.5146 0.7013 0.8374
0.3256 74.5714 522 0.7002 0.5040 0.7002 0.8368
0.3256 74.8571 524 0.7010 0.4947 0.7010 0.8373
0.3256 75.1429 526 0.7095 0.5190 0.7095 0.8423
0.3256 75.4286 528 0.7233 0.4845 0.7233 0.8505
0.3256 75.7143 530 0.7335 0.4866 0.7335 0.8564
0.3256 76.0 532 0.7347 0.4866 0.7347 0.8571
0.3256 76.2857 534 0.7231 0.5190 0.7231 0.8504
0.3256 76.5714 536 0.7142 0.5190 0.7142 0.8451
0.3256 76.8571 538 0.7058 0.4819 0.7058 0.8401
0.3256 77.1429 540 0.7039 0.5288 0.7039 0.8390
0.3256 77.4286 542 0.7014 0.5024 0.7014 0.8375
0.3256 77.7143 544 0.6964 0.5261 0.6964 0.8345
0.3256 78.0 546 0.6946 0.5287 0.6946 0.8334
0.3256 78.2857 548 0.6995 0.5190 0.6995 0.8363
0.3256 78.5714 550 0.7061 0.4856 0.7061 0.8403
0.3256 78.8571 552 0.7054 0.5190 0.7054 0.8399
0.3256 79.1429 554 0.6967 0.5179 0.6967 0.8347
0.3256 79.4286 556 0.6933 0.5287 0.6933 0.8326
0.3256 79.7143 558 0.6924 0.5287 0.6924 0.8321
0.3256 80.0 560 0.6941 0.5287 0.6941 0.8331
0.3256 80.2857 562 0.6937 0.5052 0.6937 0.8329
0.3256 80.5714 564 0.6963 0.5060 0.6963 0.8344
0.3256 80.8571 566 0.7034 0.5060 0.7034 0.8387
0.3256 81.1429 568 0.7115 0.4947 0.7115 0.8435
0.3256 81.4286 570 0.7144 0.4691 0.7144 0.8452

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task5_organization

Finetuned
(4019)
this model