ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7549
  • Qwk: 0.6906
  • Mse: 0.7549
  • Rmse: 0.8689

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 6.4458 0.0311 6.4458 2.5389
No log 0.0667 4 5.2403 0.0213 5.2403 2.2892
No log 0.1 6 3.1891 0.0702 3.1891 1.7858
No log 0.1333 8 2.4713 0.0 2.4713 1.5720
No log 0.1667 10 2.1078 0.1270 2.1078 1.4518
No log 0.2 12 2.2669 0.1353 2.2669 1.5056
No log 0.2333 14 1.7375 0.1651 1.7375 1.3181
No log 0.2667 16 1.7217 0.1165 1.7217 1.3121
No log 0.3 18 1.6632 0.1165 1.6632 1.2896
No log 0.3333 20 1.4983 0.1538 1.4983 1.2240
No log 0.3667 22 1.5327 0.1538 1.5327 1.2380
No log 0.4 24 1.4640 0.2037 1.4640 1.2100
No log 0.4333 26 1.3891 0.2807 1.3891 1.1786
No log 0.4667 28 1.2519 0.4603 1.2519 1.1189
No log 0.5 30 1.1146 0.5414 1.1146 1.0557
No log 0.5333 32 1.0152 0.5606 1.0152 1.0076
No log 0.5667 34 0.9852 0.6331 0.9852 0.9926
No log 0.6 36 1.0506 0.5909 1.0506 1.0250
No log 0.6333 38 1.0304 0.6015 1.0304 1.0151
No log 0.6667 40 1.2139 0.5649 1.2139 1.1018
No log 0.7 42 1.0824 0.5581 1.0824 1.0404
No log 0.7333 44 0.9995 0.5354 0.9995 0.9997
No log 0.7667 46 0.9822 0.6 0.9822 0.9911
No log 0.8 48 0.9238 0.6107 0.9238 0.9611
No log 0.8333 50 1.0852 0.5354 1.0852 1.0418
No log 0.8667 52 1.1410 0.5079 1.1410 1.0682
No log 0.9 54 1.1844 0.496 1.1844 1.0883
No log 0.9333 56 0.9069 0.6377 0.9069 0.9523
No log 0.9667 58 0.9927 0.5821 0.9927 0.9963
No log 1.0 60 1.2821 0.4640 1.2821 1.1323
No log 1.0333 62 1.1852 0.5 1.1852 1.0887
No log 1.0667 64 1.0493 0.5397 1.0493 1.0244
No log 1.1 66 0.7859 0.6957 0.7859 0.8865
No log 1.1333 68 0.6933 0.7083 0.6933 0.8327
No log 1.1667 70 0.8755 0.6269 0.8755 0.9357
No log 1.2 72 1.3061 0.5286 1.3061 1.1428
No log 1.2333 74 1.5384 0.4400 1.5384 1.2403
No log 1.2667 76 1.5402 0.4965 1.5402 1.2410
No log 1.3 78 1.5310 0.4416 1.5310 1.2373
No log 1.3333 80 1.3297 0.5535 1.3297 1.1531
No log 1.3667 82 0.9150 0.6620 0.9150 0.9566
No log 1.4 84 0.6771 0.7162 0.6771 0.8228
No log 1.4333 86 0.7701 0.7083 0.7701 0.8776
No log 1.4667 88 1.0184 0.5414 1.0184 1.0091
No log 1.5 90 1.1107 0.5373 1.1107 1.0539
No log 1.5333 92 1.0611 0.5734 1.0611 1.0301
No log 1.5667 94 1.0411 0.5793 1.0411 1.0204
No log 1.6 96 0.9478 0.6968 0.9478 0.9736
No log 1.6333 98 0.8286 0.7067 0.8286 0.9103
No log 1.6667 100 0.9265 0.6883 0.9265 0.9625
No log 1.7 102 1.0383 0.6234 1.0383 1.0190
No log 1.7333 104 1.2574 0.575 1.2574 1.1213
No log 1.7667 106 1.3693 0.5976 1.3693 1.1702
No log 1.8 108 1.2319 0.6108 1.2319 1.1099
No log 1.8333 110 1.0019 0.6329 1.0019 1.0009
No log 1.8667 112 0.8941 0.6933 0.8941 0.9456
No log 1.9 114 0.9290 0.6622 0.9290 0.9638
No log 1.9333 116 0.8796 0.6241 0.8796 0.9379
No log 1.9667 118 0.7723 0.7397 0.7723 0.8788
No log 2.0 120 0.7733 0.7222 0.7733 0.8794
No log 2.0333 122 0.7493 0.7042 0.7493 0.8656
No log 2.0667 124 0.8216 0.6269 0.8216 0.9064
No log 2.1 126 1.0967 0.5850 1.0967 1.0472
No log 2.1333 128 1.2454 0.5638 1.2454 1.1160
No log 2.1667 130 1.0402 0.5972 1.0402 1.0199
No log 2.2 132 0.8929 0.6667 0.8929 0.9449
No log 2.2333 134 0.9187 0.6522 0.9187 0.9585
No log 2.2667 136 1.0705 0.5816 1.0705 1.0347
No log 2.3 138 1.1664 0.5571 1.1664 1.0800
No log 2.3333 140 1.0201 0.5755 1.0201 1.0100
No log 2.3667 142 0.8454 0.6713 0.8454 0.9194
No log 2.4 144 0.7091 0.7564 0.7091 0.8421
No log 2.4333 146 0.7028 0.7879 0.7028 0.8384
No log 2.4667 148 0.7961 0.7683 0.7961 0.8923
No log 2.5 150 0.8723 0.7 0.8723 0.9340
No log 2.5333 152 0.8473 0.6842 0.8473 0.9205
No log 2.5667 154 0.7070 0.7785 0.7070 0.8408
No log 2.6 156 0.7259 0.6806 0.7259 0.8520
No log 2.6333 158 0.7198 0.6944 0.7198 0.8484
No log 2.6667 160 0.7696 0.7143 0.7696 0.8773
No log 2.7 162 0.8727 0.6797 0.8727 0.9342
No log 2.7333 164 0.8357 0.7205 0.8357 0.9142
No log 2.7667 166 0.6597 0.7853 0.6597 0.8122
No log 2.8 168 0.6506 0.7821 0.6506 0.8066
No log 2.8333 170 0.7528 0.7237 0.7528 0.8677
No log 2.8667 172 0.7659 0.7101 0.7659 0.8751
No log 2.9 174 0.8590 0.6519 0.8590 0.9268
No log 2.9333 176 0.8515 0.6619 0.8515 0.9228
No log 2.9667 178 0.9234 0.6667 0.9234 0.9609
No log 3.0 180 1.0109 0.6211 1.0109 1.0054
No log 3.0333 182 1.1100 0.5977 1.1100 1.0535
No log 3.0667 184 1.0887 0.6433 1.0887 1.0434
No log 3.1 186 0.9764 0.6550 0.9764 0.9882
No log 3.1333 188 0.9420 0.6628 0.9420 0.9706
No log 3.1667 190 0.8169 0.7 0.8169 0.9038
No log 3.2 192 0.7395 0.6944 0.7395 0.8600
No log 3.2333 194 0.7557 0.6795 0.7557 0.8693
No log 3.2667 196 0.8872 0.7030 0.8872 0.9419
No log 3.3 198 0.9802 0.6747 0.9802 0.9901
No log 3.3333 200 0.8407 0.6792 0.8407 0.9169
No log 3.3667 202 0.7158 0.7162 0.7158 0.8460
No log 3.4 204 0.7807 0.6809 0.7807 0.8836
No log 3.4333 206 0.8442 0.6939 0.8442 0.9188
No log 3.4667 208 0.7105 0.7517 0.7105 0.8429
No log 3.5 210 0.5947 0.7838 0.5947 0.7711
No log 3.5333 212 0.5929 0.8027 0.5929 0.7700
No log 3.5667 214 0.6467 0.7571 0.6467 0.8042
No log 3.6 216 0.7937 0.6711 0.7937 0.8909
No log 3.6333 218 1.0730 0.6228 1.0730 1.0359
No log 3.6667 220 1.2208 0.5697 1.2208 1.1049
No log 3.7 222 1.0573 0.5695 1.0573 1.0282
No log 3.7333 224 0.8248 0.6763 0.8248 0.9082
No log 3.7667 226 0.7552 0.7259 0.7552 0.8690
No log 3.8 228 0.6919 0.7338 0.6919 0.8318
No log 3.8333 230 0.7512 0.7067 0.7512 0.8667
No log 3.8667 232 0.8442 0.6486 0.8442 0.9188
No log 3.9 234 1.0302 0.5811 1.0302 1.0150
No log 3.9333 236 0.9678 0.5850 0.9678 0.9837
No log 3.9667 238 0.7216 0.7143 0.7216 0.8495
No log 4.0 240 0.6571 0.7832 0.6571 0.8106
No log 4.0333 242 0.6608 0.75 0.6608 0.8129
No log 4.0667 244 0.8385 0.6712 0.8385 0.9157
No log 4.1 246 1.1961 0.5912 1.1961 1.0937
No log 4.1333 248 1.2173 0.5789 1.2173 1.1033
No log 4.1667 250 0.9778 0.5890 0.9778 0.9888
No log 4.2 252 0.7972 0.6861 0.7972 0.8928
No log 4.2333 254 0.6960 0.7571 0.6960 0.8343
No log 4.2667 256 0.6460 0.7483 0.6460 0.8037
No log 4.3 258 0.7015 0.76 0.7015 0.8376
No log 4.3333 260 0.8154 0.6842 0.8154 0.9030
No log 4.3667 262 0.9145 0.6099 0.9145 0.9563
No log 4.4 264 0.8518 0.6861 0.8518 0.9229
No log 4.4333 266 0.7543 0.7413 0.7543 0.8685
No log 4.4667 268 0.7347 0.7123 0.7347 0.8571
No log 4.5 270 0.8918 0.6624 0.8918 0.9443
No log 4.5333 272 0.9547 0.6536 0.9547 0.9771
No log 4.5667 274 0.9852 0.6225 0.9852 0.9926
No log 4.6 276 0.9098 0.6395 0.9098 0.9538
No log 4.6333 278 0.9010 0.6395 0.9010 0.9492
No log 4.6667 280 0.8640 0.6803 0.8640 0.9295
No log 4.7 282 0.7759 0.7123 0.7759 0.8808
No log 4.7333 284 0.8376 0.6892 0.8376 0.9152
No log 4.7667 286 1.1018 0.5605 1.1018 1.0497
No log 4.8 288 1.2085 0.5696 1.2085 1.0993
No log 4.8333 290 1.0398 0.6 1.0398 1.0197
No log 4.8667 292 0.8677 0.7015 0.8677 0.9315
No log 4.9 294 0.7983 0.7164 0.7983 0.8935
No log 4.9333 296 0.7517 0.7482 0.7517 0.8670
No log 4.9667 298 0.7466 0.7482 0.7466 0.8640
No log 5.0 300 0.9160 0.6259 0.9160 0.9571
No log 5.0333 302 1.1475 0.5605 1.1475 1.0712
No log 5.0667 304 1.1489 0.5605 1.1489 1.0719
No log 5.1 306 0.9561 0.5931 0.9561 0.9778
No log 5.1333 308 0.7536 0.7391 0.7536 0.8681
No log 5.1667 310 0.6982 0.7445 0.6982 0.8356
No log 5.2 312 0.7386 0.7286 0.7386 0.8594
No log 5.2333 314 0.8191 0.6901 0.8191 0.9050
No log 5.2667 316 0.8858 0.6040 0.8858 0.9412
No log 5.3 318 0.9075 0.6242 0.9075 0.9527
No log 5.3333 320 0.8716 0.6323 0.8716 0.9336
No log 5.3667 322 0.8503 0.6667 0.8503 0.9221
No log 5.4 324 0.8202 0.6853 0.8202 0.9057
No log 5.4333 326 0.8065 0.7172 0.8065 0.8981
No log 5.4667 328 0.7706 0.7483 0.7706 0.8779
No log 5.5 330 0.7601 0.7417 0.7601 0.8718
No log 5.5333 332 0.8126 0.7027 0.8126 0.9015
No log 5.5667 334 0.8992 0.6395 0.8992 0.9483
No log 5.6 336 0.9252 0.6712 0.9252 0.9619
No log 5.6333 338 0.9172 0.6429 0.9172 0.9577
No log 5.6667 340 0.9091 0.6069 0.9091 0.9535
No log 5.7 342 0.8997 0.6194 0.8997 0.9485
No log 5.7333 344 0.7518 0.7261 0.7518 0.8671
No log 5.7667 346 0.6611 0.7712 0.6611 0.8131
No log 5.8 348 0.6426 0.7925 0.6426 0.8016
No log 5.8333 350 0.6661 0.8024 0.6661 0.8161
No log 5.8667 352 0.6798 0.7805 0.6798 0.8245
No log 5.9 354 0.6559 0.7673 0.6559 0.8099
No log 5.9333 356 0.6834 0.7742 0.6834 0.8267
No log 5.9667 358 0.7274 0.7383 0.7274 0.8529
No log 6.0 360 0.7818 0.7206 0.7818 0.8842
No log 6.0333 362 0.8537 0.6718 0.8537 0.9239
No log 6.0667 364 0.9244 0.6562 0.9244 0.9615
No log 6.1 366 0.9904 0.6357 0.9904 0.9952
No log 6.1333 368 0.9587 0.6165 0.9587 0.9792
No log 6.1667 370 0.9167 0.6412 0.9167 0.9574
No log 6.2 372 0.8885 0.6412 0.8885 0.9426
No log 6.2333 374 0.8530 0.6519 0.8530 0.9236
No log 6.2667 376 0.8130 0.6667 0.8130 0.9017
No log 6.3 378 0.8351 0.6577 0.8351 0.9138
No log 6.3333 380 0.7981 0.6897 0.7981 0.8934
No log 6.3667 382 0.7203 0.7417 0.7203 0.8487
No log 6.4 384 0.7146 0.76 0.7146 0.8454
No log 6.4333 386 0.6955 0.7413 0.6955 0.8340
No log 6.4667 388 0.7111 0.7397 0.7111 0.8433
No log 6.5 390 0.7717 0.6849 0.7717 0.8785
No log 6.5333 392 0.8176 0.6497 0.8176 0.9042
No log 6.5667 394 0.7291 0.7421 0.7291 0.8539
No log 6.6 396 0.6460 0.7763 0.6460 0.8037
No log 6.6333 398 0.6500 0.7763 0.6500 0.8062
No log 6.6667 400 0.6608 0.7682 0.6608 0.8129
No log 6.7 402 0.6514 0.7671 0.6514 0.8071
No log 6.7333 404 0.6583 0.7550 0.6583 0.8114
No log 6.7667 406 0.7582 0.7484 0.7582 0.8707
No log 6.8 408 0.8526 0.6621 0.8526 0.9234
No log 6.8333 410 0.8364 0.6667 0.8364 0.9145
No log 6.8667 412 0.8102 0.6565 0.8102 0.9001
No log 6.9 414 0.8239 0.6565 0.8239 0.9077
No log 6.9333 416 0.8676 0.6715 0.8676 0.9314
No log 6.9667 418 0.9089 0.5926 0.9089 0.9534
No log 7.0 420 0.9597 0.5986 0.9597 0.9796
No log 7.0333 422 0.8779 0.6418 0.8779 0.9370
No log 7.0667 424 0.7761 0.6849 0.7761 0.8809
No log 7.1 426 0.6970 0.7248 0.6970 0.8349
No log 7.1333 428 0.6910 0.7484 0.6910 0.8312
No log 7.1667 430 0.7610 0.6892 0.7610 0.8724
No log 7.2 432 0.8638 0.6748 0.8638 0.9294
No log 7.2333 434 1.0731 0.6316 1.0731 1.0359
No log 7.2667 436 1.1077 0.6190 1.1077 1.0525
No log 7.3 438 0.9503 0.5946 0.9503 0.9748
No log 7.3333 440 0.8501 0.6757 0.8501 0.9220
No log 7.3667 442 0.7060 0.7347 0.7060 0.8402
No log 7.4 444 0.6221 0.7613 0.6221 0.7887
No log 7.4333 446 0.6377 0.7722 0.6377 0.7986
No log 7.4667 448 0.7275 0.7453 0.7275 0.8529
No log 7.5 450 0.8774 0.6667 0.8774 0.9367
No log 7.5333 452 0.9613 0.6708 0.9613 0.9805
No log 7.5667 454 0.8673 0.6752 0.8673 0.9313
No log 7.6 456 0.7547 0.6711 0.7547 0.8687
No log 7.6333 458 0.6779 0.7324 0.6779 0.8233
No log 7.6667 460 0.6461 0.7273 0.6461 0.8038
No log 7.7 462 0.6753 0.7448 0.6753 0.8218
No log 7.7333 464 0.7349 0.7261 0.7349 0.8572
No log 7.7667 466 0.7740 0.7160 0.7740 0.8798
No log 7.8 468 0.6798 0.7531 0.6798 0.8245
No log 7.8333 470 0.6001 0.7925 0.6001 0.7747
No log 7.8667 472 0.6148 0.7925 0.6148 0.7841
No log 7.9 474 0.6643 0.7532 0.6643 0.8150
No log 7.9333 476 0.7813 0.6939 0.7813 0.8839
No log 7.9667 478 0.8203 0.6806 0.8203 0.9057
No log 8.0 480 0.7791 0.6812 0.7791 0.8827
No log 8.0333 482 0.7338 0.6901 0.7338 0.8566
No log 8.0667 484 0.7290 0.7152 0.7290 0.8538
No log 8.1 486 0.7625 0.7044 0.7625 0.8732
No log 8.1333 488 0.8576 0.6710 0.8576 0.9260
No log 8.1667 490 0.8798 0.6443 0.8798 0.9380
No log 8.2 492 0.8741 0.6277 0.8741 0.9349
No log 8.2333 494 0.8935 0.6286 0.8935 0.9452
No log 8.2667 496 0.7912 0.6667 0.7912 0.8895
No log 8.3 498 0.6747 0.7297 0.6747 0.8214
0.4299 8.3333 500 0.6621 0.7403 0.6621 0.8137
0.4299 8.3667 502 0.7121 0.7329 0.7121 0.8439
0.4299 8.4 504 0.7788 0.7574 0.7788 0.8825
0.4299 8.4333 506 0.9085 0.6707 0.9085 0.9531
0.4299 8.4667 508 0.9072 0.6536 0.9072 0.9525
0.4299 8.5 510 0.8310 0.6370 0.8310 0.9116
0.4299 8.5333 512 0.7610 0.7246 0.7610 0.8724
0.4299 8.5667 514 0.7549 0.6906 0.7549 0.8689

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task1_organization

Finetuned
(4023)
this model