ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7494
  • Qwk: 0.1311
  • Mse: 0.7494
  • Rmse: 0.8657

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0227 2 4.0863 0.0 4.0863 2.0215
No log 0.0455 4 2.0392 -0.0713 2.0392 1.4280
No log 0.0682 6 1.0566 -0.0435 1.0566 1.0279
No log 0.0909 8 1.2879 0.0119 1.2879 1.1348
No log 0.1136 10 0.8865 -0.0122 0.8865 0.9416
No log 0.1364 12 0.6814 0.0506 0.6814 0.8255
No log 0.1591 14 0.6958 0.0555 0.6958 0.8342
No log 0.1818 16 0.7131 0.0334 0.7131 0.8444
No log 0.2045 18 0.9230 0.0676 0.9230 0.9607
No log 0.2273 20 1.5022 0.0948 1.5022 1.2256
No log 0.25 22 0.8622 0.0476 0.8622 0.9285
No log 0.2727 24 0.7854 -0.0131 0.7854 0.8863
No log 0.2955 26 0.8031 -0.1239 0.8031 0.8962
No log 0.3182 28 0.7929 -0.0725 0.7929 0.8904
No log 0.3409 30 0.7639 0.0857 0.7639 0.8740
No log 0.3636 32 1.5901 0.1017 1.5901 1.2610
No log 0.3864 34 2.2862 0.0612 2.2862 1.5120
No log 0.4091 36 1.3343 0.0868 1.3343 1.1551
No log 0.4318 38 0.6708 0.0909 0.6708 0.8190
No log 0.4545 40 0.7894 -0.0939 0.7894 0.8885
No log 0.4773 42 0.7741 -0.1001 0.7741 0.8799
No log 0.5 44 0.7035 0.0318 0.7035 0.8387
No log 0.5227 46 1.2314 0.1026 1.2314 1.1097
No log 0.5455 48 1.5465 0.0635 1.5465 1.2436
No log 0.5682 50 1.1429 0.0772 1.1429 1.0691
No log 0.5909 52 0.7190 0.1691 0.7190 0.8479
No log 0.6136 54 0.7734 0.0944 0.7734 0.8794
No log 0.6364 56 0.8063 0.0518 0.8063 0.8979
No log 0.6591 58 0.8472 0.1783 0.8472 0.9204
No log 0.6818 60 1.0559 0.0472 1.0559 1.0276
No log 0.7045 62 1.3027 0.0778 1.3027 1.1413
No log 0.7273 64 1.1785 0.1388 1.1785 1.0856
No log 0.75 66 1.1533 0.2563 1.1533 1.0739
No log 0.7727 68 1.1262 0.2192 1.1262 1.0612
No log 0.7955 70 1.0928 0.2208 1.0928 1.0454
No log 0.8182 72 1.0419 0.1401 1.0419 1.0208
No log 0.8409 74 0.9828 0.1352 0.9828 0.9914
No log 0.8636 76 0.8971 0.2099 0.8971 0.9471
No log 0.8864 78 0.8659 0.0262 0.8659 0.9306
No log 0.9091 80 0.7824 0.1525 0.7824 0.8845
No log 0.9318 82 0.7287 -0.0030 0.7287 0.8536
No log 0.9545 84 0.7465 0.1740 0.7465 0.8640
No log 0.9773 86 0.7995 0.1475 0.7995 0.8942
No log 1.0 88 0.9773 -0.0085 0.9773 0.9886
No log 1.0227 90 0.9228 0.1495 0.9228 0.9607
No log 1.0455 92 0.8550 0.2030 0.8550 0.9246
No log 1.0682 94 0.9199 0.1489 0.9199 0.9591
No log 1.0909 96 0.9714 0.1149 0.9714 0.9856
No log 1.1136 98 0.9181 0.2252 0.9181 0.9582
No log 1.1364 100 1.0763 0.1507 1.0763 1.0374
No log 1.1591 102 0.9930 0.0950 0.9930 0.9965
No log 1.1818 104 0.8035 0.1617 0.8035 0.8964
No log 1.2045 106 0.7500 0.0846 0.7500 0.8660
No log 1.2273 108 0.7176 -0.0027 0.7176 0.8471
No log 1.25 110 0.7132 -0.0086 0.7132 0.8445
No log 1.2727 112 0.7436 -0.0506 0.7436 0.8623
No log 1.2955 114 0.7945 0.0660 0.7945 0.8914
No log 1.3182 116 1.0621 -0.0142 1.0621 1.0306
No log 1.3409 118 0.9674 -0.0327 0.9674 0.9835
No log 1.3636 120 0.9884 0.0426 0.9884 0.9942
No log 1.3864 122 1.1141 0.0558 1.1141 1.0555
No log 1.4091 124 0.8830 0.0549 0.8830 0.9397
No log 1.4318 126 0.8636 0.0611 0.8636 0.9293
No log 1.4545 128 0.8335 0.0205 0.8335 0.9130
No log 1.4773 130 0.8192 0.1187 0.8192 0.9051
No log 1.5 132 0.8182 0.1139 0.8182 0.9045
No log 1.5227 134 0.8261 0.0679 0.8261 0.9089
No log 1.5455 136 0.8538 0.2118 0.8538 0.9240
No log 1.5682 138 0.9130 0.2039 0.9130 0.9555
No log 1.5909 140 0.8949 0.2203 0.8949 0.9460
No log 1.6136 142 0.9880 0.0752 0.9880 0.9940
No log 1.6364 144 0.8172 0.0898 0.8172 0.9040
No log 1.6591 146 0.8220 0.1004 0.8220 0.9066
No log 1.6818 148 0.8288 0.0905 0.8288 0.9104
No log 1.7045 150 0.9531 0.0741 0.9531 0.9763
No log 1.7273 152 0.8874 0.0919 0.8874 0.9420
No log 1.75 154 0.8743 0.0927 0.8743 0.9350
No log 1.7727 156 0.8498 0.0597 0.8498 0.9218
No log 1.7955 158 0.8957 0.1252 0.8957 0.9464
No log 1.8182 160 1.1757 0.1076 1.1757 1.0843
No log 1.8409 162 1.4126 0.0608 1.4126 1.1885
No log 1.8636 164 1.1115 0.0802 1.1115 1.0543
No log 1.8864 166 0.8273 0.0341 0.8273 0.9096
No log 1.9091 168 1.0587 0.0824 1.0587 1.0289
No log 1.9318 170 1.0510 0.0799 1.0510 1.0252
No log 1.9545 172 0.8871 0.0344 0.8871 0.9418
No log 1.9773 174 0.9392 0.1544 0.9392 0.9691
No log 2.0 176 1.0586 0.0957 1.0586 1.0289
No log 2.0227 178 0.9837 0.1208 0.9837 0.9918
No log 2.0455 180 0.9412 0.2596 0.9412 0.9702
No log 2.0682 182 0.9487 0.0430 0.9487 0.9740
No log 2.0909 184 0.8878 0.0688 0.8878 0.9423
No log 2.1136 186 1.0080 0.1235 1.0080 1.0040
No log 2.1364 188 1.2458 0.0994 1.2458 1.1161
No log 2.1591 190 1.1961 0.1228 1.1961 1.0937
No log 2.1818 192 1.0071 0.0639 1.0071 1.0035
No log 2.2045 194 1.2055 0.0634 1.2055 1.0980
No log 2.2273 196 1.0132 0.0529 1.0132 1.0066
No log 2.25 198 0.9441 0.1832 0.9441 0.9716
No log 2.2727 200 1.0728 0.1265 1.0728 1.0358
No log 2.2955 202 0.9103 0.0659 0.9103 0.9541
No log 2.3182 204 0.8434 0.1440 0.8434 0.9184
No log 2.3409 206 0.8605 0.1742 0.8605 0.9276
No log 2.3636 208 0.9108 0.1065 0.9108 0.9544
No log 2.3864 210 0.8260 0.0308 0.8260 0.9088
No log 2.4091 212 0.8853 0.0643 0.8853 0.9409
No log 2.4318 214 0.8558 0.0248 0.8558 0.9251
No log 2.4545 216 0.7633 0.0918 0.7633 0.8736
No log 2.4773 218 0.7298 0.0863 0.7298 0.8543
No log 2.5 220 0.7330 0.1311 0.7330 0.8561
No log 2.5227 222 0.8036 0.1742 0.8036 0.8964
No log 2.5455 224 0.7417 0.1311 0.7417 0.8612
No log 2.5682 226 0.8323 0.0682 0.8323 0.9123
No log 2.5909 228 0.8846 0.1479 0.8846 0.9405
No log 2.6136 230 0.7711 0.0058 0.7711 0.8781
No log 2.6364 232 0.7320 0.1311 0.7320 0.8556
No log 2.6591 234 0.7304 0.1311 0.7304 0.8547
No log 2.6818 236 0.7349 0.0869 0.7349 0.8572
No log 2.7045 238 0.8040 -0.0217 0.8040 0.8967
No log 2.7273 240 0.8242 0.1294 0.8242 0.9078
No log 2.75 242 0.8298 0.2031 0.8298 0.9110
No log 2.7727 244 0.7507 0.0834 0.7507 0.8664
No log 2.7955 246 0.7506 0.0749 0.7506 0.8664
No log 2.8182 248 0.8102 0.0393 0.8102 0.9001
No log 2.8409 250 0.8495 0.1996 0.8495 0.9217
No log 2.8636 252 0.7914 0.0376 0.7914 0.8896
No log 2.8864 254 0.8114 0.0341 0.8114 0.9008
No log 2.9091 256 0.8977 0.1635 0.8977 0.9475
No log 2.9318 258 1.1136 0.1290 1.1136 1.0553
No log 2.9545 260 1.0095 0.0985 1.0095 1.0047
No log 2.9773 262 0.8465 0.0376 0.8465 0.9201
No log 3.0 264 0.8652 0.2311 0.8652 0.9302
No log 3.0227 266 0.8048 0.1146 0.8048 0.8971
No log 3.0455 268 0.7848 -0.0056 0.7848 0.8859
No log 3.0682 270 0.8026 0.0 0.8026 0.8959
No log 3.0909 272 0.7701 0.1304 0.7701 0.8775
No log 3.1136 274 0.8720 0.1609 0.8720 0.9338
No log 3.1364 276 1.0214 0.1461 1.0214 1.0106
No log 3.1591 278 0.8982 0.1571 0.8982 0.9478
No log 3.1818 280 0.7431 0.1362 0.7431 0.8620
No log 3.2045 282 0.8640 0.0364 0.8640 0.9295
No log 3.2273 284 0.9011 -0.0211 0.9011 0.9493
No log 3.25 286 0.8075 0.0637 0.8075 0.8986
No log 3.2727 288 0.7857 0.0902 0.7857 0.8864
No log 3.2955 290 0.7886 0.1659 0.7886 0.8880
No log 3.3182 292 0.8583 0.1727 0.8583 0.9264
No log 3.3409 294 0.8792 0.1609 0.8792 0.9376
No log 3.3636 296 0.8323 0.0327 0.8323 0.9123
No log 3.3864 298 0.9513 0.1274 0.9513 0.9753
No log 3.4091 300 0.9324 0.1559 0.9324 0.9656
No log 3.4318 302 0.8406 0.0327 0.8406 0.9169
No log 3.4545 304 0.9824 0.1379 0.9824 0.9912
No log 3.4773 306 1.1854 0.0368 1.1854 1.0888
No log 3.5 308 1.0253 0.1604 1.0253 1.0126
No log 3.5227 310 0.7917 0.1146 0.7917 0.8898
No log 3.5455 312 0.8473 0.0460 0.8473 0.9205
No log 3.5682 314 0.9791 0.0646 0.9791 0.9895
No log 3.5909 316 0.9446 0.0651 0.9446 0.9719
No log 3.6136 318 0.8147 0.0828 0.8147 0.9026
No log 3.6364 320 0.8530 0.0562 0.8530 0.9236
No log 3.6591 322 0.9222 0.0692 0.9222 0.9603
No log 3.6818 324 0.8249 0.0999 0.8249 0.9082
No log 3.7045 326 0.7087 0.1371 0.7087 0.8419
No log 3.7273 328 0.7215 0.0918 0.7215 0.8494
No log 3.75 330 0.7975 -0.0228 0.7975 0.8930
No log 3.7727 332 0.8231 -0.0598 0.8231 0.9072
No log 3.7955 334 0.8111 0.0474 0.8111 0.9006
No log 3.8182 336 0.7684 0.0412 0.7684 0.8766
No log 3.8409 338 0.7185 0.1371 0.7185 0.8476
No log 3.8636 340 0.6802 0.1371 0.6802 0.8247
No log 3.8864 342 0.6658 0.1902 0.6658 0.8159
No log 3.9091 344 0.6873 0.1902 0.6873 0.8290
No log 3.9318 346 0.7722 0.1199 0.7722 0.8788
No log 3.9545 348 0.9029 0.0842 0.9029 0.9502
No log 3.9773 350 0.9247 0.0594 0.9247 0.9616
No log 4.0 352 1.0517 0.0950 1.0517 1.0255
No log 4.0227 354 1.1417 0.1514 1.1417 1.0685
No log 4.0455 356 1.1756 0.1775 1.1756 1.0843
No log 4.0682 358 1.0066 0.1500 1.0066 1.0033
No log 4.0909 360 0.8149 0.0791 0.8149 0.9027
No log 4.1136 362 0.8199 0.1001 0.8199 0.9055
No log 4.1364 364 0.7802 0.1675 0.7802 0.8833
No log 4.1591 366 0.7755 0.1675 0.7755 0.8806
No log 4.1818 368 0.7797 0.1196 0.7797 0.8830
No log 4.2045 370 0.7956 0.0741 0.7956 0.8920
No log 4.2273 372 0.8245 0.0791 0.8245 0.9080
No log 4.25 374 0.8127 0.0679 0.8127 0.9015
No log 4.2727 376 0.8433 0.1308 0.8433 0.9183
No log 4.2955 378 0.8450 0.0968 0.8450 0.9193
No log 4.3182 380 0.8771 -0.1306 0.8771 0.9365
No log 4.3409 382 0.9126 0.0025 0.9126 0.9553
No log 4.3636 384 0.8234 0.0027 0.8234 0.9074
No log 4.3864 386 0.7914 0.0709 0.7914 0.8896
No log 4.4091 388 0.7885 0.0791 0.7885 0.8880
No log 4.4318 390 0.7953 0.0709 0.7953 0.8918
No log 4.4545 392 0.8134 0.0 0.8134 0.9019
No log 4.4773 394 0.8799 0.0962 0.8799 0.9380
No log 4.5 396 0.8647 0.0580 0.8647 0.9299
No log 4.5227 398 0.8366 0.0376 0.8366 0.9147
No log 4.5455 400 0.8343 0.0709 0.8343 0.9134
No log 4.5682 402 0.8308 0.0376 0.8308 0.9115
No log 4.5909 404 0.8910 0.0609 0.8910 0.9440
No log 4.6136 406 0.9782 0.1271 0.9782 0.9890
No log 4.6364 408 0.9522 0.1271 0.9522 0.9758
No log 4.6591 410 0.8390 0.0898 0.8390 0.9160
No log 4.6818 412 0.8387 0.1324 0.8387 0.9158
No log 4.7045 414 0.8419 0.0917 0.8419 0.9175
No log 4.7273 416 0.7749 0.1146 0.7749 0.8803
No log 4.75 418 0.7565 0.1146 0.7565 0.8698
No log 4.7727 420 0.7219 0.1371 0.7219 0.8496
No log 4.7955 422 0.7364 0.0914 0.7364 0.8581
No log 4.8182 424 0.7928 0.0874 0.7928 0.8904
No log 4.8409 426 0.8365 0.0444 0.8365 0.9146
No log 4.8636 428 0.8876 -0.0271 0.8876 0.9421
No log 4.8864 430 1.0206 0.0820 1.0206 1.0102
No log 4.9091 432 1.1282 0.0673 1.1282 1.0622
No log 4.9318 434 0.9385 0.1604 0.9385 0.9688
No log 4.9545 436 0.8207 0.0410 0.8207 0.9059
No log 4.9773 438 0.8737 -0.0014 0.8737 0.9347
No log 5.0 440 0.8505 0.0311 0.8505 0.9222
No log 5.0227 442 0.7695 0.0106 0.7695 0.8772
No log 5.0455 444 0.7120 0.1828 0.7120 0.8438
No log 5.0682 446 0.7899 0.0999 0.7899 0.8888
No log 5.0909 448 0.9382 0.0250 0.9382 0.9686
No log 5.1136 450 0.9107 0.0684 0.9107 0.9543
No log 5.1364 452 0.7993 0.0999 0.7993 0.8941
No log 5.1591 454 0.7767 0.0840 0.7767 0.8813
No log 5.1818 456 0.7803 0.0410 0.7803 0.8834
No log 5.2045 458 0.7512 0.0834 0.7512 0.8667
No log 5.2273 460 0.7258 0.1828 0.7258 0.8519
No log 5.25 462 0.7521 0.1627 0.7521 0.8672
No log 5.2727 464 0.7988 0.0999 0.7988 0.8937
No log 5.2955 466 0.7990 0.1817 0.7990 0.8939
No log 5.3182 468 0.7422 0.1691 0.7422 0.8615
No log 5.3409 470 0.7231 0.1902 0.7231 0.8504
No log 5.3636 472 0.7724 -0.0387 0.7724 0.8789
No log 5.3864 474 0.7876 -0.0280 0.7876 0.8875
No log 5.4091 476 0.7324 0.0454 0.7324 0.8558
No log 5.4318 478 0.7361 0.2258 0.7361 0.8580
No log 5.4545 480 0.8320 0.1701 0.8320 0.9122
No log 5.4773 482 0.8413 0.1235 0.8413 0.9172
No log 5.5 484 0.7847 0.1249 0.7847 0.8858
No log 5.5227 486 0.7974 0.0376 0.7974 0.8930
No log 5.5455 488 0.7781 0.0376 0.7781 0.8821
No log 5.5682 490 0.7880 -0.0446 0.7880 0.8877
No log 5.5909 492 0.7744 -0.0469 0.7744 0.8800
No log 5.6136 494 0.7428 0.0454 0.7428 0.8618
No log 5.6364 496 0.7349 0.1254 0.7349 0.8573
No log 5.6591 498 0.7753 0.1675 0.7753 0.8805
0.3466 5.6818 500 0.7638 0.1675 0.7638 0.8740
0.3466 5.7045 502 0.7687 0.1254 0.7687 0.8767
0.3466 5.7273 504 0.7963 0.0376 0.7963 0.8923
0.3466 5.75 506 0.7945 0.0828 0.7945 0.8913
0.3466 5.7727 508 0.7681 0.1371 0.7681 0.8764
0.3466 5.7955 510 0.7494 0.1311 0.7494 0.8657

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task3_organization

Finetuned
(4019)
this model