ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9007
  • Qwk: 0.0949
  • Mse: 0.9007
  • Rmse: 0.9490

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.125 2 3.6600 -0.0157 3.6600 1.9131
No log 0.25 4 1.8107 0.0560 1.8107 1.3456
No log 0.375 6 1.3195 -0.0452 1.3195 1.1487
No log 0.5 8 1.0291 -0.1274 1.0291 1.0145
No log 0.625 10 0.8793 -0.0753 0.8793 0.9377
No log 0.75 12 0.9812 -0.1695 0.9812 0.9906
No log 0.875 14 1.2323 0.0355 1.2323 1.1101
No log 1.0 16 1.1265 -0.0486 1.1265 1.0614
No log 1.125 18 1.0086 -0.1628 1.0086 1.0043
No log 1.25 20 1.0219 0.0405 1.0219 1.0109
No log 1.375 22 1.0047 0.1310 1.0047 1.0023
No log 1.5 24 1.2906 0.0727 1.2906 1.1361
No log 1.625 26 0.8711 0.0687 0.8711 0.9333
No log 1.75 28 0.9374 0.1077 0.9374 0.9682
No log 1.875 30 0.8568 0.0922 0.8568 0.9256
No log 2.0 32 0.9325 0.0267 0.9325 0.9657
No log 2.125 34 0.8832 0.1145 0.8832 0.9398
No log 2.25 36 0.8348 0.1172 0.8348 0.9137
No log 2.375 38 0.8728 0.0406 0.8728 0.9343
No log 2.5 40 0.9477 0.0785 0.9477 0.9735
No log 2.625 42 0.9829 0.0365 0.9829 0.9914
No log 2.75 44 1.0494 -0.0348 1.0494 1.0244
No log 2.875 46 1.0574 0.1273 1.0574 1.0283
No log 3.0 48 1.1226 0.1277 1.1226 1.0595
No log 3.125 50 1.0631 0.0883 1.0631 1.0311
No log 3.25 52 1.1578 -0.0384 1.1578 1.0760
No log 3.375 54 1.0896 -0.0463 1.0896 1.0438
No log 3.5 56 0.9978 -0.0247 0.9978 0.9989
No log 3.625 58 0.9417 0.1183 0.9417 0.9704
No log 3.75 60 0.9285 0.0768 0.9285 0.9636
No log 3.875 62 1.2236 0.0746 1.2236 1.1062
No log 4.0 64 0.9850 0.1511 0.9850 0.9925
No log 4.125 66 1.1201 0.1737 1.1201 1.0583
No log 4.25 68 1.0139 0.1961 1.0139 1.0069
No log 4.375 70 1.1555 0.0401 1.1555 1.0749
No log 4.5 72 1.2734 0.1326 1.2734 1.1284
No log 4.625 74 0.9655 0.1969 0.9655 0.9826
No log 4.75 76 1.5171 0.0745 1.5171 1.2317
No log 4.875 78 1.3711 0.0166 1.3711 1.1709
No log 5.0 80 0.9276 0.1733 0.9276 0.9631
No log 5.125 82 0.9640 -0.0179 0.9640 0.9818
No log 5.25 84 0.9672 0.0935 0.9672 0.9835
No log 5.375 86 1.2096 0.0006 1.2096 1.0998
No log 5.5 88 1.1071 0.0344 1.1071 1.0522
No log 5.625 90 0.8453 0.1221 0.8453 0.9194
No log 5.75 92 0.9205 0.0580 0.9205 0.9594
No log 5.875 94 0.8252 0.2063 0.8252 0.9084
No log 6.0 96 0.9463 0.0576 0.9463 0.9728
No log 6.125 98 1.2855 0.0876 1.2855 1.1338
No log 6.25 100 1.0036 0.0547 1.0036 1.0018
No log 6.375 102 0.8383 0.1673 0.8383 0.9156
No log 6.5 104 0.8825 0.1519 0.8825 0.9394
No log 6.625 106 0.8593 0.1417 0.8593 0.9270
No log 6.75 108 0.8636 0.0497 0.8636 0.9293
No log 6.875 110 0.7825 0.1962 0.7825 0.8846
No log 7.0 112 0.9347 0.0754 0.9347 0.9668
No log 7.125 114 0.9882 0.0418 0.9882 0.9941
No log 7.25 116 0.8479 0.1133 0.8479 0.9208
No log 7.375 118 1.0111 -0.0007 1.0111 1.0055
No log 7.5 120 0.9588 -0.0007 0.9588 0.9792
No log 7.625 122 0.8847 -0.0345 0.8847 0.9406
No log 7.75 124 0.7919 0.1006 0.7919 0.8899
No log 7.875 126 0.7304 0.0828 0.7304 0.8546
No log 8.0 128 0.6926 0.2326 0.6926 0.8322
No log 8.125 130 0.6833 0.3453 0.6833 0.8266
No log 8.25 132 0.6874 0.3827 0.6874 0.8291
No log 8.375 134 0.7653 0.1434 0.7653 0.8748
No log 8.5 136 0.7527 0.1778 0.7527 0.8676
No log 8.625 138 0.7624 0.2606 0.7624 0.8732
No log 8.75 140 0.7725 0.0146 0.7725 0.8789
No log 8.875 142 0.7379 0.2831 0.7379 0.8590
No log 9.0 144 0.7975 0.1426 0.7975 0.8930
No log 9.125 146 0.7939 0.2070 0.7939 0.8910
No log 9.25 148 0.9195 0.0217 0.9195 0.9589
No log 9.375 150 0.8670 -0.0054 0.8670 0.9311
No log 9.5 152 0.8230 0.2677 0.8230 0.9072
No log 9.625 154 0.8223 0.1179 0.8223 0.9068
No log 9.75 156 0.8106 0.1224 0.8106 0.9003
No log 9.875 158 0.8309 0.2570 0.8309 0.9115
No log 10.0 160 0.7832 0.1529 0.7832 0.8850
No log 10.125 162 0.7726 0.1604 0.7726 0.8790
No log 10.25 164 0.8171 0.1867 0.8171 0.9039
No log 10.375 166 0.9543 0.1914 0.9543 0.9769
No log 10.5 168 0.9009 0.1379 0.9009 0.9492
No log 10.625 170 0.8335 0.1509 0.8335 0.9129
No log 10.75 172 0.7892 0.1224 0.7892 0.8884
No log 10.875 174 0.8872 0.0659 0.8872 0.9419
No log 11.0 176 0.9617 0.0169 0.9617 0.9807
No log 11.125 178 0.8001 0.0118 0.8001 0.8945
No log 11.25 180 0.7803 0.1646 0.7803 0.8833
No log 11.375 182 0.8100 0.1591 0.8100 0.9000
No log 11.5 184 0.8870 0.0065 0.8870 0.9418
No log 11.625 186 0.8361 0.1050 0.8361 0.9144
No log 11.75 188 0.8497 0.1050 0.8497 0.9218
No log 11.875 190 0.8615 0.1051 0.8615 0.9282
No log 12.0 192 0.8613 0.1867 0.8613 0.9280
No log 12.125 194 0.8580 0.1400 0.8580 0.9263
No log 12.25 196 0.8522 0.1423 0.8522 0.9231
No log 12.375 198 0.8142 0.1863 0.8142 0.9023
No log 12.5 200 0.8338 0.0837 0.8338 0.9131
No log 12.625 202 0.7939 0.0152 0.7939 0.8910
No log 12.75 204 0.7607 0.2405 0.7607 0.8722
No log 12.875 206 0.7357 0.2627 0.7357 0.8577
No log 13.0 208 0.7559 0.2627 0.7559 0.8694
No log 13.125 210 0.7444 0.2239 0.7444 0.8628
No log 13.25 212 0.7118 0.0814 0.7118 0.8437
No log 13.375 214 0.6845 0.0914 0.6845 0.8273
No log 13.5 216 0.6905 0.0857 0.6905 0.8310
No log 13.625 218 0.7556 0.0670 0.7556 0.8693
No log 13.75 220 0.9115 0.0684 0.9115 0.9547
No log 13.875 222 0.7935 0.0549 0.7935 0.8908
No log 14.0 224 0.7091 0.0414 0.7091 0.8421
No log 14.125 226 0.9068 0.1123 0.9068 0.9523
No log 14.25 228 0.9997 0.0048 0.9997 0.9998
No log 14.375 230 0.8872 0.1345 0.8872 0.9419
No log 14.5 232 0.8846 0.1624 0.8846 0.9405
No log 14.625 234 0.9730 -0.0146 0.9730 0.9864
No log 14.75 236 0.9515 0.0481 0.9515 0.9755
No log 14.875 238 0.8874 0.0169 0.8874 0.9420
No log 15.0 240 0.7826 0.0831 0.7826 0.8847
No log 15.125 242 0.6853 0.0814 0.6853 0.8278
No log 15.25 244 0.6916 0.2403 0.6916 0.8316
No log 15.375 246 0.6911 0.0768 0.6911 0.8313
No log 15.5 248 0.7596 0.0913 0.7596 0.8716
No log 15.625 250 0.8855 0.0241 0.8855 0.9410
No log 15.75 252 0.8890 0.0241 0.8890 0.9429
No log 15.875 254 0.8117 0.2751 0.8117 0.9010
No log 16.0 256 0.7866 0.1687 0.7866 0.8869
No log 16.125 258 0.7906 0.1315 0.7906 0.8892
No log 16.25 260 0.7537 0.1778 0.7537 0.8682
No log 16.375 262 0.7933 0.0909 0.7933 0.8907
No log 16.5 264 0.8430 0.1106 0.8430 0.9181
No log 16.625 266 0.7829 0.0071 0.7829 0.8848
No log 16.75 268 0.7415 0.0628 0.7415 0.8611
No log 16.875 270 0.7225 0.0670 0.7225 0.8500
No log 17.0 272 0.7426 0.0670 0.7426 0.8617
No log 17.125 274 0.7102 0.0282 0.7102 0.8428
No log 17.25 276 0.7127 0.0355 0.7127 0.8442
No log 17.375 278 0.7332 0.1998 0.7332 0.8562
No log 17.5 280 0.7171 0.1453 0.7171 0.8468
No log 17.625 282 0.7236 0.0814 0.7236 0.8507
No log 17.75 284 0.7545 -0.0228 0.7545 0.8686
No log 17.875 286 0.7463 -0.0228 0.7463 0.8639
No log 18.0 288 0.7339 0.0355 0.7339 0.8567
No log 18.125 290 0.7583 0.2446 0.7583 0.8708
No log 18.25 292 0.7759 0.2019 0.7759 0.8809
No log 18.375 294 0.7140 0.1976 0.7140 0.8450
No log 18.5 296 0.8435 0.0016 0.8435 0.9184
No log 18.625 298 1.3157 0.0998 1.3157 1.1470
No log 18.75 300 1.4449 0.1041 1.4449 1.2021
No log 18.875 302 1.1488 0.0586 1.1488 1.0718
No log 19.0 304 0.8299 0.0424 0.8299 0.9110
No log 19.125 306 0.7364 0.2283 0.7364 0.8581
No log 19.25 308 0.7415 0.1906 0.7415 0.8611
No log 19.375 310 0.7471 0.1080 0.7471 0.8643
No log 19.5 312 0.6889 0.1976 0.6889 0.8300
No log 19.625 314 0.7091 0.0714 0.7091 0.8421
No log 19.75 316 0.7565 0.0099 0.7565 0.8698
No log 19.875 318 0.7294 0.0714 0.7294 0.8541
No log 20.0 320 0.7741 0.0799 0.7741 0.8798
No log 20.125 322 0.7901 0.0799 0.7901 0.8889
No log 20.25 324 0.8026 0.0456 0.8026 0.8959
No log 20.375 326 0.8431 -0.0008 0.8431 0.9182
No log 20.5 328 0.8190 -0.0008 0.8190 0.9050
No log 20.625 330 0.7377 0.1249 0.7377 0.8589
No log 20.75 332 0.7047 0.1856 0.7047 0.8395
No log 20.875 334 0.7175 0.1259 0.7175 0.8470
No log 21.0 336 0.8372 0.0676 0.8372 0.9150
No log 21.125 338 0.8783 -0.0143 0.8783 0.9372
No log 21.25 340 0.8469 0.0676 0.8469 0.9203
No log 21.375 342 0.7603 0.0152 0.7603 0.8720
No log 21.5 344 0.7535 0.2627 0.7535 0.8680
No log 21.625 346 0.7828 0.1744 0.7828 0.8848
No log 21.75 348 0.8111 0.2405 0.8111 0.9006
No log 21.875 350 0.8285 0.0923 0.8285 0.9102
No log 22.0 352 0.7943 0.2827 0.7943 0.8912
No log 22.125 354 0.7693 0.1744 0.7693 0.8771
No log 22.25 356 0.7514 0.2707 0.7514 0.8669
No log 22.375 358 0.7353 0.2239 0.7353 0.8575
No log 22.5 360 0.7554 0.0680 0.7554 0.8691
No log 22.625 362 0.7774 0.0562 0.7774 0.8817
No log 22.75 364 0.7462 0.0680 0.7462 0.8638
No log 22.875 366 0.7149 0.1311 0.7149 0.8455
No log 23.0 368 0.7099 0.1371 0.7099 0.8425
No log 23.125 370 0.7060 0.0814 0.7060 0.8402
No log 23.25 372 0.7214 0.0768 0.7214 0.8493
No log 23.375 374 0.7664 0.0913 0.7664 0.8755
No log 23.5 376 0.7761 0.1660 0.7761 0.8810
No log 23.625 378 0.7359 -0.0274 0.7359 0.8579
No log 23.75 380 0.7060 0.1371 0.7060 0.8402
No log 23.875 382 0.7199 0.3163 0.7199 0.8485
No log 24.0 384 0.7165 0.1371 0.7165 0.8465
No log 24.125 386 0.7558 0.0214 0.7558 0.8693
No log 24.25 388 0.8424 0.0837 0.8424 0.9178
No log 24.375 390 0.8125 0.0504 0.8125 0.9014
No log 24.5 392 0.7654 0.2138 0.7654 0.8749
No log 24.625 394 0.7367 0.1835 0.7367 0.8583
No log 24.75 396 0.7069 0.1371 0.7069 0.8408
No log 24.875 398 0.7627 0.1495 0.7627 0.8733
No log 25.0 400 0.8384 0.0793 0.8384 0.9156
No log 25.125 402 0.7925 0.1387 0.7925 0.8902
No log 25.25 404 0.6896 0.1828 0.6896 0.8304
No log 25.375 406 0.6786 0.0436 0.6786 0.8238
No log 25.5 408 0.6893 0.0436 0.6893 0.8302
No log 25.625 410 0.6861 0.1371 0.6861 0.8283
No log 25.75 412 0.7478 0.1506 0.7478 0.8648
No log 25.875 414 0.9213 0.1186 0.9213 0.9598
No log 26.0 416 0.9350 0.1149 0.9350 0.9669
No log 26.125 418 0.7971 0.1193 0.7971 0.8928
No log 26.25 420 0.7016 0.1311 0.7016 0.8376
No log 26.375 422 0.7069 0.2667 0.7069 0.8408
No log 26.5 424 0.7423 0.1080 0.7423 0.8616
No log 26.625 426 0.6961 0.1501 0.6961 0.8343
No log 26.75 428 0.6656 0.1371 0.6656 0.8159
No log 26.875 430 0.7500 0.0909 0.7500 0.8660
No log 27.0 432 0.8369 0.1453 0.8369 0.9148
No log 27.125 434 0.8534 0.1024 0.8534 0.9238
No log 27.25 436 0.7836 0.1605 0.7836 0.8852
No log 27.375 438 0.6900 0.2258 0.6900 0.8307
No log 27.5 440 0.6534 0.1444 0.6534 0.8084
No log 27.625 442 0.6519 0.1444 0.6519 0.8074
No log 27.75 444 0.6650 0.1371 0.6650 0.8155
No log 27.875 446 0.7193 0.1565 0.7193 0.8481
No log 28.0 448 0.8231 0.1879 0.8231 0.9072
No log 28.125 450 0.8627 0.2069 0.8627 0.9288
No log 28.25 452 0.8117 0.1542 0.8117 0.9009
No log 28.375 454 0.7422 0.0562 0.7422 0.8615
No log 28.5 456 0.7297 0.0562 0.7297 0.8542
No log 28.625 458 0.7571 0.1387 0.7571 0.8701
No log 28.75 460 0.8347 0.1024 0.8347 0.9136
No log 28.875 462 0.8803 0.0134 0.8803 0.9382
No log 29.0 464 0.8120 0.1149 0.8120 0.9011
No log 29.125 466 0.7767 0.0831 0.7767 0.8813
No log 29.25 468 0.8092 0.1149 0.8092 0.8996
No log 29.375 470 0.8545 0.1106 0.8545 0.9244
No log 29.5 472 0.8091 0.1593 0.8091 0.8995
No log 29.625 474 0.7737 0.0956 0.7737 0.8796
No log 29.75 476 0.7401 0.1612 0.7401 0.8603
No log 29.875 478 0.7153 0.1612 0.7153 0.8458
No log 30.0 480 0.7183 0.2105 0.7183 0.8475
No log 30.125 482 0.7287 0.1565 0.7287 0.8537
No log 30.25 484 0.7078 0.2105 0.7078 0.8413
No log 30.375 486 0.6869 0.1740 0.6869 0.8288
No log 30.5 488 0.7136 0.1612 0.7136 0.8448
No log 30.625 490 0.7833 0.1605 0.7833 0.8850
No log 30.75 492 0.7866 0.1605 0.7866 0.8869
No log 30.875 494 0.7251 0.1096 0.7251 0.8515
No log 31.0 496 0.7020 0.1675 0.7020 0.8378
No log 31.125 498 0.7035 0.1146 0.7035 0.8387
0.2277 31.25 500 0.7454 0.1395 0.7454 0.8634
0.2277 31.375 502 0.8023 0.2015 0.8023 0.8957
0.2277 31.5 504 0.7926 0.2015 0.7926 0.8903
0.2277 31.625 506 0.7219 0.1449 0.7219 0.8496
0.2277 31.75 508 0.6859 0.1249 0.6859 0.8282
0.2277 31.875 510 0.7013 0.2166 0.7013 0.8374
0.2277 32.0 512 0.7212 0.0732 0.7212 0.8492
0.2277 32.125 514 0.7372 0.0639 0.7372 0.8586
0.2277 32.25 516 0.7890 0.1605 0.7890 0.8883
0.2277 32.375 518 0.9458 0.0747 0.9458 0.9725
0.2277 32.5 520 0.9927 0.0391 0.9927 0.9964
0.2277 32.625 522 0.8753 0.1354 0.8753 0.9356
0.2277 32.75 524 0.7228 0.1097 0.7228 0.8502
0.2277 32.875 526 0.6767 0.1902 0.6767 0.8226
0.2277 33.0 528 0.6789 0.1902 0.6789 0.8240
0.2277 33.125 530 0.6863 0.1902 0.6863 0.8284
0.2277 33.25 532 0.7159 0.2180 0.7159 0.8461
0.2277 33.375 534 0.7641 0.0476 0.7641 0.8741
0.2277 33.5 536 0.8223 0.0826 0.8223 0.9068
0.2277 33.625 538 0.7905 0.0476 0.7905 0.8891
0.2277 33.75 540 0.7380 0.1691 0.7380 0.8591
0.2277 33.875 542 0.7118 0.1354 0.7118 0.8437
0.2277 34.0 544 0.7251 0.2294 0.7251 0.8515
0.2277 34.125 546 0.7213 0.2263 0.7213 0.8493
0.2277 34.25 548 0.7165 0.1354 0.7165 0.8464
0.2277 34.375 550 0.7520 0.1199 0.7520 0.8672
0.2277 34.5 552 0.8161 0.0041 0.8161 0.9034
0.2277 34.625 554 0.9050 0.0949 0.9050 0.9513
0.2277 34.75 556 0.9007 0.0949 0.9007 0.9490

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k3_task3_organization

Finetuned
(4019)
this model