ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8296
  • Qwk: 0.6260
  • Mse: 0.8296
  • Rmse: 0.9108

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0179 2 6.9429 0.0 6.9429 2.6349
No log 0.0357 4 4.5169 0.0519 4.5169 2.1253
No log 0.0536 6 3.4474 -0.0549 3.4474 1.8567
No log 0.0714 8 2.3436 0.0602 2.3436 1.5309
No log 0.0893 10 1.9425 0.1121 1.9425 1.3937
No log 0.1071 12 1.8384 0.1333 1.8384 1.3559
No log 0.125 14 1.9453 0.0192 1.9453 1.3947
No log 0.1429 16 2.2308 -0.0862 2.2308 1.4936
No log 0.1607 18 2.3700 -0.0984 2.3700 1.5395
No log 0.1786 20 2.4430 -0.0299 2.4430 1.5630
No log 0.1964 22 2.2415 0.0000 2.2415 1.4972
No log 0.2143 24 1.6551 0.1802 1.6551 1.2865
No log 0.2321 26 1.5804 0.2523 1.5804 1.2571
No log 0.25 28 1.6258 0.2857 1.6258 1.2751
No log 0.2679 30 1.7131 0.2034 1.7131 1.3088
No log 0.2857 32 1.9843 0.1034 1.9843 1.4087
No log 0.3036 34 2.5493 -0.0288 2.5493 1.5967
No log 0.3214 36 2.0888 0.0976 2.0888 1.4453
No log 0.3393 38 1.6100 0.2783 1.6100 1.2689
No log 0.3571 40 1.5371 0.4754 1.5371 1.2398
No log 0.375 42 1.5007 0.2703 1.5007 1.2250
No log 0.3929 44 1.6440 0.4 1.6440 1.2822
No log 0.4107 46 1.4485 0.2909 1.4485 1.2035
No log 0.4286 48 1.4728 0.2222 1.4728 1.2136
No log 0.4464 50 1.5556 0.2222 1.5556 1.2473
No log 0.4643 52 1.5996 0.2545 1.5996 1.2647
No log 0.4821 54 1.5045 0.2385 1.5045 1.2266
No log 0.5 56 1.4243 0.2385 1.4243 1.1934
No log 0.5179 58 1.4388 0.3220 1.4388 1.1995
No log 0.5357 60 1.4057 0.5263 1.4057 1.1856
No log 0.5536 62 1.4650 0.4681 1.4650 1.2104
No log 0.5714 64 1.5351 0.3857 1.5351 1.2390
No log 0.5893 66 1.5541 0.4058 1.5541 1.2466
No log 0.6071 68 1.6081 0.3333 1.6081 1.2681
No log 0.625 70 1.6182 0.3333 1.6182 1.2721
No log 0.6429 72 1.5748 0.4444 1.5748 1.2549
No log 0.6607 74 1.5665 0.4000 1.5665 1.2516
No log 0.6786 76 1.6327 0.2857 1.6327 1.2778
No log 0.6964 78 1.3602 0.4333 1.3602 1.1663
No log 0.7143 80 1.2194 0.5238 1.2194 1.1042
No log 0.7321 82 1.3255 0.4844 1.3255 1.1513
No log 0.75 84 1.4212 0.4148 1.4212 1.1922
No log 0.7679 86 1.3962 0.4627 1.3962 1.1816
No log 0.7857 88 1.6813 0.3380 1.6813 1.2967
No log 0.8036 90 1.8591 0.2535 1.8591 1.3635
No log 0.8214 92 1.8339 0.2370 1.8339 1.3542
No log 0.8393 94 1.7588 0.2992 1.7588 1.3262
No log 0.8571 96 1.6164 0.3710 1.6164 1.2714
No log 0.875 98 1.5214 0.3840 1.5214 1.2334
No log 0.8929 100 1.4291 0.4242 1.4291 1.1955
No log 0.9107 102 1.2912 0.4375 1.2912 1.1363
No log 0.9286 104 1.1525 0.5547 1.1525 1.0735
No log 0.9464 106 1.0067 0.6345 1.0067 1.0034
No log 0.9643 108 0.9919 0.6438 0.9919 0.9960
No log 0.9821 110 1.2364 0.4762 1.2364 1.1120
No log 1.0 112 1.4260 0.4127 1.4260 1.1942
No log 1.0179 114 1.5148 0.4211 1.5148 1.2308
No log 1.0357 116 1.4625 0.4493 1.4625 1.2093
No log 1.0536 118 1.3034 0.5652 1.3034 1.1417
No log 1.0714 120 1.1848 0.5865 1.1848 1.0885
No log 1.0893 122 1.1486 0.5564 1.1486 1.0717
No log 1.1071 124 1.0540 0.5630 1.0540 1.0266
No log 1.125 126 1.0380 0.5985 1.0380 1.0188
No log 1.1429 128 1.0085 0.5985 1.0085 1.0043
No log 1.1607 130 0.9845 0.5882 0.9845 0.9922
No log 1.1786 132 1.0770 0.6176 1.0770 1.0378
No log 1.1964 134 1.1899 0.5496 1.1899 1.0908
No log 1.2143 136 1.2233 0.5564 1.2233 1.1060
No log 1.2321 138 1.2502 0.5612 1.2502 1.1181
No log 1.25 140 1.2726 0.5429 1.2726 1.1281
No log 1.2679 142 1.2163 0.5652 1.2163 1.1029
No log 1.2857 144 1.3339 0.5379 1.3339 1.1550
No log 1.3036 146 1.4968 0.4672 1.4968 1.2234
No log 1.3214 148 1.5758 0.4265 1.5758 1.2553
No log 1.3393 150 1.6458 0.3165 1.6458 1.2829
No log 1.3571 152 1.6404 0.4684 1.6404 1.2808
No log 1.375 154 1.5734 0.5227 1.5734 1.2543
No log 1.3929 156 1.2871 0.6289 1.2871 1.1345
No log 1.4107 158 0.9474 0.6187 0.9474 0.9733
No log 1.4286 160 0.8603 0.6029 0.8603 0.9275
No log 1.4464 162 0.8788 0.6222 0.8788 0.9374
No log 1.4643 164 0.9008 0.5581 0.9008 0.9491
No log 1.4821 166 0.9928 0.5312 0.9928 0.9964
No log 1.5 168 1.1082 0.5455 1.1082 1.0527
No log 1.5179 170 1.2313 0.5674 1.2313 1.1097
No log 1.5357 172 1.4099 0.5714 1.4099 1.1874
No log 1.5536 174 1.5551 0.5085 1.5551 1.2470
No log 1.5714 176 1.6276 0.5027 1.6276 1.2758
No log 1.5893 178 1.4887 0.5198 1.4887 1.2201
No log 1.6071 180 1.2955 0.5625 1.2955 1.1382
No log 1.625 182 1.1375 0.6056 1.1375 1.0666
No log 1.6429 184 1.0627 0.6176 1.0627 1.0309
No log 1.6607 186 1.0408 0.6094 1.0408 1.0202
No log 1.6786 188 1.0640 0.5938 1.0640 1.0315
No log 1.6964 190 1.0939 0.5827 1.0939 1.0459
No log 1.7143 192 1.1360 0.6061 1.1360 1.0658
No log 1.7321 194 1.1780 0.6074 1.1780 1.0853
No log 1.75 196 1.2122 0.5857 1.2122 1.1010
No log 1.7679 198 1.1781 0.6154 1.1781 1.0854
No log 1.7857 200 1.0969 0.6232 1.0969 1.0473
No log 1.8036 202 1.0750 0.6232 1.0750 1.0368
No log 1.8214 204 1.0726 0.6531 1.0726 1.0357
No log 1.8393 206 1.0161 0.6573 1.0161 1.0080
No log 1.8571 208 0.9423 0.6901 0.9423 0.9707
No log 1.875 210 0.8840 0.6618 0.8840 0.9402
No log 1.8929 212 0.8141 0.6667 0.8141 0.9023
No log 1.9107 214 0.8146 0.6260 0.8146 0.9025
No log 1.9286 216 0.8393 0.6094 0.8393 0.9161
No log 1.9464 218 0.8292 0.6515 0.8292 0.9106
No log 1.9643 220 0.9547 0.6065 0.9547 0.9771
No log 1.9821 222 1.3654 0.6298 1.3654 1.1685
No log 2.0 224 1.6423 0.5464 1.6423 1.2815
No log 2.0179 226 1.5474 0.5348 1.5474 1.2439
No log 2.0357 228 1.3031 0.5904 1.3031 1.1416
No log 2.0536 230 1.0618 0.6543 1.0618 1.0305
No log 2.0714 232 0.8585 0.6846 0.8585 0.9265
No log 2.0893 234 0.7995 0.6761 0.7995 0.8941
No log 2.1071 236 0.8039 0.6993 0.8039 0.8966
No log 2.125 238 0.8602 0.6621 0.8602 0.9275
No log 2.1429 240 1.0212 0.6389 1.0212 1.0105
No log 2.1607 242 1.1431 0.5957 1.1431 1.0692
No log 2.1786 244 1.2033 0.4928 1.2033 1.0970
No log 2.1964 246 1.1811 0.5231 1.1811 1.0868
No log 2.2143 248 1.1861 0.5414 1.1861 1.0891
No log 2.2321 250 1.2648 0.5526 1.2648 1.1246
No log 2.25 252 1.2762 0.5802 1.2762 1.1297
No log 2.2679 254 1.2255 0.6467 1.2255 1.1070
No log 2.2857 256 1.1951 0.6467 1.1951 1.0932
No log 2.3036 258 1.0811 0.6389 1.0811 1.0397
No log 2.3214 260 1.0335 0.6187 1.0335 1.0166
No log 2.3393 262 1.0601 0.6029 1.0601 1.0296
No log 2.3571 264 1.1150 0.6029 1.1150 1.0559
No log 2.375 266 1.1353 0.5926 1.1353 1.0655
No log 2.3929 268 1.1274 0.6074 1.1274 1.0618
No log 2.4107 270 1.1383 0.6069 1.1383 1.0669
No log 2.4286 272 1.1863 0.6424 1.1863 1.0892
No log 2.4464 274 1.2010 0.6509 1.2010 1.0959
No log 2.4643 276 1.0996 0.6707 1.0996 1.0486
No log 2.4821 278 0.9327 0.6584 0.9327 0.9658
No log 2.5 280 0.8311 0.6389 0.8311 0.9116
No log 2.5179 282 0.8157 0.6618 0.8157 0.9032
No log 2.5357 284 0.8567 0.6531 0.8567 0.9256
No log 2.5536 286 1.0287 0.6358 1.0287 1.0143
No log 2.5714 288 1.2016 0.5570 1.2016 1.0962
No log 2.5893 290 1.2314 0.5570 1.2314 1.1097
No log 2.6071 292 1.1159 0.6275 1.1159 1.0564
No log 2.625 294 0.9457 0.6232 0.9457 0.9725
No log 2.6429 296 0.8348 0.6462 0.8348 0.9137
No log 2.6607 298 0.8154 0.6508 0.8154 0.9030
No log 2.6786 300 0.8552 0.6418 0.8552 0.9248
No log 2.6964 302 0.9343 0.6395 0.9343 0.9666
No log 2.7143 304 0.9598 0.6536 0.9598 0.9797
No log 2.7321 306 1.0113 0.6443 1.0113 1.0057
No log 2.75 308 0.9965 0.6536 0.9965 0.9983
No log 2.7679 310 0.9283 0.6755 0.9283 0.9635
No log 2.7857 312 0.8563 0.6294 0.8563 0.9254
No log 2.8036 314 0.8440 0.6294 0.8440 0.9187
No log 2.8214 316 0.8477 0.6933 0.8477 0.9207
No log 2.8393 318 0.8877 0.7 0.8877 0.9422
No log 2.8571 320 0.9529 0.6748 0.9529 0.9762
No log 2.875 322 1.0037 0.6826 1.0037 1.0019
No log 2.8929 324 0.9585 0.6790 0.9585 0.9790
No log 2.9107 326 0.9464 0.6846 0.9464 0.9728
No log 2.9286 328 0.9398 0.6667 0.9398 0.9694
No log 2.9464 330 0.8726 0.6471 0.8726 0.9341
No log 2.9643 332 0.8563 0.6308 0.8563 0.9254
No log 2.9821 334 0.8723 0.6142 0.8723 0.9340
No log 3.0 336 0.9342 0.6412 0.9342 0.9665
No log 3.0179 338 1.0630 0.6301 1.0630 1.0310
No log 3.0357 340 1.2652 0.5896 1.2652 1.1248
No log 3.0536 342 1.2034 0.5614 1.2034 1.0970
No log 3.0714 344 1.0119 0.6667 1.0119 1.0059
No log 3.0893 346 0.8746 0.6620 0.8746 0.9352
No log 3.1071 348 0.8227 0.7153 0.8227 0.9071
No log 3.125 350 0.8397 0.7015 0.8397 0.9164
No log 3.1429 352 0.9136 0.6165 0.9136 0.9558
No log 3.1607 354 1.0001 0.6029 1.0001 1.0001
No log 3.1786 356 1.0404 0.5846 1.0404 1.0200
No log 3.1964 358 1.0746 0.6015 1.0746 1.0366
No log 3.2143 360 1.1008 0.5630 1.1008 1.0492
No log 3.2321 362 1.0705 0.5778 1.0705 1.0346
No log 3.25 364 1.0340 0.5496 1.0340 1.0169
No log 3.2679 366 1.0219 0.5692 1.0219 1.0109
No log 3.2857 368 0.9916 0.5692 0.9916 0.9958
No log 3.3036 370 0.9888 0.6142 0.9888 0.9944
No log 3.3214 372 0.9868 0.6142 0.9868 0.9934
No log 3.3393 374 1.0032 0.5891 1.0032 1.0016
No log 3.3571 376 1.0380 0.6111 1.0380 1.0188
No log 3.375 378 1.1281 0.6013 1.1281 1.0621
No log 3.3929 380 1.1171 0.6184 1.1171 1.0569
No log 3.4107 382 1.0638 0.6174 1.0638 1.0314
No log 3.4286 384 1.0192 0.6069 1.0192 1.0096
No log 3.4464 386 0.9570 0.6294 0.9570 0.9783
No log 3.4643 388 0.8817 0.6412 0.8817 0.9390
No log 3.4821 390 0.8449 0.6515 0.8449 0.9192
No log 3.5 392 0.7944 0.6565 0.7944 0.8913
No log 3.5179 394 0.7766 0.6522 0.7766 0.8812
No log 3.5357 396 0.7637 0.6667 0.7637 0.8739
No log 3.5536 398 0.7791 0.6993 0.7791 0.8827
No log 3.5714 400 0.8139 0.6857 0.8139 0.9022
No log 3.5893 402 0.8579 0.6912 0.8579 0.9262
No log 3.6071 404 0.8843 0.6260 0.8843 0.9404
No log 3.625 406 0.9294 0.6383 0.9294 0.9641
No log 3.6429 408 0.8911 0.6573 0.8911 0.9440
No log 3.6607 410 0.8574 0.6806 0.8574 0.9259
No log 3.6786 412 0.8545 0.6757 0.8545 0.9244
No log 3.6964 414 0.8479 0.6883 0.8479 0.9208
No log 3.7143 416 0.8088 0.6879 0.8088 0.8993
No log 3.7321 418 0.7414 0.7190 0.7414 0.8611
No log 3.75 420 0.7614 0.6901 0.7614 0.8726
No log 3.7679 422 0.7949 0.6912 0.7949 0.8916
No log 3.7857 424 0.8509 0.6107 0.8509 0.9224
No log 3.8036 426 0.9425 0.6429 0.9425 0.9708
No log 3.8214 428 1.0760 0.5931 1.0760 1.0373
No log 3.8393 430 1.0764 0.6081 1.0764 1.0375
No log 3.8571 432 0.9856 0.625 0.9856 0.9928
No log 3.875 434 0.8379 0.6479 0.8379 0.9154
No log 3.8929 436 0.7742 0.6906 0.7742 0.8799
No log 3.9107 438 0.7191 0.7483 0.7191 0.8480
No log 3.9286 440 0.7131 0.7285 0.7131 0.8444
No log 3.9464 442 0.8217 0.7179 0.8217 0.9065
No log 3.9643 444 1.0579 0.6554 1.0579 1.0286
No log 3.9821 446 1.1019 0.6517 1.1019 1.0497
No log 4.0 448 0.8871 0.6957 0.8871 0.9419
No log 4.0179 450 0.7017 0.7248 0.7017 0.8377
No log 4.0357 452 0.6880 0.7285 0.6880 0.8294
No log 4.0536 454 0.7150 0.6950 0.7150 0.8456
No log 4.0714 456 0.8008 0.6993 0.8008 0.8949
No log 4.0893 458 0.9544 0.6364 0.9544 0.9769
No log 4.1071 460 1.0276 0.6133 1.0276 1.0137
No log 4.125 462 1.0225 0.6040 1.0225 1.0112
No log 4.1429 464 0.9567 0.6133 0.9567 0.9781
No log 4.1607 466 0.7840 0.6806 0.7840 0.8854
No log 4.1786 468 0.6907 0.7417 0.6907 0.8311
No log 4.1964 470 0.6594 0.7613 0.6594 0.8120
No log 4.2143 472 0.6607 0.7613 0.6607 0.8128
No log 4.2321 474 0.6820 0.7417 0.6820 0.8258
No log 4.25 476 0.7624 0.6887 0.7624 0.8731
No log 4.2679 478 0.8097 0.6993 0.8097 0.8998
No log 4.2857 480 0.8454 0.6957 0.8454 0.9194
No log 4.3036 482 0.8843 0.6412 0.8843 0.9404
No log 4.3214 484 0.9762 0.6569 0.9762 0.9880
No log 4.3393 486 1.0307 0.6241 1.0307 1.0152
No log 4.3571 488 0.9878 0.6165 0.9878 0.9939
No log 4.375 490 0.9197 0.6202 0.9197 0.9590
No log 4.3929 492 0.8821 0.6202 0.8821 0.9392
No log 4.4107 494 0.8763 0.6143 0.8763 0.9361
No log 4.4286 496 0.9076 0.7215 0.9076 0.9527
No log 4.4464 498 0.8436 0.7205 0.8436 0.9185
0.4569 4.4643 500 0.7194 0.7020 0.7194 0.8482
0.4569 4.4821 502 0.6490 0.7237 0.6490 0.8056
0.4569 4.5 504 0.6487 0.7333 0.6487 0.8054
0.4569 4.5179 506 0.6848 0.7403 0.6848 0.8276
0.4569 4.5357 508 0.7857 0.7067 0.7857 0.8864
0.4569 4.5536 510 0.9340 0.6918 0.9340 0.9664
0.4569 4.5714 512 1.0133 0.6584 1.0133 1.0066
0.4569 4.5893 514 1.0130 0.6581 1.0130 1.0065
0.4569 4.6071 516 0.9121 0.6621 0.9121 0.9551
0.4569 4.625 518 0.8296 0.6260 0.8296 0.9108

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task1_organization

Finetuned
(4023)
this model