ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8182
  • Qwk: 0.6212
  • Mse: 0.8182
  • Rmse: 0.9045

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0179 2 6.6258 0.0308 6.6258 2.5741
No log 0.0357 4 4.2740 0.0675 4.2740 2.0674
No log 0.0536 6 2.7076 0.0633 2.7076 1.6455
No log 0.0714 8 2.0200 0.2167 2.0200 1.4213
No log 0.0893 10 1.6931 0.1905 1.6931 1.3012
No log 0.1071 12 1.6579 0.1538 1.6579 1.2876
No log 0.125 14 1.6127 0.1509 1.6127 1.2699
No log 0.1429 16 1.6423 0.2523 1.6423 1.2815
No log 0.1607 18 1.7477 0.1714 1.7477 1.3220
No log 0.1786 20 1.8383 0.1538 1.8383 1.3558
No log 0.1964 22 1.8428 0.1524 1.8428 1.3575
No log 0.2143 24 1.5081 0.2407 1.5081 1.2281
No log 0.2321 26 1.3394 0.4202 1.3394 1.1573
No log 0.25 28 1.8098 0.3121 1.8098 1.3453
No log 0.2679 30 1.4423 0.3065 1.4423 1.2010
No log 0.2857 32 1.1889 0.3571 1.1889 1.0904
No log 0.3036 34 1.4221 0.4035 1.4221 1.1925
No log 0.3214 36 1.6234 0.1982 1.6234 1.2741
No log 0.3393 38 1.5523 0.2301 1.5523 1.2459
No log 0.3571 40 1.4346 0.2883 1.4346 1.1978
No log 0.375 42 1.3003 0.375 1.3003 1.1403
No log 0.3929 44 1.2445 0.375 1.2445 1.1156
No log 0.4107 46 1.2469 0.3243 1.2469 1.1166
No log 0.4286 48 1.1340 0.4444 1.1340 1.0649
No log 0.4464 50 1.0893 0.5210 1.0893 1.0437
No log 0.4643 52 1.1819 0.5410 1.1819 1.0872
No log 0.4821 54 1.1564 0.5000 1.1564 1.0754
No log 0.5 56 1.1548 0.4715 1.1548 1.0746
No log 0.5179 58 1.1286 0.4538 1.1286 1.0623
No log 0.5357 60 1.1630 0.4957 1.1630 1.0784
No log 0.5536 62 1.3549 0.4034 1.3549 1.1640
No log 0.5714 64 1.3165 0.3740 1.3165 1.1474
No log 0.5893 66 1.2945 0.5469 1.2945 1.1378
No log 0.6071 68 1.3422 0.5271 1.3422 1.1585
No log 0.625 70 1.3302 0.4677 1.3302 1.1533
No log 0.6429 72 1.2053 0.5246 1.2053 1.0979
No log 0.6607 74 1.1506 0.5366 1.1506 1.0727
No log 0.6786 76 1.0938 0.5669 1.0938 1.0459
No log 0.6964 78 1.0646 0.5397 1.0646 1.0318
No log 0.7143 80 0.9635 0.5512 0.9635 0.9816
No log 0.7321 82 1.0422 0.5606 1.0422 1.0209
No log 0.75 84 1.2908 0.5263 1.2908 1.1361
No log 0.7679 86 1.5022 0.3065 1.5022 1.2257
No log 0.7857 88 1.6010 0.2927 1.6010 1.2653
No log 0.8036 90 1.3392 0.4122 1.3392 1.1572
No log 0.8214 92 1.0603 0.5802 1.0603 1.0297
No log 0.8393 94 1.0525 0.5816 1.0525 1.0259
No log 0.8571 96 1.1501 0.5960 1.1501 1.0724
No log 0.875 98 1.1607 0.5906 1.1607 1.0774
No log 0.8929 100 1.1639 0.5906 1.1639 1.0788
No log 0.9107 102 1.1919 0.5578 1.1919 1.0917
No log 0.9286 104 1.1815 0.6069 1.1815 1.0870
No log 0.9464 106 1.1424 0.6099 1.1424 1.0688
No log 0.9643 108 1.0776 0.6176 1.0776 1.0381
No log 0.9821 110 1.0804 0.6087 1.0804 1.0394
No log 1.0 112 1.0537 0.6143 1.0537 1.0265
No log 1.0179 114 1.0192 0.6761 1.0192 1.0096
No log 1.0357 116 1.1055 0.6543 1.1055 1.0514
No log 1.0536 118 1.2916 0.5714 1.2916 1.1365
No log 1.0714 120 1.1218 0.6707 1.1218 1.0591
No log 1.0893 122 1.1349 0.6391 1.1349 1.0653
No log 1.1071 124 1.0874 0.6667 1.0874 1.0428
No log 1.125 126 0.8713 0.6914 0.8713 0.9334
No log 1.1429 128 0.7391 0.7711 0.7391 0.8597
No log 1.1607 130 0.7561 0.7152 0.7561 0.8695
No log 1.1786 132 0.9350 0.6225 0.9350 0.9670
No log 1.1964 134 1.1673 0.6125 1.1673 1.0804
No log 1.2143 136 1.1223 0.6164 1.1223 1.0594
No log 1.2321 138 0.9561 0.6490 0.9561 0.9778
No log 1.25 140 0.7448 0.7 0.7448 0.8630
No log 1.2679 142 0.7841 0.7007 0.7841 0.8855
No log 1.2857 144 0.7948 0.6618 0.7948 0.8915
No log 1.3036 146 0.8974 0.6525 0.8974 0.9473
No log 1.3214 148 1.3020 0.5868 1.3020 1.1411
No log 1.3393 150 1.7102 0.5514 1.7102 1.3077
No log 1.3571 152 1.5865 0.5532 1.5865 1.2596
No log 1.375 154 1.2617 0.5576 1.2617 1.1233
No log 1.3929 156 1.0955 0.6242 1.0955 1.0467
No log 1.4107 158 1.0650 0.6443 1.0650 1.0320
No log 1.4286 160 0.9513 0.6331 0.9513 0.9754
No log 1.4464 162 0.9364 0.6143 0.9364 0.9677
No log 1.4643 164 1.0715 0.6154 1.0715 1.0352
No log 1.4821 166 1.3038 0.5811 1.3038 1.1418
No log 1.5 168 1.3500 0.5867 1.3500 1.1619
No log 1.5179 170 1.2756 0.5676 1.2756 1.1294
No log 1.5357 172 1.1459 0.6027 1.1459 1.0705
No log 1.5536 174 1.0991 0.6174 1.0991 1.0484
No log 1.5714 176 1.1064 0.5957 1.1064 1.0518
No log 1.5893 178 1.2100 0.5649 1.2100 1.1000
No log 1.6071 180 1.2501 0.5156 1.2501 1.1181
No log 1.625 182 1.0543 0.5802 1.0543 1.0268
No log 1.6429 184 0.9255 0.6029 0.9255 0.9620
No log 1.6607 186 0.8216 0.6620 0.8216 0.9064
No log 1.6786 188 0.7114 0.7299 0.7114 0.8435
No log 1.6964 190 0.7505 0.6667 0.7505 0.8663
No log 1.7143 192 0.8645 0.6324 0.8645 0.9298
No log 1.7321 194 0.9740 0.6277 0.9740 0.9869
No log 1.75 196 1.0637 0.5909 1.0637 1.0314
No log 1.7679 198 1.0914 0.6056 1.0914 1.0447
No log 1.7857 200 1.0427 0.6081 1.0427 1.0211
No log 1.8036 202 0.9060 0.6475 0.9060 0.9518
No log 1.8214 204 0.7991 0.6806 0.7991 0.8940
No log 1.8393 206 0.8525 0.6761 0.8525 0.9233
No log 1.8571 208 1.1233 0.6174 1.1233 1.0598
No log 1.875 210 1.2277 0.5828 1.2277 1.1080
No log 1.8929 212 1.1755 0.5793 1.1755 1.0842
No log 1.9107 214 1.0556 0.5802 1.0556 1.0274
No log 1.9286 216 1.0123 0.5970 1.0123 1.0061
No log 1.9464 218 0.9607 0.6232 0.9607 0.9802
No log 1.9643 220 0.9079 0.6143 0.9079 0.9528
No log 1.9821 222 0.9247 0.6197 0.9247 0.9616
No log 2.0 224 0.8926 0.6241 0.8926 0.9448
No log 2.0179 226 0.9154 0.6345 0.9154 0.9567
No log 2.0357 228 0.9062 0.6434 0.9062 0.9519
No log 2.0536 230 0.9813 0.6541 0.9813 0.9906
No log 2.0714 232 1.1406 0.6467 1.1406 1.0680
No log 2.0893 234 1.1350 0.6467 1.1350 1.0654
No log 2.1071 236 0.9496 0.6351 0.9496 0.9745
No log 2.125 238 0.8637 0.6667 0.8637 0.9294
No log 2.1429 240 0.8344 0.6712 0.8344 0.9134
No log 2.1607 242 0.8616 0.6069 0.8616 0.9282
No log 2.1786 244 0.9215 0.6667 0.9215 0.9599
No log 2.1964 246 1.0555 0.6543 1.0555 1.0274
No log 2.2143 248 1.3042 0.6552 1.3042 1.1420
No log 2.2321 250 1.5620 0.5475 1.5620 1.2498
No log 2.25 252 1.4253 0.6071 1.4253 1.1939
No log 2.2679 254 1.1940 0.6194 1.1940 1.0927
No log 2.2857 256 1.0116 0.6029 1.0116 1.0058
No log 2.3036 258 1.0142 0.6131 1.0142 1.0071
No log 2.3214 260 1.1573 0.6316 1.1573 1.0758
No log 2.3393 262 1.2222 0.6154 1.2222 1.1055
No log 2.3571 264 1.1294 0.6027 1.1294 1.0627
No log 2.375 266 1.0056 0.6197 1.0056 1.0028
No log 2.3929 268 0.8468 0.6165 0.8468 0.9202
No log 2.4107 270 0.8208 0.6615 0.8208 0.9060
No log 2.4286 272 0.8854 0.6324 0.8854 0.9409
No log 2.4464 274 1.0719 0.5972 1.0719 1.0353
No log 2.4643 276 1.1552 0.6122 1.1552 1.0748
No log 2.4821 278 0.9857 0.6483 0.9857 0.9928
No log 2.5 280 0.8139 0.6466 0.8139 0.9021
No log 2.5179 282 0.7761 0.6853 0.7761 0.8810
No log 2.5357 284 0.8525 0.6752 0.8525 0.9233
No log 2.5536 286 0.9141 0.6914 0.9141 0.9561
No log 2.5714 288 1.0099 0.6788 1.0099 1.0049
No log 2.5893 290 1.0600 0.6585 1.0600 1.0296
No log 2.6071 292 0.9897 0.6351 0.9897 0.9948
No log 2.625 294 0.8793 0.6617 0.8793 0.9377
No log 2.6429 296 0.8238 0.6767 0.8238 0.9077
No log 2.6607 298 0.8448 0.6806 0.8448 0.9191
No log 2.6786 300 0.9737 0.6988 0.9737 0.9868
No log 2.6964 302 1.0735 0.6746 1.0735 1.0361
No log 2.7143 304 0.9875 0.6753 0.9875 0.9937
No log 2.7321 306 0.8481 0.6383 0.8481 0.9209
No log 2.75 308 0.7892 0.7134 0.7892 0.8883
No log 2.7679 310 0.8277 0.7117 0.8277 0.9098
No log 2.7857 312 0.7946 0.7044 0.7946 0.8914
No log 2.8036 314 0.7020 0.7355 0.7020 0.8379
No log 2.8214 316 0.6780 0.7338 0.6780 0.8234
No log 2.8393 318 0.7307 0.7206 0.7307 0.8548
No log 2.8571 320 0.7645 0.7111 0.7645 0.8744
No log 2.875 322 0.8300 0.6715 0.8300 0.9110
No log 2.8929 324 0.9368 0.6905 0.9368 0.9679
No log 2.9107 326 0.9907 0.6788 0.9907 0.9953
No log 2.9286 328 0.8883 0.6712 0.8883 0.9425
No log 2.9464 330 0.8060 0.6713 0.8060 0.8978
No log 2.9643 332 0.7526 0.6993 0.7526 0.8675
No log 2.9821 334 0.7524 0.7403 0.7524 0.8674
No log 3.0 336 0.8437 0.6792 0.8437 0.9186
No log 3.0179 338 0.8732 0.6797 0.8732 0.9345
No log 3.0357 340 0.8358 0.6712 0.8358 0.9142
No log 3.0536 342 0.7573 0.6815 0.7573 0.8702
No log 3.0714 344 0.7007 0.7015 0.7007 0.8371
No log 3.0893 346 0.7193 0.7111 0.7193 0.8481
No log 3.1071 348 0.8310 0.6377 0.8310 0.9116
No log 3.125 350 0.9882 0.6014 0.9882 0.9941
No log 3.1429 352 1.0701 0.6267 1.0701 1.0344
No log 3.1607 354 1.0457 0.6014 1.0457 1.0226
No log 3.1786 356 0.8805 0.6364 0.8805 0.9384
No log 3.1964 358 0.8388 0.6462 0.8388 0.9159
No log 3.2143 360 0.8382 0.6912 0.8382 0.9155
No log 3.2321 362 0.9676 0.6369 0.9676 0.9837
No log 3.25 364 1.0037 0.6335 1.0037 1.0018
No log 3.2679 366 0.9375 0.6579 0.9375 0.9682
No log 3.2857 368 0.8134 0.6715 0.8134 0.9019
No log 3.3036 370 0.8138 0.6202 0.8138 0.9021
No log 3.3214 372 0.8256 0.6142 0.8256 0.9086
No log 3.3393 374 0.8613 0.6569 0.8613 0.9280
No log 3.3571 376 0.8794 0.6714 0.8794 0.9377
No log 3.375 378 0.8127 0.6714 0.8127 0.9015
No log 3.3929 380 0.8285 0.6887 0.8285 0.9102
No log 3.4107 382 0.7994 0.6939 0.7994 0.8941
No log 3.4286 384 0.7819 0.6522 0.7819 0.8842
No log 3.4464 386 0.8407 0.6412 0.8407 0.9169
No log 3.4643 388 0.8948 0.6475 0.8948 0.9460
No log 3.4821 390 0.9512 0.6383 0.9512 0.9753
No log 3.5 392 0.9182 0.6519 0.9182 0.9582
No log 3.5179 394 0.8705 0.625 0.8705 0.9330
No log 3.5357 396 0.8844 0.6412 0.8844 0.9404
No log 3.5536 398 0.9825 0.6623 0.9825 0.9912
No log 3.5714 400 1.2207 0.6061 1.2207 1.1048
No log 3.5893 402 1.3184 0.6391 1.3184 1.1482
No log 3.6071 404 1.1486 0.6471 1.1486 1.0717
No log 3.625 406 0.8903 0.6533 0.8903 0.9436
No log 3.6429 408 0.7716 0.6866 0.7716 0.8784
No log 3.6607 410 0.7263 0.7050 0.7263 0.8523
No log 3.6786 412 0.7365 0.7237 0.7365 0.8582
No log 3.6964 414 0.9060 0.6988 0.9060 0.9519
No log 3.7143 416 0.9987 0.6786 0.9987 0.9994
No log 3.7321 418 0.9374 0.6988 0.9374 0.9682
No log 3.75 420 0.8498 0.6752 0.8498 0.9218
No log 3.7679 422 0.7519 0.6944 0.7519 0.8671
No log 3.7857 424 0.7517 0.6277 0.7517 0.8670
No log 3.8036 426 0.7988 0.6479 0.7988 0.8938
No log 3.8214 428 0.9349 0.6711 0.9349 0.9669
No log 3.8393 430 1.0140 0.6623 1.0140 1.0070
No log 3.8571 432 0.9982 0.6623 0.9982 0.9991
No log 3.875 434 0.8810 0.6434 0.8810 0.9386
No log 3.8929 436 0.8645 0.6479 0.8645 0.9298
No log 3.9107 438 0.8937 0.6667 0.8937 0.9453
No log 3.9286 440 0.9329 0.6582 0.9329 0.9659
No log 3.9464 442 0.9633 0.6541 0.9633 0.9815
No log 3.9643 444 1.0766 0.6463 1.0766 1.0376
No log 3.9821 446 1.1431 0.6460 1.1431 1.0692
No log 4.0 448 1.0636 0.6309 1.0636 1.0313
No log 4.0179 450 0.9321 0.6029 0.9321 0.9654
No log 4.0357 452 0.8291 0.6412 0.8291 0.9105
No log 4.0536 454 0.7495 0.6870 0.7495 0.8658
No log 4.0714 456 0.7277 0.6618 0.7277 0.8530
No log 4.0893 458 0.7107 0.6950 0.7107 0.8430
No log 4.1071 460 0.7796 0.7195 0.7796 0.8829
No log 4.125 462 0.7837 0.7195 0.7837 0.8853
No log 4.1429 464 0.7526 0.6846 0.7526 0.8675
No log 4.1607 466 0.6895 0.7183 0.6895 0.8304
No log 4.1786 468 0.6988 0.6765 0.6988 0.8359
No log 4.1964 470 0.7166 0.6567 0.7166 0.8465
No log 4.2143 472 0.7285 0.6765 0.7285 0.8535
No log 4.2321 474 0.8069 0.6712 0.8069 0.8983
No log 4.25 476 0.8888 0.6490 0.8888 0.9428
No log 4.2679 478 0.8980 0.6275 0.8980 0.9476
No log 4.2857 480 0.8396 0.6533 0.8396 0.9163
No log 4.3036 482 0.7615 0.6950 0.7615 0.8727
No log 4.3214 484 0.7939 0.6761 0.7939 0.8910
No log 4.3393 486 0.8405 0.6667 0.8405 0.9168
No log 4.3571 488 0.8511 0.6667 0.8511 0.9225
No log 4.375 490 0.8459 0.6667 0.8459 0.9197
No log 4.3929 492 0.7507 0.7123 0.7507 0.8664
No log 4.4107 494 0.7082 0.7143 0.7082 0.8415
No log 4.4286 496 0.7314 0.6716 0.7314 0.8552
No log 4.4464 498 0.7693 0.6866 0.7693 0.8771
0.4686 4.4643 500 0.8720 0.6423 0.8720 0.9338
0.4686 4.4821 502 0.9016 0.6176 0.9016 0.9495
0.4686 4.5 504 0.9102 0.6383 0.9102 0.9540
0.4686 4.5179 506 0.8052 0.6857 0.8052 0.8973
0.4686 4.5357 508 0.7368 0.7092 0.7368 0.8584
0.4686 4.5536 510 0.7554 0.7067 0.7554 0.8691
0.4686 4.5714 512 0.7081 0.6993 0.7081 0.8415
0.4686 4.5893 514 0.7561 0.6522 0.7561 0.8696
0.4686 4.6071 516 0.8474 0.6667 0.8474 0.9206
0.4686 4.625 518 0.9781 0.6494 0.9781 0.9890
0.4686 4.6429 520 0.9849 0.6358 0.9849 0.9924
0.4686 4.6607 522 0.9386 0.6241 0.9386 0.9688
0.4686 4.6786 524 0.8230 0.6324 0.8230 0.9072
0.4686 4.6964 526 0.7998 0.6617 0.7998 0.8943
0.4686 4.7143 528 0.7623 0.6963 0.7623 0.8731
0.4686 4.7321 530 0.7744 0.6667 0.7744 0.8800
0.4686 4.75 532 0.8465 0.6301 0.8465 0.9200
0.4686 4.7679 534 0.8865 0.6395 0.8865 0.9416
0.4686 4.7857 536 1.0471 0.6460 1.0471 1.0233
0.4686 4.8036 538 1.1829 0.6310 1.1829 1.0876
0.4686 4.8214 540 1.1754 0.6296 1.1754 1.0841
0.4686 4.8393 542 1.1390 0.6296 1.1390 1.0672
0.4686 4.8571 544 0.9818 0.6351 0.9818 0.9908
0.4686 4.875 546 0.8182 0.6212 0.8182 0.9045

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task1_organization

Finetuned
(4023)
this model