ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9054
  • Qwk: 0.6143
  • Mse: 0.9054
  • Rmse: 0.9515

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0714 2 6.8529 0.0057 6.8529 2.6178
No log 0.1429 4 4.3834 0.1083 4.3834 2.0937
No log 0.2143 6 3.2400 0.0 3.2400 1.8000
No log 0.2857 8 2.2916 0.1630 2.2916 1.5138
No log 0.3571 10 2.0072 0.1550 2.0072 1.4168
No log 0.4286 12 1.8683 0.2188 1.8683 1.3669
No log 0.5 14 1.8939 0.2963 1.8939 1.3762
No log 0.5714 16 1.9877 0.2714 1.9877 1.4099
No log 0.6429 18 2.0542 0.2222 2.0542 1.4332
No log 0.7143 20 1.8651 0.3008 1.8651 1.3657
No log 0.7857 22 1.8214 0.3235 1.8214 1.3496
No log 0.8571 24 2.2459 0.1961 2.2459 1.4986
No log 0.9286 26 2.5274 0.1605 2.5274 1.5898
No log 1.0 28 2.3436 0.1806 2.3436 1.5309
No log 1.0714 30 2.1259 0.1268 2.1259 1.4580
No log 1.1429 32 1.8398 0.3308 1.8398 1.3564
No log 1.2143 34 1.8253 0.3111 1.8253 1.3510
No log 1.2857 36 1.7857 0.3333 1.7857 1.3363
No log 1.3571 38 1.8737 0.2647 1.8737 1.3688
No log 1.4286 40 1.8982 0.2667 1.8982 1.3778
No log 1.5 42 2.0072 0.1958 2.0072 1.4168
No log 1.5714 44 2.6374 0.2614 2.6374 1.6240
No log 1.6429 46 3.2349 0.1845 3.2349 1.7986
No log 1.7143 48 3.0333 0.2010 3.0333 1.7417
No log 1.7857 50 2.2737 0.2857 2.2737 1.5079
No log 1.8571 52 1.9045 0.3776 1.9045 1.3800
No log 1.9286 54 1.5998 0.3623 1.5998 1.2648
No log 2.0 56 1.4688 0.3636 1.4688 1.2119
No log 2.0714 58 1.4342 0.3622 1.4342 1.1976
No log 2.1429 60 1.4744 0.3906 1.4744 1.2142
No log 2.2143 62 1.6276 0.3504 1.6276 1.2758
No log 2.2857 64 2.2990 0.2963 2.2990 1.5163
No log 2.3571 66 2.5876 0.2690 2.5876 1.6086
No log 2.4286 68 2.0548 0.2632 2.0548 1.4335
No log 2.5 70 1.7406 0.3425 1.7406 1.3193
No log 2.5714 72 1.5802 0.4571 1.5802 1.2571
No log 2.6429 74 1.4587 0.4965 1.4587 1.2078
No log 2.7143 76 1.4363 0.5143 1.4363 1.1984
No log 2.7857 78 1.4181 0.4507 1.4181 1.1908
No log 2.8571 80 1.5473 0.4085 1.5473 1.2439
No log 2.9286 82 1.5344 0.4255 1.5344 1.2387
No log 3.0 84 2.0751 0.3529 2.0751 1.4405
No log 3.0714 86 2.1024 0.3636 2.1024 1.4500
No log 3.1429 88 1.4878 0.4503 1.4878 1.2197
No log 3.2143 90 1.1606 0.5857 1.1606 1.0773
No log 3.2857 92 1.4432 0.5036 1.4432 1.2013
No log 3.3571 94 1.7013 0.4324 1.7013 1.3043
No log 3.4286 96 1.9587 0.3537 1.9587 1.3995
No log 3.5 98 1.8892 0.3727 1.8892 1.3745
No log 3.5714 100 1.5960 0.4400 1.5960 1.2633
No log 3.6429 102 1.5410 0.4730 1.5410 1.2414
No log 3.7143 104 1.4769 0.5034 1.4769 1.2153
No log 3.7857 106 1.3611 0.5278 1.3611 1.1667
No log 3.8571 108 1.4668 0.5229 1.4668 1.2111
No log 3.9286 110 1.6054 0.5063 1.6054 1.2670
No log 4.0 112 1.5097 0.5432 1.5097 1.2287
No log 4.0714 114 1.4355 0.5584 1.4355 1.1981
No log 4.1429 116 1.2406 0.5915 1.2406 1.1138
No log 4.2143 118 1.1268 0.5775 1.1268 1.0615
No log 4.2857 120 1.3762 0.5590 1.3762 1.1731
No log 4.3571 122 1.9932 0.3956 1.9932 1.4118
No log 4.4286 124 2.6174 0.3774 2.6174 1.6178
No log 4.5 126 2.7099 0.3568 2.7099 1.6462
No log 4.5714 128 1.9294 0.4646 1.9294 1.3890
No log 4.6429 130 1.6179 0.5376 1.6179 1.2720
No log 4.7143 132 1.1197 0.6125 1.1197 1.0582
No log 4.7857 134 0.8733 0.6803 0.8733 0.9345
No log 4.8571 136 0.8523 0.6619 0.8523 0.9232
No log 4.9286 138 0.8863 0.6812 0.8863 0.9414
No log 5.0 140 0.8831 0.6812 0.8831 0.9397
No log 5.0714 142 0.9202 0.6187 0.9202 0.9593
No log 5.1429 144 1.0570 0.6216 1.0570 1.0281
No log 5.2143 146 1.0299 0.6069 1.0299 1.0149
No log 5.2857 148 0.9211 0.6286 0.9211 0.9597
No log 5.3571 150 0.9171 0.6483 0.9171 0.9577
No log 5.4286 152 0.9674 0.6364 0.9674 0.9836
No log 5.5 154 1.0240 0.6460 1.0240 1.0119
No log 5.5714 156 0.9640 0.6538 0.9640 0.9818
No log 5.6429 158 0.9912 0.6625 0.9912 0.9956
No log 5.7143 160 1.0134 0.6329 1.0134 1.0067
No log 5.7857 162 0.9992 0.6709 0.9992 0.9996
No log 5.8571 164 0.9643 0.6835 0.9643 0.9820
No log 5.9286 166 0.9385 0.7020 0.9385 0.9688
No log 6.0 168 0.9316 0.6759 0.9316 0.9652
No log 6.0714 170 0.9319 0.6812 0.9319 0.9653
No log 6.1429 172 0.9240 0.7042 0.9240 0.9612
No log 6.2143 174 0.9349 0.6533 0.9349 0.9669
No log 6.2857 176 1.0525 0.6667 1.0525 1.0259
No log 6.3571 178 1.1849 0.6310 1.1849 1.0885
No log 6.4286 180 1.0792 0.6391 1.0792 1.0389
No log 6.5 182 0.8695 0.6853 0.8695 0.9325
No log 6.5714 184 0.7912 0.7361 0.7912 0.8895
No log 6.6429 186 0.9076 0.6475 0.9076 0.9527
No log 6.7143 188 1.0462 0.6765 1.0462 1.0229
No log 6.7857 190 1.0909 0.4724 1.0909 1.0445
No log 6.8571 192 0.9915 0.5344 0.9915 0.9957
No log 6.9286 194 0.9256 0.6479 0.9256 0.9621
No log 7.0 196 0.8671 0.6438 0.8671 0.9312
No log 7.0714 198 0.8603 0.6928 0.8603 0.9276
No log 7.1429 200 0.8858 0.7020 0.8858 0.9411
No log 7.2143 202 1.0171 0.6375 1.0171 1.0085
No log 7.2857 204 1.2457 0.5988 1.2457 1.1161
No log 7.3571 206 1.5380 0.5562 1.5380 1.2402
No log 7.4286 208 1.6093 0.5263 1.6093 1.2686
No log 7.5 210 1.3656 0.5629 1.3656 1.1686
No log 7.5714 212 1.0536 0.6038 1.0536 1.0264
No log 7.6429 214 0.9322 0.6667 0.9322 0.9655
No log 7.7143 216 0.9329 0.6621 0.9329 0.9658
No log 7.7857 218 1.0043 0.6234 1.0043 1.0021
No log 7.8571 220 1.3174 0.5988 1.3174 1.1478
No log 7.9286 222 1.4838 0.5829 1.4838 1.2181
No log 8.0 224 1.2722 0.5988 1.2722 1.1279
No log 8.0714 226 0.9609 0.6294 0.9609 0.9803
No log 8.1429 228 0.8437 0.6618 0.8437 0.9185
No log 8.2143 230 0.8628 0.6957 0.8628 0.9288
No log 8.2857 232 0.8679 0.6957 0.8679 0.9316
No log 8.3571 234 0.8617 0.6861 0.8617 0.9283
No log 8.4286 236 0.8673 0.6528 0.8673 0.9313
No log 8.5 238 0.9459 0.6216 0.9459 0.9726
No log 8.5714 240 0.9603 0.6358 0.9603 0.9800
No log 8.6429 242 0.9363 0.6282 0.9363 0.9676
No log 8.7143 244 0.9271 0.6456 0.9271 0.9628
No log 8.7857 246 0.9033 0.6410 0.9033 0.9504
No log 8.8571 248 0.8709 0.6846 0.8709 0.9332
No log 8.9286 250 0.8766 0.6622 0.8766 0.9363
No log 9.0 252 0.9096 0.6358 0.9096 0.9537
No log 9.0714 254 0.9248 0.6667 0.9248 0.9617
No log 9.1429 256 0.9019 0.6667 0.9019 0.9497
No log 9.2143 258 0.9201 0.6536 0.9201 0.9592
No log 9.2857 260 0.8874 0.6434 0.8874 0.9420
No log 9.3571 262 0.9030 0.6812 0.9030 0.9503
No log 9.4286 264 1.0065 0.5672 1.0065 1.0032
No log 9.5 266 1.0159 0.5414 1.0159 1.0079
No log 9.5714 268 0.9244 0.5865 0.9244 0.9614
No log 9.6429 270 0.8359 0.6716 0.8359 0.9143
No log 9.7143 272 0.8329 0.6715 0.8329 0.9126
No log 9.7857 274 0.8532 0.6716 0.8532 0.9237
No log 9.8571 276 0.9340 0.6571 0.9340 0.9664
No log 9.9286 278 1.0571 0.5793 1.0571 1.0281
No log 10.0 280 1.0870 0.5890 1.0870 1.0426
No log 10.0714 282 1.0018 0.6043 1.0018 1.0009
No log 10.1429 284 0.8892 0.6857 0.8892 0.9430
No log 10.2143 286 0.8415 0.6815 0.8415 0.9173
No log 10.2857 288 0.8578 0.6715 0.8578 0.9262
No log 10.3571 290 0.8403 0.6912 0.8403 0.9167
No log 10.4286 292 0.8415 0.6861 0.8415 0.9173
No log 10.5 294 1.0129 0.6443 1.0129 1.0065
No log 10.5714 296 1.1991 0.5974 1.1991 1.0950
No log 10.6429 298 1.1783 0.5921 1.1783 1.0855
No log 10.7143 300 0.9905 0.6351 0.9905 0.9953
No log 10.7857 302 0.8457 0.6515 0.8457 0.9196
No log 10.8571 304 0.8310 0.7015 0.8310 0.9116
No log 10.9286 306 0.8150 0.6567 0.8150 0.9028
No log 11.0 308 0.8066 0.6853 0.8066 0.8981
No log 11.0714 310 0.9217 0.6994 0.9217 0.9600
No log 11.1429 312 1.0945 0.6347 1.0945 1.0462
No log 11.2143 314 1.0607 0.6506 1.0607 1.0299
No log 11.2857 316 0.9163 0.6875 0.9163 0.9572
No log 11.3571 318 0.8520 0.7237 0.8520 0.9231
No log 11.4286 320 0.8398 0.7050 0.8398 0.9164
No log 11.5 322 0.8367 0.7059 0.8367 0.9147
No log 11.5714 324 0.8339 0.6567 0.8339 0.9132
No log 11.6429 326 0.8421 0.6466 0.8421 0.9177
No log 11.7143 328 0.8446 0.7164 0.8446 0.9190
No log 11.7857 330 0.8694 0.6471 0.8694 0.9324
No log 11.8571 332 0.9207 0.5957 0.9207 0.9595
No log 11.9286 334 0.9851 0.6216 0.9851 0.9925
No log 12.0 336 1.0083 0.6358 1.0083 1.0041
No log 12.0714 338 0.9118 0.6443 0.9118 0.9549
No log 12.1429 340 0.8604 0.6933 0.8604 0.9276
No log 12.2143 342 0.8927 0.7013 0.8927 0.9448
No log 12.2857 344 0.9509 0.6335 0.9509 0.9751
No log 12.3571 346 1.1461 0.6182 1.1461 1.0706
No log 12.4286 348 1.2718 0.5939 1.2718 1.1278
No log 12.5 350 1.3228 0.5939 1.3228 1.1501
No log 12.5714 352 1.2008 0.6182 1.2008 1.0958
No log 12.6429 354 0.9530 0.6143 0.9530 0.9762
No log 12.7143 356 0.8354 0.6618 0.8354 0.9140
No log 12.7857 358 0.8132 0.6618 0.8132 0.9018
No log 12.8571 360 0.8291 0.6853 0.8291 0.9105
No log 12.9286 362 0.8500 0.7020 0.8500 0.9220
No log 13.0 364 0.8343 0.7237 0.8343 0.9134
No log 13.0714 366 0.7951 0.7114 0.7951 0.8917
No log 13.1429 368 0.7521 0.7162 0.7521 0.8672
No log 13.2143 370 0.7268 0.7467 0.7268 0.8525
No log 13.2857 372 0.7265 0.7432 0.7265 0.8523
No log 13.3571 374 0.7618 0.7403 0.7618 0.8728
No log 13.4286 376 0.7814 0.7403 0.7814 0.8840
No log 13.5 378 0.7768 0.6986 0.7768 0.8813
No log 13.5714 380 0.7920 0.6944 0.7920 0.8899
No log 13.6429 382 0.8119 0.7114 0.8119 0.9011
No log 13.7143 384 0.8213 0.72 0.8213 0.9062
No log 13.7857 386 0.8253 0.72 0.8253 0.9085
No log 13.8571 388 0.8196 0.72 0.8196 0.9053
No log 13.9286 390 0.8201 0.75 0.8201 0.9056
No log 14.0 392 0.8480 0.7485 0.8480 0.9209
No log 14.0714 394 0.9063 0.7030 0.9063 0.9520
No log 14.1429 396 1.0166 0.6587 1.0166 1.0083
No log 14.2143 398 0.9859 0.6747 0.9859 0.9929
No log 14.2857 400 0.8913 0.7215 0.8913 0.9441
No log 14.3571 402 0.8183 0.6939 0.8183 0.9046
No log 14.4286 404 0.7698 0.7222 0.7698 0.8774
No log 14.5 406 0.7681 0.6861 0.7681 0.8764
No log 14.5714 408 0.8109 0.6986 0.8109 0.9005
No log 14.6429 410 0.9077 0.6797 0.9077 0.9527
No log 14.7143 412 0.9156 0.6797 0.9156 0.9569
No log 14.7857 414 0.8553 0.7034 0.8553 0.9248
No log 14.8571 416 0.8008 0.6471 0.8008 0.8949
No log 14.9286 418 0.8034 0.6667 0.8034 0.8963
No log 15.0 420 0.8270 0.6260 0.8270 0.9094
No log 15.0714 422 0.8627 0.6107 0.8627 0.9288
No log 15.1429 424 0.8802 0.6154 0.8802 0.9382
No log 15.2143 426 0.8868 0.6364 0.8868 0.9417
No log 15.2857 428 0.8922 0.6260 0.8922 0.9445
No log 15.3571 430 0.9110 0.6212 0.9110 0.9545
No log 15.4286 432 0.9538 0.5385 0.9538 0.9766
No log 15.5 434 0.9902 0.5271 0.9902 0.9951
No log 15.5714 436 0.9708 0.5735 0.9708 0.9853
No log 15.6429 438 0.9151 0.6620 0.9151 0.9566
No log 15.7143 440 0.8571 0.6806 0.8571 0.9258
No log 15.7857 442 0.8561 0.6980 0.8561 0.9252
No log 15.8571 444 0.9683 0.6790 0.9683 0.9840
No log 15.9286 446 1.1532 0.6467 1.1532 1.0739
No log 16.0 448 1.3223 0.5814 1.3223 1.1499
No log 16.0714 450 1.2431 0.6182 1.2431 1.1149
No log 16.1429 452 0.9598 0.6490 0.9598 0.9797
No log 16.2143 454 0.7790 0.6950 0.7790 0.8826
No log 16.2857 456 0.7634 0.6866 0.7634 0.8737
No log 16.3571 458 0.7746 0.6917 0.7746 0.8801
No log 16.4286 460 0.7999 0.6316 0.7999 0.8944
No log 16.5 462 0.8353 0.6331 0.8353 0.9140
No log 16.5714 464 0.8659 0.6187 0.8659 0.9305
No log 16.6429 466 0.8871 0.6187 0.8871 0.9419
No log 16.7143 468 0.8346 0.6429 0.8346 0.9136
No log 16.7857 470 0.7864 0.6901 0.7864 0.8868
No log 16.8571 472 0.7593 0.7413 0.7593 0.8714
No log 16.9286 474 0.7613 0.7324 0.7613 0.8725
No log 17.0 476 0.7492 0.7324 0.7492 0.8656
No log 17.0714 478 0.7356 0.7153 0.7356 0.8577
No log 17.1429 480 0.7274 0.7338 0.7274 0.8529
No log 17.2143 482 0.7161 0.7313 0.7161 0.8463
No log 17.2857 484 0.7082 0.7465 0.7082 0.8415
No log 17.3571 486 0.7200 0.7324 0.7200 0.8485
No log 17.4286 488 0.7332 0.7222 0.7332 0.8563
No log 17.5 490 0.7433 0.7361 0.7433 0.8621
No log 17.5714 492 0.7545 0.7246 0.7545 0.8686
No log 17.6429 494 0.7591 0.7246 0.7591 0.8712
No log 17.7143 496 0.7592 0.7143 0.7592 0.8713
No log 17.7857 498 0.7673 0.6853 0.7673 0.8760
0.4018 17.8571 500 0.8069 0.7059 0.8069 0.8983
0.4018 17.9286 502 0.8320 0.7097 0.8320 0.9121
0.4018 18.0 504 0.8104 0.7097 0.8104 0.9002
0.4018 18.0714 506 0.7705 0.7097 0.7705 0.8778
0.4018 18.1429 508 0.7758 0.7097 0.7758 0.8808
0.4018 18.2143 510 0.8083 0.7097 0.8083 0.8991
0.4018 18.2857 512 0.7844 0.6806 0.7844 0.8857
0.4018 18.3571 514 0.7435 0.7376 0.7435 0.8623
0.4018 18.4286 516 0.7590 0.7206 0.7590 0.8712
0.4018 18.5 518 0.7671 0.7313 0.7671 0.8758
0.4018 18.5714 520 0.7801 0.7206 0.7801 0.8833
0.4018 18.6429 522 0.7721 0.7338 0.7721 0.8787
0.4018 18.7143 524 0.7506 0.7483 0.7506 0.8664
0.4018 18.7857 526 0.7367 0.7582 0.7367 0.8583
0.4018 18.8571 528 0.7219 0.7792 0.7219 0.8497
0.4018 18.9286 530 0.6951 0.7792 0.6951 0.8338
0.4018 19.0 532 0.6729 0.7568 0.6729 0.8203
0.4018 19.0714 534 0.7002 0.7724 0.7002 0.8368
0.4018 19.1429 536 0.7481 0.7467 0.7481 0.8649
0.4018 19.2143 538 0.7813 0.75 0.7813 0.8839
0.4018 19.2857 540 0.8305 0.7468 0.8305 0.9113
0.4018 19.3571 542 0.9268 0.6957 0.9268 0.9627
0.4018 19.4286 544 0.9991 0.6626 0.9991 0.9996
0.4018 19.5 546 0.9060 0.7037 0.9060 0.9518
0.4018 19.5714 548 0.7656 0.7226 0.7656 0.8750
0.4018 19.6429 550 0.7222 0.7347 0.7222 0.8498
0.4018 19.7143 552 0.7277 0.7234 0.7277 0.8530
0.4018 19.7857 554 0.7365 0.7286 0.7365 0.8582
0.4018 19.8571 556 0.7582 0.7 0.7582 0.8707
0.4018 19.9286 558 0.8122 0.6620 0.8122 0.9012
0.4018 20.0 560 0.8740 0.6286 0.8740 0.9349
0.4018 20.0714 562 0.9350 0.6241 0.9350 0.9670
0.4018 20.1429 564 0.9054 0.6143 0.9054 0.9515

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task1_organization

Finetuned
(4019)
this model