ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7622
  • Qwk: 0.5539
  • Mse: 0.7622
  • Rmse: 0.8730

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 4.0853 -0.0077 4.0853 2.0212
No log 0.1667 4 2.2430 0.0628 2.2430 1.4977
No log 0.25 6 1.5719 -0.0180 1.5719 1.2537
No log 0.3333 8 1.3303 0.0380 1.3303 1.1534
No log 0.4167 10 1.1478 0.2030 1.1478 1.0714
No log 0.5 12 1.2256 0.0170 1.2256 1.1071
No log 0.5833 14 1.6320 0.0778 1.6320 1.2775
No log 0.6667 16 1.8982 0.1271 1.8982 1.3777
No log 0.75 18 1.9210 0.1152 1.9210 1.3860
No log 0.8333 20 1.3855 0.0226 1.3855 1.1771
No log 0.9167 22 1.0623 0.2366 1.0623 1.0307
No log 1.0 24 1.0424 0.1881 1.0424 1.0210
No log 1.0833 26 1.0667 0.1727 1.0667 1.0328
No log 1.1667 28 1.0645 0.1532 1.0645 1.0317
No log 1.25 30 1.1201 0.2611 1.1201 1.0584
No log 1.3333 32 1.2437 0.0116 1.2437 1.1152
No log 1.4167 34 1.3203 0.0 1.3203 1.1490
No log 1.5 36 1.4404 0.0399 1.4404 1.2002
No log 1.5833 38 1.4465 0.1136 1.4465 1.2027
No log 1.6667 40 1.3855 0.1488 1.3855 1.1771
No log 1.75 42 1.2939 0.1716 1.2939 1.1375
No log 1.8333 44 1.1204 0.2315 1.1204 1.0585
No log 1.9167 46 0.9724 0.3105 0.9724 0.9861
No log 2.0 48 0.9491 0.2967 0.9491 0.9742
No log 2.0833 50 1.0155 0.3348 1.0155 1.0077
No log 2.1667 52 0.9914 0.2885 0.9914 0.9957
No log 2.25 54 1.0741 0.2749 1.0741 1.0364
No log 2.3333 56 1.1407 0.2979 1.1407 1.0680
No log 2.4167 58 0.9429 0.2934 0.9429 0.9710
No log 2.5 60 0.8263 0.4590 0.8263 0.9090
No log 2.5833 62 0.8194 0.4610 0.8194 0.9052
No log 2.6667 64 0.8705 0.4728 0.8705 0.9330
No log 2.75 66 0.8150 0.5212 0.8150 0.9028
No log 2.8333 68 0.8362 0.4336 0.8362 0.9144
No log 2.9167 70 0.9131 0.4940 0.9131 0.9556
No log 3.0 72 0.9625 0.5126 0.9625 0.9811
No log 3.0833 74 0.9512 0.4918 0.9512 0.9753
No log 3.1667 76 0.8965 0.4663 0.8965 0.9468
No log 3.25 78 0.9329 0.3836 0.9329 0.9659
No log 3.3333 80 1.0013 0.3617 1.0013 1.0007
No log 3.4167 82 0.9442 0.3908 0.9442 0.9717
No log 3.5 84 0.8984 0.4460 0.8984 0.9479
No log 3.5833 86 0.9016 0.4613 0.9016 0.9495
No log 3.6667 88 0.9423 0.4159 0.9423 0.9707
No log 3.75 90 1.0267 0.3886 1.0267 1.0133
No log 3.8333 92 0.9698 0.4345 0.9698 0.9848
No log 3.9167 94 0.8560 0.4938 0.8560 0.9252
No log 4.0 96 0.9151 0.4708 0.9151 0.9566
No log 4.0833 98 0.9441 0.3879 0.9441 0.9716
No log 4.1667 100 0.8478 0.4547 0.8478 0.9207
No log 4.25 102 0.9446 0.3786 0.9446 0.9719
No log 4.3333 104 1.1067 0.4267 1.1067 1.0520
No log 4.4167 106 1.0797 0.4008 1.0797 1.0391
No log 4.5 108 0.9223 0.3611 0.9223 0.9604
No log 4.5833 110 0.9641 0.4620 0.9641 0.9819
No log 4.6667 112 1.0922 0.3174 1.0922 1.0451
No log 4.75 114 1.1005 0.3243 1.1005 1.0490
No log 4.8333 116 0.9300 0.4588 0.9300 0.9644
No log 4.9167 118 0.8522 0.4138 0.8522 0.9231
No log 5.0 120 0.9624 0.3694 0.9624 0.9810
No log 5.0833 122 1.0648 0.3694 1.0648 1.0319
No log 5.1667 124 0.9976 0.4483 0.9976 0.9988
No log 5.25 126 0.8762 0.3918 0.8762 0.9361
No log 5.3333 128 0.8758 0.4770 0.8758 0.9359
No log 5.4167 130 0.9541 0.4241 0.9541 0.9768
No log 5.5 132 0.9052 0.4962 0.9052 0.9514
No log 5.5833 134 0.8774 0.4604 0.8774 0.9367
No log 5.6667 136 0.8942 0.5169 0.8942 0.9456
No log 5.75 138 0.8349 0.4829 0.8349 0.9137
No log 5.8333 140 0.7957 0.5204 0.7957 0.8920
No log 5.9167 142 0.7709 0.5303 0.7709 0.8780
No log 6.0 144 0.7794 0.5303 0.7794 0.8829
No log 6.0833 146 0.8016 0.5632 0.8016 0.8953
No log 6.1667 148 0.8651 0.4565 0.8651 0.9301
No log 6.25 150 0.8843 0.4781 0.8843 0.9404
No log 6.3333 152 0.8516 0.4985 0.8516 0.9228
No log 6.4167 154 0.7645 0.5152 0.7645 0.8744
No log 6.5 156 0.7280 0.5503 0.7280 0.8533
No log 6.5833 158 0.7318 0.5060 0.7318 0.8555
No log 6.6667 160 0.7420 0.4608 0.7420 0.8614
No log 6.75 162 0.6924 0.5644 0.6924 0.8321
No log 6.8333 164 0.6865 0.5835 0.6865 0.8286
No log 6.9167 166 0.6807 0.5597 0.6807 0.8250
No log 7.0 168 0.6719 0.5635 0.6719 0.8197
No log 7.0833 170 0.6979 0.5217 0.6979 0.8354
No log 7.1667 172 0.6596 0.5934 0.6596 0.8122
No log 7.25 174 0.6993 0.6035 0.6993 0.8363
No log 7.3333 176 0.6912 0.6063 0.6912 0.8314
No log 7.4167 178 0.6732 0.6154 0.6732 0.8205
No log 7.5 180 0.8391 0.4171 0.8391 0.9160
No log 7.5833 182 0.8757 0.4171 0.8757 0.9358
No log 7.6667 184 0.7196 0.6032 0.7196 0.8483
No log 7.75 186 0.6865 0.5913 0.6865 0.8286
No log 7.8333 188 0.6897 0.5891 0.6897 0.8305
No log 7.9167 190 0.7001 0.6259 0.7001 0.8367
No log 8.0 192 0.8686 0.4096 0.8686 0.9320
No log 8.0833 194 0.9200 0.4258 0.9200 0.9592
No log 8.1667 196 0.7763 0.5828 0.7763 0.8811
No log 8.25 198 0.7211 0.5884 0.7211 0.8492
No log 8.3333 200 0.7310 0.6085 0.7310 0.8550
No log 8.4167 202 0.7920 0.5418 0.7920 0.8900
No log 8.5 204 0.8073 0.5513 0.8073 0.8985
No log 8.5833 206 0.7592 0.5898 0.7592 0.8713
No log 8.6667 208 0.7343 0.5558 0.7343 0.8569
No log 8.75 210 0.7388 0.5433 0.7388 0.8595
No log 8.8333 212 0.7356 0.5433 0.7356 0.8577
No log 8.9167 214 0.6934 0.5577 0.6934 0.8327
No log 9.0 216 0.6781 0.6186 0.6781 0.8235
No log 9.0833 218 0.6733 0.6349 0.6733 0.8205
No log 9.1667 220 0.6725 0.6215 0.6725 0.8201
No log 9.25 222 0.7117 0.6107 0.7117 0.8436
No log 9.3333 224 0.6808 0.5941 0.6808 0.8251
No log 9.4167 226 0.6734 0.6103 0.6734 0.8206
No log 9.5 228 0.6726 0.5931 0.6726 0.8201
No log 9.5833 230 0.6678 0.5945 0.6678 0.8172
No log 9.6667 232 0.6853 0.6121 0.6853 0.8278
No log 9.75 234 0.7715 0.5852 0.7715 0.8783
No log 9.8333 236 0.7537 0.6148 0.7537 0.8682
No log 9.9167 238 0.6959 0.6005 0.6959 0.8342
No log 10.0 240 0.7334 0.6035 0.7334 0.8564
No log 10.0833 242 0.8333 0.4834 0.8333 0.9129
No log 10.1667 244 0.8986 0.4829 0.8986 0.9479
No log 10.25 246 0.8335 0.5141 0.8335 0.9130
No log 10.3333 248 0.7746 0.5954 0.7746 0.8801
No log 10.4167 250 0.7535 0.5635 0.7535 0.8681
No log 10.5 252 0.7431 0.5050 0.7431 0.8620
No log 10.5833 254 0.7593 0.5074 0.7593 0.8714
No log 10.6667 256 0.7837 0.5152 0.7837 0.8852
No log 10.75 258 0.7827 0.5436 0.7827 0.8847
No log 10.8333 260 0.7961 0.5253 0.7961 0.8923
No log 10.9167 262 0.7878 0.5649 0.7878 0.8876
No log 11.0 264 0.7828 0.6019 0.7828 0.8848
No log 11.0833 266 0.7931 0.5667 0.7931 0.8906
No log 11.1667 268 0.7996 0.5695 0.7996 0.8942
No log 11.25 270 0.7641 0.5158 0.7641 0.8741
No log 11.3333 272 0.7485 0.4770 0.7485 0.8652
No log 11.4167 274 0.7641 0.4981 0.7641 0.8741
No log 11.5 276 0.7448 0.4981 0.7448 0.8630
No log 11.5833 278 0.7231 0.4770 0.7231 0.8503
No log 11.6667 280 0.7323 0.5671 0.7323 0.8557
No log 11.75 282 0.7517 0.5673 0.7517 0.8670
No log 11.8333 284 0.7505 0.6186 0.7505 0.8663
No log 11.9167 286 0.7365 0.5773 0.7365 0.8582
No log 12.0 288 0.7338 0.5443 0.7338 0.8566
No log 12.0833 290 0.7389 0.5450 0.7389 0.8596
No log 12.1667 292 0.7174 0.5343 0.7174 0.8470
No log 12.25 294 0.7088 0.5796 0.7088 0.8419
No log 12.3333 296 0.7220 0.5909 0.7220 0.8497
No log 12.4167 298 0.7307 0.5898 0.7307 0.8548
No log 12.5 300 0.7433 0.5969 0.7433 0.8622
No log 12.5833 302 0.7356 0.6104 0.7356 0.8577
No log 12.6667 304 0.7396 0.5969 0.7396 0.8600
No log 12.75 306 0.7357 0.5985 0.7357 0.8578
No log 12.8333 308 0.7284 0.5690 0.7284 0.8535
No log 12.9167 310 0.7321 0.5343 0.7321 0.8556
No log 13.0 312 0.7438 0.5690 0.7438 0.8624
No log 13.0833 314 0.7559 0.5651 0.7559 0.8695
No log 13.1667 316 0.7694 0.6051 0.7694 0.8772
No log 13.25 318 0.7502 0.6051 0.7502 0.8661
No log 13.3333 320 0.7252 0.5902 0.7252 0.8516
No log 13.4167 322 0.7119 0.5690 0.7119 0.8438
No log 13.5 324 0.7006 0.5690 0.7006 0.8370
No log 13.5833 326 0.6873 0.5809 0.6873 0.8291
No log 13.6667 328 0.6969 0.5797 0.6969 0.8348
No log 13.75 330 0.6944 0.5797 0.6944 0.8333
No log 13.8333 332 0.6990 0.5891 0.6990 0.8361
No log 13.9167 334 0.7050 0.5902 0.7050 0.8396
No log 14.0 336 0.7375 0.5678 0.7375 0.8588
No log 14.0833 338 0.7971 0.5505 0.7971 0.8928
No log 14.1667 340 0.8010 0.5317 0.8010 0.8950
No log 14.25 342 0.7778 0.5463 0.7778 0.8819
No log 14.3333 344 0.7781 0.5463 0.7781 0.8821
No log 14.4167 346 0.7784 0.5463 0.7784 0.8823
No log 14.5 348 0.8009 0.5657 0.8009 0.8949
No log 14.5833 350 0.7993 0.5678 0.7993 0.8940
No log 14.6667 352 0.7951 0.5680 0.7951 0.8917
No log 14.75 354 0.8250 0.5400 0.8250 0.9083
No log 14.8333 356 0.8427 0.5311 0.8427 0.9180
No log 14.9167 358 0.7902 0.5989 0.7902 0.8889
No log 15.0 360 0.7537 0.5329 0.7537 0.8681
No log 15.0833 362 0.8077 0.5473 0.8077 0.8987
No log 15.1667 364 0.9110 0.5328 0.9110 0.9544
No log 15.25 366 0.9138 0.5328 0.9138 0.9559
No log 15.3333 368 0.8212 0.5378 0.8212 0.9062
No log 15.4167 370 0.7482 0.5797 0.7482 0.8650
No log 15.5 372 0.7484 0.5587 0.7484 0.8651
No log 15.5833 374 0.7724 0.6005 0.7724 0.8789
No log 15.6667 376 0.7732 0.5443 0.7732 0.8793
No log 15.75 378 0.7882 0.5416 0.7882 0.8878
No log 15.8333 380 0.8398 0.5374 0.8398 0.9164
No log 15.9167 382 0.9163 0.5023 0.9163 0.9572
No log 16.0 384 0.9213 0.4820 0.9213 0.9599
No log 16.0833 386 0.8622 0.5365 0.8622 0.9285
No log 16.1667 388 0.8105 0.5469 0.8105 0.9003
No log 16.25 390 0.7501 0.6044 0.7501 0.8661
No log 16.3333 392 0.7570 0.5806 0.7570 0.8700
No log 16.4167 394 0.7549 0.5895 0.7549 0.8688
No log 16.5 396 0.7360 0.5921 0.7360 0.8579
No log 16.5833 398 0.7337 0.5102 0.7337 0.8565
No log 16.6667 400 0.7356 0.5111 0.7356 0.8577
No log 16.75 402 0.7372 0.5224 0.7372 0.8586
No log 16.8333 404 0.7414 0.5224 0.7414 0.8610
No log 16.9167 406 0.7495 0.5557 0.7495 0.8657
No log 17.0 408 0.7733 0.6166 0.7733 0.8794
No log 17.0833 410 0.8241 0.6184 0.8241 0.9078
No log 17.1667 412 0.7946 0.6525 0.7946 0.8914
No log 17.25 414 0.7465 0.5815 0.7465 0.8640
No log 17.3333 416 0.7258 0.5528 0.7258 0.8520
No log 17.4167 418 0.7087 0.5528 0.7087 0.8418
No log 17.5 420 0.6910 0.5797 0.6910 0.8312
No log 17.5833 422 0.7017 0.5797 0.7017 0.8377
No log 17.6667 424 0.7104 0.5797 0.7104 0.8428
No log 17.75 426 0.7115 0.5917 0.7115 0.8435
No log 17.8333 428 0.7106 0.5797 0.7106 0.8430
No log 17.9167 430 0.7216 0.4982 0.7216 0.8495
No log 18.0 432 0.7690 0.5137 0.7690 0.8769
No log 18.0833 434 0.8112 0.5344 0.8112 0.9007
No log 18.1667 436 0.7932 0.4926 0.7932 0.8906
No log 18.25 438 0.7587 0.5333 0.7587 0.8710
No log 18.3333 440 0.7298 0.5577 0.7298 0.8543
No log 18.4167 442 0.7189 0.5797 0.7189 0.8479
No log 18.5 444 0.7157 0.5552 0.7157 0.8460
No log 18.5833 446 0.7193 0.5545 0.7193 0.8481
No log 18.6667 448 0.7504 0.5536 0.7504 0.8662
No log 18.75 450 0.7739 0.5314 0.7739 0.8797
No log 18.8333 452 0.7579 0.5044 0.7579 0.8706
No log 18.9167 454 0.7518 0.4976 0.7518 0.8671
No log 19.0 456 0.7668 0.4737 0.7668 0.8757
No log 19.0833 458 0.7495 0.4867 0.7495 0.8657
No log 19.1667 460 0.7126 0.5990 0.7126 0.8442
No log 19.25 462 0.7239 0.5898 0.7239 0.8508
No log 19.3333 464 0.7228 0.5796 0.7228 0.8502
No log 19.4167 466 0.7127 0.5582 0.7127 0.8442
No log 19.5 468 0.7484 0.5774 0.7484 0.8651
No log 19.5833 470 0.8203 0.4723 0.8203 0.9057
No log 19.6667 472 0.9005 0.5417 0.9005 0.9490
No log 19.75 474 0.8864 0.5265 0.8864 0.9415
No log 19.8333 476 0.8329 0.5170 0.8329 0.9126
No log 19.9167 478 0.7927 0.4843 0.7927 0.8903
No log 20.0 480 0.7471 0.5678 0.7471 0.8643
No log 20.0833 482 0.7126 0.5354 0.7126 0.8442
No log 20.1667 484 0.7032 0.5466 0.7032 0.8386
No log 20.25 486 0.7084 0.5545 0.7084 0.8417
No log 20.3333 488 0.7104 0.5742 0.7104 0.8428
No log 20.4167 490 0.7439 0.5517 0.7439 0.8625
No log 20.5 492 0.7760 0.4962 0.7759 0.8809
No log 20.5833 494 0.7778 0.5654 0.7778 0.8819
No log 20.6667 496 0.7554 0.4982 0.7554 0.8692
No log 20.75 498 0.7362 0.4871 0.7362 0.8580
0.2749 20.8333 500 0.7275 0.5752 0.7275 0.8530
0.2749 20.9167 502 0.7914 0.5466 0.7914 0.8896
0.2749 21.0 504 0.8523 0.5386 0.8523 0.9232
0.2749 21.0833 506 0.8618 0.5386 0.8618 0.9283
0.2749 21.1667 508 0.8372 0.4593 0.8372 0.9150
0.2749 21.25 510 0.7890 0.5838 0.7890 0.8883
0.2749 21.3333 512 0.7610 0.4876 0.7610 0.8723
0.2749 21.4167 514 0.7595 0.5203 0.7595 0.8715
0.2749 21.5 516 0.7511 0.5203 0.7511 0.8667
0.2749 21.5833 518 0.7495 0.5528 0.7495 0.8658
0.2749 21.6667 520 0.7622 0.5539 0.7622 0.8730

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k7_task5_organization

Finetuned
(4019)
this model