ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9033
  • Qwk: 0.3020
  • Mse: 0.9033
  • Rmse: 0.9504

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 4.6028 -0.0020 4.6028 2.1454
No log 0.1081 4 2.6751 -0.0233 2.6751 1.6356
No log 0.1622 6 2.0758 0.0790 2.0758 1.4408
No log 0.2162 8 1.5391 0.0372 1.5391 1.2406
No log 0.2703 10 1.4275 0.0372 1.4275 1.1948
No log 0.3243 12 1.5702 0.0724 1.5702 1.2531
No log 0.3784 14 1.8326 0.0724 1.8326 1.3537
No log 0.4324 16 1.6497 0.0724 1.6497 1.2844
No log 0.4865 18 1.3910 -0.0066 1.3910 1.1794
No log 0.5405 20 1.3889 0.1195 1.3889 1.1785
No log 0.5946 22 1.6540 0.0889 1.6540 1.2861
No log 0.6486 24 2.3634 0.1592 2.3634 1.5373
No log 0.7027 26 2.2501 0.1243 2.2501 1.5000
No log 0.7568 28 1.7910 0.1459 1.7910 1.3383
No log 0.8108 30 1.4742 0.0682 1.4742 1.2142
No log 0.8649 32 1.2948 0.1974 1.2948 1.1379
No log 0.9189 34 1.2107 0.2402 1.2107 1.1003
No log 0.9730 36 1.2730 0.1882 1.2730 1.1283
No log 1.0270 38 1.4683 0.0455 1.4683 1.2117
No log 1.0811 40 1.5491 0.0145 1.5491 1.2446
No log 1.1351 42 1.4829 0.0145 1.4829 1.2177
No log 1.1892 44 1.3035 0.1537 1.3035 1.1417
No log 1.2432 46 1.2738 0.1899 1.2738 1.1286
No log 1.2973 48 1.3391 0.1986 1.3391 1.1572
No log 1.3514 50 1.3376 0.1449 1.3376 1.1565
No log 1.4054 52 1.6990 0.1851 1.6990 1.3035
No log 1.4595 54 2.2019 0.2157 2.2019 1.4839
No log 1.5135 56 2.1916 0.2095 2.1916 1.4804
No log 1.5676 58 1.6614 0.1922 1.6614 1.2890
No log 1.6216 60 1.0673 0.3635 1.0673 1.0331
No log 1.6757 62 0.9685 0.3678 0.9685 0.9841
No log 1.7297 64 0.9444 0.3510 0.9444 0.9718
No log 1.7838 66 1.2733 0.2115 1.2733 1.1284
No log 1.8378 68 1.9495 0.2528 1.9495 1.3963
No log 1.8919 70 2.1388 0.1604 2.1388 1.4625
No log 1.9459 72 2.1601 0.1831 2.1601 1.4697
No log 2.0 74 1.8944 0.2472 1.8944 1.3764
No log 2.0541 76 1.6351 0.2756 1.6351 1.2787
No log 2.1081 78 1.2950 0.2037 1.2950 1.1380
No log 2.1622 80 1.2128 0.2706 1.2128 1.1013
No log 2.2162 82 1.1561 0.2791 1.1561 1.0752
No log 2.2703 84 1.1540 0.2589 1.1540 1.0742
No log 2.3243 86 1.0464 0.3380 1.0464 1.0229
No log 2.3784 88 0.9191 0.3925 0.9191 0.9587
No log 2.4324 90 0.8130 0.5287 0.8130 0.9017
No log 2.4865 92 0.7532 0.5093 0.7532 0.8679
No log 2.5405 94 0.7720 0.4852 0.7720 0.8786
No log 2.5946 96 0.7224 0.5606 0.7224 0.8499
No log 2.6486 98 0.7477 0.5025 0.7477 0.8647
No log 2.7027 100 0.8155 0.5174 0.8155 0.9030
No log 2.7568 102 0.8051 0.5339 0.8051 0.8973
No log 2.8108 104 0.7461 0.5522 0.7461 0.8638
No log 2.8649 106 0.8902 0.5774 0.8902 0.9435
No log 2.9189 108 0.9795 0.5868 0.9795 0.9897
No log 2.9730 110 1.1695 0.4433 1.1695 1.0815
No log 3.0270 112 0.8327 0.6026 0.8327 0.9125
No log 3.0811 114 0.7171 0.6609 0.7171 0.8468
No log 3.1351 116 0.8080 0.5107 0.8080 0.8989
No log 3.1892 118 0.6849 0.5902 0.6849 0.8276
No log 3.2432 120 1.0033 0.5233 1.0033 1.0017
No log 3.2973 122 1.1141 0.4879 1.1141 1.0555
No log 3.3514 124 1.2039 0.3806 1.2039 1.0972
No log 3.4054 126 0.8564 0.5630 0.8564 0.9254
No log 3.4595 128 0.7464 0.4261 0.7464 0.8640
No log 3.5135 130 0.7894 0.4908 0.7894 0.8885
No log 3.5676 132 0.7811 0.4116 0.7811 0.8838
No log 3.6216 134 0.8928 0.4116 0.8928 0.9449
No log 3.6757 136 0.9250 0.4116 0.9250 0.9618
No log 3.7297 138 0.8454 0.3646 0.8454 0.9195
No log 3.7838 140 0.8095 0.4591 0.8095 0.8997
No log 3.8378 142 0.8165 0.4591 0.8165 0.9036
No log 3.8919 144 0.8330 0.4794 0.8330 0.9127
No log 3.9459 146 0.8834 0.4341 0.8834 0.9399
No log 4.0 148 0.8568 0.4500 0.8568 0.9257
No log 4.0541 150 0.8928 0.4212 0.8928 0.9449
No log 4.1081 152 0.8832 0.5069 0.8832 0.9398
No log 4.1622 154 0.8392 0.4699 0.8392 0.9161
No log 4.2162 156 0.8687 0.4337 0.8687 0.9320
No log 4.2703 158 0.8590 0.4104 0.8590 0.9268
No log 4.3243 160 0.8961 0.3777 0.8961 0.9466
No log 4.3784 162 0.9056 0.3505 0.9056 0.9516
No log 4.4324 164 0.8312 0.3796 0.8312 0.9117
No log 4.4865 166 0.8649 0.3700 0.8649 0.9300
No log 4.5405 168 1.0101 0.3459 1.0101 1.0050
No log 4.5946 170 0.9576 0.3721 0.9576 0.9786
No log 4.6486 172 0.7944 0.3728 0.7944 0.8913
No log 4.7027 174 1.0409 0.4748 1.0409 1.0202
No log 4.7568 176 1.0072 0.4612 1.0072 1.0036
No log 4.8108 178 0.8086 0.4848 0.8086 0.8992
No log 4.8649 180 0.8427 0.4291 0.8427 0.9180
No log 4.9189 182 0.8566 0.4378 0.8566 0.9255
No log 4.9730 184 0.7938 0.5517 0.7938 0.8910
No log 5.0270 186 0.7685 0.5245 0.7685 0.8766
No log 5.0811 188 0.8573 0.4755 0.8573 0.9259
No log 5.1351 190 0.7581 0.5072 0.7581 0.8707
No log 5.1892 192 0.8357 0.4700 0.8357 0.9142
No log 5.2432 194 1.1849 0.4266 1.1849 1.0885
No log 5.2973 196 1.1904 0.4154 1.1904 1.0911
No log 5.3514 198 0.9183 0.4572 0.9183 0.9583
No log 5.4054 200 0.7431 0.5244 0.7431 0.8620
No log 5.4595 202 0.7646 0.4828 0.7646 0.8744
No log 5.5135 204 0.7860 0.4752 0.7860 0.8866
No log 5.5676 206 0.7914 0.4598 0.7914 0.8896
No log 5.6216 208 0.7834 0.4465 0.7834 0.8851
No log 5.6757 210 0.8394 0.5007 0.8394 0.9162
No log 5.7297 212 0.8177 0.4500 0.8177 0.9043
No log 5.7838 214 0.7840 0.4860 0.7840 0.8854
No log 5.8378 216 0.8368 0.4765 0.8368 0.9148
No log 5.8919 218 0.8499 0.4402 0.8499 0.9219
No log 5.9459 220 0.7921 0.4498 0.7921 0.8900
No log 6.0 222 0.7950 0.4498 0.7950 0.8916
No log 6.0541 224 0.7943 0.4119 0.7943 0.8912
No log 6.1081 226 0.8154 0.4476 0.8154 0.9030
No log 6.1622 228 0.8191 0.4700 0.8191 0.9051
No log 6.2162 230 0.7916 0.4813 0.7916 0.8897
No log 6.2703 232 0.7876 0.4860 0.7876 0.8874
No log 6.3243 234 0.7882 0.5351 0.7882 0.8878
No log 6.3784 236 0.7844 0.5552 0.7844 0.8857
No log 6.4324 238 0.7797 0.5479 0.7797 0.8830
No log 6.4865 240 0.7927 0.5173 0.7927 0.8903
No log 6.5405 242 0.7776 0.5450 0.7776 0.8818
No log 6.5946 244 0.7626 0.5205 0.7626 0.8733
No log 6.6486 246 0.7756 0.5740 0.7756 0.8807
No log 6.7027 248 0.7504 0.4993 0.7504 0.8663
No log 6.7568 250 0.7513 0.5025 0.7513 0.8668
No log 6.8108 252 0.7641 0.5393 0.7641 0.8741
No log 6.8649 254 0.7594 0.4847 0.7594 0.8715
No log 6.9189 256 0.7654 0.4476 0.7654 0.8749
No log 6.9730 258 0.7525 0.4912 0.7525 0.8675
No log 7.0270 260 0.8084 0.5339 0.8084 0.8991
No log 7.0811 262 0.8391 0.4654 0.8391 0.9160
No log 7.1351 264 0.7848 0.4471 0.7848 0.8859
No log 7.1892 266 0.7718 0.4242 0.7718 0.8785
No log 7.2432 268 0.8461 0.3915 0.8461 0.9199
No log 7.2973 270 0.8177 0.3854 0.8177 0.9043
No log 7.3514 272 0.7742 0.4012 0.7742 0.8799
No log 7.4054 274 0.7591 0.4260 0.7591 0.8713
No log 7.4595 276 0.7633 0.5080 0.7633 0.8736
No log 7.5135 278 0.7431 0.4931 0.7431 0.8620
No log 7.5676 280 0.7582 0.4799 0.7582 0.8707
No log 7.6216 282 0.7220 0.5274 0.7220 0.8497
No log 7.6757 284 0.7292 0.4977 0.7292 0.8539
No log 7.7297 286 0.7801 0.5524 0.7801 0.8832
No log 7.7838 288 0.7848 0.4632 0.7848 0.8859
No log 7.8378 290 0.7627 0.4661 0.7627 0.8733
No log 7.8919 292 0.7516 0.5076 0.7516 0.8670
No log 7.9459 294 0.7410 0.5312 0.7410 0.8608
No log 8.0 296 0.7681 0.5763 0.7681 0.8764
No log 8.0541 298 0.8082 0.5576 0.8082 0.8990
No log 8.1081 300 0.7734 0.4696 0.7734 0.8794
No log 8.1622 302 0.8226 0.4343 0.8226 0.9070
No log 8.2162 304 0.9171 0.3989 0.9171 0.9577
No log 8.2703 306 0.8870 0.4028 0.8870 0.9418
No log 8.3243 308 0.8213 0.3822 0.8213 0.9063
No log 8.3784 310 0.8833 0.5085 0.8833 0.9399
No log 8.4324 312 0.9929 0.4988 0.9929 0.9964
No log 8.4865 314 0.9426 0.5161 0.9426 0.9709
No log 8.5405 316 0.8173 0.5267 0.8173 0.9040
No log 8.5946 318 0.7806 0.5450 0.7806 0.8835
No log 8.6486 320 0.7921 0.5270 0.7921 0.8900
No log 8.7027 322 0.7915 0.5270 0.7915 0.8896
No log 8.7568 324 0.7764 0.4563 0.7764 0.8812
No log 8.8108 326 0.7749 0.4236 0.7749 0.8803
No log 8.8649 328 0.7643 0.4197 0.7643 0.8742
No log 8.9189 330 0.7700 0.4284 0.7700 0.8775
No log 8.9730 332 0.7965 0.5362 0.7965 0.8925
No log 9.0270 334 0.8377 0.4261 0.8377 0.9153
No log 9.0811 336 0.7710 0.5292 0.7710 0.8780
No log 9.1351 338 0.7483 0.4652 0.7483 0.8651
No log 9.1892 340 0.7664 0.3756 0.7664 0.8754
No log 9.2432 342 0.7789 0.3762 0.7789 0.8826
No log 9.2973 344 0.8445 0.3961 0.8445 0.9190
No log 9.3514 346 0.9036 0.3250 0.9036 0.9506
No log 9.4054 348 0.8544 0.3278 0.8544 0.9244
No log 9.4595 350 0.8442 0.3704 0.8442 0.9188
No log 9.5135 352 0.9243 0.4191 0.9243 0.9614
No log 9.5676 354 0.8840 0.4250 0.8840 0.9402
No log 9.6216 356 0.7861 0.3660 0.7861 0.8866
No log 9.6757 358 0.8204 0.5362 0.8204 0.9057
No log 9.7297 360 0.8507 0.4998 0.8507 0.9223
No log 9.7838 362 0.7948 0.5292 0.7948 0.8915
No log 9.8378 364 0.7770 0.3762 0.7770 0.8815
No log 9.8919 366 0.8249 0.4513 0.8249 0.9082
No log 9.9459 368 0.8383 0.4513 0.8383 0.9156
No log 10.0 370 0.8166 0.3762 0.8166 0.9036
No log 10.0541 372 0.8455 0.3840 0.8455 0.9195
No log 10.1081 374 0.8567 0.3840 0.8567 0.9256
No log 10.1622 376 0.8186 0.3719 0.8186 0.9048
No log 10.2162 378 0.8268 0.4513 0.8268 0.9093
No log 10.2703 380 0.8109 0.4513 0.8109 0.9005
No log 10.3243 382 0.7641 0.4806 0.7641 0.8742
No log 10.3784 384 0.7890 0.5421 0.7890 0.8883
No log 10.4324 386 0.8289 0.5029 0.8289 0.9104
No log 10.4865 388 0.8062 0.4948 0.8062 0.8979
No log 10.5405 390 0.7759 0.4158 0.7759 0.8809
No log 10.5946 392 0.8406 0.4943 0.8406 0.9169
No log 10.6486 394 0.8852 0.4638 0.8852 0.9409
No log 10.7027 396 0.8488 0.4413 0.8488 0.9213
No log 10.7568 398 0.8421 0.3615 0.8421 0.9177
No log 10.8108 400 0.8899 0.3431 0.8899 0.9433
No log 10.8649 402 0.9522 0.3124 0.9522 0.9758
No log 10.9189 404 1.0084 0.2791 1.0084 1.0042
No log 10.9730 406 0.9823 0.3202 0.9823 0.9911
No log 11.0270 408 0.9799 0.3202 0.9799 0.9899
No log 11.0811 410 0.8841 0.3551 0.8841 0.9403
No log 11.1351 412 0.7958 0.3629 0.7958 0.8921
No log 11.1892 414 0.7645 0.3974 0.7645 0.8744
No log 11.2432 416 0.7477 0.4019 0.7477 0.8647
No log 11.2973 418 0.7680 0.5329 0.7680 0.8764
No log 11.3514 420 0.8031 0.5595 0.8031 0.8961
No log 11.4054 422 0.8617 0.4666 0.8617 0.9283
No log 11.4595 424 0.8468 0.3896 0.8468 0.9202
No log 11.5135 426 0.8003 0.4119 0.8003 0.8946
No log 11.5676 428 0.7937 0.4507 0.7937 0.8909
No log 11.6216 430 0.7955 0.4507 0.7955 0.8919
No log 11.6757 432 0.7851 0.4297 0.7851 0.8861
No log 11.7297 434 0.8133 0.4100 0.8133 0.9018
No log 11.7838 436 0.8809 0.5210 0.8809 0.9385
No log 11.8378 438 0.8972 0.4743 0.8972 0.9472
No log 11.8919 440 0.8461 0.5250 0.8461 0.9199
No log 11.9459 442 0.8132 0.5045 0.8132 0.9018
No log 12.0 444 0.7955 0.4583 0.7955 0.8919
No log 12.0541 446 0.8178 0.3879 0.8178 0.9043
No log 12.1081 448 0.8960 0.3478 0.8960 0.9466
No log 12.1622 450 0.9178 0.3478 0.9178 0.9580
No log 12.2162 452 0.8559 0.3374 0.8559 0.9251
No log 12.2703 454 0.7989 0.4334 0.7989 0.8938
No log 12.3243 456 0.8399 0.5068 0.8399 0.9165
No log 12.3784 458 0.9033 0.4989 0.9033 0.9504
No log 12.4324 460 0.9318 0.4796 0.9318 0.9653
No log 12.4865 462 0.9185 0.4755 0.9185 0.9584
No log 12.5405 464 0.8382 0.4542 0.8382 0.9155
No log 12.5946 466 0.8194 0.4321 0.8194 0.9052
No log 12.6486 468 0.8364 0.4519 0.8364 0.9145
No log 12.7027 470 0.8289 0.4423 0.8289 0.9105
No log 12.7568 472 0.8448 0.4549 0.8448 0.9191
No log 12.8108 474 0.9849 0.4419 0.9849 0.9924
No log 12.8649 476 1.2560 0.3200 1.2560 1.1207
No log 12.9189 478 1.3477 0.3609 1.3477 1.1609
No log 12.9730 480 1.2332 0.3058 1.2332 1.1105
No log 13.0270 482 1.1676 0.3848 1.1676 1.0805
No log 13.0811 484 0.9796 0.4717 0.9796 0.9897
No log 13.1351 486 0.8128 0.4197 0.8128 0.9015
No log 13.1892 488 0.8353 0.3641 0.8353 0.9139
No log 13.2432 490 0.8813 0.3700 0.8813 0.9388
No log 13.2973 492 0.8629 0.3374 0.8629 0.9289
No log 13.3514 494 0.8455 0.3663 0.8455 0.9195
No log 13.4054 496 0.8870 0.4175 0.8870 0.9418
No log 13.4595 498 0.9340 0.4115 0.9340 0.9664
0.3156 13.5135 500 0.9251 0.4115 0.9251 0.9618
0.3156 13.5676 502 0.8848 0.3832 0.8848 0.9406
0.3156 13.6216 504 0.8565 0.3866 0.8565 0.9255
0.3156 13.6757 506 0.8670 0.3734 0.8670 0.9311
0.3156 13.7297 508 0.8871 0.3174 0.8871 0.9418
0.3156 13.7838 510 0.9033 0.3020 0.9033 0.9504

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

Finetuned
(4019)
this model