ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7667
  • Qwk: 0.4359
  • Mse: 0.7667
  • Rmse: 0.8756

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 4.6091 0.0010 4.6091 2.1469
No log 0.0870 4 2.5046 -0.0076 2.5046 1.5826
No log 0.1304 6 1.6178 0.0372 1.6178 1.2719
No log 0.1739 8 1.2649 0.2319 1.2649 1.1247
No log 0.2174 10 1.2956 -0.0783 1.2956 1.1383
No log 0.2609 12 1.4054 -0.0661 1.4054 1.1855
No log 0.3043 14 1.6061 -0.0757 1.6061 1.2673
No log 0.3478 16 2.3200 0.0638 2.3200 1.5231
No log 0.3913 18 2.1449 0.0405 2.1449 1.4645
No log 0.4348 20 1.5844 0.0169 1.5844 1.2587
No log 0.4783 22 1.3037 0.0926 1.3037 1.1418
No log 0.5217 24 1.3202 0.0408 1.3202 1.1490
No log 0.5652 26 1.2212 0.2314 1.2212 1.1051
No log 0.6087 28 1.2069 0.1144 1.2069 1.0986
No log 0.6522 30 1.2332 0.1211 1.2332 1.1105
No log 0.6957 32 1.2574 -0.0011 1.2574 1.1214
No log 0.7391 34 1.2789 0.1171 1.2789 1.1309
No log 0.7826 36 1.2205 0.0460 1.2205 1.1048
No log 0.8261 38 1.2457 0.0075 1.2457 1.1161
No log 0.8696 40 1.2141 0.0153 1.2141 1.1019
No log 0.9130 42 1.1329 0.3462 1.1329 1.0644
No log 0.9565 44 1.2337 0.1045 1.2337 1.1107
No log 1.0 46 1.1986 0.1959 1.1986 1.0948
No log 1.0435 48 1.1030 0.4157 1.1030 1.0502
No log 1.0870 50 1.0740 0.4157 1.0740 1.0363
No log 1.1304 52 1.1056 0.2786 1.1056 1.0515
No log 1.1739 54 1.0958 0.3839 1.0958 1.0468
No log 1.2174 56 0.9699 0.3744 0.9699 0.9848
No log 1.2609 58 1.1867 0.2799 1.1867 1.0894
No log 1.3043 60 1.5525 -0.0541 1.5525 1.2460
No log 1.3478 62 1.4384 0.0077 1.4384 1.1993
No log 1.3913 64 1.1896 0.0527 1.1896 1.0907
No log 1.4348 66 0.9832 0.1752 0.9832 0.9915
No log 1.4783 68 0.9675 0.4990 0.9675 0.9836
No log 1.5217 70 1.0623 0.3276 1.0623 1.0307
No log 1.5652 72 1.1095 0.3480 1.1095 1.0533
No log 1.6087 74 0.9404 0.4527 0.9404 0.9697
No log 1.6522 76 0.8744 0.3583 0.8744 0.9351
No log 1.6957 78 0.9648 0.2175 0.9648 0.9823
No log 1.7391 80 0.9420 0.3436 0.9420 0.9706
No log 1.7826 82 0.8096 0.4979 0.8096 0.8998
No log 1.8261 84 0.8418 0.5197 0.8418 0.9175
No log 1.8696 86 0.8206 0.5197 0.8206 0.9058
No log 1.9130 88 0.8109 0.4512 0.8109 0.9005
No log 1.9565 90 0.9972 0.3816 0.9972 0.9986
No log 2.0 92 1.0702 0.2857 1.0702 1.0345
No log 2.0435 94 0.8957 0.3493 0.8957 0.9464
No log 2.0870 96 0.8297 0.5301 0.8297 0.9109
No log 2.1304 98 0.8537 0.5437 0.8537 0.9239
No log 2.1739 100 0.8315 0.5515 0.8315 0.9119
No log 2.2174 102 0.7900 0.4563 0.7900 0.8888
No log 2.2609 104 0.7609 0.4563 0.7609 0.8723
No log 2.3043 106 0.8190 0.4202 0.8190 0.9050
No log 2.3478 108 0.9599 0.4222 0.9599 0.9797
No log 2.3913 110 1.0603 0.4060 1.0603 1.0297
No log 2.4348 112 0.8214 0.5264 0.8214 0.9063
No log 2.4783 114 0.7369 0.5958 0.7369 0.8584
No log 2.5217 116 0.7552 0.5992 0.7552 0.8690
No log 2.5652 118 0.9156 0.5094 0.9156 0.9569
No log 2.6087 120 1.3982 0.3634 1.3982 1.1825
No log 2.6522 122 1.5822 0.2838 1.5822 1.2579
No log 2.6957 124 1.2367 0.4723 1.2367 1.1121
No log 2.7391 126 0.8691 0.5102 0.8691 0.9322
No log 2.7826 128 0.8735 0.4712 0.8735 0.9346
No log 2.8261 130 0.8250 0.4420 0.8250 0.9083
No log 2.8696 132 0.8220 0.4143 0.8220 0.9067
No log 2.9130 134 0.9210 0.4703 0.9210 0.9597
No log 2.9565 136 0.9251 0.4703 0.9251 0.9618
No log 3.0 138 0.8116 0.4792 0.8116 0.9009
No log 3.0435 140 0.8287 0.4505 0.8287 0.9104
No log 3.0870 142 0.9791 0.4570 0.9791 0.9895
No log 3.1304 144 0.9339 0.5 0.9339 0.9664
No log 3.1739 146 0.7931 0.5744 0.7931 0.8906
No log 3.2174 148 0.7772 0.5532 0.7772 0.8816
No log 3.2609 150 0.7727 0.5455 0.7727 0.8790
No log 3.3043 152 0.8189 0.5058 0.8189 0.9049
No log 3.3478 154 0.9375 0.5124 0.9375 0.9683
No log 3.3913 156 0.9905 0.5238 0.9905 0.9952
No log 3.4348 158 0.8367 0.5374 0.8367 0.9147
No log 3.4783 160 0.7563 0.5552 0.7563 0.8696
No log 3.5217 162 0.7766 0.5451 0.7766 0.8812
No log 3.5652 164 1.0294 0.4767 1.0294 1.0146
No log 3.6087 166 1.2447 0.4593 1.2447 1.1157
No log 3.6522 168 1.1448 0.4590 1.1448 1.0700
No log 3.6957 170 0.9503 0.4771 0.9503 0.9748
No log 3.7391 172 0.8480 0.4764 0.8480 0.9209
No log 3.7826 174 0.9413 0.4862 0.9413 0.9702
No log 3.8261 176 1.0706 0.4767 1.0706 1.0347
No log 3.8696 178 1.0576 0.5041 1.0576 1.0284
No log 3.9130 180 1.0927 0.5206 1.0927 1.0453
No log 3.9565 182 1.1858 0.5167 1.1858 1.0889
No log 4.0 184 0.9850 0.4663 0.9850 0.9925
No log 4.0435 186 0.8347 0.4493 0.8347 0.9136
No log 4.0870 188 0.8691 0.4331 0.8691 0.9322
No log 4.1304 190 1.0774 0.4845 1.0774 1.0380
No log 4.1739 192 1.1317 0.4731 1.1317 1.0638
No log 4.2174 194 0.9003 0.4914 0.9003 0.9489
No log 4.2609 196 0.7777 0.5094 0.7777 0.8819
No log 4.3043 198 0.7618 0.5012 0.7618 0.8728
No log 4.3478 200 0.8395 0.4293 0.8395 0.9163
No log 4.3913 202 1.2038 0.4613 1.2038 1.0972
No log 4.4348 204 1.4733 0.3620 1.4733 1.2138
No log 4.4783 206 1.3668 0.4129 1.3667 1.1691
No log 4.5217 208 1.0777 0.4453 1.0777 1.0381
No log 4.5652 210 0.8153 0.4690 0.8153 0.9029
No log 4.6087 212 0.7534 0.4656 0.7534 0.8680
No log 4.6522 214 0.7559 0.4603 0.7559 0.8694
No log 4.6957 216 0.9261 0.5280 0.9261 0.9623
No log 4.7391 218 1.2772 0.4231 1.2772 1.1301
No log 4.7826 220 1.4582 0.3964 1.4582 1.2076
No log 4.8261 222 1.3179 0.4131 1.3179 1.1480
No log 4.8696 224 1.0256 0.4625 1.0256 1.0127
No log 4.9130 226 0.8017 0.4202 0.8017 0.8954
No log 4.9565 228 0.7516 0.5144 0.7516 0.8670
No log 5.0 230 0.7558 0.4681 0.7558 0.8694
No log 5.0435 232 0.7815 0.4006 0.7815 0.8840
No log 5.0870 234 0.9090 0.4751 0.9090 0.9534
No log 5.1304 236 0.9085 0.5224 0.9085 0.9531
No log 5.1739 238 0.7780 0.5352 0.7780 0.8821
No log 5.2174 240 0.7889 0.5427 0.7889 0.8882
No log 5.2609 242 0.8160 0.5427 0.8160 0.9033
No log 5.3043 244 0.7934 0.5427 0.7934 0.8907
No log 5.3478 246 0.7548 0.4879 0.7548 0.8688
No log 5.3913 248 0.7766 0.4377 0.7766 0.8813
No log 5.4348 250 0.7988 0.3902 0.7988 0.8938
No log 5.4783 252 0.8004 0.4142 0.8004 0.8947
No log 5.5217 254 0.7590 0.4181 0.7590 0.8712
No log 5.5652 256 0.7467 0.4385 0.7467 0.8641
No log 5.6087 258 0.7449 0.4560 0.7449 0.8631
No log 5.6522 260 0.7274 0.4898 0.7274 0.8529
No log 5.6957 262 0.7492 0.5744 0.7492 0.8655
No log 5.7391 264 0.7625 0.5112 0.7625 0.8732
No log 5.7826 266 0.7830 0.4042 0.7830 0.8849
No log 5.8261 268 0.8520 0.4351 0.8520 0.9230
No log 5.8696 270 0.8391 0.3590 0.8391 0.9160
No log 5.9130 272 0.7877 0.4571 0.7877 0.8875
No log 5.9565 274 0.7926 0.4571 0.7926 0.8903
No log 6.0 276 0.8199 0.4343 0.8199 0.9055
No log 6.0435 278 0.8175 0.4611 0.8175 0.9042
No log 6.0870 280 0.8131 0.3738 0.8131 0.9017
No log 6.1304 282 0.8148 0.3983 0.8148 0.9026
No log 6.1739 284 0.8567 0.3464 0.8567 0.9256
No log 6.2174 286 1.0004 0.4610 1.0004 1.0002
No log 6.2609 288 1.0829 0.4186 1.0829 1.0406
No log 6.3043 290 1.0064 0.4186 1.0064 1.0032
No log 6.3478 292 0.9017 0.3945 0.9017 0.9496
No log 6.3913 294 0.8101 0.3661 0.8101 0.9001
No log 6.4348 296 0.7925 0.4929 0.7925 0.8902
No log 6.4783 298 0.7931 0.4290 0.7931 0.8906
No log 6.5217 300 0.8638 0.4546 0.8638 0.9294
No log 6.5652 302 0.9460 0.4946 0.9460 0.9726
No log 6.6087 304 1.0085 0.4883 1.0085 1.0042
No log 6.6522 306 0.9389 0.4781 0.9389 0.9690
No log 6.6957 308 0.9114 0.4165 0.9114 0.9547
No log 6.7391 310 0.9058 0.3369 0.9058 0.9517
No log 6.7826 312 0.8668 0.3352 0.8668 0.9310
No log 6.8261 314 0.8120 0.3117 0.8120 0.9011
No log 6.8696 316 0.8381 0.4639 0.8381 0.9155
No log 6.9130 318 0.9609 0.5605 0.9609 0.9803
No log 6.9565 320 0.9592 0.5605 0.9592 0.9794
No log 7.0 322 0.7656 0.5359 0.7656 0.8750
No log 7.0435 324 0.6725 0.5291 0.6725 0.8200
No log 7.0870 326 0.7141 0.5862 0.7141 0.8450
No log 7.1304 328 0.7242 0.5779 0.7242 0.8510
No log 7.1739 330 0.7007 0.5578 0.7007 0.8371
No log 7.2174 332 0.7555 0.4754 0.7555 0.8692
No log 7.2609 334 0.7858 0.5353 0.7858 0.8865
No log 7.3043 336 0.7547 0.5136 0.7547 0.8688
No log 7.3478 338 0.7226 0.4965 0.7226 0.8501
No log 7.3913 340 0.7235 0.5155 0.7235 0.8506
No log 7.4348 342 0.7468 0.4996 0.7468 0.8642
No log 7.4783 344 0.8828 0.4958 0.8828 0.9396
No log 7.5217 346 1.0144 0.5159 1.0144 1.0072
No log 7.5652 348 1.1163 0.4396 1.1163 1.0566
No log 7.6087 350 0.9951 0.5029 0.9951 0.9975
No log 7.6522 352 0.8573 0.3953 0.8573 0.9259
No log 7.6957 354 0.8474 0.4159 0.8474 0.9205
No log 7.7391 356 0.8844 0.3943 0.8844 0.9404
No log 7.7826 358 0.9347 0.3854 0.9347 0.9668
No log 7.8261 360 0.9775 0.4418 0.9775 0.9887
No log 7.8696 362 1.0250 0.4975 1.0250 1.0124
No log 7.9130 364 1.1209 0.4924 1.1209 1.0587
No log 7.9565 366 1.0505 0.5134 1.0505 1.0249
No log 8.0 368 0.8934 0.4479 0.8934 0.9452
No log 8.0435 370 0.8198 0.3463 0.8198 0.9054
No log 8.0870 372 0.8233 0.3663 0.8233 0.9073
No log 8.1304 374 0.8639 0.4565 0.8639 0.9295
No log 8.1739 376 0.9046 0.5 0.9046 0.9511
No log 8.2174 378 0.9199 0.5071 0.9199 0.9591
No log 8.2609 380 0.9115 0.4291 0.9115 0.9547
No log 8.3043 382 0.8297 0.3596 0.8297 0.9109
No log 8.3478 384 0.8408 0.4305 0.8408 0.9169
No log 8.3913 386 0.8730 0.5037 0.8730 0.9343
No log 8.4348 388 0.8139 0.4139 0.8139 0.9022
No log 8.4783 390 0.8302 0.4143 0.8302 0.9112
No log 8.5217 392 0.9009 0.4773 0.9009 0.9492
No log 8.5652 394 0.9462 0.4825 0.9462 0.9727
No log 8.6087 396 0.8942 0.5065 0.8942 0.9456
No log 8.6522 398 0.8054 0.5229 0.8054 0.8975
No log 8.6957 400 0.7812 0.5422 0.7812 0.8839
No log 8.7391 402 0.8073 0.5212 0.8073 0.8985
No log 8.7826 404 0.9501 0.4894 0.9501 0.9747
No log 8.8261 406 0.9845 0.5040 0.9845 0.9922
No log 8.8696 408 0.9030 0.5716 0.9030 0.9502
No log 8.9130 410 0.8516 0.5801 0.8516 0.9228
No log 8.9565 412 0.9068 0.5779 0.9068 0.9523
No log 9.0 414 1.0627 0.5300 1.0627 1.0309
No log 9.0435 416 1.0494 0.5300 1.0494 1.0244
No log 9.0870 418 0.9051 0.4130 0.9051 0.9514
No log 9.1304 420 0.7949 0.3217 0.7949 0.8916
No log 9.1739 422 0.7994 0.3070 0.7994 0.8941
No log 9.2174 424 0.8461 0.3217 0.8461 0.9198
No log 9.2609 426 0.9020 0.4639 0.9020 0.9497
No log 9.3043 428 0.9750 0.5488 0.9750 0.9874
No log 9.3478 430 0.9987 0.5293 0.9987 0.9994
No log 9.3913 432 0.9225 0.5 0.9225 0.9605
No log 9.4348 434 0.8427 0.5189 0.8427 0.9180
No log 9.4783 436 0.8074 0.5719 0.8074 0.8986
No log 9.5217 438 0.7950 0.5172 0.7950 0.8916
No log 9.5652 440 0.9162 0.5980 0.9162 0.9572
No log 9.6087 442 1.1090 0.5198 1.1090 1.0531
No log 9.6522 444 1.2066 0.5143 1.2066 1.0984
No log 9.6957 446 1.1607 0.4978 1.1607 1.0773
No log 9.7391 448 0.9872 0.5086 0.9872 0.9936
No log 9.7826 450 0.8491 0.4623 0.8491 0.9215
No log 9.8261 452 0.8028 0.4796 0.8028 0.8960
No log 9.8696 454 0.8450 0.5876 0.8450 0.9192
No log 9.9130 456 0.8748 0.5258 0.8748 0.9353
No log 9.9565 458 0.9200 0.5083 0.9200 0.9592
No log 10.0 460 0.9969 0.5015 0.9969 0.9984
No log 10.0435 462 0.9531 0.3958 0.9531 0.9763
No log 10.0870 464 0.9005 0.4240 0.9005 0.9489
No log 10.1304 466 0.8440 0.3728 0.8440 0.9187
No log 10.1739 468 0.8335 0.3873 0.8335 0.9130
No log 10.2174 470 0.8410 0.3873 0.8410 0.9171
No log 10.2609 472 0.8836 0.3147 0.8836 0.9400
No log 10.3043 474 0.9084 0.3289 0.9084 0.9531
No log 10.3478 476 0.9016 0.3794 0.9016 0.9495
No log 10.3913 478 0.9051 0.4463 0.9051 0.9514
No log 10.4348 480 0.8431 0.5182 0.8431 0.9182
No log 10.4783 482 0.7800 0.4820 0.7800 0.8832
No log 10.5217 484 0.7703 0.4902 0.7703 0.8777
No log 10.5652 486 0.8264 0.5164 0.8264 0.9091
No log 10.6087 488 0.9579 0.5174 0.9579 0.9787
No log 10.6522 490 1.1086 0.5094 1.1086 1.0529
No log 10.6957 492 1.0918 0.5094 1.0918 1.0449
No log 10.7391 494 0.9545 0.5190 0.9545 0.9770
No log 10.7826 496 0.8087 0.5353 0.8087 0.8993
No log 10.8261 498 0.7448 0.5380 0.7448 0.8630
0.3798 10.8696 500 0.7400 0.5380 0.7400 0.8602
0.3798 10.9130 502 0.7311 0.6041 0.7311 0.8551
0.3798 10.9565 504 0.7450 0.5815 0.7450 0.8631
0.3798 11.0 506 0.7570 0.5815 0.7570 0.8700
0.3798 11.0435 508 0.7381 0.4980 0.7381 0.8591
0.3798 11.0870 510 0.7378 0.4995 0.7378 0.8590
0.3798 11.1304 512 0.7214 0.5274 0.7214 0.8493
0.3798 11.1739 514 0.7238 0.5595 0.7238 0.8508
0.3798 11.2174 516 0.7711 0.5519 0.7711 0.8781
0.3798 11.2609 518 0.8295 0.5015 0.8295 0.9108
0.3798 11.3043 520 0.8101 0.4165 0.8101 0.9001
0.3798 11.3478 522 0.7657 0.4359 0.7657 0.8751
0.3798 11.3913 524 0.7612 0.4737 0.7612 0.8724
0.3798 11.4348 526 0.7667 0.4359 0.7667 0.8756

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task2_organization

Finetuned
(4023)
this model