ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8661
  • Qwk: 0.5202
  • Mse: 0.8661
  • Rmse: 0.9306

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 3.9022 0.0151 3.9022 1.9754
No log 0.0702 4 2.1316 0.0909 2.1316 1.4600
No log 0.1053 6 0.9571 0.1207 0.9571 0.9783
No log 0.1404 8 0.7224 0.2327 0.7224 0.8499
No log 0.1754 10 0.6560 0.3203 0.6560 0.8099
No log 0.2105 12 0.7122 0.2333 0.7122 0.8439
No log 0.2456 14 0.7358 0.1288 0.7358 0.8578
No log 0.2807 16 0.6871 0.1372 0.6871 0.8289
No log 0.3158 18 0.6552 0.1920 0.6552 0.8094
No log 0.3509 20 0.6468 0.3217 0.6468 0.8043
No log 0.3860 22 0.5817 0.2147 0.5817 0.7627
No log 0.4211 24 0.5748 0.2851 0.5748 0.7582
No log 0.4561 26 0.5851 0.2767 0.5851 0.7649
No log 0.4912 28 0.5897 0.3187 0.5897 0.7679
No log 0.5263 30 0.6281 0.3269 0.6281 0.7926
No log 0.5614 32 0.6777 0.4332 0.6777 0.8232
No log 0.5965 34 0.6857 0.4357 0.6857 0.8281
No log 0.6316 36 0.6328 0.4237 0.6328 0.7955
No log 0.6667 38 0.5947 0.4560 0.5947 0.7712
No log 0.7018 40 0.5729 0.3998 0.5729 0.7569
No log 0.7368 42 0.5738 0.4688 0.5738 0.7575
No log 0.7719 44 0.5956 0.4983 0.5956 0.7718
No log 0.8070 46 0.6297 0.4714 0.6297 0.7935
No log 0.8421 48 0.6773 0.4802 0.6773 0.8230
No log 0.8772 50 0.8673 0.4181 0.8673 0.9313
No log 0.9123 52 0.9414 0.3771 0.9414 0.9702
No log 0.9474 54 0.7754 0.4504 0.7754 0.8806
No log 0.9825 56 0.6410 0.4637 0.6410 0.8006
No log 1.0175 58 0.6617 0.4752 0.6617 0.8135
No log 1.0526 60 0.7548 0.4149 0.7548 0.8688
No log 1.0877 62 0.8719 0.3715 0.8719 0.9337
No log 1.1228 64 0.9231 0.3664 0.9231 0.9608
No log 1.1579 66 0.9483 0.3622 0.9483 0.9738
No log 1.1930 68 0.9653 0.3727 0.9653 0.9825
No log 1.2281 70 1.0745 0.3688 1.0745 1.0366
No log 1.2632 72 1.0240 0.3823 1.0240 1.0119
No log 1.2982 74 0.9592 0.4106 0.9592 0.9794
No log 1.3333 76 1.0345 0.4351 1.0345 1.0171
No log 1.3684 78 1.0907 0.4227 1.0907 1.0444
No log 1.4035 80 1.1962 0.4447 1.1962 1.0937
No log 1.4386 82 1.2962 0.3994 1.2962 1.1385
No log 1.4737 84 1.3831 0.3897 1.3831 1.1760
No log 1.5088 86 1.1082 0.4559 1.1082 1.0527
No log 1.5439 88 0.9848 0.4934 0.9848 0.9924
No log 1.5789 90 0.9413 0.4188 0.9413 0.9702
No log 1.6140 92 0.9456 0.4354 0.9456 0.9724
No log 1.6491 94 1.0308 0.4770 1.0308 1.0153
No log 1.6842 96 1.3183 0.4024 1.3183 1.1482
No log 1.7193 98 1.2033 0.4448 1.2033 1.0969
No log 1.7544 100 0.9472 0.4817 0.9472 0.9733
No log 1.7895 102 0.8666 0.5144 0.8666 0.9309
No log 1.8246 104 0.8135 0.4610 0.8135 0.9020
No log 1.8596 106 0.9193 0.4624 0.9193 0.9588
No log 1.8947 108 0.9776 0.4922 0.9776 0.9887
No log 1.9298 110 0.8383 0.4123 0.8383 0.9156
No log 1.9649 112 0.8671 0.4091 0.8671 0.9312
No log 2.0 114 0.9174 0.4223 0.9174 0.9578
No log 2.0351 116 1.0549 0.4616 1.0549 1.0271
No log 2.0702 118 1.0494 0.4271 1.0494 1.0244
No log 2.1053 120 0.9814 0.4826 0.9814 0.9907
No log 2.1404 122 1.0562 0.4466 1.0562 1.0277
No log 2.1754 124 1.1706 0.4720 1.1706 1.0819
No log 2.2105 126 1.2760 0.4340 1.2760 1.1296
No log 2.2456 128 1.2838 0.4397 1.2838 1.1331
No log 2.2807 130 1.1342 0.4607 1.1342 1.0650
No log 2.3158 132 1.0750 0.4865 1.0750 1.0368
No log 2.3509 134 1.1185 0.4696 1.1185 1.0576
No log 2.3860 136 1.2417 0.4379 1.2417 1.1143
No log 2.4211 138 1.1047 0.4223 1.1047 1.0511
No log 2.4561 140 0.9906 0.4688 0.9906 0.9953
No log 2.4912 142 1.0098 0.4426 1.0098 1.0049
No log 2.5263 144 1.3287 0.4239 1.3287 1.1527
No log 2.5614 146 1.5117 0.3824 1.5117 1.2295
No log 2.5965 148 1.5693 0.3787 1.5693 1.2527
No log 2.6316 150 1.2137 0.4440 1.2137 1.1017
No log 2.6667 152 0.8149 0.5003 0.8149 0.9027
No log 2.7018 154 0.7545 0.5240 0.7545 0.8686
No log 2.7368 156 0.8017 0.5090 0.8017 0.8954
No log 2.7719 158 1.0694 0.4756 1.0694 1.0341
No log 2.8070 160 1.2204 0.4114 1.2204 1.1047
No log 2.8421 162 1.1340 0.4425 1.1340 1.0649
No log 2.8772 164 0.9738 0.4441 0.9738 0.9868
No log 2.9123 166 0.7992 0.4693 0.7992 0.8940
No log 2.9474 168 0.7896 0.4734 0.7896 0.8886
No log 2.9825 170 0.9195 0.4286 0.9195 0.9589
No log 3.0175 172 1.0190 0.4298 1.0190 1.0095
No log 3.0526 174 0.9813 0.4543 0.9813 0.9906
No log 3.0877 176 0.9170 0.4400 0.9170 0.9576
No log 3.1228 178 0.9634 0.4425 0.9634 0.9815
No log 3.1579 180 1.0769 0.4384 1.0769 1.0377
No log 3.1930 182 1.1696 0.4425 1.1696 1.0815
No log 3.2281 184 1.2425 0.4413 1.2425 1.1147
No log 3.2632 186 1.3469 0.4414 1.3469 1.1606
No log 3.2982 188 1.3278 0.4339 1.3278 1.1523
No log 3.3333 190 1.3026 0.4327 1.3026 1.1413
No log 3.3684 192 1.2650 0.4018 1.2650 1.1247
No log 3.4035 194 1.1128 0.4364 1.1128 1.0549
No log 3.4386 196 0.9322 0.4669 0.9322 0.9655
No log 3.4737 198 0.8476 0.4935 0.8476 0.9206
No log 3.5088 200 0.8368 0.5015 0.8368 0.9148
No log 3.5439 202 0.9280 0.4583 0.9280 0.9633
No log 3.5789 204 0.9853 0.4583 0.9853 0.9926
No log 3.6140 206 1.0635 0.4597 1.0635 1.0313
No log 3.6491 208 1.0141 0.4598 1.0141 1.0070
No log 3.6842 210 1.0807 0.4584 1.0807 1.0396
No log 3.7193 212 1.2637 0.4366 1.2637 1.1241
No log 3.7544 214 1.3691 0.3795 1.3691 1.1701
No log 3.7895 216 1.2432 0.4063 1.2432 1.1150
No log 3.8246 218 1.2623 0.3973 1.2623 1.1235
No log 3.8596 220 1.1390 0.4141 1.1390 1.0672
No log 3.8947 222 1.0346 0.4514 1.0346 1.0172
No log 3.9298 224 1.0667 0.4600 1.0667 1.0328
No log 3.9649 226 1.1548 0.4600 1.1548 1.0746
No log 4.0 228 1.1895 0.4690 1.1895 1.0906
No log 4.0351 230 1.0836 0.4904 1.0836 1.0410
No log 4.0702 232 0.9828 0.5030 0.9828 0.9914
No log 4.1053 234 0.8905 0.5127 0.8905 0.9437
No log 4.1404 236 0.8692 0.5228 0.8692 0.9323
No log 4.1754 238 0.8407 0.5266 0.8407 0.9169
No log 4.2105 240 0.8564 0.5168 0.8564 0.9254
No log 4.2456 242 0.9269 0.5221 0.9269 0.9628
No log 4.2807 244 0.9239 0.5320 0.9239 0.9612
No log 4.3158 246 0.8967 0.5132 0.8967 0.9469
No log 4.3509 248 0.9219 0.4689 0.9219 0.9601
No log 4.3860 250 0.9737 0.4433 0.9737 0.9868
No log 4.4211 252 1.0063 0.4492 1.0063 1.0032
No log 4.4561 254 1.0161 0.4494 1.0161 1.0080
No log 4.4912 256 1.0107 0.4495 1.0107 1.0053
No log 4.5263 258 0.9725 0.4495 0.9725 0.9861
No log 4.5614 260 0.9282 0.4826 0.9282 0.9635
No log 4.5965 262 0.9049 0.4566 0.9049 0.9513
No log 4.6316 264 0.8790 0.4543 0.8790 0.9375
No log 4.6667 266 0.8713 0.4765 0.8713 0.9334
No log 4.7018 268 0.8571 0.4711 0.8571 0.9258
No log 4.7368 270 0.8836 0.4629 0.8836 0.9400
No log 4.7719 272 0.8604 0.4833 0.8604 0.9276
No log 4.8070 274 0.8185 0.5306 0.8185 0.9047
No log 4.8421 276 0.8514 0.4977 0.8514 0.9227
No log 4.8772 278 0.9889 0.4674 0.9889 0.9944
No log 4.9123 280 1.2408 0.4250 1.2408 1.1139
No log 4.9474 282 1.2537 0.4034 1.2537 1.1197
No log 4.9825 284 1.0671 0.4367 1.0671 1.0330
No log 5.0175 286 0.8266 0.5154 0.8266 0.9092
No log 5.0526 288 0.7488 0.5008 0.7488 0.8653
No log 5.0877 290 0.7450 0.4942 0.7450 0.8631
No log 5.1228 292 0.8094 0.5442 0.8094 0.8997
No log 5.1579 294 0.9357 0.5174 0.9357 0.9673
No log 5.1930 296 1.0084 0.494 1.0084 1.0042
No log 5.2281 298 0.9642 0.4960 0.9642 0.9819
No log 5.2632 300 0.8270 0.5090 0.8270 0.9094
No log 5.2982 302 0.7617 0.5084 0.7617 0.8727
No log 5.3333 304 0.7646 0.5175 0.7646 0.8744
No log 5.3684 306 0.8084 0.4863 0.8084 0.8991
No log 5.4035 308 0.9704 0.4939 0.9704 0.9851
No log 5.4386 310 1.1021 0.4815 1.1021 1.0498
No log 5.4737 312 1.0787 0.4868 1.0787 1.0386
No log 5.5088 314 0.9473 0.5019 0.9473 0.9733
No log 5.5439 316 0.8408 0.5300 0.8408 0.9170
No log 5.5789 318 0.8265 0.5268 0.8265 0.9091
No log 5.6140 320 0.8858 0.5557 0.8858 0.9412
No log 5.6491 322 1.0472 0.4766 1.0472 1.0233
No log 5.6842 324 1.1938 0.4504 1.1938 1.0926
No log 5.7193 326 1.1674 0.4652 1.1674 1.0805
No log 5.7544 328 1.0510 0.4723 1.0510 1.0252
No log 5.7895 330 0.9914 0.4903 0.9914 0.9957
No log 5.8246 332 1.0556 0.4929 1.0556 1.0274
No log 5.8596 334 1.2451 0.4498 1.2451 1.1159
No log 5.8947 336 1.3396 0.4353 1.3396 1.1574
No log 5.9298 338 1.2730 0.4354 1.2730 1.1283
No log 5.9649 340 1.0614 0.4545 1.0614 1.0302
No log 6.0 342 0.8855 0.5120 0.8855 0.9410
No log 6.0351 344 0.7773 0.5828 0.7773 0.8817
No log 6.0702 346 0.7425 0.5935 0.7425 0.8617
No log 6.1053 348 0.7411 0.5797 0.7411 0.8609
No log 6.1404 350 0.8172 0.4984 0.8172 0.9040
No log 6.1754 352 0.9149 0.4770 0.9149 0.9565
No log 6.2105 354 0.9812 0.4656 0.9812 0.9906
No log 6.2456 356 0.9395 0.4822 0.9395 0.9693
No log 6.2807 358 0.9477 0.4826 0.9477 0.9735
No log 6.3158 360 0.9316 0.4883 0.9316 0.9652
No log 6.3509 362 0.9380 0.4883 0.9380 0.9685
No log 6.3860 364 0.9093 0.4833 0.9093 0.9536
No log 6.4211 366 0.8492 0.5349 0.8492 0.9215
No log 6.4561 368 0.8101 0.5570 0.8101 0.9000
No log 6.4912 370 0.8173 0.5570 0.8173 0.9041
No log 6.5263 372 0.7990 0.5652 0.7990 0.8939
No log 6.5614 374 0.7846 0.5681 0.7846 0.8858
No log 6.5965 376 0.7939 0.5681 0.7939 0.8910
No log 6.6316 378 0.7936 0.5681 0.7936 0.8909
No log 6.6667 380 0.8280 0.5504 0.8280 0.9099
No log 6.7018 382 0.8461 0.5291 0.8461 0.9198
No log 6.7368 384 0.9042 0.4979 0.9042 0.9509
No log 6.7719 386 1.0144 0.4634 1.0144 1.0072
No log 6.8070 388 1.0977 0.4683 1.0977 1.0477
No log 6.8421 390 1.1381 0.4730 1.1381 1.0668
No log 6.8772 392 1.1165 0.4834 1.1165 1.0567
No log 6.9123 394 1.1318 0.4828 1.1318 1.0639
No log 6.9474 396 1.0732 0.4860 1.0732 1.0360
No log 6.9825 398 1.0099 0.5030 1.0099 1.0049
No log 7.0175 400 1.0154 0.4918 1.0154 1.0077
No log 7.0526 402 0.9696 0.4918 0.9696 0.9847
No log 7.0877 404 0.9472 0.4974 0.9472 0.9732
No log 7.1228 406 0.9624 0.4798 0.9624 0.9810
No log 7.1579 408 0.9890 0.4684 0.9890 0.9945
No log 7.1930 410 0.9883 0.4684 0.9883 0.9941
No log 7.2281 412 0.9537 0.4882 0.9537 0.9766
No log 7.2632 414 0.8894 0.5374 0.8894 0.9431
No log 7.2982 416 0.8453 0.5348 0.8453 0.9194
No log 7.3333 418 0.8458 0.5082 0.8458 0.9197
No log 7.3684 420 0.8771 0.5426 0.8771 0.9365
No log 7.4035 422 0.8842 0.5195 0.8842 0.9403
No log 7.4386 424 0.9085 0.5145 0.9085 0.9532
No log 7.4737 426 0.9097 0.5298 0.9097 0.9538
No log 7.5088 428 0.8820 0.5142 0.8820 0.9392
No log 7.5439 430 0.8324 0.5283 0.8324 0.9124
No log 7.5789 432 0.8054 0.5357 0.8054 0.8975
No log 7.6140 434 0.7892 0.5610 0.7892 0.8884
No log 7.6491 436 0.7885 0.5623 0.7885 0.8880
No log 7.6842 438 0.8052 0.5709 0.8052 0.8974
No log 7.7193 440 0.8619 0.5267 0.8619 0.9284
No log 7.7544 442 0.9638 0.5018 0.9638 0.9817
No log 7.7895 444 1.0778 0.4918 1.0778 1.0382
No log 7.8246 446 1.1630 0.4559 1.1630 1.0784
No log 7.8596 448 1.1913 0.4451 1.1913 1.0915
No log 7.8947 450 1.1458 0.4495 1.1458 1.0704
No log 7.9298 452 1.0712 0.4752 1.0712 1.0350
No log 7.9649 454 1.0546 0.4752 1.0546 1.0269
No log 8.0 456 1.0179 0.4880 1.0179 1.0089
No log 8.0351 458 0.9772 0.4880 0.9772 0.9886
No log 8.0702 460 0.9854 0.4888 0.9854 0.9927
No log 8.1053 462 0.9859 0.4946 0.9859 0.9929
No log 8.1404 464 1.0273 0.4833 1.0273 1.0136
No log 8.1754 466 1.0666 0.4775 1.0666 1.0328
No log 8.2105 468 1.0642 0.4775 1.0642 1.0316
No log 8.2456 470 1.0317 0.4892 1.0317 1.0157
No log 8.2807 472 0.9797 0.4974 0.9797 0.9898
No log 8.3158 474 0.9168 0.5147 0.9168 0.9575
No log 8.3509 476 0.8543 0.5435 0.8543 0.9243
No log 8.3860 478 0.8243 0.5819 0.8243 0.9079
No log 8.4211 480 0.8227 0.5560 0.8227 0.9070
No log 8.4561 482 0.8321 0.5658 0.8321 0.9122
No log 8.4912 484 0.8499 0.5398 0.8499 0.9219
No log 8.5263 486 0.8689 0.5325 0.8689 0.9321
No log 8.5614 488 0.8789 0.5240 0.8789 0.9375
No log 8.5965 490 0.8773 0.5188 0.8773 0.9367
No log 8.6316 492 0.8723 0.5188 0.8723 0.9340
No log 8.6667 494 0.8651 0.5188 0.8651 0.9301
No log 8.7018 496 0.8551 0.5125 0.8551 0.9247
No log 8.7368 498 0.8547 0.5125 0.8547 0.9245
0.4433 8.7719 500 0.8481 0.5079 0.8481 0.9209
0.4433 8.8070 502 0.8272 0.5142 0.8272 0.9095
0.4433 8.8421 504 0.8198 0.5269 0.8198 0.9054
0.4433 8.8772 506 0.8124 0.5420 0.8124 0.9013
0.4433 8.9123 508 0.8126 0.5484 0.8126 0.9014
0.4433 8.9474 510 0.8191 0.5673 0.8191 0.9050
0.4433 8.9825 512 0.8312 0.5622 0.8312 0.9117
0.4433 9.0175 514 0.8482 0.5428 0.8482 0.9210
0.4433 9.0526 516 0.8688 0.5370 0.8688 0.9321
0.4433 9.0877 518 0.8856 0.5370 0.8856 0.9411
0.4433 9.1228 520 0.9001 0.5406 0.9001 0.9488
0.4433 9.1579 522 0.9159 0.5347 0.9159 0.9570
0.4433 9.1930 524 0.9170 0.5347 0.9170 0.9576
0.4433 9.2281 526 0.9111 0.5347 0.9111 0.9545
0.4433 9.2632 528 0.9085 0.5360 0.9085 0.9532
0.4433 9.2982 530 0.8984 0.5314 0.8984 0.9478
0.4433 9.3333 532 0.8862 0.5323 0.8862 0.9414
0.4433 9.3684 534 0.8770 0.5323 0.8770 0.9365
0.4433 9.4035 536 0.8715 0.5370 0.8715 0.9336
0.4433 9.4386 538 0.8672 0.5417 0.8672 0.9313
0.4433 9.4737 540 0.8686 0.5499 0.8686 0.9320
0.4433 9.5088 542 0.8665 0.5499 0.8665 0.9309
0.4433 9.5439 544 0.8641 0.5439 0.8641 0.9296
0.4433 9.5789 546 0.8581 0.5439 0.8581 0.9264
0.4433 9.6140 548 0.8551 0.54 0.8551 0.9247
0.4433 9.6491 550 0.8531 0.5167 0.8531 0.9236
0.4433 9.6842 552 0.8526 0.5167 0.8526 0.9234
0.4433 9.7193 554 0.8531 0.5167 0.8531 0.9236
0.4433 9.7544 556 0.8546 0.5167 0.8546 0.9244
0.4433 9.7895 558 0.8565 0.5167 0.8565 0.9255
0.4433 9.8246 560 0.8591 0.5202 0.8591 0.9269
0.4433 9.8596 562 0.8600 0.5202 0.8600 0.9273
0.4433 9.8947 564 0.8624 0.5202 0.8624 0.9287
0.4433 9.9298 566 0.8645 0.5202 0.8645 0.9298
0.4433 9.9649 568 0.8657 0.5202 0.8657 0.9304
0.4433 10.0 570 0.8661 0.5202 0.8661 0.9306

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k11_task2_organization

Finetuned
(4023)
this model