ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k20_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7980
  • Qwk: 0.3304
  • Mse: 0.7980
  • Rmse: 0.8933

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.3004 0.0102 4.3004 2.0737
No log 0.0784 4 2.7013 -0.0067 2.7013 1.6435
No log 0.1176 6 1.4134 -0.0256 1.4134 1.1889
No log 0.1569 8 1.1496 0.1779 1.1496 1.0722
No log 0.1961 10 1.2976 0.2639 1.2976 1.1391
No log 0.2353 12 1.1068 0.1071 1.1068 1.0521
No log 0.2745 14 1.0840 0.0919 1.0840 1.0411
No log 0.3137 16 1.1313 0.0574 1.1313 1.0636
No log 0.3529 18 1.1905 -0.0187 1.1905 1.0911
No log 0.3922 20 1.2648 0.0850 1.2648 1.1246
No log 0.4314 22 1.1711 0.1525 1.1711 1.0822
No log 0.4706 24 1.1266 0.1379 1.1266 1.0614
No log 0.5098 26 1.0402 0.3003 1.0402 1.0199
No log 0.5490 28 1.0334 0.2758 1.0334 1.0166
No log 0.5882 30 1.0807 0.1471 1.0807 1.0396
No log 0.6275 32 1.1333 0.1618 1.1333 1.0646
No log 0.6667 34 1.1508 0.1764 1.1508 1.0727
No log 0.7059 36 1.0651 0.2781 1.0651 1.0320
No log 0.7451 38 1.0078 0.3162 1.0078 1.0039
No log 0.7843 40 0.9898 0.2594 0.9898 0.9949
No log 0.8235 42 0.9866 0.2008 0.9866 0.9933
No log 0.8627 44 1.1143 0.0482 1.1143 1.0556
No log 0.9020 46 1.2296 -0.0833 1.2296 1.1089
No log 0.9412 48 1.2073 0.0924 1.2073 1.0988
No log 0.9804 50 1.1164 0.0385 1.1164 1.0566
No log 1.0196 52 1.1390 0.1770 1.1390 1.0672
No log 1.0588 54 1.2091 0.1052 1.2091 1.0996
No log 1.0980 56 1.1621 0.1473 1.1621 1.0780
No log 1.1373 58 1.0572 0.1446 1.0572 1.0282
No log 1.1765 60 0.9689 0.3733 0.9689 0.9843
No log 1.2157 62 0.9548 0.3229 0.9548 0.9771
No log 1.2549 64 1.0863 0.2271 1.0863 1.0422
No log 1.2941 66 1.2223 0.1744 1.2223 1.1056
No log 1.3333 68 1.3921 0.1449 1.3921 1.1799
No log 1.3725 70 1.4599 0.1004 1.4599 1.2083
No log 1.4118 72 1.2242 0.0338 1.2242 1.1064
No log 1.4510 74 1.0879 0.1653 1.0879 1.0430
No log 1.4902 76 1.0549 0.1680 1.0549 1.0271
No log 1.5294 78 1.1193 0.1801 1.1193 1.0580
No log 1.5686 80 1.2948 0.2167 1.2948 1.1379
No log 1.6078 82 1.4017 0.1500 1.4017 1.1839
No log 1.6471 84 1.3813 -0.0710 1.3813 1.1753
No log 1.6863 86 1.4027 -0.0400 1.4027 1.1844
No log 1.7255 88 1.3046 -0.0154 1.3046 1.1422
No log 1.7647 90 1.1795 -0.0120 1.1795 1.0860
No log 1.8039 92 1.0125 0.2015 1.0125 1.0062
No log 1.8431 94 0.9516 0.2341 0.9516 0.9755
No log 1.8824 96 0.9266 0.2865 0.9266 0.9626
No log 1.9216 98 0.9249 0.2088 0.9249 0.9617
No log 1.9608 100 0.9544 0.2128 0.9544 0.9770
No log 2.0 102 0.9560 0.2005 0.9560 0.9777
No log 2.0392 104 0.9412 0.2128 0.9412 0.9702
No log 2.0784 106 0.9439 0.3119 0.9439 0.9716
No log 2.1176 108 0.9352 0.3492 0.9352 0.9671
No log 2.1569 110 0.9142 0.3531 0.9142 0.9561
No log 2.1961 112 0.8977 0.3027 0.8977 0.9475
No log 2.2353 114 0.8873 0.3454 0.8873 0.9420
No log 2.2745 116 0.9289 0.3202 0.9289 0.9638
No log 2.3137 118 1.0609 0.3163 1.0609 1.0300
No log 2.3529 120 1.1561 0.2888 1.1561 1.0752
No log 2.3922 122 1.1411 0.2631 1.1411 1.0682
No log 2.4314 124 1.0414 0.2704 1.0414 1.0205
No log 2.4706 126 1.0234 0.2704 1.0234 1.0116
No log 2.5098 128 0.9433 0.3374 0.9433 0.9712
No log 2.5490 130 0.8689 0.3840 0.8689 0.9321
No log 2.5882 132 0.8644 0.3797 0.8644 0.9297
No log 2.6275 134 0.8820 0.3797 0.8820 0.9392
No log 2.6667 136 0.9010 0.3559 0.9010 0.9492
No log 2.7059 138 0.9400 0.4257 0.9400 0.9695
No log 2.7451 140 0.9148 0.5173 0.9148 0.9565
No log 2.7843 142 0.8692 0.5210 0.8692 0.9323
No log 2.8235 144 0.8388 0.4553 0.8388 0.9158
No log 2.8627 146 0.8404 0.4434 0.8404 0.9167
No log 2.9020 148 0.8275 0.4932 0.8275 0.9097
No log 2.9412 150 0.7980 0.5163 0.7980 0.8933
No log 2.9804 152 0.7952 0.5602 0.7952 0.8917
No log 3.0196 154 0.8448 0.4519 0.8448 0.9191
No log 3.0588 156 0.8916 0.4760 0.8916 0.9443
No log 3.0980 158 0.9315 0.3976 0.9315 0.9651
No log 3.1373 160 0.9507 0.3196 0.9507 0.9750
No log 3.1765 162 0.9082 0.4241 0.9082 0.9530
No log 3.2157 164 0.8368 0.4503 0.8368 0.9148
No log 3.2549 166 0.8172 0.4759 0.8172 0.9040
No log 3.2941 168 0.8568 0.4012 0.8568 0.9256
No log 3.3333 170 0.8881 0.3779 0.8881 0.9424
No log 3.3725 172 0.9027 0.4296 0.9027 0.9501
No log 3.4118 174 0.9161 0.4042 0.9161 0.9571
No log 3.4510 176 0.8761 0.4857 0.8761 0.9360
No log 3.4902 178 0.8314 0.4867 0.8314 0.9118
No log 3.5294 180 0.8049 0.5052 0.8049 0.8972
No log 3.5686 182 0.9443 0.4353 0.9443 0.9718
No log 3.6078 184 0.8843 0.4240 0.8843 0.9404
No log 3.6471 186 0.7444 0.4707 0.7444 0.8628
No log 3.6863 188 0.7598 0.5146 0.7598 0.8717
No log 3.7255 190 0.8458 0.4824 0.8458 0.9197
No log 3.7647 192 0.9901 0.3584 0.9901 0.9951
No log 3.8039 194 1.0161 0.3584 1.0161 1.0080
No log 3.8431 196 0.9619 0.4054 0.9619 0.9808
No log 3.8824 198 0.8748 0.3939 0.8748 0.9353
No log 3.9216 200 0.7719 0.4524 0.7719 0.8786
No log 3.9608 202 0.7258 0.4524 0.7258 0.8519
No log 4.0 204 0.7160 0.4269 0.7160 0.8462
No log 4.0392 206 0.7396 0.4524 0.7396 0.8600
No log 4.0784 208 0.7346 0.5010 0.7346 0.8571
No log 4.1176 210 0.7375 0.5010 0.7375 0.8588
No log 4.1569 212 0.7121 0.4908 0.7121 0.8439
No log 4.1961 214 0.7195 0.5098 0.7195 0.8482
No log 4.2353 216 0.7020 0.5844 0.7020 0.8379
No log 4.2745 218 0.7106 0.5516 0.7106 0.8430
No log 4.3137 220 0.6883 0.5010 0.6883 0.8296
No log 4.3529 222 0.7250 0.4405 0.7250 0.8515
No log 4.3922 224 0.7711 0.4929 0.7711 0.8781
No log 4.4314 226 0.7633 0.5048 0.7633 0.8737
No log 4.4706 228 0.7326 0.4524 0.7326 0.8559
No log 4.5098 230 0.7041 0.5010 0.7041 0.8391
No log 4.5490 232 0.7228 0.5094 0.7228 0.8502
No log 4.5882 234 0.8783 0.5729 0.8783 0.9372
No log 4.6275 236 0.8525 0.5536 0.8525 0.9233
No log 4.6667 238 0.7698 0.4991 0.7698 0.8774
No log 4.7059 240 0.7856 0.5357 0.7856 0.8864
No log 4.7451 242 0.8780 0.5232 0.8780 0.9370
No log 4.7843 244 0.9690 0.4181 0.9690 0.9844
No log 4.8235 246 0.9939 0.3662 0.9939 0.9969
No log 4.8627 248 0.8994 0.4250 0.8994 0.9484
No log 4.9020 250 0.8453 0.4012 0.8453 0.9194
No log 4.9412 252 0.8324 0.3933 0.8324 0.9124
No log 4.9804 254 0.8293 0.4321 0.8293 0.9107
No log 5.0196 256 0.8453 0.4338 0.8453 0.9194
No log 5.0588 258 0.8258 0.4321 0.8258 0.9087
No log 5.0980 260 0.8067 0.4313 0.8067 0.8981
No log 5.1373 262 0.8186 0.4141 0.8186 0.9048
No log 5.1765 264 0.8706 0.4948 0.8706 0.9330
No log 5.2157 266 0.9502 0.5012 0.9502 0.9748
No log 5.2549 268 1.0224 0.3863 1.0224 1.0111
No log 5.2941 270 0.9830 0.4249 0.9830 0.9915
No log 5.3333 272 0.9434 0.3880 0.9434 0.9713
No log 5.3725 274 0.9018 0.4192 0.9018 0.9497
No log 5.4118 276 0.8620 0.3976 0.8620 0.9284
No log 5.4510 278 0.8393 0.3859 0.8393 0.9161
No log 5.4902 280 0.8438 0.3859 0.8438 0.9186
No log 5.5294 282 0.8618 0.3977 0.8618 0.9283
No log 5.5686 284 0.9032 0.4368 0.9032 0.9503
No log 5.6078 286 0.9836 0.3436 0.9836 0.9917
No log 5.6471 288 1.0427 0.3699 1.0427 1.0211
No log 5.6863 290 1.0079 0.4255 1.0079 1.0040
No log 5.7255 292 0.9089 0.4352 0.9089 0.9534
No log 5.7647 294 0.8571 0.4048 0.8571 0.9258
No log 5.8039 296 0.8538 0.4313 0.8538 0.9240
No log 5.8431 298 0.8454 0.4313 0.8454 0.9194
No log 5.8824 300 0.8648 0.4012 0.8648 0.9299
No log 5.9216 302 0.9690 0.4135 0.9690 0.9844
No log 5.9608 304 1.0323 0.3506 1.0323 1.0160
No log 6.0 306 1.0127 0.4697 1.0127 1.0063
No log 6.0392 308 0.9915 0.4444 0.9915 0.9957
No log 6.0784 310 0.9247 0.3577 0.9247 0.9616
No log 6.1176 312 0.8737 0.3369 0.8737 0.9347
No log 6.1569 314 0.8425 0.4045 0.8425 0.9179
No log 6.1961 316 0.8877 0.4898 0.8877 0.9422
No log 6.2353 318 0.9302 0.4724 0.9302 0.9645
No log 6.2745 320 0.9140 0.5011 0.9140 0.9560
No log 6.3137 322 0.8862 0.4785 0.8862 0.9414
No log 6.3529 324 0.8628 0.4530 0.8628 0.9288
No log 6.3922 326 0.8612 0.4632 0.8612 0.9280
No log 6.4314 328 0.8981 0.4089 0.8981 0.9477
No log 6.4706 330 0.9319 0.3502 0.9319 0.9653
No log 6.5098 332 0.9818 0.3066 0.9818 0.9909
No log 6.5490 334 1.0302 0.4334 1.0302 1.0150
No log 6.5882 336 1.0720 0.3575 1.0720 1.0354
No log 6.6275 338 0.9815 0.4601 0.9815 0.9907
No log 6.6667 340 0.8796 0.5477 0.8796 0.9379
No log 6.7059 342 0.8255 0.4944 0.8255 0.9086
No log 6.7451 344 0.8672 0.4455 0.8672 0.9312
No log 6.7843 346 0.9492 0.4196 0.9492 0.9743
No log 6.8235 348 1.0416 0.4994 1.0416 1.0206
No log 6.8627 350 0.9680 0.5083 0.9680 0.9839
No log 6.9020 352 0.7617 0.4984 0.7617 0.8727
No log 6.9412 354 0.7739 0.4534 0.7739 0.8797
No log 6.9804 356 0.9150 0.3424 0.9150 0.9566
No log 7.0196 358 1.0627 0.2907 1.0627 1.0309
No log 7.0588 360 1.1439 0.2355 1.1439 1.0695
No log 7.0980 362 1.2028 0.3741 1.2028 1.0967
No log 7.1373 364 1.2669 0.3087 1.2669 1.1256
No log 7.1765 366 1.2553 0.3227 1.2553 1.1204
No log 7.2157 368 1.1264 0.3800 1.1264 1.0613
No log 7.2549 370 0.9375 0.2610 0.9375 0.9683
No log 7.2941 372 0.8338 0.4075 0.8338 0.9131
No log 7.3333 374 0.8290 0.3414 0.8290 0.9105
No log 7.3725 376 0.8540 0.4110 0.8540 0.9241
No log 7.4118 378 0.8850 0.3958 0.8850 0.9408
No log 7.4510 380 0.8716 0.4450 0.8716 0.9336
No log 7.4902 382 0.8351 0.4455 0.8351 0.9139
No log 7.5294 384 0.7689 0.4995 0.7689 0.8769
No log 7.5686 386 0.7468 0.5016 0.7468 0.8642
No log 7.6078 388 0.7415 0.4581 0.7415 0.8611
No log 7.6471 390 0.7705 0.4596 0.7705 0.8778
No log 7.6863 392 0.7921 0.4354 0.7921 0.8900
No log 7.7255 394 0.8038 0.4482 0.8038 0.8966
No log 7.7647 396 0.7854 0.4486 0.7854 0.8862
No log 7.8039 398 0.7600 0.4598 0.7600 0.8718
No log 7.8431 400 0.7524 0.4903 0.7524 0.8674
No log 7.8824 402 0.7269 0.4930 0.7269 0.8526
No log 7.9216 404 0.7265 0.5530 0.7265 0.8524
No log 7.9608 406 0.7867 0.5220 0.7867 0.8870
No log 8.0 408 0.7909 0.5220 0.7909 0.8893
No log 8.0392 410 0.7640 0.5102 0.7640 0.8741
No log 8.0784 412 0.7426 0.5287 0.7426 0.8617
No log 8.1176 414 0.7357 0.4952 0.7357 0.8577
No log 8.1569 416 0.7528 0.5342 0.7528 0.8677
No log 8.1961 418 0.7873 0.5500 0.7873 0.8873
No log 8.2353 420 0.8055 0.5571 0.8055 0.8975
No log 8.2745 422 0.8048 0.5473 0.8048 0.8971
No log 8.3137 424 0.7731 0.5342 0.7731 0.8793
No log 8.3529 426 0.7491 0.4922 0.7491 0.8655
No log 8.3922 428 0.7453 0.4930 0.7453 0.8633
No log 8.4314 430 0.7432 0.5055 0.7432 0.8621
No log 8.4706 432 0.7364 0.5055 0.7364 0.8582
No log 8.5098 434 0.7377 0.5580 0.7377 0.8589
No log 8.5490 436 0.7389 0.5596 0.7389 0.8596
No log 8.5882 438 0.7280 0.5580 0.7280 0.8532
No log 8.6275 440 0.7411 0.5248 0.7411 0.8609
No log 8.6667 442 0.7748 0.4995 0.7748 0.8802
No log 8.7059 444 0.7827 0.4519 0.7827 0.8847
No log 8.7451 446 0.7656 0.4772 0.7656 0.8750
No log 8.7843 448 0.7518 0.5030 0.7518 0.8670
No log 8.8235 450 0.7330 0.4719 0.7330 0.8561
No log 8.8627 452 0.7270 0.4719 0.7270 0.8527
No log 8.9020 454 0.7267 0.4719 0.7267 0.8525
No log 8.9412 456 0.7297 0.4813 0.7297 0.8542
No log 8.9804 458 0.7501 0.5343 0.7501 0.8661
No log 9.0196 460 0.7923 0.5622 0.7923 0.8901
No log 9.0588 462 0.8284 0.5041 0.8284 0.9102
No log 9.0980 464 0.8828 0.5106 0.8828 0.9396
No log 9.1373 466 0.9408 0.4249 0.9408 0.9700
No log 9.1765 468 0.8933 0.4667 0.8933 0.9451
No log 9.2157 470 0.8614 0.5007 0.8614 0.9281
No log 9.2549 472 0.8235 0.4455 0.8235 0.9075
No log 9.2941 474 0.8075 0.3822 0.8075 0.8986
No log 9.3333 476 0.8157 0.3454 0.8157 0.9032
No log 9.3725 478 0.8364 0.3454 0.8364 0.9145
No log 9.4118 480 0.8494 0.3668 0.8494 0.9216
No log 9.4510 482 0.8847 0.3531 0.8847 0.9406
No log 9.4902 484 0.9142 0.3531 0.9142 0.9562
No log 9.5294 486 0.9153 0.3531 0.9153 0.9567
No log 9.5686 488 0.9268 0.3394 0.9268 0.9627
No log 9.6078 490 0.9510 0.3003 0.9510 0.9752
No log 9.6471 492 0.9742 0.3003 0.9742 0.9870
No log 9.6863 494 0.9864 0.3539 0.9864 0.9932
No log 9.7255 496 0.9574 0.3672 0.9574 0.9785
No log 9.7647 498 0.9368 0.3672 0.9368 0.9679
0.3337 9.8039 500 0.8963 0.3842 0.8963 0.9467
0.3337 9.8431 502 0.8937 0.3824 0.8937 0.9454
0.3337 9.8824 504 0.8895 0.3824 0.8895 0.9431
0.3337 9.9216 506 0.8852 0.4089 0.8852 0.9409
0.3337 9.9608 508 0.8655 0.3647 0.8655 0.9303
0.3337 10.0 510 0.8504 0.3418 0.8504 0.9222
0.3337 10.0392 512 0.8335 0.3304 0.8335 0.9130
0.3337 10.0784 514 0.8133 0.3304 0.8133 0.9018
0.3337 10.1176 516 0.7980 0.3304 0.7980 0.8933

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k20_task5_organization

Finetuned
(4023)
this model