ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8791
  • Qwk: 0.4534
  • Mse: 0.8791
  • Rmse: 0.9376

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0870 2 4.8180 -0.0104 4.8180 2.1950
No log 0.1739 4 2.5644 0.0332 2.5644 1.6014
No log 0.2609 6 1.6466 0.0372 1.6466 1.2832
No log 0.3478 8 1.6210 0.0451 1.6210 1.2732
No log 0.4348 10 1.7016 -0.0798 1.7016 1.3044
No log 0.5217 12 1.5404 -0.0577 1.5404 1.2411
No log 0.6087 14 1.3550 0.0464 1.3550 1.1640
No log 0.6957 16 1.2937 0.1142 1.2937 1.1374
No log 0.7826 18 1.3089 0.0020 1.3090 1.1441
No log 0.8696 20 1.3557 -0.0974 1.3557 1.1643
No log 0.9565 22 1.3392 0.0038 1.3392 1.1572
No log 1.0435 24 1.3298 0.0339 1.3298 1.1532
No log 1.1304 26 1.4213 0.0317 1.4213 1.1922
No log 1.2174 28 1.4356 0.0317 1.4356 1.1982
No log 1.3043 30 1.3268 0.0488 1.3268 1.1519
No log 1.3913 32 1.1804 0.2986 1.1804 1.0865
No log 1.4783 34 1.0980 0.2155 1.0980 1.0478
No log 1.5652 36 1.0860 0.2624 1.0860 1.0421
No log 1.6522 38 1.1138 0.1814 1.1138 1.0553
No log 1.7391 40 1.1083 0.2290 1.1083 1.0527
No log 1.8261 42 1.0568 0.3094 1.0568 1.0280
No log 1.9130 44 1.0109 0.3190 1.0109 1.0055
No log 2.0 46 1.0194 0.3806 1.0194 1.0096
No log 2.0870 48 1.0958 0.4415 1.0958 1.0468
No log 2.1739 50 1.3668 0.2576 1.3668 1.1691
No log 2.2609 52 1.2130 0.3354 1.2130 1.1013
No log 2.3478 54 0.9226 0.4400 0.9226 0.9605
No log 2.4348 56 0.9144 0.5058 0.9144 0.9562
No log 2.5217 58 0.9604 0.4685 0.9604 0.9800
No log 2.6087 60 0.9112 0.5756 0.9112 0.9546
No log 2.6957 62 0.9581 0.4260 0.9581 0.9788
No log 2.7826 64 0.9878 0.4649 0.9878 0.9939
No log 2.8696 66 1.2928 0.3503 1.2928 1.1370
No log 2.9565 68 1.5461 0.2172 1.5461 1.2434
No log 3.0435 70 1.4246 0.2399 1.4246 1.1936
No log 3.1304 72 1.0969 0.3954 1.0969 1.0474
No log 3.2174 74 1.0487 0.4035 1.0487 1.0240
No log 3.3043 76 1.2058 0.3913 1.2058 1.0981
No log 3.3913 78 1.1346 0.3872 1.1346 1.0652
No log 3.4783 80 0.9809 0.4233 0.9809 0.9904
No log 3.5652 82 1.0107 0.3491 1.0107 1.0053
No log 3.6522 84 1.0703 0.4497 1.0703 1.0345
No log 3.7391 86 1.0664 0.4414 1.0664 1.0327
No log 3.8261 88 0.9899 0.4894 0.9899 0.9949
No log 3.9130 90 1.0433 0.4607 1.0433 1.0214
No log 4.0 92 1.2793 0.4245 1.2793 1.1311
No log 4.0870 94 1.1476 0.4921 1.1476 1.0712
No log 4.1739 96 0.9871 0.4591 0.9871 0.9935
No log 4.2609 98 0.9573 0.5324 0.9573 0.9784
No log 4.3478 100 0.9663 0.5012 0.9663 0.9830
No log 4.4348 102 0.9492 0.4530 0.9492 0.9743
No log 4.5217 104 1.0936 0.4260 1.0936 1.0458
No log 4.6087 106 1.0554 0.4313 1.0554 1.0273
No log 4.6957 108 0.9838 0.4128 0.9838 0.9919
No log 4.7826 110 1.2075 0.3525 1.2075 1.0988
No log 4.8696 112 1.2502 0.3631 1.2502 1.1181
No log 4.9565 114 1.1197 0.4219 1.1197 1.0581
No log 5.0435 116 1.0236 0.3927 1.0236 1.0117
No log 5.1304 118 1.0205 0.4014 1.0205 1.0102
No log 5.2174 120 1.1082 0.4551 1.1082 1.0527
No log 5.3043 122 1.3467 0.3772 1.3467 1.1605
No log 5.3913 124 1.3288 0.3772 1.3288 1.1527
No log 5.4783 126 1.0765 0.4485 1.0765 1.0375
No log 5.5652 128 1.0288 0.3993 1.0288 1.0143
No log 5.6522 130 1.0703 0.4372 1.0703 1.0345
No log 5.7391 132 1.0316 0.4078 1.0316 1.0157
No log 5.8261 134 1.0382 0.3872 1.0382 1.0189
No log 5.9130 136 1.0493 0.4286 1.0493 1.0244
No log 6.0 138 1.1129 0.4655 1.1129 1.0549
No log 6.0870 140 1.1349 0.4136 1.1349 1.0653
No log 6.1739 142 1.1365 0.4516 1.1365 1.0661
No log 6.2609 144 0.9873 0.5114 0.9873 0.9936
No log 6.3478 146 0.9661 0.4737 0.9661 0.9829
No log 6.4348 148 0.9585 0.4690 0.9585 0.9790
No log 6.5217 150 0.9777 0.4794 0.9777 0.9888
No log 6.6087 152 0.9722 0.4764 0.9722 0.9860
No log 6.6957 154 0.9461 0.3973 0.9461 0.9727
No log 6.7826 156 0.9944 0.3728 0.9944 0.9972
No log 6.8696 158 0.9655 0.3688 0.9655 0.9826
No log 6.9565 160 0.9696 0.4722 0.9696 0.9847
No log 7.0435 162 1.1088 0.4186 1.1088 1.0530
No log 7.1304 164 1.1272 0.4300 1.1272 1.0617
No log 7.2174 166 0.9688 0.4552 0.9688 0.9843
No log 7.3043 168 0.9598 0.5216 0.9598 0.9797
No log 7.3913 170 0.9673 0.4932 0.9673 0.9835
No log 7.4783 172 0.9298 0.4998 0.9298 0.9643
No log 7.5652 174 1.0557 0.3601 1.0557 1.0275
No log 7.6522 176 1.1939 0.3917 1.1939 1.0927
No log 7.7391 178 1.1055 0.3697 1.1055 1.0514
No log 7.8261 180 0.9524 0.4681 0.9524 0.9759
No log 7.9130 182 0.9398 0.4681 0.9398 0.9694
No log 8.0 184 1.0492 0.3601 1.0492 1.0243
No log 8.0870 186 1.1011 0.3827 1.1011 1.0493
No log 8.1739 188 1.1307 0.3429 1.1307 1.0634
No log 8.2609 190 0.9923 0.4339 0.9923 0.9961
No log 8.3478 192 0.9066 0.5418 0.9066 0.9522
No log 8.4348 194 0.9102 0.5391 0.9102 0.9540
No log 8.5217 196 0.9373 0.5098 0.9373 0.9682
No log 8.6087 198 0.9139 0.4871 0.9139 0.9560
No log 8.6957 200 0.9318 0.3969 0.9318 0.9653
No log 8.7826 202 0.9787 0.4774 0.9787 0.9893
No log 8.8696 204 0.9572 0.4326 0.9572 0.9784
No log 8.9565 206 0.9719 0.4718 0.9719 0.9858
No log 9.0435 208 1.0019 0.4898 1.0019 1.0010
No log 9.1304 210 1.1195 0.4284 1.1195 1.0581
No log 9.2174 212 1.0744 0.4942 1.0744 1.0365
No log 9.3043 214 1.0346 0.4718 1.0346 1.0171
No log 9.3913 216 1.0111 0.4727 1.0111 1.0055
No log 9.4783 218 1.0186 0.4724 1.0186 1.0093
No log 9.5652 220 1.0566 0.4220 1.0566 1.0279
No log 9.6522 222 1.0060 0.3838 1.0060 1.0030
No log 9.7391 224 0.9353 0.4454 0.9353 0.9671
No log 9.8261 226 0.9078 0.3948 0.9078 0.9528
No log 9.9130 228 0.9015 0.4920 0.9015 0.9495
No log 10.0 230 1.0052 0.4811 1.0052 1.0026
No log 10.0870 232 1.1177 0.4521 1.1177 1.0572
No log 10.1739 234 1.0251 0.5098 1.0251 1.0125
No log 10.2609 236 0.9153 0.5305 0.9153 0.9567
No log 10.3478 238 0.9113 0.5098 0.9113 0.9546
No log 10.4348 240 1.0287 0.4563 1.0287 1.0143
No log 10.5217 242 1.0514 0.4418 1.0514 1.0254
No log 10.6087 244 0.9703 0.5 0.9703 0.9850
No log 10.6957 246 0.9083 0.4808 0.9083 0.9531
No log 10.7826 248 0.9040 0.4499 0.9040 0.9508
No log 10.8696 250 0.9105 0.4894 0.9105 0.9542
No log 10.9565 252 1.0265 0.4741 1.0265 1.0132
No log 11.0435 254 1.0404 0.4771 1.0404 1.0200
No log 11.1304 256 1.0000 0.5227 1.0000 1.0000
No log 11.2174 258 0.9507 0.4823 0.9507 0.9751
No log 11.3043 260 0.9482 0.4835 0.9482 0.9738
No log 11.3913 262 1.0421 0.4458 1.0421 1.0208
No log 11.4783 264 1.1453 0.3491 1.1453 1.0702
No log 11.5652 266 1.0900 0.3918 1.0900 1.0440
No log 11.6522 268 1.0798 0.3567 1.0798 1.0391
No log 11.7391 270 1.1378 0.3629 1.1378 1.0667
No log 11.8261 272 1.1274 0.3357 1.1274 1.0618
No log 11.9130 274 1.0587 0.3596 1.0587 1.0290
No log 12.0 276 1.0386 0.3409 1.0386 1.0191
No log 12.0870 278 0.9810 0.4648 0.9810 0.9905
No log 12.1739 280 0.9782 0.4816 0.9782 0.9890
No log 12.2609 282 0.9982 0.4726 0.9982 0.9991
No log 12.3478 284 1.0606 0.4167 1.0606 1.0299
No log 12.4348 286 1.2308 0.3921 1.2308 1.1094
No log 12.5217 288 1.2197 0.3719 1.2197 1.1044
No log 12.6087 290 1.1132 0.3325 1.1132 1.0551
No log 12.6957 292 1.1261 0.3503 1.1261 1.0612
No log 12.7826 294 1.1652 0.3947 1.1652 1.0795
No log 12.8696 296 1.0862 0.3629 1.0862 1.0422
No log 12.9565 298 0.9812 0.3886 0.9812 0.9906
No log 13.0435 300 0.9516 0.4369 0.9516 0.9755
No log 13.1304 302 0.9675 0.4734 0.9675 0.9836
No log 13.2174 304 1.0426 0.4521 1.0426 1.0211
No log 13.3043 306 1.0619 0.4813 1.0619 1.0305
No log 13.3913 308 1.0146 0.4279 1.0146 1.0073
No log 13.4783 310 0.9415 0.5411 0.9415 0.9703
No log 13.5652 312 0.9232 0.4877 0.9232 0.9609
No log 13.6522 314 0.9654 0.5022 0.9654 0.9825
No log 13.7391 316 0.9707 0.4854 0.9707 0.9853
No log 13.8261 318 0.9186 0.4829 0.9186 0.9584
No log 13.9130 320 1.0089 0.4715 1.0089 1.0045
No log 14.0 322 1.0980 0.3503 1.0980 1.0479
No log 14.0870 324 1.0358 0.4368 1.0358 1.0177
No log 14.1739 326 0.9487 0.4401 0.9487 0.9740
No log 14.2609 328 0.9473 0.4593 0.9473 0.9733
No log 14.3478 330 0.9498 0.4053 0.9498 0.9746
No log 14.4348 332 0.9579 0.4110 0.9579 0.9787
No log 14.5217 334 0.9862 0.4533 0.9862 0.9931
No log 14.6087 336 0.9961 0.4715 0.9961 0.9980
No log 14.6957 338 0.9964 0.4630 0.9964 0.9982
No log 14.7826 340 0.9409 0.4726 0.9409 0.9700
No log 14.8696 342 0.9158 0.4964 0.9158 0.9570
No log 14.9565 344 0.9176 0.4125 0.9176 0.9579
No log 15.0435 346 0.9479 0.3643 0.9479 0.9736
No log 15.1304 348 1.0017 0.3237 1.0017 1.0009
No log 15.2174 350 1.0502 0.3367 1.0502 1.0248
No log 15.3043 352 1.0697 0.3474 1.0697 1.0343
No log 15.3913 354 1.1043 0.3726 1.1043 1.0509
No log 15.4783 356 1.0083 0.3958 1.0083 1.0042
No log 15.5652 358 0.9560 0.4016 0.9560 0.9777
No log 15.6522 360 0.9272 0.4196 0.9272 0.9629
No log 15.7391 362 0.9205 0.4937 0.9205 0.9594
No log 15.8261 364 1.0556 0.4186 1.0556 1.0274
No log 15.9130 366 1.1760 0.3388 1.1760 1.0844
No log 16.0 368 1.0817 0.4383 1.0817 1.0401
No log 16.0870 370 0.8979 0.5245 0.8979 0.9476
No log 16.1739 372 0.8628 0.5026 0.8628 0.9289
No log 16.2609 374 0.8584 0.4560 0.8584 0.9265
No log 16.3478 376 0.8920 0.5406 0.8920 0.9445
No log 16.4348 378 0.9590 0.5086 0.9590 0.9793
No log 16.5217 380 1.0484 0.4390 1.0484 1.0239
No log 16.6087 382 0.9866 0.4655 0.9866 0.9933
No log 16.6957 384 0.8926 0.4737 0.8926 0.9448
No log 16.7826 386 0.8820 0.4864 0.8820 0.9392
No log 16.8696 388 0.9283 0.5114 0.9283 0.9635
No log 16.9565 390 0.9948 0.4655 0.9948 0.9974
No log 17.0435 392 1.0547 0.4186 1.0547 1.0270
No log 17.1304 394 0.9726 0.4426 0.9726 0.9862
No log 17.2174 396 0.8943 0.4998 0.8943 0.9457
No log 17.3043 398 0.8616 0.4424 0.8616 0.9282
No log 17.3913 400 0.9279 0.4027 0.9279 0.9633
No log 17.4783 402 0.9181 0.4281 0.9181 0.9582
No log 17.5652 404 0.8511 0.5260 0.8511 0.9225
No log 17.6522 406 0.9094 0.5041 0.9094 0.9536
No log 17.7391 408 1.1360 0.4073 1.1360 1.0658
No log 17.8261 410 1.2300 0.3772 1.2300 1.1091
No log 17.9130 412 1.1139 0.4073 1.1139 1.0554
No log 18.0 414 0.9238 0.5515 0.9238 0.9612
No log 18.0870 416 0.8368 0.4514 0.8368 0.9148
No log 18.1739 418 0.8725 0.4752 0.8725 0.9341
No log 18.2609 420 0.8744 0.4991 0.8744 0.9351
No log 18.3478 422 0.8420 0.5343 0.8420 0.9176
No log 18.4348 424 0.8990 0.4836 0.8990 0.9481
No log 18.5217 426 1.0914 0.4976 1.0914 1.0447
No log 18.6087 428 1.2082 0.5273 1.2082 1.0992
No log 18.6957 430 1.1310 0.5290 1.1310 1.0635
No log 18.7826 432 0.9560 0.4862 0.9560 0.9777
No log 18.8696 434 0.8491 0.5318 0.8491 0.9214
No log 18.9565 436 0.8277 0.5131 0.8277 0.9098
No log 19.0435 438 0.8507 0.5374 0.8507 0.9223
No log 19.1304 440 0.8546 0.5560 0.8546 0.9245
No log 19.2174 442 0.8684 0.5536 0.8684 0.9319
No log 19.3043 444 0.8498 0.5560 0.8498 0.9219
No log 19.3913 446 0.8114 0.5470 0.8114 0.9008
No log 19.4783 448 0.7957 0.5868 0.7957 0.8920
No log 19.5652 450 0.7951 0.5749 0.7951 0.8917
No log 19.6522 452 0.8139 0.5562 0.8139 0.9022
No log 19.7391 454 0.8099 0.5344 0.8099 0.9000
No log 19.8261 456 0.8210 0.4923 0.8210 0.9061
No log 19.9130 458 0.8548 0.5333 0.8548 0.9245
No log 20.0 460 0.8586 0.5073 0.8586 0.9266
No log 20.0870 462 0.8513 0.4676 0.8513 0.9227
No log 20.1739 464 0.8758 0.5119 0.8758 0.9359
No log 20.2609 466 0.9589 0.4902 0.9589 0.9792
No log 20.3478 468 1.0961 0.3907 1.0961 1.0469
No log 20.4348 470 1.0813 0.4022 1.0813 1.0399
No log 20.5217 472 0.9854 0.4726 0.9854 0.9927
No log 20.6087 474 0.9267 0.4636 0.9267 0.9626
No log 20.6957 476 0.9322 0.4256 0.9322 0.9655
No log 20.7826 478 0.9485 0.4736 0.9485 0.9739
No log 20.8696 480 1.0266 0.4 1.0266 1.0132
No log 20.9565 482 1.1899 0.3548 1.1899 1.0908
No log 21.0435 484 1.2184 0.3548 1.2184 1.1038
No log 21.1304 486 1.1171 0.4272 1.1171 1.0570
No log 21.2174 488 1.0163 0.3606 1.0163 1.0081
No log 21.3043 490 0.9599 0.4087 0.9599 0.9798
No log 21.3913 492 0.9628 0.4604 0.9628 0.9812
No log 21.4783 494 0.9932 0.4724 0.9932 0.9966
No log 21.5652 496 0.9932 0.4942 0.9932 0.9966
No log 21.6522 498 0.9402 0.5291 0.9402 0.9696
0.3332 21.7391 500 0.9156 0.5291 0.9156 0.9569
0.3332 21.8261 502 0.8780 0.5057 0.8780 0.9370
0.3332 21.9130 504 0.8780 0.4719 0.8780 0.9370
0.3332 22.0 506 0.8873 0.4719 0.8873 0.9420
0.3332 22.0870 508 0.8586 0.4771 0.8586 0.9266
0.3332 22.1739 510 0.8558 0.5504 0.8558 0.9251
0.3332 22.2609 512 0.8675 0.5311 0.8675 0.9314
0.3332 22.3478 514 0.8716 0.5291 0.8716 0.9336
0.3332 22.4348 516 0.8681 0.5291 0.8681 0.9317
0.3332 22.5217 518 0.8616 0.5331 0.8616 0.9282
0.3332 22.6087 520 0.9006 0.5070 0.9006 0.9490
0.3332 22.6957 522 0.9441 0.4734 0.9441 0.9716
0.3332 22.7826 524 0.9775 0.4874 0.9775 0.9887
0.3332 22.8696 526 1.0346 0.4958 1.0346 1.0172
0.3332 22.9565 528 0.9626 0.5385 0.9626 0.9811
0.3332 23.0435 530 0.9088 0.5235 0.9088 0.9533
0.3332 23.1304 532 0.8784 0.5253 0.8784 0.9372
0.3332 23.2174 534 0.8613 0.5501 0.8613 0.9281
0.3332 23.3043 536 0.8592 0.4676 0.8592 0.9269
0.3332 23.3913 538 0.8678 0.4593 0.8678 0.9316
0.3332 23.4783 540 0.8715 0.4746 0.8715 0.9336
0.3332 23.5652 542 0.9012 0.5610 0.9012 0.9493
0.3332 23.6522 544 0.9303 0.4940 0.9303 0.9645
0.3332 23.7391 546 0.9166 0.4694 0.9166 0.9574
0.3332 23.8261 548 0.8973 0.4840 0.8973 0.9472
0.3332 23.9130 550 0.8919 0.4840 0.8919 0.9444
0.3332 24.0 552 0.8791 0.4534 0.8791 0.9376

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k7_task2_organization

Finetuned
(4019)
this model