ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9743
  • Qwk: 0.3920
  • Mse: 0.9743
  • Rmse: 0.9871

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.7691 -0.0075 4.7691 2.1838
No log 0.0769 4 3.0979 -0.0348 3.0979 1.7601
No log 0.1154 6 2.0107 0.0062 2.0107 1.4180
No log 0.1538 8 1.8700 0.0062 1.8700 1.3675
No log 0.1923 10 1.4947 0.0227 1.4947 1.2226
No log 0.2308 12 1.2681 0.1257 1.2681 1.1261
No log 0.2692 14 1.1478 0.1935 1.1478 1.0714
No log 0.3077 16 1.0866 0.2729 1.0866 1.0424
No log 0.3462 18 1.1966 0.2126 1.1966 1.0939
No log 0.3846 20 1.1426 0.2521 1.1426 1.0689
No log 0.4231 22 1.2714 0.1347 1.2714 1.1276
No log 0.4615 24 1.5768 0.0169 1.5768 1.2557
No log 0.5 26 1.8955 0.0145 1.8955 1.3768
No log 0.5385 28 1.7122 0.0 1.7122 1.3085
No log 0.5769 30 1.5312 0.0 1.5312 1.2374
No log 0.6154 32 1.4690 0.0512 1.4690 1.2120
No log 0.6538 34 1.2118 0.1532 1.2118 1.1008
No log 0.6923 36 1.0950 0.2569 1.0950 1.0464
No log 0.7308 38 1.0687 0.2569 1.0687 1.0338
No log 0.7692 40 1.1164 0.1595 1.1164 1.0566
No log 0.8077 42 1.6018 0.2532 1.6018 1.2656
No log 0.8462 44 1.6702 0.1614 1.6702 1.2923
No log 0.8846 46 1.4338 0.2720 1.4338 1.1974
No log 0.9231 48 1.1505 0.1675 1.1505 1.0726
No log 0.9615 50 1.1338 0.3195 1.1338 1.0648
No log 1.0 52 1.1578 0.1918 1.1578 1.0760
No log 1.0385 54 1.1525 0.1918 1.1525 1.0735
No log 1.0769 56 1.1333 0.2782 1.1333 1.0645
No log 1.1154 58 1.1352 0.3663 1.1352 1.0655
No log 1.1538 60 1.1241 0.3189 1.1241 1.0602
No log 1.1923 62 1.1302 0.2935 1.1302 1.0631
No log 1.2308 64 1.2176 0.1495 1.2176 1.1034
No log 1.2692 66 1.2661 0.1495 1.2661 1.1252
No log 1.3077 68 1.1969 0.2402 1.1969 1.0940
No log 1.3462 70 1.0724 0.3902 1.0724 1.0356
No log 1.3846 72 1.0439 0.3708 1.0439 1.0217
No log 1.4231 74 1.0255 0.3145 1.0255 1.0127
No log 1.4615 76 1.0690 0.2935 1.0690 1.0339
No log 1.5 78 1.2119 0.2067 1.2119 1.1009
No log 1.5385 80 1.1682 0.2161 1.1682 1.0809
No log 1.5769 82 1.0167 0.3263 1.0167 1.0083
No log 1.6154 84 1.1183 0.1911 1.1183 1.0575
No log 1.6538 86 1.1695 0.1766 1.1695 1.0815
No log 1.6923 88 1.1088 0.3243 1.1088 1.0530
No log 1.7308 90 1.0467 0.3486 1.0467 1.0231
No log 1.7692 92 1.3672 0.1621 1.3672 1.1693
No log 1.8077 94 1.5792 0.1907 1.5792 1.2567
No log 1.8462 96 1.5500 0.2211 1.5500 1.2450
No log 1.8846 98 1.2763 0.2826 1.2763 1.1297
No log 1.9231 100 1.0002 0.3616 1.0002 1.0001
No log 1.9615 102 1.0212 0.2019 1.0212 1.0106
No log 2.0 104 1.0442 0.2019 1.0442 1.0219
No log 2.0385 106 1.0312 0.2317 1.0312 1.0155
No log 2.0769 108 0.9691 0.4244 0.9691 0.9844
No log 2.1154 110 1.0314 0.3534 1.0314 1.0156
No log 2.1538 112 1.0691 0.3959 1.0691 1.0340
No log 2.1923 114 0.9357 0.4159 0.9357 0.9673
No log 2.2308 116 0.9774 0.4236 0.9774 0.9887
No log 2.2692 118 0.9712 0.4035 0.9712 0.9855
No log 2.3077 120 1.0679 0.3747 1.0679 1.0334
No log 2.3462 122 1.1215 0.3743 1.1215 1.0590
No log 2.3846 124 1.0154 0.4038 1.0154 1.0077
No log 2.4231 126 0.9573 0.5025 0.9573 0.9784
No log 2.4615 128 0.9877 0.5024 0.9877 0.9938
No log 2.5 130 1.0713 0.4819 1.0713 1.0350
No log 2.5385 132 1.0592 0.4819 1.0592 1.0292
No log 2.5769 134 1.0277 0.5239 1.0277 1.0137
No log 2.6154 136 0.9777 0.5151 0.9777 0.9888
No log 2.6538 138 0.9448 0.5287 0.9448 0.9720
No log 2.6923 140 0.9190 0.5239 0.9190 0.9587
No log 2.7308 142 0.8948 0.4685 0.8948 0.9460
No log 2.7692 144 0.9293 0.5022 0.9293 0.9640
No log 2.8077 146 0.8963 0.4396 0.8963 0.9467
No log 2.8462 148 1.0667 0.4372 1.0667 1.0328
No log 2.8846 150 1.0547 0.3823 1.0547 1.0270
No log 2.9231 152 0.9877 0.4066 0.9877 0.9938
No log 2.9615 154 1.1946 0.3339 1.1946 1.0930
No log 3.0 156 1.1827 0.2876 1.1827 1.0875
No log 3.0385 158 1.0395 0.4253 1.0395 1.0196
No log 3.0769 160 1.2351 0.2799 1.2351 1.1113
No log 3.1154 162 1.3041 0.2461 1.3041 1.1420
No log 3.1538 164 1.1423 0.2712 1.1423 1.0688
No log 3.1923 166 1.0598 0.3387 1.0598 1.0295
No log 3.2308 168 1.0556 0.3566 1.0556 1.0274
No log 3.2692 170 1.0327 0.3973 1.0327 1.0162
No log 3.3077 172 0.9856 0.3804 0.9856 0.9928
No log 3.3462 174 0.9908 0.3874 0.9908 0.9954
No log 3.3846 176 0.9738 0.3365 0.9738 0.9868
No log 3.4231 178 1.0388 0.2572 1.0388 1.0192
No log 3.4615 180 1.0068 0.2180 1.0068 1.0034
No log 3.5 182 0.9880 0.3705 0.9880 0.9940
No log 3.5385 184 1.0677 0.3572 1.0677 1.0333
No log 3.5769 186 1.0911 0.3533 1.0911 1.0446
No log 3.6154 188 1.1181 0.4270 1.1181 1.0574
No log 3.6538 190 1.1263 0.4374 1.1263 1.0613
No log 3.6923 192 1.2599 0.1876 1.2599 1.1225
No log 3.7308 194 1.2044 0.1775 1.2044 1.0974
No log 3.7692 196 1.0447 0.4690 1.0447 1.0221
No log 3.8077 198 1.1602 0.3539 1.1602 1.0771
No log 3.8462 200 1.2398 0.2634 1.2398 1.1134
No log 3.8846 202 1.0991 0.1755 1.0991 1.0484
No log 3.9231 204 0.9900 0.3174 0.9900 0.9950
No log 3.9615 206 0.9862 0.3951 0.9862 0.9931
No log 4.0 208 0.9677 0.3848 0.9677 0.9837
No log 4.0385 210 0.9228 0.3838 0.9228 0.9606
No log 4.0769 212 0.9574 0.3308 0.9574 0.9785
No log 4.1154 214 1.0789 0.3607 1.0789 1.0387
No log 4.1538 216 1.0468 0.3739 1.0468 1.0231
No log 4.1923 218 0.9109 0.4470 0.9109 0.9544
No log 4.2308 220 0.9024 0.4256 0.9024 0.9499
No log 4.2692 222 0.9038 0.4256 0.9038 0.9507
No log 4.3077 224 0.9143 0.4340 0.9143 0.9562
No log 4.3462 226 1.0064 0.3530 1.0064 1.0032
No log 4.3846 228 1.0312 0.3625 1.0312 1.0155
No log 4.4231 230 0.9632 0.3464 0.9632 0.9814
No log 4.4615 232 0.9293 0.3478 0.9293 0.9640
No log 4.5 234 0.9032 0.4159 0.9032 0.9504
No log 4.5385 236 0.8933 0.3869 0.8933 0.9451
No log 4.5769 238 0.9048 0.3773 0.9048 0.9512
No log 4.6154 240 0.9270 0.3813 0.9270 0.9628
No log 4.6538 242 0.9962 0.3723 0.9962 0.9981
No log 4.6923 244 0.9722 0.3572 0.9722 0.9860
No log 4.7308 246 0.9010 0.5102 0.9010 0.9492
No log 4.7692 248 0.9876 0.4829 0.9876 0.9938
No log 4.8077 250 0.9807 0.4829 0.9807 0.9903
No log 4.8462 252 0.9284 0.4690 0.9284 0.9635
No log 4.8846 254 0.9314 0.5040 0.9314 0.9651
No log 4.9231 256 0.9295 0.4626 0.9295 0.9641
No log 4.9615 258 0.9323 0.3885 0.9323 0.9656
No log 5.0 260 0.9472 0.3764 0.9472 0.9733
No log 5.0385 262 0.9787 0.3874 0.9787 0.9893
No log 5.0769 264 0.9983 0.3160 0.9983 0.9991
No log 5.1154 266 1.0056 0.3019 1.0056 1.0028
No log 5.1538 268 1.0285 0.4069 1.0285 1.0142
No log 5.1923 270 1.0735 0.3770 1.0735 1.0361
No log 5.2308 272 1.1247 0.3967 1.1247 1.0605
No log 5.2692 274 1.0380 0.4129 1.0380 1.0188
No log 5.3077 276 0.9453 0.4240 0.9453 0.9723
No log 5.3462 278 0.9283 0.4159 0.9283 0.9635
No log 5.3846 280 0.9201 0.4218 0.9201 0.9592
No log 5.4231 282 0.9371 0.4107 0.9371 0.9681
No log 5.4615 284 0.9692 0.3936 0.9692 0.9845
No log 5.5 286 0.9456 0.4107 0.9456 0.9724
No log 5.5385 288 0.9247 0.3819 0.9247 0.9616
No log 5.5769 290 0.9344 0.4548 0.9344 0.9666
No log 5.6154 292 0.9603 0.3771 0.9603 0.9799
No log 5.6538 294 0.9764 0.3513 0.9764 0.9881
No log 5.6923 296 0.9667 0.4002 0.9667 0.9832
No log 5.7308 298 1.0046 0.4069 1.0046 1.0023
No log 5.7692 300 1.0138 0.4454 1.0138 1.0069
No log 5.8077 302 1.0424 0.4022 1.0424 1.0210
No log 5.8462 304 1.0225 0.4113 1.0225 1.0112
No log 5.8846 306 1.0587 0.4201 1.0587 1.0289
No log 5.9231 308 1.1395 0.3647 1.1395 1.0675
No log 5.9615 310 1.2230 0.3421 1.2230 1.1059
No log 6.0 312 1.1924 0.3169 1.1924 1.0920
No log 6.0385 314 1.1224 0.3478 1.1224 1.0594
No log 6.0769 316 0.9568 0.4202 0.9568 0.9782
No log 6.1154 318 0.8540 0.4548 0.8540 0.9241
No log 6.1538 320 0.8592 0.4646 0.8592 0.9269
No log 6.1923 322 0.9689 0.3707 0.9689 0.9843
No log 6.2308 324 1.0591 0.3864 1.0591 1.0291
No log 6.2692 326 1.0616 0.3864 1.0616 1.0304
No log 6.3077 328 0.9778 0.4223 0.9778 0.9888
No log 6.3462 330 0.8731 0.4450 0.8731 0.9344
No log 6.3846 332 0.8822 0.4019 0.8822 0.9393
No log 6.4231 334 0.8915 0.4356 0.8915 0.9442
No log 6.4615 336 0.9412 0.4373 0.9412 0.9701
No log 6.5 338 1.0246 0.4067 1.0246 1.0122
No log 6.5385 340 1.0736 0.3697 1.0736 1.0361
No log 6.5769 342 1.0645 0.3217 1.0645 1.0317
No log 6.6154 344 1.0672 0.2871 1.0672 1.0331
No log 6.6538 346 1.0800 0.2871 1.0800 1.0393
No log 6.6923 348 1.1194 0.2673 1.1194 1.0580
No log 6.7308 350 1.0879 0.2673 1.0879 1.0430
No log 6.7692 352 1.0228 0.3018 1.0228 1.0113
No log 6.8077 354 0.9936 0.4349 0.9936 0.9968
No log 6.8462 356 0.9867 0.4568 0.9867 0.9933
No log 6.8846 358 0.9800 0.5176 0.9800 0.9899
No log 6.9231 360 0.9628 0.5176 0.9628 0.9812
No log 6.9615 362 0.9413 0.4196 0.9413 0.9702
No log 7.0 364 0.9304 0.3126 0.9304 0.9646
No log 7.0385 366 0.9288 0.2714 0.9288 0.9637
No log 7.0769 368 0.8874 0.3678 0.8874 0.9420
No log 7.1154 370 0.8598 0.4381 0.8598 0.9273
No log 7.1538 372 0.8907 0.4889 0.8907 0.9438
No log 7.1923 374 0.8970 0.4889 0.8970 0.9471
No log 7.2308 376 0.8612 0.5098 0.8612 0.9280
No log 7.2692 378 0.8517 0.4496 0.8517 0.9229
No log 7.3077 380 0.8598 0.4667 0.8598 0.9273
No log 7.3462 382 0.8411 0.4407 0.8411 0.9171
No log 7.3846 384 0.8396 0.5082 0.8396 0.9163
No log 7.4231 386 0.8895 0.5339 0.8895 0.9431
No log 7.4615 388 0.9508 0.4781 0.9508 0.9751
No log 7.5 390 1.0488 0.4587 1.0488 1.0241
No log 7.5385 392 1.0232 0.4513 1.0232 1.0115
No log 7.5769 394 1.0049 0.4513 1.0049 1.0025
No log 7.6154 396 0.9369 0.4914 0.9369 0.9680
No log 7.6538 398 0.8593 0.5202 0.8593 0.9270
No log 7.6923 400 0.8330 0.5229 0.8330 0.9127
No log 7.7308 402 0.8315 0.5318 0.8315 0.9119
No log 7.7692 404 0.8610 0.4416 0.8610 0.9279
No log 7.8077 406 0.8350 0.4454 0.8350 0.9138
No log 7.8462 408 0.9117 0.4201 0.9117 0.9548
No log 7.8846 410 0.9350 0.3572 0.9350 0.9670
No log 7.9231 412 0.8811 0.3237 0.8811 0.9387
No log 7.9615 414 0.8437 0.4681 0.8437 0.9185
No log 8.0 416 0.8532 0.4181 0.8532 0.9237
No log 8.0385 418 0.8781 0.4050 0.8781 0.9371
No log 8.0769 420 0.9349 0.4833 0.9349 0.9669
No log 8.1154 422 1.0325 0.4558 1.0325 1.0161
No log 8.1538 424 1.1495 0.4101 1.1495 1.0722
No log 8.1923 426 1.2082 0.2961 1.2082 1.0992
No log 8.2308 428 1.1291 0.3596 1.1291 1.0626
No log 8.2692 430 1.0389 0.2857 1.0389 1.0193
No log 8.3077 432 1.0158 0.3369 1.0158 1.0079
No log 8.3462 434 0.9871 0.3506 0.9871 0.9935
No log 8.3846 436 0.9791 0.3747 0.9791 0.9895
No log 8.4231 438 1.0444 0.4663 1.0444 1.0219
No log 8.4615 440 1.1566 0.3989 1.1566 1.0754
No log 8.5 442 1.1444 0.4217 1.1444 1.0697
No log 8.5385 444 0.9907 0.4811 0.9907 0.9953
No log 8.5769 446 0.9339 0.4663 0.9339 0.9664
No log 8.6154 448 0.9381 0.4663 0.9381 0.9686
No log 8.6538 450 0.9852 0.3979 0.9852 0.9926
No log 8.6923 452 1.0491 0.3377 1.0491 1.0243
No log 8.7308 454 1.0523 0.2878 1.0523 1.0258
No log 8.7692 456 0.9981 0.3019 0.9981 0.9991
No log 8.8077 458 0.9723 0.3396 0.9723 0.9860
No log 8.8462 460 0.9664 0.4238 0.9664 0.9831
No log 8.8846 462 0.9884 0.4328 0.9884 0.9942
No log 8.9231 464 1.0037 0.4754 1.0037 1.0018
No log 8.9615 466 0.9916 0.4754 0.9916 0.9958
No log 9.0 468 0.9674 0.4894 0.9674 0.9836
No log 9.0385 470 0.9765 0.4983 0.9765 0.9882
No log 9.0769 472 1.0358 0.4715 1.0358 1.0178
No log 9.1154 474 1.0863 0.3892 1.0863 1.0422
No log 9.1538 476 1.0550 0.4313 1.0550 1.0272
No log 9.1923 478 0.9767 0.3122 0.9767 0.9883
No log 9.2308 480 0.9786 0.2432 0.9786 0.9892
No log 9.2692 482 0.9817 0.2486 0.9817 0.9908
No log 9.3077 484 0.9691 0.2893 0.9691 0.9844
No log 9.3462 486 0.9862 0.2621 0.9862 0.9931
No log 9.3846 488 1.0363 0.2475 1.0363 1.0180
No log 9.4231 490 1.1062 0.3578 1.1062 1.0518
No log 9.4615 492 1.1024 0.3807 1.1024 1.0499
No log 9.5 494 1.0345 0.4345 1.0345 1.0171
No log 9.5385 496 0.8851 0.5086 0.8851 0.9408
No log 9.5769 498 0.8235 0.4718 0.8235 0.9075
0.422 9.6154 500 0.8323 0.4494 0.8323 0.9123
0.422 9.6538 502 0.8298 0.4064 0.8298 0.9110
0.422 9.6923 504 0.8639 0.4724 0.8639 0.9295
0.422 9.7308 506 0.8901 0.4434 0.8901 0.9434
0.422 9.7692 508 0.8621 0.4181 0.8621 0.9285
0.422 9.8077 510 0.8779 0.3855 0.8779 0.9369
0.422 9.8462 512 0.9607 0.3839 0.9607 0.9802
0.422 9.8846 514 0.9743 0.3920 0.9743 0.9871

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k17_task2_organization

Finetuned
(4023)
this model