ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0238
  • Qwk: 0.3676
  • Mse: 1.0238
  • Rmse: 1.0118

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 4.6810 0.0010 4.6810 2.1636
No log 0.0976 4 2.8211 -0.0233 2.8211 1.6796
No log 0.1463 6 1.9576 0.0062 1.9576 1.3991
No log 0.1951 8 1.4039 0.0393 1.4039 1.1849
No log 0.2439 10 1.4997 0.0037 1.4997 1.2246
No log 0.2927 12 1.5257 -0.0277 1.5257 1.2352
No log 0.3415 14 1.2734 0.1203 1.2734 1.1285
No log 0.3902 16 1.2566 0.1009 1.2566 1.1210
No log 0.4390 18 1.2323 0.1649 1.2323 1.1101
No log 0.4878 20 1.1994 0.2083 1.1994 1.0952
No log 0.5366 22 1.2102 0.1593 1.2102 1.1001
No log 0.5854 24 1.1975 0.2520 1.1975 1.0943
No log 0.6341 26 1.1697 0.1176 1.1697 1.0815
No log 0.6829 28 1.1800 0.1585 1.1800 1.0863
No log 0.7317 30 1.2564 0.0446 1.2564 1.1209
No log 0.7805 32 1.2053 0.1042 1.2053 1.0979
No log 0.8293 34 1.1440 0.1417 1.1440 1.0696
No log 0.8780 36 1.2890 0.0050 1.2890 1.1354
No log 0.9268 38 1.4968 0.0254 1.4968 1.2235
No log 0.9756 40 1.5965 0.0 1.5965 1.2635
No log 1.0244 42 1.5294 0.0 1.5294 1.2367
No log 1.0732 44 1.4084 0.0403 1.4084 1.1868
No log 1.1220 46 1.3651 0.0104 1.3651 1.1684
No log 1.1707 48 1.3585 0.0275 1.3585 1.1655
No log 1.2195 50 1.2897 0.1865 1.2897 1.1356
No log 1.2683 52 1.2565 0.1507 1.2565 1.1209
No log 1.3171 54 1.2142 0.2478 1.2142 1.1019
No log 1.3659 56 1.2215 0.3038 1.2215 1.1052
No log 1.4146 58 1.1742 0.2155 1.1742 1.0836
No log 1.4634 60 1.1696 0.1042 1.1696 1.0815
No log 1.5122 62 1.1720 0.1711 1.1720 1.0826
No log 1.5610 64 1.1676 0.2360 1.1676 1.0805
No log 1.6098 66 1.1724 0.1693 1.1724 1.0828
No log 1.6585 68 1.1830 0.1654 1.1830 1.0876
No log 1.7073 70 1.1430 0.3041 1.1430 1.0691
No log 1.7561 72 1.1393 0.3041 1.1393 1.0674
No log 1.8049 74 1.1072 0.2520 1.1072 1.0522
No log 1.8537 76 1.1331 0.2569 1.1331 1.0645
No log 1.9024 78 1.1403 0.1979 1.1403 1.0679
No log 1.9512 80 1.1307 0.1752 1.1307 1.0633
No log 2.0 82 1.1247 0.1857 1.1247 1.0605
No log 2.0488 84 1.1520 0.1585 1.1520 1.0733
No log 2.0976 86 1.1888 0.1076 1.1888 1.0903
No log 2.1463 88 1.2366 0.0442 1.2366 1.1120
No log 2.1951 90 1.2436 0.1637 1.2436 1.1151
No log 2.2439 92 1.3021 0.1313 1.3021 1.1411
No log 2.2927 94 1.6031 0.2644 1.6031 1.2661
No log 2.3415 96 1.6921 0.2393 1.6921 1.3008
No log 2.3902 98 1.3970 0.3500 1.3970 1.1819
No log 2.4390 100 1.0936 0.3710 1.0936 1.0458
No log 2.4878 102 1.1174 0.2539 1.1174 1.0571
No log 2.5366 104 1.0289 0.3639 1.0289 1.0144
No log 2.5854 106 1.0121 0.2871 1.0121 1.0060
No log 2.6341 108 1.0471 0.4625 1.0471 1.0233
No log 2.6829 110 0.9471 0.3697 0.9471 0.9732
No log 2.7317 112 0.9089 0.4512 0.9089 0.9534
No log 2.7805 114 0.9555 0.4513 0.9555 0.9775
No log 2.8293 116 1.0229 0.4166 1.0229 1.0114
No log 2.8780 118 0.9622 0.3256 0.9622 0.9809
No log 2.9268 120 0.9696 0.2995 0.9696 0.9847
No log 2.9756 122 1.0085 0.3869 1.0085 1.0043
No log 3.0244 124 1.1325 0.3188 1.1325 1.0642
No log 3.0732 126 1.0123 0.3773 1.0123 1.0062
No log 3.1220 128 0.9800 0.3371 0.9800 0.9899
No log 3.1707 130 0.9819 0.3445 0.9819 0.9909
No log 3.2195 132 1.2147 0.3574 1.2147 1.1021
No log 3.2683 134 1.2498 0.3042 1.2498 1.1179
No log 3.3171 136 0.9471 0.3849 0.9471 0.9732
No log 3.3659 138 1.1496 0.4368 1.1496 1.0722
No log 3.4146 140 1.1163 0.4166 1.1163 1.0566
No log 3.4634 142 0.9184 0.3756 0.9184 0.9583
No log 3.5122 144 0.9866 0.4755 0.9866 0.9933
No log 3.5610 146 0.9698 0.4808 0.9698 0.9848
No log 3.6098 148 0.9362 0.3927 0.9362 0.9676
No log 3.6585 150 0.9388 0.3196 0.9388 0.9689
No log 3.7073 152 0.9355 0.3139 0.9355 0.9672
No log 3.7561 154 0.9536 0.4867 0.9536 0.9765
No log 3.8049 156 1.3531 0.3039 1.3531 1.1632
No log 3.8537 158 1.5001 0.2374 1.5001 1.2248
No log 3.9024 160 1.1035 0.4689 1.1035 1.0505
No log 3.9512 162 0.9930 0.3200 0.9930 0.9965
No log 4.0 164 1.1729 0.3404 1.1729 1.0830
No log 4.0488 166 1.1254 0.4076 1.1254 1.0609
No log 4.0976 168 0.9927 0.3088 0.9927 0.9963
No log 4.1463 170 0.9871 0.4352 0.9871 0.9935
No log 4.1951 172 0.9904 0.3424 0.9904 0.9952
No log 4.2439 174 1.0153 0.3380 1.0153 1.0076
No log 4.2927 176 1.0098 0.3522 1.0098 1.0049
No log 4.3415 178 1.0401 0.2947 1.0401 1.0199
No log 4.3902 180 1.0294 0.3093 1.0294 1.0146
No log 4.4390 182 1.0209 0.3139 1.0209 1.0104
No log 4.4878 184 1.0212 0.3639 1.0212 1.0105
No log 4.5366 186 1.0167 0.3535 1.0167 1.0083
No log 4.5854 188 1.1199 0.3478 1.1199 1.0583
No log 4.6341 190 1.1673 0.4219 1.1673 1.0804
No log 4.6829 192 1.3065 0.3263 1.3065 1.1430
No log 4.7317 194 1.2770 0.3448 1.2770 1.1300
No log 4.7805 196 1.1408 0.4552 1.1408 1.0681
No log 4.8293 198 1.0284 0.3707 1.0284 1.0141
No log 4.8780 200 0.9181 0.3380 0.9181 0.9582
No log 4.9268 202 0.8998 0.3596 0.8998 0.9486
No log 4.9756 204 0.9002 0.4297 0.9002 0.9488
No log 5.0244 206 0.8696 0.3230 0.8696 0.9326
No log 5.0732 208 0.8752 0.2898 0.8752 0.9355
No log 5.1220 210 0.8660 0.3230 0.8660 0.9306
No log 5.1707 212 0.8744 0.3089 0.8744 0.9351
No log 5.2195 214 0.8921 0.2898 0.8921 0.9445
No log 5.2683 216 0.8931 0.3185 0.8931 0.9450
No log 5.3171 218 0.9215 0.2947 0.9215 0.9600
No log 5.3659 220 0.9387 0.2947 0.9387 0.9689
No log 5.4146 222 0.9209 0.3185 0.9209 0.9596
No log 5.4634 224 1.0329 0.4214 1.0329 1.0163
No log 5.5122 226 1.2525 0.3271 1.2525 1.1191
No log 5.5610 228 1.2120 0.3025 1.2120 1.1009
No log 5.6098 230 0.9970 0.4434 0.9970 0.9985
No log 5.6585 232 0.9599 0.3565 0.9599 0.9797
No log 5.7073 234 0.9666 0.3396 0.9666 0.9832
No log 5.7561 236 0.9934 0.3697 0.9934 0.9967
No log 5.8049 238 0.9842 0.3449 0.9842 0.9921
No log 5.8537 240 1.0115 0.3560 1.0115 1.0057
No log 5.9024 242 1.0000 0.3115 1.0000 1.0000
No log 5.9512 244 0.9862 0.4023 0.9862 0.9931
No log 6.0 246 0.9829 0.3780 0.9829 0.9914
No log 6.0488 248 1.0108 0.3400 1.0108 1.0054
No log 6.0976 250 1.1576 0.4098 1.1576 1.0759
No log 6.1463 252 1.1428 0.4094 1.1428 1.0690
No log 6.1951 254 1.0377 0.3022 1.0377 1.0187
No log 6.2439 256 1.0105 0.2970 1.0105 1.0053
No log 6.2927 258 1.0047 0.2993 1.0047 1.0023
No log 6.3415 260 1.0061 0.3237 1.0061 1.0030
No log 6.3902 262 0.9965 0.3090 0.9965 0.9983
No log 6.4390 264 1.0154 0.3115 1.0154 1.0077
No log 6.4878 266 1.0403 0.3046 1.0403 1.0200
No log 6.5366 268 0.9869 0.3093 0.9869 0.9934
No log 6.5854 270 0.9861 0.3814 0.9861 0.9930
No log 6.6341 272 1.0019 0.4235 1.0019 1.0009
No log 6.6829 274 0.9573 0.3943 0.9573 0.9784
No log 6.7317 276 1.0226 0.3113 1.0226 1.0112
No log 6.7805 278 1.0508 0.3556 1.0508 1.0251
No log 6.8293 280 0.9973 0.2997 0.9973 0.9986
No log 6.8780 282 0.9810 0.3535 0.9810 0.9905
No log 6.9268 284 0.9831 0.3396 0.9831 0.9915
No log 6.9756 286 1.0235 0.3113 1.0235 1.0117
No log 7.0244 288 1.1191 0.3515 1.1191 1.0579
No log 7.0732 290 1.0885 0.2791 1.0885 1.0433
No log 7.1220 292 1.0183 0.3113 1.0183 1.0091
No log 7.1707 294 0.9875 0.2850 0.9875 0.9937
No log 7.2195 296 1.0255 0.3113 1.0255 1.0127
No log 7.2683 298 1.0215 0.3113 1.0215 1.0107
No log 7.3171 300 0.9514 0.3282 0.9514 0.9754
No log 7.3659 302 0.9825 0.3571 0.9825 0.9912
No log 7.4146 304 1.0346 0.3837 1.0346 1.0171
No log 7.4634 306 0.9950 0.3855 0.9950 0.9975
No log 7.5122 308 0.9553 0.3539 0.9553 0.9774
No log 7.5610 310 1.0443 0.2821 1.0443 1.0219
No log 7.6098 312 1.1636 0.3023 1.1636 1.0787
No log 7.6585 314 1.2592 0.3658 1.2592 1.1221
No log 7.7073 316 1.1887 0.3771 1.1887 1.0903
No log 7.7561 318 0.9908 0.3115 0.9908 0.9954
No log 7.8049 320 0.9586 0.3974 0.9586 0.9791
No log 7.8537 322 0.9642 0.4016 0.9642 0.9820
No log 7.9024 324 0.9504 0.3424 0.9504 0.9749
No log 7.9512 326 1.0115 0.3848 1.0115 1.0057
No log 8.0 328 1.0113 0.3725 1.0113 1.0056
No log 8.0488 330 0.9340 0.3771 0.9340 0.9665
No log 8.0976 332 0.9893 0.3614 0.9893 0.9946
No log 8.1463 334 1.0934 0.3508 1.0934 1.0457
No log 8.1951 336 1.0419 0.2702 1.0419 1.0207
No log 8.2439 338 0.9465 0.3873 0.9465 0.9729
No log 8.2927 340 0.9692 0.2973 0.9692 0.9845
No log 8.3415 342 1.0726 0.3237 1.0726 1.0357
No log 8.3902 344 1.1273 0.3117 1.1273 1.0618
No log 8.4390 346 1.0855 0.3488 1.0855 1.0419
No log 8.4878 348 0.9832 0.3165 0.9832 0.9916
No log 8.5366 350 0.9325 0.3908 0.9325 0.9657
No log 8.5854 352 0.9304 0.3804 0.9304 0.9646
No log 8.6341 354 0.9489 0.4603 0.9489 0.9741
No log 8.6829 356 0.9479 0.4377 0.9479 0.9736
No log 8.7317 358 0.9520 0.4104 0.9520 0.9757
No log 8.7805 360 0.9371 0.4181 0.9371 0.9680
No log 8.8293 362 0.9606 0.4233 0.9606 0.9801
No log 8.8780 364 0.9815 0.3688 0.9815 0.9907
No log 8.9268 366 0.9478 0.4023 0.9478 0.9735
No log 8.9756 368 0.9727 0.3405 0.9727 0.9863
No log 9.0244 370 1.1393 0.3427 1.1393 1.0674
No log 9.0732 372 1.2072 0.3537 1.2072 1.0987
No log 9.1220 374 1.1324 0.3413 1.1324 1.0641
No log 9.1707 376 1.0493 0.3625 1.0493 1.0243
No log 9.2195 378 0.9751 0.3256 0.9751 0.9875
No log 9.2683 380 0.9719 0.3535 0.9719 0.9858
No log 9.3171 382 0.9798 0.3062 0.9798 0.9899
No log 9.3659 384 0.9651 0.3117 0.9651 0.9824
No log 9.4146 386 0.9811 0.3308 0.9811 0.9905
No log 9.4634 388 1.0307 0.3996 1.0307 1.0152
No log 9.5122 390 1.0449 0.4 1.0449 1.0222
No log 9.5610 392 0.9960 0.3762 0.9960 0.9980
No log 9.6098 394 0.9369 0.3069 0.9369 0.9679
No log 9.6585 396 0.9042 0.3453 0.9042 0.9509
No log 9.7073 398 0.9047 0.4016 0.9047 0.9511
No log 9.7561 400 0.8858 0.3453 0.8858 0.9412
No log 9.8049 402 0.8909 0.3632 0.8909 0.9439
No log 9.8537 404 0.8896 0.3535 0.8896 0.9432
No log 9.9024 406 0.8805 0.3943 0.8805 0.9384
No log 9.9512 408 0.8956 0.4626 0.8956 0.9464
No log 10.0 410 0.9821 0.4805 0.9821 0.9910
No log 10.0488 412 0.9988 0.4191 0.9988 0.9994
No log 10.0976 414 0.9220 0.4234 0.9220 0.9602
No log 10.1463 416 0.8990 0.3747 0.8990 0.9482
No log 10.1951 418 0.9061 0.4240 0.9061 0.9519
No log 10.2439 420 0.9238 0.3747 0.9238 0.9612
No log 10.2927 422 0.9593 0.3950 0.9593 0.9794
No log 10.3415 424 1.0281 0.4339 1.0281 1.0140
No log 10.3902 426 1.0292 0.3686 1.0292 1.0145
No log 10.4390 428 1.0214 0.3421 1.0214 1.0106
No log 10.4878 430 1.0350 0.3144 1.0350 1.0173
No log 10.5366 432 1.0236 0.3021 1.0236 1.0117
No log 10.5854 434 0.9918 0.2741 0.9918 0.9959
No log 10.6341 436 0.9836 0.2741 0.9836 0.9917
No log 10.6829 438 1.0063 0.2721 1.0063 1.0031
No log 10.7317 440 1.0065 0.2721 1.0065 1.0032
No log 10.7805 442 1.0100 0.3144 1.0100 1.0050
No log 10.8293 444 0.9758 0.3045 0.9758 0.9879
No log 10.8780 446 0.9862 0.3645 0.9862 0.9931
No log 10.9268 448 0.9821 0.3434 0.9821 0.9910
No log 10.9756 450 0.9649 0.3069 0.9649 0.9823
No log 11.0244 452 0.9791 0.3169 0.9791 0.9895
No log 11.0732 454 1.0188 0.2782 1.0188 1.0093
No log 11.1220 456 1.1379 0.3274 1.1379 1.0667
No log 11.1707 458 1.1920 0.3488 1.1920 1.0918
No log 11.2195 460 1.1232 0.3274 1.1232 1.0598
No log 11.2683 462 1.0028 0.3289 1.0028 1.0014
No log 11.3171 464 0.9795 0.3069 0.9795 0.9897
No log 11.3659 466 0.9695 0.2920 0.9695 0.9846
No log 11.4146 468 1.0055 0.2973 1.0055 1.0028
No log 11.4634 470 1.1078 0.3755 1.1078 1.0525
No log 11.5122 472 1.2189 0.3982 1.2189 1.1040
No log 11.5610 474 1.3015 0.4036 1.3015 1.1408
No log 11.6098 476 1.2222 0.3621 1.2222 1.1055
No log 11.6585 478 1.0469 0.3876 1.0469 1.0232
No log 11.7073 480 0.9181 0.3282 0.9181 0.9582
No log 11.7561 482 0.8957 0.3738 0.8957 0.9464
No log 11.8049 484 0.8905 0.3987 0.8905 0.9436
No log 11.8537 486 0.9176 0.3351 0.9176 0.9579
No log 11.9024 488 1.0402 0.4647 1.0402 1.0199
No log 11.9512 490 1.0372 0.4563 1.0372 1.0184
No log 12.0 492 0.9627 0.4323 0.9627 0.9812
No log 12.0488 494 0.9178 0.3654 0.9178 0.9580
No log 12.0976 496 0.8904 0.4256 0.8904 0.9436
No log 12.1463 498 0.8879 0.4514 0.8879 0.9423
0.3964 12.1951 500 0.8931 0.4749 0.8931 0.9450
0.3964 12.2439 502 0.8784 0.4346 0.8784 0.9372
0.3964 12.2927 504 0.9105 0.3902 0.9105 0.9542
0.3964 12.3415 506 0.9441 0.3830 0.9441 0.9716
0.3964 12.3902 508 1.0054 0.3667 1.0054 1.0027
0.3964 12.4390 510 1.0238 0.3676 1.0238 1.0118

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k13_task2_organization

Finetuned
(4023)
this model