ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k9_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9923
  • Qwk: 0.2998
  • Mse: 0.9923
  • Rmse: 0.9962

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 4.9319 -0.0020 4.9319 2.2208
No log 0.1379 4 2.9552 -0.0104 2.9552 1.7191
No log 0.2069 6 1.8788 0.0118 1.8788 1.3707
No log 0.2759 8 2.0897 0.0023 2.0897 1.4456
No log 0.3448 10 1.6834 0.0082 1.6834 1.2975
No log 0.4138 12 1.2975 0.0627 1.2975 1.1391
No log 0.4828 14 1.3244 -0.0521 1.3244 1.1508
No log 0.5517 16 1.3400 -0.1279 1.3400 1.1576
No log 0.6207 18 1.3332 -0.1049 1.3332 1.1547
No log 0.6897 20 1.3924 0.0038 1.3924 1.1800
No log 0.7586 22 1.3509 -0.0114 1.3509 1.1623
No log 0.8276 24 1.3297 -0.0165 1.3297 1.1531
No log 0.8966 26 1.1935 0.1935 1.1935 1.0925
No log 0.9655 28 1.1857 0.1076 1.1857 1.0889
No log 1.0345 30 1.1737 0.1076 1.1737 1.0834
No log 1.1034 32 1.1504 0.1247 1.1504 1.0726
No log 1.1724 34 1.0958 0.1649 1.0958 1.0468
No log 1.2414 36 1.1902 0.1413 1.1902 1.0910
No log 1.3103 38 1.5200 0.0662 1.5200 1.2329
No log 1.3793 40 1.5360 0.0339 1.5360 1.2393
No log 1.4483 42 1.4311 0.0426 1.4311 1.1963
No log 1.5172 44 1.4611 0.0254 1.4611 1.2088
No log 1.5862 46 1.3032 0.1565 1.3032 1.1416
No log 1.6552 48 1.1674 0.1300 1.1674 1.0805
No log 1.7241 50 1.0935 0.1935 1.0935 1.0457
No log 1.7931 52 1.0896 0.1343 1.0896 1.0438
No log 1.8621 54 1.1066 0.1210 1.1066 1.0519
No log 1.9310 56 1.1626 0.2040 1.1626 1.0782
No log 2.0 58 1.2472 0.1320 1.2472 1.1168
No log 2.0690 60 1.2856 0.1379 1.2856 1.1339
No log 2.1379 62 1.2561 0.1821 1.2561 1.1208
No log 2.2069 64 1.0723 0.2622 1.0723 1.0355
No log 2.2759 66 0.9633 0.3787 0.9633 0.9815
No log 2.3448 68 1.0430 0.2650 1.0430 1.0213
No log 2.4138 70 1.1570 0.2457 1.1570 1.0756
No log 2.4828 72 1.0913 0.2115 1.0913 1.0447
No log 2.5517 74 0.9929 0.3206 0.9929 0.9965
No log 2.6207 76 0.9750 0.4617 0.9750 0.9874
No log 2.6897 78 1.1551 0.2191 1.1551 1.0747
No log 2.7586 80 1.2681 0.2457 1.2681 1.1261
No log 2.8276 82 1.1457 0.3508 1.1457 1.0704
No log 2.8966 84 0.9410 0.4635 0.9410 0.9701
No log 2.9655 86 0.9065 0.4107 0.9065 0.9521
No log 3.0345 88 1.0125 0.3676 1.0125 1.0062
No log 3.1034 90 1.0613 0.4101 1.0613 1.0302
No log 3.1724 92 0.9878 0.3876 0.9878 0.9939
No log 3.2414 94 0.9221 0.4037 0.9221 0.9603
No log 3.3103 96 0.9151 0.4293 0.9151 0.9566
No log 3.3793 98 0.9581 0.3491 0.9581 0.9788
No log 3.4483 100 1.0206 0.3528 1.0206 1.0102
No log 3.5172 102 1.0693 0.4052 1.0693 1.0341
No log 3.5862 104 1.1368 0.3560 1.1368 1.0662
No log 3.6552 106 1.0615 0.3564 1.0615 1.0303
No log 3.7241 108 1.0093 0.3548 1.0093 1.0047
No log 3.7931 110 1.0113 0.3811 1.0113 1.0056
No log 3.8621 112 1.0291 0.3300 1.0291 1.0144
No log 3.9310 114 1.0299 0.4039 1.0299 1.0149
No log 4.0 116 1.0257 0.4491 1.0257 1.0128
No log 4.0690 118 0.9916 0.3539 0.9916 0.9958
No log 4.1379 120 0.9742 0.2843 0.9742 0.9870
No log 4.2069 122 0.9740 0.3039 0.9740 0.9869
No log 4.2759 124 0.9892 0.3184 0.9892 0.9946
No log 4.3448 126 0.9603 0.3237 0.9603 0.9800
No log 4.4138 128 0.9719 0.3493 0.9719 0.9859
No log 4.4828 130 1.0242 0.2949 1.0242 1.0120
No log 4.5517 132 0.9959 0.3097 0.9959 0.9979
No log 4.6207 134 0.9422 0.3196 0.9422 0.9707
No log 4.6897 136 1.0016 0.3762 1.0016 1.0008
No log 4.7586 138 1.1018 0.3891 1.1018 1.0497
No log 4.8276 140 1.2654 0.2939 1.2654 1.1249
No log 4.8966 142 1.1425 0.2777 1.1425 1.0689
No log 4.9655 144 1.0474 0.3554 1.0474 1.0234
No log 5.0345 146 1.0954 0.3268 1.0954 1.0466
No log 5.1034 148 1.1451 0.4093 1.1451 1.0701
No log 5.1724 150 1.0190 0.3144 1.0190 1.0094
No log 5.2414 152 0.9481 0.3493 0.9481 0.9737
No log 5.3103 154 1.0336 0.2567 1.0336 1.0166
No log 5.3793 156 1.0601 0.2815 1.0601 1.0296
No log 5.4483 158 1.0554 0.2618 1.0554 1.0273
No log 5.5172 160 1.0206 0.3250 1.0206 1.0102
No log 5.5862 162 0.9711 0.2624 0.9711 0.9855
No log 5.6552 164 0.9536 0.3278 0.9536 0.9765
No log 5.7241 166 0.9987 0.3098 0.9987 0.9993
No log 5.7931 168 1.0349 0.2739 1.0349 1.0173
No log 5.8621 170 0.9856 0.3250 0.9856 0.9928
No log 5.9310 172 0.9434 0.2835 0.9434 0.9713
No log 6.0 174 0.9339 0.3318 0.9339 0.9664
No log 6.0690 176 0.9204 0.3318 0.9204 0.9594
No log 6.1379 178 0.9363 0.3493 0.9363 0.9676
No log 6.2069 180 1.0193 0.3636 1.0193 1.0096
No log 6.2759 182 0.9703 0.4025 0.9703 0.9850
No log 6.3448 184 0.8810 0.3744 0.8810 0.9386
No log 6.4138 186 0.9335 0.3806 0.9335 0.9662
No log 6.4828 188 1.0527 0.3875 1.0527 1.0260
No log 6.5517 190 1.0445 0.4630 1.0445 1.0220
No log 6.6207 192 0.9532 0.4606 0.9532 0.9763
No log 6.6897 194 0.8938 0.3437 0.8938 0.9454
No log 6.7586 196 1.0396 0.4412 1.0396 1.0196
No log 6.8276 198 1.1749 0.4168 1.1749 1.0839
No log 6.8966 200 1.0900 0.4607 1.0900 1.0440
No log 6.9655 202 0.9591 0.4277 0.9591 0.9793
No log 7.0345 204 0.9581 0.3838 0.9581 0.9788
No log 7.1034 206 0.9775 0.4042 0.9775 0.9887
No log 7.1724 208 1.0839 0.2949 1.0839 1.0411
No log 7.2414 210 1.2119 0.2781 1.2119 1.1008
No log 7.3103 212 1.2126 0.2527 1.2126 1.1012
No log 7.3793 214 1.1486 0.2602 1.1486 1.0717
No log 7.4483 216 1.0199 0.2972 1.0199 1.0099
No log 7.5172 218 0.9494 0.3119 0.9494 0.9744
No log 7.5862 220 0.9385 0.3583 0.9385 0.9688
No log 7.6552 222 0.9477 0.3437 0.9477 0.9735
No log 7.7241 224 1.0101 0.3720 1.0101 1.0050
No log 7.7931 226 1.0728 0.4295 1.0728 1.0358
No log 7.8621 228 1.0329 0.3970 1.0329 1.0163
No log 7.9310 230 1.0267 0.4202 1.0267 1.0133
No log 8.0 232 0.9566 0.3771 0.9566 0.9780
No log 8.0690 234 0.9518 0.4454 0.9518 0.9756
No log 8.1379 236 0.9103 0.3914 0.9103 0.9541
No log 8.2069 238 0.9192 0.4104 0.9192 0.9587
No log 8.2759 240 0.9192 0.4104 0.9192 0.9588
No log 8.3448 242 0.9338 0.4065 0.9338 0.9664
No log 8.4138 244 0.9270 0.4104 0.9270 0.9628
No log 8.4828 246 0.8745 0.4278 0.8745 0.9351
No log 8.5517 248 0.8689 0.4220 0.8689 0.9322
No log 8.6207 250 0.8758 0.3830 0.8758 0.9358
No log 8.6897 252 0.8645 0.3539 0.8645 0.9298
No log 8.7586 254 0.8736 0.3508 0.8736 0.9347
No log 8.8276 256 0.8499 0.3641 0.8499 0.9219
No log 8.8966 258 0.8462 0.4077 0.8462 0.9199
No log 8.9655 260 0.8928 0.4016 0.8928 0.9449
No log 9.0345 262 0.9006 0.4299 0.9006 0.9490
No log 9.1034 264 0.8854 0.3779 0.8854 0.9409
No log 9.1724 266 0.8907 0.3100 0.8907 0.9438
No log 9.2414 268 0.9300 0.2995 0.9300 0.9643
No log 9.3103 270 0.9438 0.2878 0.9438 0.9715
No log 9.3793 272 0.9445 0.2975 0.9445 0.9718
No log 9.4483 274 0.9609 0.3730 0.9609 0.9802
No log 9.5172 276 0.9310 0.2975 0.9310 0.9649
No log 9.5862 278 0.9210 0.3217 0.9210 0.9597
No log 9.6552 280 0.9299 0.3711 0.9299 0.9643
No log 9.7241 282 0.8430 0.3147 0.8430 0.9182
No log 9.7931 284 0.8245 0.3045 0.8245 0.9080
No log 9.8621 286 0.8081 0.3345 0.8081 0.8989
No log 9.9310 288 0.8548 0.3926 0.8548 0.9245
No log 10.0 290 0.8767 0.4826 0.8767 0.9363
No log 10.0690 292 0.8118 0.3753 0.8118 0.9010
No log 10.1379 294 0.8026 0.3493 0.8026 0.8959
No log 10.2069 296 0.8249 0.3609 0.8249 0.9083
No log 10.2759 298 0.8945 0.3418 0.8945 0.9458
No log 10.3448 300 0.9226 0.3372 0.9226 0.9605
No log 10.4138 302 0.8649 0.3074 0.8649 0.9300
No log 10.4828 304 0.7914 0.3583 0.7914 0.8896
No log 10.5517 306 0.7686 0.4819 0.7686 0.8767
No log 10.6207 308 0.7519 0.5327 0.7519 0.8671
No log 10.6897 310 0.8018 0.3966 0.8018 0.8954
No log 10.7586 312 0.8779 0.5090 0.8779 0.9370
No log 10.8276 314 0.9075 0.4484 0.9075 0.9526
No log 10.8966 316 1.0141 0.4730 1.0141 1.0070
No log 10.9655 318 1.0310 0.3699 1.0310 1.0154
No log 11.0345 320 0.9724 0.2821 0.9724 0.9861
No log 11.1034 322 0.9010 0.3609 0.9010 0.9492
No log 11.1724 324 0.9041 0.3753 0.9041 0.9508
No log 11.2414 326 0.9588 0.3464 0.9588 0.9792
No log 11.3103 328 1.0081 0.3488 1.0081 1.0040
No log 11.3793 330 1.0904 0.3718 1.0904 1.0442
No log 11.4483 332 1.1091 0.3718 1.1091 1.0531
No log 11.5172 334 1.0909 0.3709 1.0909 1.0445
No log 11.5862 336 1.0158 0.3223 1.0158 1.0079
No log 11.6552 338 1.0025 0.2945 1.0025 1.0012
No log 11.7241 340 0.9809 0.2551 0.9809 0.9904
No log 11.7931 342 0.9918 0.2945 0.9918 0.9959
No log 11.8621 344 1.0022 0.3024 1.0022 1.0011
No log 11.9310 346 0.9430 0.2812 0.9430 0.9711
No log 12.0 348 0.8908 0.3201 0.8908 0.9438
No log 12.0690 350 0.8866 0.3510 0.8866 0.9416
No log 12.1379 352 0.8807 0.3201 0.8807 0.9385
No log 12.2069 354 0.9363 0.3023 0.9363 0.9676
No log 12.2759 356 0.9880 0.3577 0.9880 0.9940
No log 12.3448 358 0.9453 0.3577 0.9453 0.9723
No log 12.4138 360 0.8480 0.3124 0.8480 0.9209
No log 12.4828 362 0.8311 0.4012 0.8311 0.9116
No log 12.5517 364 0.8480 0.4337 0.8480 0.9209
No log 12.6207 366 0.8290 0.4012 0.8290 0.9105
No log 12.6897 368 0.8173 0.3710 0.8173 0.9041
No log 12.7586 370 0.8953 0.4223 0.8953 0.9462
No log 12.8276 372 0.9234 0.5045 0.9234 0.9609
No log 12.8966 374 0.8544 0.4031 0.8544 0.9244
No log 12.9655 376 0.8438 0.3190 0.8438 0.9186
No log 13.0345 378 0.8931 0.2733 0.8931 0.9451
No log 13.1034 380 0.8759 0.2887 0.8759 0.9359
No log 13.1724 382 0.8725 0.3401 0.8725 0.9341
No log 13.2414 384 0.9639 0.4224 0.9639 0.9818
No log 13.3103 386 1.0290 0.4984 1.0290 1.0144
No log 13.3793 388 0.9793 0.4449 0.9793 0.9896
No log 13.4483 390 0.9533 0.4224 0.9533 0.9764
No log 13.5172 392 0.9370 0.3223 0.9370 0.9680
No log 13.5862 394 0.9336 0.3372 0.9336 0.9662
No log 13.6552 396 0.9322 0.3372 0.9322 0.9655
No log 13.7241 398 0.9888 0.3196 0.9888 0.9944
No log 13.7931 400 1.0278 0.2927 1.0278 1.0138
No log 13.8621 402 1.0342 0.2927 1.0342 1.0169
No log 13.9310 404 1.0021 0.3503 1.0021 1.0011
No log 14.0 406 0.9350 0.3289 0.9350 0.9669
No log 14.0690 408 0.9166 0.3551 0.9166 0.9574
No log 14.1379 410 0.9216 0.3270 0.9216 0.9600
No log 14.2069 412 0.9207 0.3270 0.9207 0.9595
No log 14.2759 414 0.9408 0.3533 0.9408 0.9700
No log 14.3448 416 0.9530 0.3489 0.9530 0.9762
No log 14.4138 418 0.9407 0.3503 0.9407 0.9699
No log 14.4828 420 0.9245 0.3389 0.9245 0.9615
No log 14.5517 422 0.8926 0.3609 0.8926 0.9448
No log 14.6207 424 0.9153 0.3547 0.9153 0.9567
No log 14.6897 426 0.9866 0.3762 0.9866 0.9933
No log 14.7586 428 0.9822 0.3503 0.9822 0.9911
No log 14.8276 430 0.9282 0.3687 0.9282 0.9634
No log 14.8966 432 0.9201 0.3478 0.9201 0.9592
No log 14.9655 434 0.9354 0.3547 0.9354 0.9672
No log 15.0345 436 1.0168 0.3721 1.0168 1.0084
No log 15.1034 438 1.0760 0.3072 1.0760 1.0373
No log 15.1724 440 1.0222 0.2702 1.0222 1.0110
No log 15.2414 442 0.9719 0.2812 0.9719 0.9859
No log 15.3103 444 0.9781 0.2812 0.9781 0.9890
No log 15.3793 446 0.9935 0.2655 0.9935 0.9967
No log 15.4483 448 1.0185 0.1582 1.0185 1.0092
No log 15.5172 450 1.0128 0.2322 1.0128 1.0064
No log 15.5862 452 0.9813 0.3989 0.9813 0.9906
No log 15.6552 454 0.9746 0.4357 0.9746 0.9872
No log 15.7241 456 0.9308 0.4614 0.9308 0.9648
No log 15.7931 458 0.9383 0.4598 0.9383 0.9687
No log 15.8621 460 0.9779 0.4186 0.9779 0.9889
No log 15.9310 462 0.9709 0.4186 0.9709 0.9854
No log 16.0 464 1.0078 0.3341 1.0078 1.0039
No log 16.0690 466 1.0485 0.2980 1.0485 1.0240
No log 16.1379 468 1.0478 0.2141 1.0478 1.0236
No log 16.2069 470 1.0437 0.2141 1.0437 1.0216
No log 16.2759 472 1.0224 0.2141 1.0224 1.0111
No log 16.3448 474 1.0090 0.2291 1.0090 1.0045
No log 16.4138 476 1.0087 0.3958 1.0087 1.0043
No log 16.4828 478 0.9510 0.4966 0.9510 0.9752
No log 16.5517 480 0.9093 0.4966 0.9093 0.9536
No log 16.6207 482 0.8518 0.4526 0.8518 0.9229
No log 16.6897 484 0.8262 0.3961 0.8262 0.9090
No log 16.7586 486 0.8425 0.4203 0.8425 0.9179
No log 16.8276 488 0.8739 0.4302 0.8739 0.9349
No log 16.8966 490 0.8670 0.3710 0.8670 0.9311
No log 16.9655 492 0.8393 0.3345 0.8393 0.9162
No log 17.0345 494 0.8402 0.4221 0.8402 0.9166
No log 17.1034 496 0.8356 0.3629 0.8356 0.9141
No log 17.1724 498 0.8577 0.3345 0.8577 0.9261
0.3428 17.2414 500 0.9277 0.4028 0.9277 0.9632
0.3428 17.3103 502 0.9445 0.4351 0.9445 0.9719
0.3428 17.3793 504 0.9147 0.3794 0.9147 0.9564
0.3428 17.4483 506 0.8771 0.3632 0.8771 0.9365
0.3428 17.5172 508 0.8870 0.3632 0.8870 0.9418
0.3428 17.5862 510 0.9526 0.3389 0.9526 0.9760
0.3428 17.6552 512 0.9923 0.2998 0.9923 0.9962

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k9_task2_organization

Finetuned
(4023)
this model