ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9023
  • Qwk: 0.4440
  • Mse: 0.9023
  • Rmse: 0.9499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 3.9079 -0.0004 3.9079 1.9768
No log 0.0870 4 1.9242 0.0594 1.9242 1.3871
No log 0.1304 6 1.2709 0.0228 1.2709 1.1273
No log 0.1739 8 1.2861 -0.0119 1.2861 1.1341
No log 0.2174 10 1.1131 0.0168 1.1131 1.0550
No log 0.2609 12 1.4534 0.0062 1.4534 1.2056
No log 0.3043 14 1.6130 0.0044 1.6130 1.2701
No log 0.3478 16 2.4023 -0.1074 2.4023 1.5499
No log 0.3913 18 1.8580 -0.0546 1.8580 1.3631
No log 0.4348 20 0.9865 0.0516 0.9865 0.9932
No log 0.4783 22 0.8442 0.1102 0.8442 0.9188
No log 0.5217 24 0.8920 0.1234 0.8920 0.9445
No log 0.5652 26 1.1489 0.0296 1.1489 1.0719
No log 0.6087 28 1.4322 0.0402 1.4322 1.1968
No log 0.6522 30 1.6283 0.0902 1.6283 1.2761
No log 0.6957 32 1.5586 0.1385 1.5586 1.2485
No log 0.7391 34 1.5629 0.1372 1.5629 1.2502
No log 0.7826 36 1.2189 0.1109 1.2189 1.1040
No log 0.8261 38 1.0270 0.1448 1.0270 1.0134
No log 0.8696 40 0.8282 0.2404 0.8282 0.9101
No log 0.9130 42 0.7782 0.2521 0.7782 0.8822
No log 0.9565 44 0.8506 0.1422 0.8506 0.9223
No log 1.0 46 0.8544 0.1353 0.8544 0.9244
No log 1.0435 48 1.0831 0.1313 1.0831 1.0407
No log 1.0870 50 1.3611 0.1630 1.3611 1.1667
No log 1.1304 52 1.2755 0.1520 1.2755 1.1294
No log 1.1739 54 1.1232 0.1418 1.1232 1.0598
No log 1.2174 56 0.8161 0.2172 0.8161 0.9034
No log 1.2609 58 0.6614 0.3878 0.6614 0.8133
No log 1.3043 60 0.6973 0.3385 0.6973 0.8350
No log 1.3478 62 0.8539 0.2356 0.8539 0.9241
No log 1.3913 64 1.3033 0.1422 1.3033 1.1416
No log 1.4348 66 1.7136 0.1300 1.7136 1.3090
No log 1.4783 68 1.8812 0.1340 1.8812 1.3716
No log 1.5217 70 1.7973 0.1527 1.7973 1.3406
No log 1.5652 72 1.6219 0.1759 1.6219 1.2735
No log 1.6087 74 1.3708 0.2307 1.3708 1.1708
No log 1.6522 76 1.1696 0.2611 1.1696 1.0815
No log 1.6957 78 1.0988 0.2190 1.0988 1.0482
No log 1.7391 80 1.0166 0.1501 1.0166 1.0083
No log 1.7826 82 0.9347 0.1740 0.9347 0.9668
No log 1.8261 84 0.8102 0.2098 0.8102 0.9001
No log 1.8696 86 0.6475 0.3649 0.6475 0.8047
No log 1.9130 88 0.6288 0.3873 0.6288 0.7930
No log 1.9565 90 0.6476 0.3816 0.6476 0.8048
No log 2.0 92 0.6383 0.3662 0.6383 0.7989
No log 2.0435 94 0.6490 0.4339 0.6490 0.8056
No log 2.0870 96 0.7540 0.3120 0.7540 0.8683
No log 2.1304 98 1.2334 0.2751 1.2334 1.1106
No log 2.1739 100 1.8931 0.1617 1.8931 1.3759
No log 2.2174 102 2.0105 0.1422 2.0105 1.4179
No log 2.2609 104 1.7890 0.1548 1.7890 1.3375
No log 2.3043 106 1.4359 0.2108 1.4359 1.1983
No log 2.3478 108 1.1187 0.2157 1.1186 1.0577
No log 2.3913 110 0.9231 0.2274 0.9231 0.9608
No log 2.4348 112 0.8098 0.2795 0.8098 0.8999
No log 2.4783 114 0.7739 0.2539 0.7739 0.8797
No log 2.5217 116 0.8155 0.2321 0.8155 0.9030
No log 2.5652 118 0.8088 0.2396 0.8088 0.8993
No log 2.6087 120 0.8334 0.2321 0.8334 0.9129
No log 2.6522 122 0.9650 0.1850 0.9650 0.9823
No log 2.6957 124 1.2026 0.1909 1.2026 1.0966
No log 2.7391 126 1.3129 0.1836 1.3129 1.1458
No log 2.7826 128 1.3669 0.2575 1.3669 1.1692
No log 2.8261 130 1.5475 0.1910 1.5475 1.2440
No log 2.8696 132 1.5569 0.2320 1.5569 1.2478
No log 2.9130 134 1.4281 0.2456 1.4281 1.1950
No log 2.9565 136 1.1143 0.3038 1.1143 1.0556
No log 3.0 138 0.9047 0.3329 0.9047 0.9512
No log 3.0435 140 0.8251 0.3174 0.8251 0.9084
No log 3.0870 142 0.8625 0.2989 0.8625 0.9287
No log 3.1304 144 0.8583 0.3105 0.8583 0.9264
No log 3.1739 146 0.7861 0.3030 0.7861 0.8866
No log 3.2174 148 0.7356 0.3519 0.7356 0.8577
No log 3.2609 150 0.7366 0.3847 0.7366 0.8582
No log 3.3043 152 0.7807 0.3400 0.7807 0.8836
No log 3.3478 154 0.9662 0.3307 0.9662 0.9829
No log 3.3913 156 1.1603 0.3030 1.1603 1.0772
No log 3.4348 158 1.3806 0.2920 1.3806 1.1750
No log 3.4783 160 1.3918 0.3078 1.3918 1.1797
No log 3.5217 162 1.0968 0.3481 1.0968 1.0473
No log 3.5652 164 0.8933 0.4544 0.8933 0.9451
No log 3.6087 166 0.7933 0.4587 0.7933 0.8907
No log 3.6522 168 0.7701 0.4633 0.7701 0.8776
No log 3.6957 170 0.8325 0.4123 0.8325 0.9124
No log 3.7391 172 0.9009 0.3546 0.9009 0.9491
No log 3.7826 174 0.8361 0.3927 0.8361 0.9144
No log 3.8261 176 0.8029 0.4041 0.8029 0.8961
No log 3.8696 178 0.8180 0.4041 0.8180 0.9044
No log 3.9130 180 0.8450 0.4084 0.8450 0.9193
No log 3.9565 182 0.8451 0.4069 0.8451 0.9193
No log 4.0 184 0.8272 0.4134 0.8272 0.9095
No log 4.0435 186 0.7724 0.3964 0.7724 0.8789
No log 4.0870 188 0.7705 0.4200 0.7705 0.8778
No log 4.1304 190 0.8787 0.4080 0.8787 0.9374
No log 4.1739 192 1.1282 0.3286 1.1282 1.0622
No log 4.2174 194 1.3327 0.3408 1.3327 1.1544
No log 4.2609 196 1.3284 0.3083 1.3284 1.1526
No log 4.3043 198 1.1368 0.3027 1.1368 1.0662
No log 4.3478 200 0.8900 0.3300 0.8900 0.9434
No log 4.3913 202 0.8180 0.4555 0.8180 0.9044
No log 4.4348 204 0.8515 0.3995 0.8515 0.9228
No log 4.4783 206 0.9749 0.3360 0.9749 0.9874
No log 4.5217 208 1.0753 0.3440 1.0753 1.0370
No log 4.5652 210 1.0831 0.3440 1.0832 1.0407
No log 4.6087 212 1.1549 0.3077 1.1549 1.0746
No log 4.6522 214 1.0527 0.3566 1.0527 1.0260
No log 4.6957 216 0.9976 0.3115 0.9976 0.9988
No log 4.7391 218 0.9392 0.3214 0.9392 0.9691
No log 4.7826 220 0.8411 0.3805 0.8411 0.9171
No log 4.8261 222 0.8005 0.4165 0.8005 0.8947
No log 4.8696 224 0.8203 0.4188 0.8203 0.9057
No log 4.9130 226 0.9162 0.4015 0.9162 0.9572
No log 4.9565 228 1.1330 0.3069 1.1330 1.0644
No log 5.0 230 1.3984 0.3229 1.3984 1.1825
No log 5.0435 232 1.4372 0.2936 1.4372 1.1988
No log 5.0870 234 1.2678 0.2968 1.2678 1.1260
No log 5.1304 236 1.0262 0.3443 1.0262 1.0130
No log 5.1739 238 0.8861 0.4211 0.8861 0.9413
No log 5.2174 240 0.8444 0.4492 0.8444 0.9189
No log 5.2609 242 0.8459 0.4342 0.8459 0.9197
No log 5.3043 244 0.8727 0.4041 0.8727 0.9342
No log 5.3478 246 0.9462 0.3840 0.9462 0.9727
No log 5.3913 248 1.0028 0.3608 1.0028 1.0014
No log 5.4348 250 1.0884 0.3214 1.0884 1.0433
No log 5.4783 252 1.2792 0.3058 1.2792 1.1310
No log 5.5217 254 1.4927 0.2626 1.4927 1.2217
No log 5.5652 256 1.5789 0.2214 1.5789 1.2565
No log 5.6087 258 1.4557 0.2787 1.4557 1.2065
No log 5.6522 260 1.2564 0.3471 1.2564 1.1209
No log 5.6957 262 1.0604 0.3760 1.0604 1.0298
No log 5.7391 264 0.9792 0.4079 0.9792 0.9896
No log 5.7826 266 0.9927 0.4067 0.9927 0.9964
No log 5.8261 268 1.1060 0.3608 1.1060 1.0516
No log 5.8696 270 1.2268 0.3260 1.2268 1.1076
No log 5.9130 272 1.3037 0.2777 1.3037 1.1418
No log 5.9565 274 1.3850 0.2873 1.3850 1.1768
No log 6.0 276 1.3423 0.2924 1.3423 1.1586
No log 6.0435 278 1.2125 0.2908 1.2125 1.1011
No log 6.0870 280 1.0644 0.3334 1.0644 1.0317
No log 6.1304 282 0.9438 0.3890 0.9438 0.9715
No log 6.1739 284 0.9113 0.4056 0.9113 0.9546
No log 6.2174 286 0.9105 0.4521 0.9105 0.9542
No log 6.2609 288 0.9541 0.4253 0.9541 0.9768
No log 6.3043 290 0.9779 0.3878 0.9779 0.9889
No log 6.3478 292 0.9739 0.3560 0.9739 0.9869
No log 6.3913 294 0.9616 0.3960 0.9616 0.9806
No log 6.4348 296 0.9769 0.3813 0.9769 0.9884
No log 6.4783 298 1.0307 0.3654 1.0307 1.0152
No log 6.5217 300 1.1203 0.3363 1.1203 1.0584
No log 6.5652 302 1.1422 0.3297 1.1422 1.0687
No log 6.6087 304 1.0665 0.3614 1.0665 1.0327
No log 6.6522 306 0.9699 0.3578 0.9699 0.9848
No log 6.6957 308 0.8834 0.4014 0.8834 0.9399
No log 6.7391 310 0.8356 0.4150 0.8356 0.9141
No log 6.7826 312 0.8356 0.4015 0.8356 0.9141
No log 6.8261 314 0.8818 0.4075 0.8818 0.9390
No log 6.8696 316 0.9639 0.4136 0.9639 0.9818
No log 6.9130 318 0.9977 0.3935 0.9977 0.9988
No log 6.9565 320 0.9764 0.3923 0.9764 0.9882
No log 7.0 322 0.9180 0.4220 0.9180 0.9581
No log 7.0435 324 0.8857 0.3954 0.8857 0.9411
No log 7.0870 326 0.8893 0.4024 0.8893 0.9430
No log 7.1304 328 0.9000 0.4113 0.9000 0.9487
No log 7.1739 330 0.9101 0.4011 0.9101 0.9540
No log 7.2174 332 0.9275 0.4030 0.9275 0.9631
No log 7.2609 334 0.9095 0.4011 0.9095 0.9537
No log 7.3043 336 0.9114 0.4030 0.9114 0.9546
No log 7.3478 338 0.9055 0.4030 0.9055 0.9516
No log 7.3913 340 0.9110 0.4149 0.9110 0.9545
No log 7.4348 342 0.9301 0.4044 0.9301 0.9644
No log 7.4783 344 0.9352 0.4080 0.9352 0.9670
No log 7.5217 346 0.9207 0.3931 0.9207 0.9595
No log 7.5652 348 0.8940 0.4219 0.8940 0.9455
No log 7.6087 350 0.8585 0.3964 0.8585 0.9266
No log 7.6522 352 0.8290 0.4081 0.8290 0.9105
No log 7.6957 354 0.8229 0.4725 0.8229 0.9071
No log 7.7391 356 0.8384 0.4461 0.8384 0.9157
No log 7.7826 358 0.8599 0.4626 0.8599 0.9273
No log 7.8261 360 0.8952 0.4410 0.8952 0.9462
No log 7.8696 362 0.9196 0.4072 0.9196 0.9590
No log 7.9130 364 0.9418 0.3970 0.9418 0.9705
No log 7.9565 366 0.9409 0.3970 0.9409 0.9700
No log 8.0 368 0.9257 0.4072 0.9257 0.9621
No log 8.0435 370 0.9158 0.4072 0.9158 0.9569
No log 8.0870 372 0.9243 0.4072 0.9243 0.9614
No log 8.1304 374 0.9186 0.4072 0.9186 0.9584
No log 8.1739 376 0.9072 0.4198 0.9072 0.9525
No log 8.2174 378 0.9157 0.4198 0.9157 0.9569
No log 8.2609 380 0.9392 0.3970 0.9392 0.9691
No log 8.3043 382 0.9706 0.4009 0.9706 0.9852
No log 8.3478 384 0.9674 0.4009 0.9674 0.9836
No log 8.3913 386 0.9682 0.4009 0.9682 0.9840
No log 8.4348 388 0.9532 0.3989 0.9532 0.9763
No log 8.4783 390 0.9449 0.4072 0.9449 0.9720
No log 8.5217 392 0.9424 0.4072 0.9424 0.9708
No log 8.5652 394 0.9292 0.4198 0.9292 0.9640
No log 8.6087 396 0.9310 0.4198 0.9310 0.9649
No log 8.6522 398 0.9275 0.4372 0.9275 0.9630
No log 8.6957 400 0.9156 0.4425 0.9156 0.9569
No log 8.7391 402 0.9085 0.4425 0.9085 0.9532
No log 8.7826 404 0.9046 0.4372 0.9046 0.9511
No log 8.8261 406 0.9002 0.4372 0.9002 0.9488
No log 8.8696 408 0.8966 0.4074 0.8966 0.9469
No log 8.9130 410 0.8948 0.3838 0.8948 0.9459
No log 8.9565 412 0.9018 0.3806 0.9018 0.9496
No log 9.0 414 0.9204 0.3902 0.9204 0.9594
No log 9.0435 416 0.9462 0.3870 0.9462 0.9727
No log 9.0870 418 0.9714 0.4170 0.9714 0.9856
No log 9.1304 420 0.9763 0.4170 0.9763 0.9881
No log 9.1739 422 0.9681 0.4204 0.9681 0.9839
No log 9.2174 424 0.9525 0.4187 0.9525 0.9759
No log 9.2609 426 0.9343 0.3976 0.9343 0.9666
No log 9.3043 428 0.9149 0.3795 0.9149 0.9565
No log 9.3478 430 0.9004 0.4355 0.9004 0.9489
No log 9.3913 432 0.8917 0.4336 0.8917 0.9443
No log 9.4348 434 0.8863 0.4319 0.8863 0.9414
No log 9.4783 436 0.8840 0.4247 0.8840 0.9402
No log 9.5217 438 0.8834 0.4175 0.8834 0.9399
No log 9.5652 440 0.8860 0.4175 0.8860 0.9413
No log 9.6087 442 0.8883 0.4175 0.8883 0.9425
No log 9.6522 444 0.8891 0.4175 0.8891 0.9429
No log 9.6957 446 0.8905 0.4247 0.8905 0.9436
No log 9.7391 448 0.8914 0.4319 0.8914 0.9442
No log 9.7826 450 0.8938 0.4319 0.8938 0.9454
No log 9.8261 452 0.8968 0.4336 0.8968 0.9470
No log 9.8696 454 0.8984 0.4336 0.8984 0.9478
No log 9.9130 456 0.9004 0.4353 0.9004 0.9489
No log 9.9565 458 0.9018 0.4440 0.9018 0.9496
No log 10.0 460 0.9023 0.4440 0.9023 0.9499

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k7_task2_organization

Finetuned
(4023)
this model