ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k8_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4813
  • Qwk: 0.4545
  • Mse: 0.4813
  • Rmse: 0.6938

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 3.0608 -0.0112 3.0608 1.7495
No log 0.0952 4 1.2825 -0.0070 1.2825 1.1325
No log 0.1429 6 0.9058 0.0769 0.9058 0.9518
No log 0.1905 8 0.8389 0.1131 0.8389 0.9159
No log 0.2381 10 0.6705 -0.0256 0.6705 0.8188
No log 0.2857 12 1.4945 0.0 1.4945 1.2225
No log 0.3333 14 1.1870 0.0817 1.1870 1.0895
No log 0.3810 16 0.8663 0.1351 0.8663 0.9308
No log 0.4286 18 0.7227 0.0417 0.7227 0.8501
No log 0.4762 20 0.7328 0.0417 0.7328 0.8560
No log 0.5238 22 0.8508 0.1619 0.8508 0.9224
No log 0.5714 24 0.6457 0.0769 0.6457 0.8036
No log 0.6190 26 0.8047 0.2315 0.8047 0.8970
No log 0.6667 28 0.7444 0.2258 0.7444 0.8628
No log 0.7143 30 0.8682 0.1765 0.8682 0.9318
No log 0.7619 32 1.2886 0.1094 1.2886 1.1351
No log 0.8095 34 0.7802 0.0707 0.7802 0.8833
No log 0.8571 36 0.7889 0.2410 0.7889 0.8882
No log 0.9048 38 0.8417 0.2780 0.8417 0.9174
No log 0.9524 40 0.6372 0.0769 0.6372 0.7982
No log 1.0 42 0.6183 0.0850 0.6183 0.7863
No log 1.0476 44 1.3291 0.0 1.3291 1.1529
No log 1.0952 46 1.7838 0.0390 1.7838 1.3356
No log 1.1429 48 1.4535 0.0 1.4535 1.2056
No log 1.1905 50 0.9481 0.0 0.9481 0.9737
No log 1.2381 52 0.6802 0.0850 0.6802 0.8247
No log 1.2857 54 0.5853 0.0 0.5853 0.7650
No log 1.3333 56 0.6151 0.1111 0.6151 0.7843
No log 1.3810 58 0.8263 0.1644 0.8263 0.9090
No log 1.4286 60 0.6113 0.2184 0.6113 0.7818
No log 1.4762 62 0.6429 0.2000 0.6429 0.8018
No log 1.5238 64 0.6185 0.2516 0.6185 0.7865
No log 1.5714 66 0.5958 0.2941 0.5958 0.7719
No log 1.6190 68 0.6094 0.1724 0.6094 0.7806
No log 1.6667 70 1.3126 0.0857 1.3126 1.1457
No log 1.7143 72 1.6269 0.0380 1.6269 1.2755
No log 1.7619 74 0.9244 0.2137 0.9244 0.9615
No log 1.8095 76 0.4985 0.2821 0.4985 0.7060
No log 1.8571 78 0.7540 0.2372 0.7540 0.8683
No log 1.9048 80 0.5163 0.3488 0.5163 0.7185
No log 1.9524 82 0.6165 0.2727 0.6165 0.7852
No log 2.0 84 0.6690 0.2637 0.6690 0.8180
No log 2.0476 86 0.5065 0.3636 0.5065 0.7117
No log 2.0952 88 0.4636 0.4413 0.4636 0.6809
No log 2.1429 90 0.4881 0.2683 0.4881 0.6986
No log 2.1905 92 0.5127 0.3508 0.5127 0.7160
No log 2.2381 94 0.5786 0.3498 0.5786 0.7607
No log 2.2857 96 0.5386 0.3299 0.5386 0.7339
No log 2.3333 98 0.6222 0.3398 0.6222 0.7888
No log 2.3810 100 0.5069 0.4229 0.5069 0.7120
No log 2.4286 102 0.5024 0.4518 0.5024 0.7088
No log 2.4762 104 0.5252 0.4518 0.5252 0.7247
No log 2.5238 106 0.6129 0.4909 0.6129 0.7829
No log 2.5714 108 1.1946 0.25 1.1946 1.0930
No log 2.6190 110 1.1599 0.2704 1.1599 1.0770
No log 2.6667 112 0.5771 0.4930 0.5771 0.7597
No log 2.7143 114 0.5319 0.5233 0.5319 0.7293
No log 2.7619 116 0.5553 0.4346 0.5553 0.7452
No log 2.8095 118 0.7697 0.2960 0.7697 0.8773
No log 2.8571 120 0.5560 0.4400 0.5560 0.7456
No log 2.9048 122 0.5742 0.4400 0.5742 0.7578
No log 2.9524 124 0.7676 0.3548 0.7676 0.8762
No log 3.0 126 0.6031 0.3939 0.6031 0.7766
No log 3.0476 128 0.5336 0.4709 0.5336 0.7305
No log 3.0952 130 0.5356 0.4839 0.5356 0.7318
No log 3.1429 132 0.8486 0.2424 0.8486 0.9212
No log 3.1905 134 0.9739 0.1888 0.9739 0.9869
No log 3.2381 136 0.7020 0.1848 0.7020 0.8379
No log 3.2857 138 0.5684 0.4709 0.5684 0.7539
No log 3.3333 140 0.6155 0.3591 0.6155 0.7845
No log 3.3810 142 0.6666 0.2871 0.6666 0.8165
No log 3.4286 144 0.6074 0.4639 0.6074 0.7793
No log 3.4762 146 0.6893 0.3171 0.6893 0.8302
No log 3.5238 148 1.0359 0.1292 1.0359 1.0178
No log 3.5714 150 0.8307 0.0796 0.8307 0.9114
No log 3.6190 152 0.6898 0.2621 0.6898 0.8305
No log 3.6667 154 0.6222 0.4175 0.6222 0.7888
No log 3.7143 156 0.6151 0.4286 0.6151 0.7843
No log 3.7619 158 0.6939 0.2174 0.6939 0.8330
No log 3.8095 160 0.7785 0.1193 0.7785 0.8823
No log 3.8571 162 0.6077 0.2000 0.6077 0.7795
No log 3.9048 164 0.6309 0.2821 0.6309 0.7943
No log 3.9524 166 0.7600 0.2696 0.7600 0.8718
No log 4.0 168 0.6183 0.2917 0.6183 0.7864
No log 4.0476 170 0.7304 0.2222 0.7304 0.8546
No log 4.0952 172 1.0198 0.0988 1.0198 1.0099
No log 4.1429 174 0.8930 0.1261 0.8930 0.9450
No log 4.1905 176 0.6049 0.2771 0.6049 0.7778
No log 4.2381 178 0.5810 0.3439 0.5810 0.7623
No log 4.2857 180 0.5767 0.2688 0.5767 0.7594
No log 4.3333 182 0.5625 0.3258 0.5625 0.7500
No log 4.3810 184 0.9309 0.2000 0.9309 0.9649
No log 4.4286 186 1.1773 0.1438 1.1773 1.0850
No log 4.4762 188 0.9416 0.2000 0.9416 0.9704
No log 4.5238 190 0.5638 0.3224 0.5638 0.7509
No log 4.5714 192 0.5152 0.3913 0.5152 0.7178
No log 4.6190 194 0.5355 0.4105 0.5355 0.7318
No log 4.6667 196 0.5196 0.4167 0.5196 0.7208
No log 4.7143 198 0.6555 0.3143 0.6555 0.8096
No log 4.7619 200 0.6173 0.3200 0.6173 0.7857
No log 4.8095 202 0.5260 0.4051 0.5260 0.7252
No log 4.8571 204 0.5418 0.4171 0.5418 0.7361
No log 4.9048 206 0.5338 0.4051 0.5338 0.7306
No log 4.9524 208 0.5701 0.3778 0.5701 0.7550
No log 5.0 210 0.7022 0.1923 0.7022 0.8380
No log 5.0476 212 0.6641 0.2332 0.6641 0.8149
No log 5.0952 214 0.5800 0.3023 0.5800 0.7616
No log 5.1429 216 0.5700 0.4346 0.5700 0.7550
No log 5.1905 218 0.6219 0.2161 0.6219 0.7886
No log 5.2381 220 0.5585 0.4468 0.5585 0.7473
No log 5.2857 222 0.5453 0.3412 0.5453 0.7384
No log 5.3333 224 0.5601 0.3054 0.5601 0.7484
No log 5.3810 226 0.6312 0.2332 0.6312 0.7945
No log 5.4286 228 0.5592 0.3103 0.5592 0.7478
No log 5.4762 230 0.5012 0.2393 0.5012 0.7080
No log 5.5238 232 0.4901 0.4012 0.4901 0.7001
No log 5.5714 234 0.4970 0.2832 0.4970 0.7050
No log 5.6190 236 0.4824 0.3520 0.4824 0.6946
No log 5.6667 238 0.4904 0.3520 0.4904 0.7003
No log 5.7143 240 0.5514 0.2421 0.5514 0.7425
No log 5.7619 242 0.5244 0.2965 0.5244 0.7241
No log 5.8095 244 0.4921 0.4975 0.4921 0.7015
No log 5.8571 246 0.4962 0.4286 0.4962 0.7044
No log 5.9048 248 0.5779 0.2487 0.5779 0.7602
No log 5.9524 250 0.6015 0.2965 0.6015 0.7756
No log 6.0 252 0.5603 0.25 0.5603 0.7485
No log 6.0476 254 0.5043 0.3478 0.5043 0.7101
No log 6.0952 256 0.4915 0.3684 0.4915 0.7010
No log 6.1429 258 0.4939 0.2994 0.4939 0.7028
No log 6.1905 260 0.5069 0.3520 0.5069 0.7119
No log 6.2381 262 0.5182 0.3708 0.5182 0.7198
No log 6.2857 264 0.4995 0.3407 0.4995 0.7067
No log 6.3333 266 0.4967 0.4043 0.4967 0.7048
No log 6.3810 268 0.5087 0.2917 0.5087 0.7132
No log 6.4286 270 0.6362 0.2919 0.6362 0.7976
No log 6.4762 272 0.8543 0.3410 0.8543 0.9243
No log 6.5238 274 0.8244 0.3333 0.8244 0.9080
No log 6.5714 276 0.6392 0.2919 0.6392 0.7995
No log 6.6190 278 0.5112 0.3131 0.5112 0.7150
No log 6.6667 280 0.4999 0.4286 0.4999 0.7070
No log 6.7143 282 0.5196 0.2889 0.5196 0.7208
No log 6.7619 284 0.5773 0.2593 0.5773 0.7598
No log 6.8095 286 0.5910 0.2258 0.5910 0.7688
No log 6.8571 288 0.5901 0.2258 0.5901 0.7682
No log 6.9048 290 0.5623 0.2184 0.5623 0.7499
No log 6.9524 292 0.5190 0.3023 0.5190 0.7204
No log 7.0 294 0.5099 0.3810 0.5099 0.7141
No log 7.0476 296 0.5250 0.3023 0.5250 0.7246
No log 7.0952 298 0.5258 0.3708 0.5258 0.7251
No log 7.1429 300 0.5059 0.3149 0.5059 0.7112
No log 7.1905 302 0.5053 0.3478 0.5053 0.7108
No log 7.2381 304 0.5139 0.3708 0.5139 0.7169
No log 7.2857 306 0.5457 0.2542 0.5457 0.7387
No log 7.3333 308 0.5637 0.2513 0.5637 0.7508
No log 7.3810 310 0.5703 0.25 0.5703 0.7552
No log 7.4286 312 0.5167 0.3548 0.5167 0.7188
No log 7.4762 314 0.4953 0.4157 0.4953 0.7038
No log 7.5238 316 0.4984 0.4947 0.4984 0.7060
No log 7.5714 318 0.4964 0.4105 0.4964 0.7045
No log 7.6190 320 0.4975 0.4105 0.4975 0.7054
No log 7.6667 322 0.5105 0.3016 0.5105 0.7145
No log 7.7143 324 0.5652 0.2893 0.5652 0.7518
No log 7.7619 326 0.5766 0.2850 0.5766 0.7593
No log 7.8095 328 0.5866 0.2850 0.5866 0.7659
No log 7.8571 330 0.5746 0.2871 0.5746 0.7580
No log 7.9048 332 0.5318 0.3297 0.5318 0.7293
No log 7.9524 334 0.5101 0.3978 0.5101 0.7142
No log 8.0 336 0.4899 0.4105 0.4899 0.6999
No log 8.0476 338 0.4893 0.3913 0.4893 0.6995
No log 8.0952 340 0.5074 0.2889 0.5074 0.7123
No log 8.1429 342 0.5567 0.3016 0.5567 0.7461
No log 8.1905 344 0.5971 0.2941 0.5971 0.7727
No log 8.2381 346 0.5983 0.2941 0.5983 0.7735
No log 8.2857 348 0.5563 0.3016 0.5563 0.7459
No log 8.3333 350 0.5264 0.2967 0.5264 0.7255
No log 8.3810 352 0.4993 0.3978 0.4993 0.7066
No log 8.4286 354 0.4873 0.3797 0.4873 0.6981
No log 8.4762 356 0.4887 0.3797 0.4887 0.6991
No log 8.5238 358 0.4947 0.3149 0.4947 0.7033
No log 8.5714 360 0.5117 0.2889 0.5117 0.7153
No log 8.6190 362 0.5205 0.2889 0.5205 0.7215
No log 8.6667 364 0.5082 0.2889 0.5082 0.7129
No log 8.7143 366 0.4923 0.3182 0.4923 0.7016
No log 8.7619 368 0.4863 0.3778 0.4863 0.6974
No log 8.8095 370 0.4873 0.3898 0.4873 0.6981
No log 8.8571 372 0.4891 0.3898 0.4891 0.6994
No log 8.9048 374 0.4895 0.3898 0.4895 0.6996
No log 8.9524 376 0.4876 0.3898 0.4876 0.6983
No log 9.0 378 0.4854 0.3898 0.4854 0.6967
No log 9.0476 380 0.4818 0.4667 0.4818 0.6941
No log 9.0952 382 0.4812 0.4839 0.4812 0.6937
No log 9.1429 384 0.4852 0.3846 0.4852 0.6966
No log 9.1905 386 0.4935 0.3224 0.4935 0.7025
No log 9.2381 388 0.5049 0.3224 0.5049 0.7105
No log 9.2857 390 0.5071 0.3224 0.5071 0.7121
No log 9.3333 392 0.5162 0.2865 0.5162 0.7185
No log 9.3810 394 0.5289 0.2865 0.5289 0.7273
No log 9.4286 396 0.5276 0.2865 0.5276 0.7263
No log 9.4762 398 0.5205 0.2865 0.5205 0.7215
No log 9.5238 400 0.5143 0.2865 0.5143 0.7172
No log 9.5714 402 0.5115 0.2865 0.5115 0.7152
No log 9.6190 404 0.5037 0.3224 0.5037 0.7097
No log 9.6667 406 0.4983 0.3224 0.4983 0.7059
No log 9.7143 408 0.4970 0.3224 0.4970 0.7050
No log 9.7619 410 0.4918 0.3224 0.4918 0.7013
No log 9.8095 412 0.4882 0.3224 0.4882 0.6987
No log 9.8571 414 0.4860 0.3224 0.4860 0.6972
No log 9.9048 416 0.4840 0.3258 0.4840 0.6957
No log 9.9524 418 0.4821 0.4033 0.4821 0.6943
No log 10.0 420 0.4813 0.4545 0.4813 0.6938

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k8_task3_organization

Finetuned
(4023)
this model