ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k6_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9405
  • Qwk: 0.5089
  • Mse: 0.9405
  • Rmse: 0.9698

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 4.0888 -0.0169 4.0888 2.0221
No log 0.1212 4 2.1197 0.0305 2.1197 1.4559
No log 0.1818 6 1.2076 0.0520 1.2076 1.0989
No log 0.2424 8 0.8125 0.0608 0.8125 0.9014
No log 0.3030 10 0.6724 0.2630 0.6724 0.8200
No log 0.3636 12 0.6457 0.2333 0.6457 0.8035
No log 0.4242 14 0.6338 0.2990 0.6338 0.7961
No log 0.4848 16 0.6415 0.2584 0.6415 0.8009
No log 0.5455 18 0.6145 0.3080 0.6145 0.7839
No log 0.6061 20 0.8421 0.3049 0.8421 0.9177
No log 0.6667 22 0.9275 0.3483 0.9275 0.9631
No log 0.7273 24 0.7728 0.3342 0.7728 0.8791
No log 0.7879 26 0.6362 0.2725 0.6362 0.7976
No log 0.8485 28 0.5744 0.3316 0.5744 0.7579
No log 0.9091 30 0.5522 0.3611 0.5522 0.7431
No log 0.9697 32 0.5411 0.4388 0.5411 0.7356
No log 1.0303 34 0.5514 0.4250 0.5514 0.7426
No log 1.0909 36 0.5896 0.3907 0.5896 0.7679
No log 1.1515 38 0.6093 0.4334 0.6093 0.7806
No log 1.2121 40 0.5933 0.4494 0.5933 0.7703
No log 1.2727 42 0.5963 0.4390 0.5963 0.7722
No log 1.3333 44 0.6083 0.4680 0.6083 0.7799
No log 1.3939 46 0.6363 0.5043 0.6363 0.7977
No log 1.4545 48 0.6500 0.4568 0.6500 0.8062
No log 1.5152 50 0.6761 0.4894 0.6761 0.8223
No log 1.5758 52 0.7015 0.5227 0.7015 0.8376
No log 1.6364 54 0.7200 0.5034 0.7200 0.8485
No log 1.6970 56 0.7029 0.5442 0.7029 0.8384
No log 1.7576 58 0.8238 0.4951 0.8238 0.9077
No log 1.8182 60 1.2185 0.4034 1.2185 1.1039
No log 1.8788 62 1.3755 0.3739 1.3755 1.1728
No log 1.9394 64 0.9272 0.4584 0.9272 0.9629
No log 2.0 66 0.6530 0.5749 0.6530 0.8081
No log 2.0606 68 0.6576 0.5667 0.6576 0.8109
No log 2.1212 70 0.9464 0.4532 0.9464 0.9728
No log 2.1818 72 1.7678 0.3165 1.7678 1.3296
No log 2.2424 74 2.0035 0.2700 2.0035 1.4155
No log 2.3030 76 1.5794 0.3607 1.5794 1.2567
No log 2.3636 78 0.9566 0.4586 0.9566 0.9781
No log 2.4242 80 0.7282 0.5367 0.7282 0.8533
No log 2.4848 82 0.7449 0.5427 0.7449 0.8631
No log 2.5455 84 0.9095 0.4969 0.9095 0.9537
No log 2.6061 86 1.1316 0.4828 1.1316 1.0638
No log 2.6667 88 1.0308 0.4860 1.0308 1.0153
No log 2.7273 90 0.8666 0.5297 0.8666 0.9309
No log 2.7879 92 0.8621 0.5286 0.8621 0.9285
No log 2.8485 94 1.0027 0.4865 1.0027 1.0013
No log 2.9091 96 1.1457 0.4513 1.1457 1.0704
No log 2.9697 98 1.1150 0.4817 1.1150 1.0559
No log 3.0303 100 0.9501 0.5267 0.9501 0.9748
No log 3.0909 102 0.8593 0.5393 0.8593 0.9270
No log 3.1515 104 0.8691 0.5145 0.8691 0.9322
No log 3.2121 106 0.7926 0.5402 0.7926 0.8903
No log 3.2727 108 0.8991 0.5114 0.8991 0.9482
No log 3.3333 110 1.0857 0.4592 1.0857 1.0420
No log 3.3939 112 1.0098 0.4789 1.0098 1.0049
No log 3.4545 114 0.8313 0.5213 0.8313 0.9117
No log 3.5152 116 0.8999 0.5057 0.8999 0.9486
No log 3.5758 118 1.1716 0.4687 1.1716 1.0824
No log 3.6364 120 1.5070 0.3895 1.5070 1.2276
No log 3.6970 122 1.4270 0.4077 1.4270 1.1946
No log 3.7576 124 1.0935 0.5109 1.0935 1.0457
No log 3.8182 126 0.9572 0.4944 0.9572 0.9784
No log 3.8788 128 0.9726 0.5194 0.9726 0.9862
No log 3.9394 130 0.9400 0.5279 0.9400 0.9695
No log 4.0 132 0.8391 0.5437 0.8391 0.9160
No log 4.0606 134 0.8132 0.5211 0.8132 0.9018
No log 4.1212 136 0.8040 0.5059 0.8040 0.8967
No log 4.1818 138 0.8767 0.5206 0.8767 0.9363
No log 4.2424 140 1.0744 0.5143 1.0744 1.0365
No log 4.3030 142 1.1758 0.4671 1.1758 1.0843
No log 4.3636 144 1.0645 0.4779 1.0645 1.0318
No log 4.4242 146 0.9991 0.4892 0.9991 0.9995
No log 4.4848 148 1.0141 0.4838 1.0141 1.0070
No log 4.5455 150 1.1280 0.4806 1.1280 1.0621
No log 4.6061 152 1.4540 0.4033 1.4540 1.2058
No log 4.6667 154 1.6327 0.3854 1.6327 1.2778
No log 4.7273 156 1.4520 0.3916 1.4520 1.2050
No log 4.7879 158 1.1291 0.4531 1.1291 1.0626
No log 4.8485 160 0.9116 0.4855 0.9116 0.9548
No log 4.9091 162 0.8826 0.5299 0.8826 0.9395
No log 4.9697 164 0.9277 0.5431 0.9277 0.9632
No log 5.0303 166 0.9089 0.5334 0.9089 0.9534
No log 5.0909 168 0.9150 0.4692 0.9150 0.9566
No log 5.1515 170 0.9650 0.4696 0.9650 0.9823
No log 5.2121 172 0.9099 0.4717 0.9099 0.9539
No log 5.2727 174 0.8966 0.4741 0.8966 0.9469
No log 5.3333 176 0.9218 0.4768 0.9218 0.9601
No log 5.3939 178 1.0144 0.4867 1.0144 1.0072
No log 5.4545 180 0.9917 0.4624 0.9917 0.9959
No log 5.5152 182 0.9363 0.4892 0.9363 0.9676
No log 5.5758 184 0.9555 0.4823 0.9555 0.9775
No log 5.6364 186 0.9971 0.5125 0.9971 0.9985
No log 5.6970 188 1.0824 0.4752 1.0824 1.0404
No log 5.7576 190 1.1558 0.4669 1.1558 1.0751
No log 5.8182 192 1.2129 0.4723 1.2129 1.1013
No log 5.8788 194 1.2185 0.4347 1.2185 1.1038
No log 5.9394 196 1.0985 0.4709 1.0985 1.0481
No log 6.0 198 0.9874 0.4918 0.9874 0.9937
No log 6.0606 200 0.9197 0.4980 0.9197 0.9590
No log 6.1212 202 0.8278 0.5073 0.8278 0.9098
No log 6.1818 204 0.8211 0.4956 0.8211 0.9062
No log 6.2424 206 0.8714 0.4991 0.8714 0.9335
No log 6.3030 208 0.9545 0.5009 0.9545 0.9770
No log 6.3636 210 1.0105 0.4938 1.0105 1.0052
No log 6.4242 212 1.0976 0.4740 1.0976 1.0477
No log 6.4848 214 1.0450 0.4883 1.0450 1.0223
No log 6.5455 216 1.0157 0.5054 1.0157 1.0078
No log 6.6061 218 0.9583 0.4938 0.9583 0.9789
No log 6.6667 220 0.9670 0.5130 0.9670 0.9834
No log 6.7273 222 0.9839 0.5070 0.9839 0.9919
No log 6.7879 224 1.0756 0.4619 1.0756 1.0371
No log 6.8485 226 1.0975 0.4561 1.0975 1.0476
No log 6.9091 228 1.0688 0.4449 1.0688 1.0338
No log 6.9697 230 1.0049 0.4908 1.0049 1.0025
No log 7.0303 232 1.0105 0.4872 1.0105 1.0053
No log 7.0909 234 1.0260 0.4924 1.0260 1.0129
No log 7.1515 236 1.0511 0.4929 1.0511 1.0252
No log 7.2121 238 1.0361 0.4889 1.0361 1.0179
No log 7.2727 240 0.9872 0.5036 0.9872 0.9936
No log 7.3333 242 0.8882 0.5071 0.8882 0.9425
No log 7.3939 244 0.8432 0.5079 0.8432 0.9183
No log 7.4545 246 0.8799 0.5065 0.8799 0.9380
No log 7.5152 248 0.9436 0.4879 0.9436 0.9714
No log 7.5758 250 0.9642 0.5002 0.9642 0.9820
No log 7.6364 252 0.9535 0.5187 0.9535 0.9765
No log 7.6970 254 0.9558 0.5116 0.9558 0.9777
No log 7.7576 256 0.9859 0.5025 0.9859 0.9929
No log 7.8182 258 1.0300 0.4991 1.0300 1.0149
No log 7.8788 260 1.0732 0.5091 1.0732 1.0360
No log 7.9394 262 1.1230 0.4771 1.1230 1.0597
No log 8.0 264 1.1741 0.4844 1.1741 1.0836
No log 8.0606 266 1.1571 0.4841 1.1571 1.0757
No log 8.1212 268 1.0813 0.4922 1.0813 1.0399
No log 8.1818 270 1.0072 0.5028 1.0072 1.0036
No log 8.2424 272 0.9336 0.5043 0.9336 0.9662
No log 8.3030 274 0.9061 0.4847 0.9061 0.9519
No log 8.3636 276 0.8741 0.4998 0.8741 0.9349
No log 8.4242 278 0.8557 0.4964 0.8557 0.9251
No log 8.4848 280 0.8560 0.5012 0.8560 0.9252
No log 8.5455 282 0.8927 0.4922 0.8927 0.9448
No log 8.6061 284 0.9634 0.5043 0.9634 0.9815
No log 8.6667 286 1.0272 0.4994 1.0272 1.0135
No log 8.7273 288 1.1001 0.4948 1.1001 1.0489
No log 8.7879 290 1.1212 0.4992 1.1212 1.0588
No log 8.8485 292 1.0927 0.4948 1.0927 1.0453
No log 8.9091 294 1.0266 0.5089 1.0266 1.0132
No log 8.9697 296 0.9604 0.5244 0.9604 0.9800
No log 9.0303 298 0.9299 0.5244 0.9299 0.9643
No log 9.0909 300 0.8889 0.5116 0.8889 0.9428
No log 9.1515 302 0.8644 0.5125 0.8644 0.9297
No log 9.2121 304 0.8469 0.4910 0.8469 0.9203
No log 9.2727 306 0.8357 0.4910 0.8357 0.9142
No log 9.3333 308 0.8388 0.4910 0.8388 0.9159
No log 9.3939 310 0.8538 0.5134 0.8538 0.9240
No log 9.4545 312 0.8784 0.5116 0.8784 0.9372
No log 9.5152 314 0.8935 0.5116 0.8935 0.9452
No log 9.5758 316 0.9100 0.5116 0.9100 0.9539
No log 9.6364 318 0.9178 0.5097 0.9178 0.9580
No log 9.6970 320 0.9238 0.5097 0.9238 0.9612
No log 9.7576 322 0.9294 0.5097 0.9294 0.9640
No log 9.8182 324 0.9354 0.5089 0.9354 0.9671
No log 9.8788 326 0.9383 0.5089 0.9383 0.9686
No log 9.9394 328 0.9395 0.5089 0.9395 0.9693
No log 10.0 330 0.9405 0.5089 0.9405 0.9698

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k6_task2_organization

Finetuned
(4023)
this model