ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9862
  • Qwk: 0.4234
  • Mse: 0.9862
  • Rmse: 0.9931

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.9811 0.0288 3.9811 1.9953
No log 0.0930 4 2.1979 0.0193 2.1979 1.4825
No log 0.1395 6 1.3523 0.0528 1.3523 1.1629
No log 0.1860 8 0.8366 -0.0270 0.8366 0.9147
No log 0.2326 10 0.7129 0.1667 0.7129 0.8443
No log 0.2791 12 0.7662 0.1819 0.7662 0.8753
No log 0.3256 14 0.8084 0.1442 0.8084 0.8991
No log 0.3721 16 0.7843 0.1208 0.7843 0.8856
No log 0.4186 18 0.7537 0.1314 0.7537 0.8682
No log 0.4651 20 0.7773 0.0111 0.7773 0.8816
No log 0.5116 22 0.7650 0.0634 0.7650 0.8747
No log 0.5581 24 0.7312 0.1202 0.7312 0.8551
No log 0.6047 26 0.6664 0.2080 0.6664 0.8164
No log 0.6512 28 0.6485 0.2291 0.6485 0.8053
No log 0.6977 30 0.6460 0.1732 0.6460 0.8037
No log 0.7442 32 0.6356 0.2247 0.6356 0.7973
No log 0.7907 34 0.6073 0.3120 0.6073 0.7793
No log 0.8372 36 0.5950 0.2634 0.5950 0.7713
No log 0.8837 38 0.6065 0.2888 0.6065 0.7788
No log 0.9302 40 0.6436 0.3131 0.6436 0.8022
No log 0.9767 42 0.6547 0.2406 0.6547 0.8092
No log 1.0233 44 0.6434 0.2370 0.6434 0.8022
No log 1.0698 46 0.6045 0.2981 0.6045 0.7775
No log 1.1163 48 0.5631 0.2903 0.5631 0.7504
No log 1.1628 50 0.5708 0.2819 0.5708 0.7555
No log 1.2093 52 0.6039 0.3376 0.6039 0.7771
No log 1.2558 54 0.6362 0.3762 0.6362 0.7976
No log 1.3023 56 0.6957 0.4046 0.6957 0.8341
No log 1.3488 58 0.9014 0.3999 0.9014 0.9494
No log 1.3953 60 1.0357 0.3931 1.0357 1.0177
No log 1.4419 62 1.1420 0.3591 1.1420 1.0686
No log 1.4884 64 1.0695 0.3131 1.0695 1.0342
No log 1.5349 66 0.8573 0.3831 0.8573 0.9259
No log 1.5814 68 0.8595 0.3806 0.8595 0.9271
No log 1.6279 70 0.9108 0.3786 0.9108 0.9543
No log 1.6744 72 0.8619 0.4284 0.8619 0.9284
No log 1.7209 74 0.8194 0.4108 0.8194 0.9052
No log 1.7674 76 0.8655 0.4207 0.8655 0.9303
No log 1.8140 78 0.9058 0.4111 0.9058 0.9517
No log 1.8605 80 1.0869 0.3785 1.0869 1.0425
No log 1.9070 82 1.0739 0.3986 1.0739 1.0363
No log 1.9535 84 0.9139 0.4085 0.9139 0.9560
No log 2.0 86 0.7450 0.5070 0.7450 0.8631
No log 2.0465 88 0.7118 0.5040 0.7118 0.8437
No log 2.0930 90 0.7894 0.4208 0.7894 0.8885
No log 2.1395 92 0.7441 0.4725 0.7441 0.8626
No log 2.1860 94 0.7193 0.5112 0.7193 0.8481
No log 2.2326 96 0.6548 0.5374 0.6548 0.8092
No log 2.2791 98 0.6723 0.4981 0.6723 0.8199
No log 2.3256 100 0.7205 0.5157 0.7205 0.8488
No log 2.3721 102 0.9068 0.4658 0.9068 0.9523
No log 2.4186 104 1.0965 0.4053 1.0965 1.0471
No log 2.4651 106 1.0329 0.4298 1.0329 1.0163
No log 2.5116 108 0.8722 0.4761 0.8722 0.9339
No log 2.5581 110 0.7961 0.5080 0.7961 0.8923
No log 2.6047 112 0.8177 0.4287 0.8177 0.9043
No log 2.6512 114 0.8636 0.3838 0.8636 0.9293
No log 2.6977 116 0.8150 0.4591 0.8150 0.9028
No log 2.7442 118 0.8007 0.5137 0.8007 0.8948
No log 2.7907 120 0.9518 0.4201 0.9518 0.9756
No log 2.8372 122 1.0747 0.3741 1.0747 1.0367
No log 2.8837 124 1.2822 0.3965 1.2822 1.1323
No log 2.9302 126 1.3416 0.3955 1.3416 1.1583
No log 2.9767 128 1.4514 0.3926 1.4514 1.2047
No log 3.0233 130 1.4700 0.3959 1.4700 1.2124
No log 3.0698 132 1.2595 0.3864 1.2595 1.1223
No log 3.1163 134 1.0354 0.4658 1.0354 1.0175
No log 3.1628 136 0.9774 0.4780 0.9774 0.9886
No log 3.2093 138 0.9725 0.4437 0.9725 0.9861
No log 3.2558 140 0.9069 0.4462 0.9069 0.9523
No log 3.3023 142 0.8225 0.4627 0.8225 0.9069
No log 3.3488 144 0.8174 0.4677 0.8174 0.9041
No log 3.3953 146 0.9456 0.4160 0.9456 0.9724
No log 3.4419 148 1.0507 0.3980 1.0507 1.0250
No log 3.4884 150 0.9772 0.4041 0.9772 0.9885
No log 3.5349 152 0.9549 0.4073 0.9549 0.9772
No log 3.5814 154 1.0482 0.4013 1.0482 1.0238
No log 3.6279 156 1.0520 0.4150 1.0520 1.0256
No log 3.6744 158 0.9572 0.4330 0.9572 0.9784
No log 3.7209 160 0.9217 0.4034 0.9217 0.9601
No log 3.7674 162 0.9322 0.3902 0.9322 0.9655
No log 3.8140 164 0.9500 0.3923 0.9500 0.9747
No log 3.8605 166 1.0662 0.4320 1.0662 1.0326
No log 3.9070 168 1.0724 0.4054 1.0724 1.0356
No log 3.9535 170 0.9399 0.4007 0.9399 0.9695
No log 4.0 172 0.7982 0.4604 0.7982 0.8934
No log 4.0465 174 0.7520 0.5312 0.7520 0.8672
No log 4.0930 176 0.7238 0.4753 0.7238 0.8508
No log 4.1395 178 0.6993 0.5214 0.6993 0.8362
No log 4.1860 180 0.8268 0.4865 0.8268 0.9093
No log 4.2326 182 1.0827 0.4291 1.0827 1.0405
No log 4.2791 184 1.2057 0.4196 1.2057 1.0981
No log 4.3256 186 1.1453 0.4290 1.1453 1.0702
No log 4.3721 188 1.0208 0.4263 1.0208 1.0104
No log 4.4186 190 0.8585 0.4579 0.8585 0.9265
No log 4.4651 192 0.7791 0.4979 0.7791 0.8827
No log 4.5116 194 0.7144 0.4859 0.7144 0.8452
No log 4.5581 196 0.7131 0.5223 0.7131 0.8445
No log 4.6047 198 0.7448 0.5036 0.7448 0.8630
No log 4.6512 200 0.8906 0.4678 0.8906 0.9437
No log 4.6977 202 1.2381 0.3714 1.2381 1.1127
No log 4.7442 204 1.5560 0.3385 1.5560 1.2474
No log 4.7907 206 1.5886 0.3267 1.5886 1.2604
No log 4.8372 208 1.4180 0.3667 1.4180 1.1908
No log 4.8837 210 1.2298 0.3994 1.2298 1.1090
No log 4.9302 212 0.9244 0.4559 0.9244 0.9614
No log 4.9767 214 0.7719 0.4568 0.7719 0.8786
No log 5.0233 216 0.7139 0.5225 0.7139 0.8450
No log 5.0698 218 0.6499 0.5464 0.6499 0.8062
No log 5.1163 220 0.6379 0.5496 0.6379 0.7987
No log 5.1628 222 0.6967 0.5169 0.6967 0.8347
No log 5.2093 224 0.8879 0.4501 0.8879 0.9423
No log 5.2558 226 1.1294 0.4304 1.1294 1.0627
No log 5.3023 228 1.2237 0.4316 1.2237 1.1062
No log 5.3488 230 1.1644 0.4526 1.1644 1.0791
No log 5.3953 232 1.0205 0.4644 1.0205 1.0102
No log 5.4419 234 0.9001 0.4832 0.9001 0.9487
No log 5.4884 236 0.8893 0.4618 0.8893 0.9430
No log 5.5349 238 0.9305 0.4264 0.9305 0.9646
No log 5.5814 240 0.9860 0.4500 0.9860 0.9930
No log 5.6279 242 0.9967 0.4498 0.9967 0.9983
No log 5.6744 244 1.0516 0.4457 1.0516 1.0255
No log 5.7209 246 1.0666 0.4619 1.0666 1.0328
No log 5.7674 248 1.1082 0.4250 1.1082 1.0527
No log 5.8140 250 1.1948 0.4154 1.1948 1.0930
No log 5.8605 252 1.2555 0.4154 1.2555 1.1205
No log 5.9070 254 1.2610 0.3965 1.2610 1.1229
No log 5.9535 256 1.1543 0.4250 1.1543 1.0744
No log 6.0 258 1.0155 0.4424 1.0155 1.0077
No log 6.0465 260 0.9028 0.4578 0.9028 0.9502
No log 6.0930 262 0.8522 0.4713 0.8522 0.9231
No log 6.1395 264 0.8701 0.4342 0.8701 0.9328
No log 6.1860 266 0.9148 0.4458 0.9148 0.9565
No log 6.2326 268 0.9395 0.4145 0.9395 0.9693
No log 6.2791 270 0.9555 0.4435 0.9555 0.9775
No log 6.3256 272 0.9421 0.4505 0.9421 0.9706
No log 6.3721 274 0.8796 0.4555 0.8796 0.9379
No log 6.4186 276 0.8619 0.4538 0.8619 0.9284
No log 6.4651 278 0.9040 0.4215 0.9040 0.9508
No log 6.5116 280 1.0282 0.3999 1.0282 1.0140
No log 6.5581 282 1.1277 0.4265 1.1277 1.0619
No log 6.6047 284 1.1224 0.4094 1.1224 1.0594
No log 6.6512 286 1.0650 0.4193 1.0650 1.0320
No log 6.6977 288 0.9899 0.4098 0.9899 0.9950
No log 6.7442 290 0.9665 0.4250 0.9665 0.9831
No log 6.7907 292 0.9819 0.4480 0.9819 0.9909
No log 6.8372 294 1.0418 0.4187 1.0418 1.0207
No log 6.8837 296 1.1358 0.4063 1.1358 1.0657
No log 6.9302 298 1.2986 0.4034 1.2986 1.1396
No log 6.9767 300 1.3656 0.3940 1.3656 1.1686
No log 7.0233 302 1.3891 0.3807 1.3891 1.1786
No log 7.0698 304 1.3191 0.3913 1.3191 1.1485
No log 7.1163 306 1.2061 0.3976 1.2061 1.0982
No log 7.1628 308 1.0762 0.4339 1.0762 1.0374
No log 7.2093 310 0.9729 0.4451 0.9729 0.9863
No log 7.2558 312 0.9226 0.4232 0.9226 0.9605
No log 7.3023 314 0.9409 0.4300 0.9409 0.9700
No log 7.3488 316 0.9928 0.4333 0.9928 0.9964
No log 7.3953 318 1.0183 0.4345 1.0183 1.0091
No log 7.4419 320 1.0075 0.4162 1.0075 1.0038
No log 7.4884 322 0.9876 0.4250 0.9876 0.9938
No log 7.5349 324 0.9853 0.4281 0.9853 0.9926
No log 7.5814 326 0.9994 0.4234 0.9994 0.9997
No log 7.6279 328 1.0500 0.4366 1.0500 1.0247
No log 7.6744 330 1.0899 0.4042 1.0899 1.0440
No log 7.7209 332 1.1176 0.4146 1.1176 1.0572
No log 7.7674 334 1.1105 0.4146 1.1105 1.0538
No log 7.8140 336 1.1222 0.4087 1.1222 1.0593
No log 7.8605 338 1.1119 0.4087 1.1119 1.0545
No log 7.9070 340 1.0510 0.4082 1.0510 1.0252
No log 7.9535 342 0.9618 0.4679 0.9618 0.9807
No log 8.0 344 0.9250 0.4624 0.9250 0.9618
No log 8.0465 346 0.9277 0.4624 0.9277 0.9631
No log 8.0930 348 0.9803 0.4340 0.9803 0.9901
No log 8.1395 350 1.0122 0.4292 1.0122 1.0061
No log 8.1860 352 1.0088 0.4292 1.0088 1.0044
No log 8.2326 354 0.9893 0.4263 0.9893 0.9946
No log 8.2791 356 0.9969 0.4263 0.9969 0.9984
No log 8.3256 358 1.0246 0.4263 1.0246 1.0122
No log 8.3721 360 1.0273 0.4307 1.0273 1.0135
No log 8.4186 362 1.0420 0.4145 1.0420 1.0208
No log 8.4651 364 1.0572 0.4145 1.0572 1.0282
No log 8.5116 366 1.1081 0.4146 1.1081 1.0527
No log 8.5581 368 1.1617 0.4087 1.1617 1.0778
No log 8.6047 370 1.1882 0.4005 1.1882 1.0900
No log 8.6512 372 1.1817 0.4162 1.1817 1.0870
No log 8.6977 374 1.1411 0.4087 1.1411 1.0682
No log 8.7442 376 1.1045 0.4146 1.1045 1.0509
No log 8.7907 378 1.0826 0.4145 1.0826 1.0405
No log 8.8372 380 1.0722 0.4145 1.0722 1.0354
No log 8.8837 382 1.0486 0.4145 1.0486 1.0240
No log 8.9302 384 1.0164 0.4401 1.0164 1.0082
No log 8.9767 386 1.0091 0.4462 1.0091 1.0045
No log 9.0233 388 1.0150 0.4354 1.0150 1.0074
No log 9.0698 390 1.0137 0.4263 1.0137 1.0068
No log 9.1163 392 1.0283 0.4263 1.0283 1.0140
No log 9.1628 394 1.0334 0.4115 1.0334 1.0165
No log 9.2093 396 1.0396 0.4175 1.0396 1.0196
No log 9.2558 398 1.0396 0.4175 1.0396 1.0196
No log 9.3023 400 1.0357 0.4174 1.0357 1.0177
No log 9.3488 402 1.0295 0.4174 1.0295 1.0146
No log 9.3953 404 1.0136 0.4174 1.0136 1.0068
No log 9.4419 406 0.9978 0.4174 0.9978 0.9989
No log 9.4884 408 0.9925 0.4175 0.9925 0.9962
No log 9.5349 410 0.9922 0.4175 0.9922 0.9961
No log 9.5814 412 0.9951 0.4175 0.9951 0.9976
No log 9.6279 414 0.9999 0.4175 0.9999 0.9999
No log 9.6744 416 0.9977 0.4175 0.9977 0.9989
No log 9.7209 418 0.9898 0.4175 0.9898 0.9949
No log 9.7674 420 0.9835 0.4234 0.9835 0.9917
No log 9.8140 422 0.9824 0.4234 0.9824 0.9912
No log 9.8605 424 0.9837 0.4234 0.9837 0.9918
No log 9.9070 426 0.9839 0.4234 0.9839 0.9919
No log 9.9535 428 0.9856 0.4234 0.9856 0.9928
No log 10.0 430 0.9862 0.4234 0.9862 0.9931

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k8_task2_organization

Finetuned
(4023)
this model