ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9868
  • Qwk: 0.4234
  • Mse: 0.9868
  • Rmse: 0.9934

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.9811 0.0288 3.9811 1.9953
No log 0.0930 4 2.1979 0.0193 2.1979 1.4825
No log 0.1395 6 1.3523 0.0528 1.3523 1.1629
No log 0.1860 8 0.8366 -0.0270 0.8366 0.9147
No log 0.2326 10 0.7129 0.1667 0.7129 0.8443
No log 0.2791 12 0.7662 0.1819 0.7662 0.8753
No log 0.3256 14 0.8084 0.1442 0.8084 0.8991
No log 0.3721 16 0.7843 0.1208 0.7843 0.8856
No log 0.4186 18 0.7537 0.1314 0.7537 0.8682
No log 0.4651 20 0.7773 0.0111 0.7773 0.8816
No log 0.5116 22 0.7650 0.0634 0.7650 0.8747
No log 0.5581 24 0.7312 0.1202 0.7312 0.8551
No log 0.6047 26 0.6664 0.2080 0.6664 0.8164
No log 0.6512 28 0.6485 0.2291 0.6485 0.8053
No log 0.6977 30 0.6460 0.1732 0.6460 0.8037
No log 0.7442 32 0.6357 0.2247 0.6357 0.7973
No log 0.7907 34 0.6073 0.3120 0.6073 0.7793
No log 0.8372 36 0.5950 0.2634 0.5950 0.7713
No log 0.8837 38 0.6065 0.2888 0.6065 0.7788
No log 0.9302 40 0.6436 0.3131 0.6436 0.8022
No log 0.9767 42 0.6547 0.2406 0.6547 0.8092
No log 1.0233 44 0.6434 0.2370 0.6434 0.8021
No log 1.0698 46 0.6045 0.2981 0.6045 0.7775
No log 1.1163 48 0.5631 0.2903 0.5631 0.7504
No log 1.1628 50 0.5709 0.2819 0.5709 0.7555
No log 1.2093 52 0.6039 0.3376 0.6039 0.7771
No log 1.2558 54 0.6362 0.3762 0.6362 0.7976
No log 1.3023 56 0.6957 0.4046 0.6957 0.8341
No log 1.3488 58 0.9013 0.3999 0.9013 0.9494
No log 1.3953 60 1.0357 0.3931 1.0357 1.0177
No log 1.4419 62 1.1420 0.3591 1.1420 1.0686
No log 1.4884 64 1.0695 0.3131 1.0695 1.0342
No log 1.5349 66 0.8573 0.3831 0.8573 0.9259
No log 1.5814 68 0.8595 0.3806 0.8595 0.9271
No log 1.6279 70 0.9107 0.3786 0.9107 0.9543
No log 1.6744 72 0.8618 0.4284 0.8618 0.9283
No log 1.7209 74 0.8193 0.4108 0.8193 0.9052
No log 1.7674 76 0.8655 0.4207 0.8655 0.9303
No log 1.8140 78 0.9057 0.4111 0.9057 0.9517
No log 1.8605 80 1.0868 0.3785 1.0868 1.0425
No log 1.9070 82 1.0739 0.3986 1.0739 1.0363
No log 1.9535 84 0.9139 0.4085 0.9139 0.9560
No log 2.0 86 0.7451 0.5070 0.7451 0.8632
No log 2.0465 88 0.7118 0.5040 0.7118 0.8437
No log 2.0930 90 0.7895 0.4208 0.7895 0.8885
No log 2.1395 92 0.7440 0.4725 0.7440 0.8626
No log 2.1860 94 0.7192 0.5112 0.7192 0.8480
No log 2.2326 96 0.6548 0.5374 0.6548 0.8092
No log 2.2791 98 0.6723 0.4981 0.6723 0.8199
No log 2.3256 100 0.7205 0.5157 0.7205 0.8488
No log 2.3721 102 0.9068 0.4658 0.9068 0.9523
No log 2.4186 104 1.0966 0.4053 1.0966 1.0472
No log 2.4651 106 1.0330 0.4298 1.0330 1.0164
No log 2.5116 108 0.8722 0.4761 0.8722 0.9339
No log 2.5581 110 0.7961 0.5080 0.7961 0.8922
No log 2.6047 112 0.8177 0.4287 0.8177 0.9043
No log 2.6512 114 0.8636 0.3838 0.8636 0.9293
No log 2.6977 116 0.8150 0.4591 0.8150 0.9028
No log 2.7442 118 0.8005 0.5137 0.8005 0.8947
No log 2.7907 120 0.9516 0.4250 0.9516 0.9755
No log 2.8372 122 1.0746 0.3741 1.0746 1.0366
No log 2.8837 124 1.2824 0.3965 1.2824 1.1324
No log 2.9302 126 1.3421 0.3955 1.3421 1.1585
No log 2.9767 128 1.4520 0.3926 1.4520 1.2050
No log 3.0233 130 1.4706 0.3959 1.4706 1.2127
No log 3.0698 132 1.2599 0.3864 1.2599 1.1224
No log 3.1163 134 1.0355 0.4658 1.0355 1.0176
No log 3.1628 136 0.9773 0.4780 0.9773 0.9886
No log 3.2093 138 0.9722 0.4437 0.9722 0.9860
No log 3.2558 140 0.9067 0.4462 0.9067 0.9522
No log 3.3023 142 0.8223 0.4627 0.8223 0.9068
No log 3.3488 144 0.8174 0.4677 0.8174 0.9041
No log 3.3953 146 0.9457 0.4160 0.9457 0.9725
No log 3.4419 148 1.0508 0.3980 1.0508 1.0251
No log 3.4884 150 0.9773 0.4041 0.9773 0.9886
No log 3.5349 152 0.9548 0.4073 0.9548 0.9772
No log 3.5814 154 1.0482 0.4013 1.0482 1.0238
No log 3.6279 156 1.0522 0.4150 1.0522 1.0257
No log 3.6744 158 0.9575 0.4330 0.9575 0.9785
No log 3.7209 160 0.9220 0.4034 0.9220 0.9602
No log 3.7674 162 0.9323 0.3902 0.9323 0.9656
No log 3.8140 164 0.9499 0.3923 0.9499 0.9746
No log 3.8605 166 1.0659 0.4379 1.0659 1.0324
No log 3.9070 168 1.0720 0.4054 1.0720 1.0354
No log 3.9535 170 0.9394 0.4007 0.9394 0.9693
No log 4.0 172 0.7980 0.4604 0.7980 0.8933
No log 4.0465 174 0.7521 0.5312 0.7521 0.8672
No log 4.0930 176 0.7238 0.4753 0.7238 0.8508
No log 4.1395 178 0.6995 0.5214 0.6995 0.8364
No log 4.1860 180 0.8270 0.4865 0.8270 0.9094
No log 4.2326 182 1.0829 0.4291 1.0829 1.0406
No log 4.2791 184 1.2059 0.4196 1.2059 1.0982
No log 4.3256 186 1.1456 0.4290 1.1456 1.0703
No log 4.3721 188 1.0212 0.4263 1.0212 1.0106
No log 4.4186 190 0.8589 0.4579 0.8589 0.9268
No log 4.4651 192 0.7794 0.4979 0.7794 0.8828
No log 4.5116 194 0.7145 0.5 0.7145 0.8453
No log 4.5581 196 0.7131 0.5223 0.7131 0.8445
No log 4.6047 198 0.7446 0.5036 0.7446 0.8629
No log 4.6512 200 0.8901 0.4678 0.8901 0.9435
No log 4.6977 202 1.2374 0.3714 1.2374 1.1124
No log 4.7442 204 1.5555 0.3385 1.5555 1.2472
No log 4.7907 206 1.5883 0.3267 1.5883 1.2603
No log 4.8372 208 1.4176 0.3667 1.4176 1.1906
No log 4.8837 210 1.2295 0.3994 1.2295 1.1088
No log 4.9302 212 0.9242 0.4559 0.9242 0.9614
No log 4.9767 214 0.7721 0.4568 0.7721 0.8787
No log 5.0233 216 0.7142 0.5225 0.7142 0.8451
No log 5.0698 218 0.6502 0.5464 0.6502 0.8063
No log 5.1163 220 0.6382 0.5496 0.6382 0.7988
No log 5.1628 222 0.6971 0.5169 0.6971 0.8349
No log 5.2093 224 0.8882 0.4501 0.8882 0.9425
No log 5.2558 226 1.1295 0.4304 1.1295 1.0628
No log 5.3023 228 1.2236 0.4316 1.2236 1.1062
No log 5.3488 230 1.1642 0.4526 1.1642 1.0790
No log 5.3953 232 1.0202 0.4644 1.0202 1.0101
No log 5.4419 234 0.8999 0.4832 0.8999 0.9487
No log 5.4884 236 0.8892 0.4618 0.8892 0.9430
No log 5.5349 238 0.9302 0.4264 0.9302 0.9645
No log 5.5814 240 0.9855 0.4500 0.9855 0.9927
No log 5.6279 242 0.9961 0.4498 0.9961 0.9981
No log 5.6744 244 1.0513 0.4457 1.0513 1.0253
No log 5.7209 246 1.0669 0.4619 1.0669 1.0329
No log 5.7674 248 1.1087 0.4250 1.1087 1.0530
No log 5.8140 250 1.1949 0.4154 1.1949 1.0931
No log 5.8605 252 1.2557 0.4154 1.2557 1.1206
No log 5.9070 254 1.2619 0.3965 1.2619 1.1233
No log 5.9535 256 1.1558 0.4250 1.1558 1.0751
No log 6.0 258 1.0170 0.4483 1.0170 1.0085
No log 6.0465 260 0.9039 0.4578 0.9039 0.9507
No log 6.0930 262 0.8528 0.4713 0.8528 0.9235
No log 6.1395 264 0.8705 0.4341 0.8705 0.9330
No log 6.1860 266 0.9149 0.4456 0.9149 0.9565
No log 6.2326 268 0.9394 0.4145 0.9394 0.9692
No log 6.2791 270 0.9551 0.4435 0.9551 0.9773
No log 6.3256 272 0.9415 0.4505 0.9415 0.9703
No log 6.3721 274 0.8793 0.4555 0.8793 0.9377
No log 6.4186 276 0.8619 0.4538 0.8619 0.9284
No log 6.4651 278 0.9044 0.4215 0.9044 0.9510
No log 6.5116 280 1.0289 0.3999 1.0289 1.0144
No log 6.5581 282 1.1286 0.4265 1.1286 1.0623
No log 6.6047 284 1.1232 0.4094 1.1232 1.0598
No log 6.6512 286 1.0655 0.4193 1.0655 1.0323
No log 6.6977 288 0.9901 0.4098 0.9901 0.9951
No log 6.7442 290 0.9666 0.4295 0.9666 0.9832
No log 6.7907 292 0.9821 0.4480 0.9821 0.9910
No log 6.8372 294 1.0422 0.4187 1.0422 1.0209
No log 6.8837 296 1.1367 0.4063 1.1367 1.0662
No log 6.9302 298 1.3000 0.4034 1.3000 1.1402
No log 6.9767 300 1.3671 0.3940 1.3671 1.1692
No log 7.0233 302 1.3880 0.3807 1.3880 1.1781
No log 7.0698 304 1.3160 0.3913 1.3160 1.1472
No log 7.1163 306 1.2022 0.3976 1.2022 1.0964
No log 7.1628 308 1.0726 0.4339 1.0726 1.0356
No log 7.2093 310 0.9709 0.4392 0.9709 0.9853
No log 7.2558 312 0.9224 0.4232 0.9224 0.9604
No log 7.3023 314 0.9418 0.4300 0.9418 0.9705
No log 7.3488 316 0.9944 0.4333 0.9944 0.9972
No log 7.3953 318 1.0204 0.4284 1.0204 1.0101
No log 7.4419 320 1.0094 0.4162 1.0094 1.0047
No log 7.4884 322 0.9885 0.4250 0.9885 0.9942
No log 7.5349 324 0.9852 0.4281 0.9852 0.9926
No log 7.5814 326 0.9988 0.4234 0.9988 0.9994
No log 7.6279 328 1.0493 0.4366 1.0493 1.0244
No log 7.6744 330 1.0901 0.4042 1.0901 1.0441
No log 7.7209 332 1.1185 0.4162 1.1185 1.0576
No log 7.7674 334 1.1117 0.4146 1.1117 1.0544
No log 7.8140 336 1.1227 0.4087 1.1227 1.0596
No log 7.8605 338 1.1114 0.4087 1.1114 1.0542
No log 7.9070 340 1.0498 0.4082 1.0498 1.0246
No log 7.9535 342 0.9602 0.4679 0.9602 0.9799
No log 8.0 344 0.9233 0.4624 0.9233 0.9609
No log 8.0465 346 0.9260 0.4624 0.9260 0.9623
No log 8.0930 348 0.9786 0.4340 0.9786 0.9892
No log 8.1395 350 1.0108 0.4292 1.0108 1.0054
No log 8.1860 352 1.0079 0.4307 1.0079 1.0039
No log 8.2326 354 0.9887 0.4263 0.9887 0.9944
No log 8.2791 356 0.9966 0.4263 0.9966 0.9983
No log 8.3256 358 1.0243 0.4263 1.0243 1.0121
No log 8.3721 360 1.0267 0.4307 1.0267 1.0133
No log 8.4186 362 1.0413 0.4145 1.0413 1.0204
No log 8.4651 364 1.0561 0.4145 1.0561 1.0277
No log 8.5116 366 1.1066 0.4146 1.1066 1.0519
No log 8.5581 368 1.1602 0.4104 1.1602 1.0771
No log 8.6047 370 1.1870 0.4005 1.1870 1.0895
No log 8.6512 372 1.1816 0.4162 1.1816 1.0870
No log 8.6977 374 1.1418 0.4146 1.1418 1.0686
No log 8.7442 376 1.1056 0.4146 1.1056 1.0515
No log 8.7907 378 1.0837 0.4145 1.0837 1.0410
No log 8.8372 380 1.0731 0.4145 1.0731 1.0359
No log 8.8837 382 1.0494 0.4145 1.0494 1.0244
No log 8.9302 384 1.0172 0.4401 1.0172 1.0086
No log 8.9767 386 1.0099 0.4462 1.0099 1.0049
No log 9.0233 388 1.0158 0.4354 1.0158 1.0079
No log 9.0698 390 1.0147 0.4263 1.0147 1.0073
No log 9.1163 392 1.0293 0.4263 1.0293 1.0146
No log 9.1628 394 1.0346 0.4115 1.0346 1.0171
No log 9.2093 396 1.0409 0.4175 1.0409 1.0202
No log 9.2558 398 1.0410 0.4175 1.0410 1.0203
No log 9.3023 400 1.0372 0.4174 1.0372 1.0184
No log 9.3488 402 1.0310 0.4174 1.0310 1.0154
No log 9.3953 404 1.0152 0.4174 1.0152 1.0076
No log 9.4419 406 0.9992 0.4174 0.9992 0.9996
No log 9.4884 408 0.9938 0.4175 0.9938 0.9969
No log 9.5349 410 0.9933 0.4175 0.9933 0.9967
No log 9.5814 412 0.9961 0.4175 0.9961 0.9980
No log 9.6279 414 1.0008 0.4175 1.0008 1.0004
No log 9.6744 416 0.9985 0.4175 0.9985 0.9993
No log 9.7209 418 0.9906 0.4175 0.9906 0.9953
No log 9.7674 420 0.9842 0.4234 0.9842 0.9921
No log 9.8140 422 0.9830 0.4234 0.9830 0.9915
No log 9.8605 424 0.9844 0.4234 0.9844 0.9922
No log 9.9070 426 0.9845 0.4234 0.9845 0.9922
No log 9.9535 428 0.9862 0.4234 0.9862 0.9931
No log 10.0 430 0.9868 0.4234 0.9868 0.9934

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k8_task2_organization

Finetuned
(4023)
this model