ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0807
  • Qwk: 0.6023
  • Mse: 1.0807
  • Rmse: 1.0396

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 2.3782 -0.0139 2.3782 1.5421
No log 0.0909 4 1.6288 0.1180 1.6288 1.2763
No log 0.1364 6 1.8303 -0.0135 1.8303 1.3529
No log 0.1818 8 1.5821 0.0776 1.5821 1.2578
No log 0.2273 10 1.4205 0.1093 1.4205 1.1919
No log 0.2727 12 1.3729 0.1057 1.3729 1.1717
No log 0.3182 14 1.3706 0.1266 1.3706 1.1707
No log 0.3636 16 1.3677 0.1236 1.3677 1.1695
No log 0.4091 18 1.3631 0.1236 1.3631 1.1675
No log 0.4545 20 1.3076 0.1528 1.3076 1.1435
No log 0.5 22 1.2840 0.1318 1.2840 1.1331
No log 0.5455 24 1.2730 0.1480 1.2730 1.1283
No log 0.5909 26 1.2397 0.1689 1.2397 1.1134
No log 0.6364 28 1.1984 0.2400 1.1984 1.0947
No log 0.6818 30 1.2203 0.3586 1.2203 1.1047
No log 0.7273 32 1.1729 0.2703 1.1729 1.0830
No log 0.7727 34 1.1824 0.2600 1.1824 1.0874
No log 0.8182 36 1.1950 0.2977 1.1950 1.0931
No log 0.8636 38 1.1636 0.3254 1.1636 1.0787
No log 0.9091 40 1.1712 0.3638 1.1712 1.0822
No log 0.9545 42 1.2118 0.3366 1.2118 1.1008
No log 1.0 44 1.1488 0.3920 1.1488 1.0718
No log 1.0455 46 1.0694 0.4301 1.0694 1.0341
No log 1.0909 48 1.0819 0.4887 1.0819 1.0402
No log 1.1364 50 1.0861 0.5033 1.0861 1.0422
No log 1.1818 52 1.0110 0.5028 1.0110 1.0055
No log 1.2273 54 1.0184 0.4387 1.0184 1.0092
No log 1.2727 56 1.0209 0.4722 1.0209 1.0104
No log 1.3182 58 1.0343 0.4342 1.0343 1.0170
No log 1.3636 60 1.0062 0.4665 1.0062 1.0031
No log 1.4091 62 0.9834 0.4546 0.9834 0.9916
No log 1.4545 64 0.9666 0.5143 0.9666 0.9832
No log 1.5 66 0.9431 0.5433 0.9431 0.9711
No log 1.5455 68 0.9029 0.5691 0.9029 0.9502
No log 1.5909 70 0.9086 0.5353 0.9086 0.9532
No log 1.6364 72 1.0067 0.5331 1.0067 1.0033
No log 1.6818 74 1.1783 0.4758 1.1783 1.0855
No log 1.7273 76 1.2182 0.4818 1.2182 1.1037
No log 1.7727 78 1.0649 0.4638 1.0649 1.0320
No log 1.8182 80 0.9183 0.5470 0.9183 0.9583
No log 1.8636 82 0.8417 0.5777 0.8417 0.9175
No log 1.9091 84 0.8389 0.6186 0.8389 0.9159
No log 1.9545 86 0.8741 0.5682 0.8741 0.9349
No log 2.0 88 1.0944 0.4767 1.0944 1.0462
No log 2.0455 90 1.2999 0.4952 1.2999 1.1401
No log 2.0909 92 1.2248 0.4971 1.2248 1.1067
No log 2.1364 94 0.9591 0.5751 0.9591 0.9794
No log 2.1818 96 0.9043 0.6323 0.9043 0.9510
No log 2.2273 98 1.0323 0.5761 1.0323 1.0160
No log 2.2727 100 1.2539 0.5699 1.2539 1.1198
No log 2.3182 102 1.3286 0.5003 1.3286 1.1527
No log 2.3636 104 1.1887 0.5974 1.1887 1.0903
No log 2.4091 106 1.0535 0.6351 1.0535 1.0264
No log 2.4545 108 1.0216 0.6366 1.0216 1.0107
No log 2.5 110 1.0532 0.6122 1.0532 1.0263
No log 2.5455 112 1.0227 0.6122 1.0227 1.0113
No log 2.5909 114 1.0320 0.6025 1.0320 1.0159
No log 2.6364 116 1.0099 0.6360 1.0099 1.0049
No log 2.6818 118 1.0099 0.6503 1.0099 1.0049
No log 2.7273 120 1.0973 0.6326 1.0973 1.0475
No log 2.7727 122 1.2503 0.5616 1.2503 1.1182
No log 2.8182 124 1.1955 0.6105 1.1955 1.0934
No log 2.8636 126 1.0957 0.6136 1.0957 1.0467
No log 2.9091 128 1.1733 0.5988 1.1733 1.0832
No log 2.9545 130 1.3187 0.5365 1.3187 1.1484
No log 3.0 132 1.3368 0.5237 1.3368 1.1562
No log 3.0455 134 1.3046 0.5441 1.3046 1.1422
No log 3.0909 136 1.1978 0.5879 1.1978 1.0944
No log 3.1364 138 1.0664 0.6197 1.0664 1.0327
No log 3.1818 140 0.9574 0.6221 0.9574 0.9785
No log 3.2273 142 0.9029 0.6101 0.9029 0.9502
No log 3.2727 144 0.9462 0.6438 0.9462 0.9727
No log 3.3182 146 1.1016 0.6076 1.1016 1.0496
No log 3.3636 148 1.3854 0.5151 1.3854 1.1770
No log 3.4091 150 1.7277 0.4833 1.7277 1.3144
No log 3.4545 152 1.7263 0.4712 1.7263 1.3139
No log 3.5 154 1.4541 0.5300 1.4541 1.2059
No log 3.5455 156 1.1505 0.6179 1.1505 1.0726
No log 3.5909 158 1.1121 0.6140 1.1121 1.0546
No log 3.6364 160 1.2577 0.5836 1.2577 1.1215
No log 3.6818 162 1.4410 0.5232 1.4410 1.2004
No log 3.7273 164 1.4621 0.5156 1.4621 1.2092
No log 3.7727 166 1.3953 0.5284 1.3953 1.1812
No log 3.8182 168 1.2942 0.5820 1.2942 1.1376
No log 3.8636 170 1.1317 0.6450 1.1317 1.0638
No log 3.9091 172 1.1562 0.6423 1.1562 1.0753
No log 3.9545 174 1.3228 0.5921 1.3228 1.1501
No log 4.0 176 1.6097 0.5373 1.6097 1.2687
No log 4.0455 178 1.7505 0.5287 1.7505 1.3231
No log 4.0909 180 1.6314 0.5341 1.6314 1.2772
No log 4.1364 182 1.3482 0.5856 1.3482 1.1611
No log 4.1818 184 1.0397 0.6179 1.0397 1.0197
No log 4.2273 186 0.9170 0.6186 0.9170 0.9576
No log 4.2727 188 0.9014 0.6359 0.9014 0.9494
No log 4.3182 190 0.9558 0.6313 0.9558 0.9777
No log 4.3636 192 1.0485 0.6016 1.0485 1.0240
No log 4.4091 194 1.0619 0.5745 1.0619 1.0305
No log 4.4545 196 1.1737 0.5619 1.1737 1.0834
No log 4.5 198 1.2747 0.5403 1.2747 1.1290
No log 4.5455 200 1.2166 0.5652 1.2166 1.1030
No log 4.5909 202 1.0982 0.6026 1.0982 1.0479
No log 4.6364 204 0.9346 0.6320 0.9346 0.9667
No log 4.6818 206 0.9160 0.6394 0.9160 0.9571
No log 4.7273 208 1.0170 0.6340 1.0170 1.0085
No log 4.7727 210 1.2562 0.5804 1.2562 1.1208
No log 4.8182 212 1.5491 0.5277 1.5491 1.2446
No log 4.8636 214 1.5737 0.5261 1.5737 1.2545
No log 4.9091 216 1.3874 0.5543 1.3874 1.1779
No log 4.9545 218 1.0914 0.6046 1.0914 1.0447
No log 5.0 220 0.9669 0.5915 0.9669 0.9833
No log 5.0455 222 0.8982 0.6220 0.8982 0.9477
No log 5.0909 224 0.9317 0.6160 0.9317 0.9653
No log 5.1364 226 1.0869 0.6185 1.0869 1.0426
No log 5.1818 228 1.2900 0.5785 1.2900 1.1358
No log 5.2273 230 1.2820 0.5892 1.2820 1.1323
No log 5.2727 232 1.2607 0.5814 1.2607 1.1228
No log 5.3182 234 1.2403 0.5745 1.2403 1.1137
No log 5.3636 236 1.1039 0.5959 1.1039 1.0507
No log 5.4091 238 1.0527 0.6052 1.0527 1.0260
No log 5.4545 240 1.1265 0.5799 1.1265 1.0614
No log 5.5 242 1.1566 0.5756 1.1566 1.0754
No log 5.5455 244 1.1445 0.5787 1.1445 1.0698
No log 5.5909 246 1.0894 0.5828 1.0894 1.0438
No log 5.6364 248 0.9980 0.6342 0.9980 0.9990
No log 5.6818 250 0.9964 0.6457 0.9964 0.9982
No log 5.7273 252 1.0755 0.6327 1.0755 1.0371
No log 5.7727 254 1.2582 0.5964 1.2582 1.1217
No log 5.8182 256 1.5165 0.5429 1.5165 1.2315
No log 5.8636 258 1.6139 0.5161 1.6139 1.2704
No log 5.9091 260 1.5518 0.5256 1.5518 1.2457
No log 5.9545 262 1.3467 0.5192 1.3467 1.1605
No log 6.0 264 1.1907 0.5487 1.1907 1.0912
No log 6.0455 266 1.1207 0.5479 1.1207 1.0586
No log 6.0909 268 1.1447 0.5208 1.1447 1.0699
No log 6.1364 270 1.2493 0.5103 1.2493 1.1177
No log 6.1818 272 1.3053 0.5236 1.3053 1.1425
No log 6.2273 274 1.3318 0.5232 1.3318 1.1540
No log 6.2727 276 1.2773 0.5382 1.2773 1.1302
No log 6.3182 278 1.1458 0.5954 1.1458 1.0704
No log 6.3636 280 1.0361 0.5912 1.0361 1.0179
No log 6.4091 282 1.0376 0.5905 1.0376 1.0186
No log 6.4545 284 1.1162 0.5787 1.1162 1.0565
No log 6.5 286 1.2217 0.5335 1.2217 1.1053
No log 6.5455 288 1.3058 0.5385 1.3058 1.1427
No log 6.5909 290 1.3228 0.5232 1.3228 1.1501
No log 6.6364 292 1.3353 0.5191 1.3353 1.1555
No log 6.6818 294 1.2755 0.5285 1.2755 1.1294
No log 6.7273 296 1.1878 0.5300 1.1878 1.0899
No log 6.7727 298 1.0888 0.5961 1.0888 1.0434
No log 6.8182 300 1.0601 0.5882 1.0601 1.0296
No log 6.8636 302 1.1008 0.5995 1.1008 1.0492
No log 6.9091 304 1.1837 0.5954 1.1837 1.0880
No log 6.9545 306 1.2372 0.5868 1.2372 1.1123
No log 7.0 308 1.2452 0.5814 1.2452 1.1159
No log 7.0455 310 1.1766 0.6060 1.1766 1.0847
No log 7.0909 312 1.1622 0.5858 1.1622 1.0781
No log 7.1364 314 1.1139 0.5910 1.1139 1.0554
No log 7.1818 316 1.0510 0.5878 1.0510 1.0252
No log 7.2273 318 1.0640 0.5942 1.0640 1.0315
No log 7.2727 320 1.0540 0.5942 1.0540 1.0267
No log 7.3182 322 1.0898 0.6003 1.0898 1.0439
No log 7.3636 324 1.1260 0.5980 1.1260 1.0611
No log 7.4091 326 1.1939 0.5714 1.1939 1.0927
No log 7.4545 328 1.2273 0.5505 1.2273 1.1078
No log 7.5 330 1.1886 0.5561 1.1886 1.0902
No log 7.5455 332 1.1004 0.5928 1.1004 1.0490
No log 7.5909 334 1.0136 0.6159 1.0136 1.0068
No log 7.6364 336 0.9678 0.6273 0.9678 0.9837
No log 7.6818 338 0.9818 0.6361 0.9818 0.9908
No log 7.7273 340 1.0402 0.6077 1.0402 1.0199
No log 7.7727 342 1.1016 0.5817 1.1016 1.0496
No log 7.8182 344 1.1610 0.5652 1.1610 1.0775
No log 7.8636 346 1.2086 0.5416 1.2086 1.0994
No log 7.9091 348 1.1915 0.5392 1.1915 1.0915
No log 7.9545 350 1.1430 0.5704 1.1430 1.0691
No log 8.0 352 1.0824 0.5962 1.0824 1.0404
No log 8.0455 354 1.0143 0.6142 1.0143 1.0071
No log 8.0909 356 0.9624 0.6459 0.9624 0.9810
No log 8.1364 358 0.9714 0.6251 0.9714 0.9856
No log 8.1818 360 0.9649 0.6429 0.9649 0.9823
No log 8.2273 362 0.9781 0.6284 0.9781 0.9890
No log 8.2727 364 1.0260 0.6076 1.0260 1.0129
No log 8.3182 366 1.0617 0.5994 1.0617 1.0304
No log 8.3636 368 1.0656 0.5869 1.0656 1.0323
No log 8.4091 370 1.0802 0.5869 1.0802 1.0393
No log 8.4545 372 1.1107 0.5858 1.1107 1.0539
No log 8.5 374 1.1528 0.5858 1.1528 1.0737
No log 8.5455 376 1.1560 0.5858 1.1560 1.0752
No log 8.5909 378 1.1238 0.5858 1.1238 1.0601
No log 8.6364 380 1.0874 0.6011 1.0874 1.0428
No log 8.6818 382 1.0795 0.6064 1.0795 1.0390
No log 8.7273 384 1.0968 0.6052 1.0968 1.0473
No log 8.7727 386 1.1374 0.6011 1.1374 1.0665
No log 8.8182 388 1.1766 0.5735 1.1766 1.0847
No log 8.8636 390 1.1930 0.5715 1.1930 1.0922
No log 8.9091 392 1.1985 0.5715 1.1985 1.0947
No log 8.9545 394 1.1758 0.5695 1.1758 1.0843
No log 9.0 396 1.1443 0.5815 1.1443 1.0697
No log 9.0455 398 1.1134 0.5968 1.1134 1.0552
No log 9.0909 400 1.0889 0.6092 1.0889 1.0435
No log 9.1364 402 1.0926 0.6080 1.0926 1.0453
No log 9.1818 404 1.0951 0.5980 1.0951 1.0465
No log 9.2273 406 1.0890 0.6092 1.0890 1.0436
No log 9.2727 408 1.0909 0.6092 1.0909 1.0445
No log 9.3182 410 1.1031 0.6011 1.1031 1.0503
No log 9.3636 412 1.1054 0.6011 1.1054 1.0514
No log 9.4091 414 1.1104 0.6011 1.1104 1.0538
No log 9.4545 416 1.1083 0.6011 1.1083 1.0528
No log 9.5 418 1.1050 0.6011 1.1050 1.0512
No log 9.5455 420 1.0983 0.6023 1.0983 1.0480
No log 9.5909 422 1.0863 0.6023 1.0863 1.0423
No log 9.6364 424 1.0713 0.6023 1.0713 1.0350
No log 9.6818 426 1.0658 0.6064 1.0658 1.0324
No log 9.7273 428 1.0648 0.6064 1.0648 1.0319
No log 9.7727 430 1.0657 0.6132 1.0657 1.0323
No log 9.8182 432 1.0704 0.6132 1.0704 1.0346
No log 9.8636 434 1.0744 0.6132 1.0744 1.0366
No log 9.9091 436 1.0795 0.6064 1.0795 1.0390
No log 9.9545 438 1.0808 0.6023 1.0808 1.0396
No log 10.0 440 1.0807 0.6023 1.0807 1.0396

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k9_task5_organization

Finetuned
(4023)
this model