ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8020
  • Qwk: 0.6716
  • Mse: 0.8020
  • Rmse: 0.8956

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 7.1475 0.0057 7.1475 2.6735
No log 0.0833 4 4.5387 0.0500 4.5387 2.1304
No log 0.125 6 3.7582 -0.0729 3.7582 1.9386
No log 0.1667 8 2.7595 0.0667 2.7595 1.6612
No log 0.2083 10 1.9346 0.0855 1.9346 1.3909
No log 0.25 12 1.8941 0.1754 1.8941 1.3763
No log 0.2917 14 2.0940 0.1311 2.0940 1.4471
No log 0.3333 16 1.9429 0.1913 1.9429 1.3939
No log 0.375 18 1.6877 0.0943 1.6877 1.2991
No log 0.4167 20 1.6567 0.1495 1.6567 1.2871
No log 0.4583 22 1.8104 0.2143 1.8104 1.3455
No log 0.5 24 1.7755 0.2632 1.7755 1.3325
No log 0.5417 26 1.9248 0.3231 1.9248 1.3874
No log 0.5833 28 2.2889 0.0828 2.2889 1.5129
No log 0.625 30 2.1792 0.1690 2.1792 1.4762
No log 0.6667 32 1.9094 0.2985 1.9094 1.3818
No log 0.7083 34 1.7320 0.3833 1.7320 1.3160
No log 0.75 36 1.7472 0.3810 1.7472 1.3218
No log 0.7917 38 1.6219 0.4065 1.6219 1.2735
No log 0.8333 40 1.6215 0.4252 1.6215 1.2734
No log 0.875 42 1.6521 0.4219 1.6521 1.2853
No log 0.9167 44 1.7211 0.3817 1.7211 1.3119
No log 0.9583 46 1.9497 0.2609 1.9497 1.3963
No log 1.0 48 2.2094 0.1733 2.2094 1.4864
No log 1.0417 50 2.3753 0.1429 2.3753 1.5412
No log 1.0833 52 2.2589 0.1733 2.2589 1.5030
No log 1.125 54 1.7853 0.3741 1.7853 1.3361
No log 1.1667 56 1.3593 0.4516 1.3593 1.1659
No log 1.2083 58 1.2124 0.4959 1.2124 1.1011
No log 1.25 60 1.1729 0.5082 1.1729 1.0830
No log 1.2917 62 1.2108 0.5041 1.2108 1.1003
No log 1.3333 64 1.3913 0.3876 1.3913 1.1796
No log 1.375 66 1.3991 0.3788 1.3991 1.1828
No log 1.4167 68 1.3059 0.5113 1.3059 1.1427
No log 1.4583 70 1.1977 0.4921 1.1977 1.0944
No log 1.5 72 1.1296 0.4706 1.1296 1.0628
No log 1.5417 74 1.1220 0.5124 1.1220 1.0593
No log 1.5833 76 1.1386 0.5 1.1386 1.0671
No log 1.625 78 1.2571 0.48 1.2571 1.1212
No log 1.6667 80 1.7051 0.3308 1.7051 1.3058
No log 1.7083 82 1.5068 0.3492 1.5068 1.2275
No log 1.75 84 1.3492 0.4463 1.3492 1.1615
No log 1.7917 86 1.4552 0.4160 1.4552 1.2063
No log 1.8333 88 1.4136 0.4320 1.4136 1.1890
No log 1.875 90 1.2731 0.4228 1.2731 1.1283
No log 1.9167 92 1.2191 0.4724 1.2191 1.1041
No log 1.9583 94 1.1858 0.4480 1.1858 1.0890
No log 2.0 96 1.0245 0.4576 1.0245 1.0122
No log 2.0417 98 1.0158 0.4538 1.0158 1.0079
No log 2.0833 100 1.0519 0.5366 1.0519 1.0256
No log 2.125 102 1.2391 0.5522 1.2391 1.1132
No log 2.1667 104 1.3330 0.5217 1.3330 1.1545
No log 2.2083 106 1.2862 0.5075 1.2862 1.1341
No log 2.25 108 1.2394 0.5 1.2394 1.1133
No log 2.2917 110 1.0682 0.4715 1.0682 1.0336
No log 2.3333 112 1.0672 0.4839 1.0672 1.0330
No log 2.375 114 1.0713 0.5238 1.0713 1.0350
No log 2.4167 116 1.0957 0.528 1.0957 1.0468
No log 2.4583 118 1.1114 0.512 1.1114 1.0542
No log 2.5 120 1.1795 0.4355 1.1795 1.0861
No log 2.5417 122 1.3321 0.4848 1.3321 1.1542
No log 2.5833 124 1.2781 0.4531 1.2781 1.1305
No log 2.625 126 1.1176 0.4724 1.1176 1.0572
No log 2.6667 128 1.0868 0.5385 1.0868 1.0425
No log 2.7083 130 1.1198 0.4961 1.1198 1.0582
No log 2.75 132 1.1447 0.5038 1.1447 1.0699
No log 2.7917 134 1.2781 0.5113 1.2781 1.1305
No log 2.8333 136 1.3161 0.4361 1.3161 1.1472
No log 2.875 138 1.4624 0.4672 1.4624 1.2093
No log 2.9167 140 1.4755 0.4823 1.4755 1.2147
No log 2.9583 142 1.2729 0.5191 1.2729 1.1282
No log 3.0 144 1.0645 0.5385 1.0645 1.0318
No log 3.0417 146 1.0647 0.5397 1.0647 1.0318
No log 3.0833 148 1.3210 0.4375 1.3210 1.1494
No log 3.125 150 1.8014 0.3134 1.8014 1.3421
No log 3.1667 152 1.7042 0.3433 1.7042 1.3055
No log 3.2083 154 1.2605 0.4252 1.2605 1.1227
No log 3.25 156 1.0463 0.5581 1.0463 1.0229
No log 3.2917 158 1.0298 0.5692 1.0298 1.0148
No log 3.3333 160 1.0421 0.5354 1.0421 1.0208
No log 3.375 162 1.1108 0.5238 1.1108 1.0539
No log 3.4167 164 1.3262 0.5152 1.3262 1.1516
No log 3.4583 166 1.5974 0.4218 1.5974 1.2639
No log 3.5 168 1.6062 0.4487 1.6062 1.2674
No log 3.5417 170 1.4840 0.4507 1.4840 1.2182
No log 3.5833 172 1.3711 0.4493 1.3711 1.1709
No log 3.625 174 1.5443 0.3623 1.5443 1.2427
No log 3.6667 176 1.6465 0.3741 1.6465 1.2832
No log 3.7083 178 1.8321 0.3165 1.8321 1.3535
No log 3.75 180 1.6930 0.3212 1.6930 1.3011
No log 3.7917 182 1.6185 0.3212 1.6185 1.2722
No log 3.8333 184 1.3379 0.4627 1.3379 1.1567
No log 3.875 186 1.1822 0.5469 1.1822 1.0873
No log 3.9167 188 1.1465 0.528 1.1465 1.0708
No log 3.9583 190 1.1991 0.4885 1.1991 1.0950
No log 4.0 192 1.4158 0.4571 1.4158 1.1899
No log 4.0417 194 1.5754 0.4658 1.5754 1.2551
No log 4.0833 196 1.7055 0.4359 1.7055 1.3059
No log 4.125 198 1.3111 0.4722 1.3111 1.1450
No log 4.1667 200 0.9901 0.5926 0.9901 0.9950
No log 4.2083 202 0.9748 0.6324 0.9748 0.9873
No log 4.25 204 1.1694 0.5038 1.1694 1.0814
No log 4.2917 206 1.1761 0.5038 1.1761 1.0845
No log 4.3333 208 0.9501 0.6519 0.9501 0.9747
No log 4.375 210 0.8746 0.6286 0.8746 0.9352
No log 4.4167 212 0.9481 0.625 0.9481 0.9737
No log 4.4583 214 0.9076 0.6331 0.9076 0.9527
No log 4.5 216 0.8859 0.6765 0.8859 0.9412
No log 4.5417 218 0.9920 0.5926 0.9920 0.9960
No log 4.5833 220 1.1178 0.5455 1.1178 1.0573
No log 4.625 222 1.2028 0.4965 1.2028 1.0967
No log 4.6667 224 1.0371 0.6286 1.0371 1.0184
No log 4.7083 226 0.9016 0.6286 0.9016 0.9495
No log 4.75 228 0.9495 0.6377 0.9495 0.9744
No log 4.7917 230 0.9904 0.6423 0.9904 0.9952
No log 4.8333 232 0.9565 0.6475 0.9565 0.9780
No log 4.875 234 0.9486 0.5354 0.9486 0.9740
No log 4.9167 236 1.0847 0.5303 1.0847 1.0415
No log 4.9583 238 1.1789 0.5522 1.1789 1.0858
No log 5.0 240 1.1450 0.5401 1.1450 1.0700
No log 5.0417 242 1.0267 0.5821 1.0267 1.0133
No log 5.0833 244 0.9350 0.5581 0.9350 0.9670
No log 5.125 246 0.9180 0.5846 0.9180 0.9581
No log 5.1667 248 0.9529 0.6119 0.9529 0.9762
No log 5.2083 250 1.0031 0.5606 1.0031 1.0015
No log 5.25 252 1.0270 0.5846 1.0270 1.0134
No log 5.2917 254 1.0205 0.5736 1.0205 1.0102
No log 5.3333 256 0.9793 0.5692 0.9793 0.9896
No log 5.375 258 1.0338 0.5669 1.0338 1.0168
No log 5.4167 260 1.0885 0.5736 1.0885 1.0433
No log 5.4583 262 1.0381 0.6324 1.0381 1.0189
No log 5.5 264 0.9983 0.5802 0.9983 0.9991
No log 5.5417 266 0.9803 0.6061 0.9803 0.9901
No log 5.5833 268 0.9669 0.6222 0.9669 0.9833
No log 5.625 270 0.9409 0.6423 0.9409 0.9700
No log 5.6667 272 0.9504 0.6143 0.9504 0.9749
No log 5.7083 274 0.9298 0.6331 0.9298 0.9642
No log 5.75 276 0.9224 0.6479 0.9224 0.9604
No log 5.7917 278 0.9210 0.6429 0.9210 0.9597
No log 5.8333 280 1.0602 0.5850 1.0602 1.0297
No log 5.875 282 1.2182 0.5333 1.2182 1.1037
No log 5.9167 284 1.2053 0.5442 1.2053 1.0979
No log 5.9583 286 1.0630 0.5672 1.0630 1.0310
No log 6.0 288 1.0003 0.5161 1.0003 1.0001
No log 6.0417 290 1.0029 0.5161 1.0029 1.0015
No log 6.0833 292 0.9824 0.512 0.9824 0.9912
No log 6.125 294 1.0289 0.5564 1.0289 1.0143
No log 6.1667 296 1.0837 0.5882 1.0837 1.0410
No log 6.2083 298 1.0317 0.5821 1.0317 1.0157
No log 6.25 300 0.9389 0.5736 0.9389 0.9690
No log 6.2917 302 0.9421 0.6471 0.9421 0.9706
No log 6.3333 304 0.9498 0.6812 0.9498 0.9746
No log 6.375 306 0.9504 0.6119 0.9504 0.9749
No log 6.4167 308 0.9431 0.6423 0.9431 0.9711
No log 6.4583 310 0.9361 0.6107 0.9361 0.9675
No log 6.5 312 0.9337 0.6000 0.9337 0.9663
No log 6.5417 314 0.9130 0.5938 0.9130 0.9555
No log 6.5833 316 0.9111 0.6154 0.9111 0.9545
No log 6.625 318 0.9506 0.5581 0.9506 0.9750
No log 6.6667 320 1.1066 0.5781 1.1066 1.0519
No log 6.7083 322 1.2252 0.4697 1.2252 1.1069
No log 6.75 324 1.1779 0.5455 1.1779 1.0853
No log 6.7917 326 1.0100 0.5846 1.0100 1.0050
No log 6.8333 328 0.9697 0.5538 0.9697 0.9847
No log 6.875 330 1.0443 0.5649 1.0443 1.0219
No log 6.9167 332 1.1239 0.5426 1.1239 1.0601
No log 6.9583 334 1.0675 0.5538 1.0675 1.0332
No log 7.0 336 0.9835 0.5692 0.9835 0.9917
No log 7.0417 338 0.9737 0.5736 0.9737 0.9868
No log 7.0833 340 0.9925 0.6165 0.9925 0.9963
No log 7.125 342 0.9899 0.5781 0.9899 0.9949
No log 7.1667 344 0.9716 0.5781 0.9716 0.9857
No log 7.2083 346 0.9403 0.5891 0.9403 0.9697
No log 7.25 348 0.9527 0.5846 0.9527 0.9761
No log 7.2917 350 0.9431 0.5538 0.9431 0.9712
No log 7.3333 352 0.8686 0.6107 0.8686 0.9320
No log 7.375 354 0.8382 0.6107 0.8382 0.9155
No log 7.4167 356 0.8053 0.6107 0.8053 0.8974
No log 7.4583 358 0.8205 0.6 0.8205 0.9058
No log 7.5 360 0.8303 0.5736 0.8303 0.9112
No log 7.5417 362 0.8673 0.5469 0.8673 0.9313
No log 7.5833 364 0.8757 0.5469 0.8757 0.9358
No log 7.625 366 0.8919 0.5397 0.8919 0.9444
No log 7.6667 368 0.8479 0.5781 0.8479 0.9208
No log 7.7083 370 0.8046 0.6515 0.8046 0.8970
No log 7.75 372 0.7988 0.6316 0.7988 0.8937
No log 7.7917 374 0.7906 0.6667 0.7906 0.8892
No log 7.8333 376 0.7650 0.7143 0.7650 0.8746
No log 7.875 378 0.7527 0.7260 0.7527 0.8676
No log 7.9167 380 0.7593 0.7397 0.7593 0.8714
No log 7.9583 382 0.7937 0.7042 0.7937 0.8909
No log 8.0 384 0.8249 0.6906 0.8249 0.9082
No log 8.0417 386 0.8179 0.7059 0.8179 0.9044
No log 8.0833 388 0.8194 0.6715 0.8194 0.9052
No log 8.125 390 0.8248 0.6667 0.8248 0.9082
No log 8.1667 392 0.8125 0.7153 0.8125 0.9014
No log 8.2083 394 0.7892 0.7153 0.7892 0.8884
No log 8.25 396 0.7856 0.6957 0.7856 0.8864
No log 8.2917 398 0.7902 0.6957 0.7902 0.8890
No log 8.3333 400 0.8042 0.6765 0.8042 0.8968
No log 8.375 402 0.7897 0.7007 0.7897 0.8886
No log 8.4167 404 0.8011 0.6912 0.8011 0.8951
No log 8.4583 406 0.8475 0.6165 0.8475 0.9206
No log 8.5 408 0.9175 0.6269 0.9175 0.9579
No log 8.5417 410 0.9465 0.6107 0.9465 0.9729
No log 8.5833 412 0.9042 0.6 0.9042 0.9509
No log 8.625 414 0.9159 0.6212 0.9159 0.9570
No log 8.6667 416 0.9994 0.5882 0.9994 0.9997
No log 8.7083 418 1.0306 0.5797 1.0306 1.0152
No log 8.75 420 0.9958 0.5899 0.9958 0.9979
No log 8.7917 422 0.8805 0.6131 0.8805 0.9384
No log 8.8333 424 0.7864 0.6567 0.7864 0.8868
No log 8.875 426 0.7636 0.7101 0.7636 0.8738
No log 8.9167 428 0.8121 0.6567 0.8121 0.9012
No log 8.9583 430 0.9851 0.6131 0.9851 0.9925
No log 9.0 432 1.2736 0.6282 1.2736 1.1285
No log 9.0417 434 1.3338 0.5385 1.3338 1.1549
No log 9.0833 436 1.1024 0.5714 1.1024 1.0500
No log 9.125 438 0.9021 0.6316 0.9021 0.9498
No log 9.1667 440 0.8778 0.6 0.8778 0.9369
No log 9.2083 442 0.8606 0.6767 0.8606 0.9277
No log 9.25 444 0.8119 0.6815 0.8119 0.9010
No log 9.2917 446 0.7896 0.6912 0.7896 0.8886
No log 9.3333 448 0.7783 0.6912 0.7783 0.8822
No log 9.375 450 0.7588 0.6716 0.7588 0.8711
No log 9.4167 452 0.7508 0.6716 0.7508 0.8665
No log 9.4583 454 0.7617 0.6617 0.7617 0.8727
No log 9.5 456 0.8128 0.6765 0.8128 0.9015
No log 9.5417 458 0.8206 0.6667 0.8206 0.9059
No log 9.5833 460 0.9060 0.6165 0.9060 0.9518
No log 9.625 462 0.9583 0.6525 0.9583 0.9789
No log 9.6667 464 0.8824 0.6475 0.8824 0.9394
No log 9.7083 466 0.7621 0.6866 0.7621 0.8730
No log 9.75 468 0.7432 0.6567 0.7432 0.8621
No log 9.7917 470 0.7623 0.6567 0.7623 0.8731
No log 9.8333 472 0.8109 0.6269 0.8109 0.9005
No log 9.875 474 0.8838 0.6715 0.8838 0.9401
No log 9.9167 476 1.0444 0.5942 1.0444 1.0219
No log 9.9583 478 1.2162 0.5634 1.2162 1.1028
No log 10.0 480 1.3808 0.4865 1.3808 1.1751
No log 10.0417 482 1.2546 0.5442 1.2546 1.1201
No log 10.0833 484 0.9457 0.6286 0.9457 0.9725
No log 10.125 486 0.8187 0.6364 0.8187 0.9048
No log 10.1667 488 0.8197 0.6866 0.8197 0.9054
No log 10.2083 490 0.8250 0.6815 0.8250 0.9083
No log 10.25 492 0.8818 0.6618 0.8818 0.9391
No log 10.2917 494 0.9350 0.5649 0.9350 0.9670
No log 10.3333 496 0.9033 0.6569 0.9033 0.9504
No log 10.375 498 0.8597 0.6667 0.8597 0.9272
0.4275 10.4167 500 0.8182 0.7007 0.8182 0.9045
0.4275 10.4583 502 0.7990 0.7007 0.7990 0.8939
0.4275 10.5 504 0.7903 0.7153 0.7903 0.8890
0.4275 10.5417 506 0.7894 0.6667 0.7894 0.8885
0.4275 10.5833 508 0.8128 0.6519 0.8128 0.9016
0.4275 10.625 510 0.8696 0.6197 0.8696 0.9325
0.4275 10.6667 512 0.8596 0.6074 0.8596 0.9271
0.4275 10.7083 514 0.8020 0.6716 0.8020 0.8956

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task1_organization

Finetuned
(4023)
this model