ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7333
  • Qwk: 0.5050
  • Mse: 0.7333
  • Rmse: 0.8563

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 4.0530 -0.0134 4.0530 2.0132
No log 0.0952 4 2.2566 0.0168 2.2566 1.5022
No log 0.1429 6 1.5437 -0.1013 1.5437 1.2425
No log 0.1905 8 1.6951 -0.1075 1.6951 1.3020
No log 0.2381 10 1.0047 -0.1179 1.0047 1.0024
No log 0.2857 12 0.8236 0.1404 0.8236 0.9075
No log 0.3333 14 0.8243 0.1494 0.8243 0.9079
No log 0.3810 16 0.7273 0.2507 0.7273 0.8528
No log 0.4286 18 0.9289 0.0851 0.9289 0.9638
No log 0.4762 20 1.0314 0.0906 1.0314 1.0156
No log 0.5238 22 1.0063 0.1506 1.0063 1.0031
No log 0.5714 24 1.1691 0.1296 1.1691 1.0812
No log 0.6190 26 1.1169 0.1666 1.1169 1.0568
No log 0.6667 28 1.0342 0.1676 1.0342 1.0170
No log 0.7143 30 0.7359 0.2694 0.7359 0.8579
No log 0.7619 32 0.6567 0.3765 0.6567 0.8104
No log 0.8095 34 0.7604 0.2617 0.7604 0.8720
No log 0.8571 36 0.7325 0.2459 0.7325 0.8559
No log 0.9048 38 0.6542 0.3814 0.6542 0.8088
No log 0.9524 40 0.7276 0.4742 0.7276 0.8530
No log 1.0 42 0.9583 0.3667 0.9583 0.9789
No log 1.0476 44 1.4383 0.2676 1.4383 1.1993
No log 1.0952 46 1.7608 0.1835 1.7608 1.3269
No log 1.1429 48 1.4151 0.2559 1.4151 1.1896
No log 1.1905 50 0.9635 0.3091 0.9635 0.9816
No log 1.2381 52 0.7456 0.3839 0.7456 0.8635
No log 1.2857 54 0.6123 0.3851 0.6123 0.7825
No log 1.3333 56 0.6045 0.4017 0.6045 0.7775
No log 1.3810 58 0.6019 0.3958 0.6019 0.7758
No log 1.4286 60 0.6632 0.3807 0.6632 0.8144
No log 1.4762 62 0.6473 0.3648 0.6473 0.8045
No log 1.5238 64 0.5722 0.4459 0.5722 0.7564
No log 1.5714 66 0.5398 0.4683 0.5398 0.7347
No log 1.6190 68 0.5614 0.4153 0.5614 0.7492
No log 1.6667 70 0.6844 0.3719 0.6844 0.8273
No log 1.7143 72 0.7752 0.3916 0.7752 0.8805
No log 1.7619 74 0.8403 0.4304 0.8403 0.9167
No log 1.8095 76 1.2116 0.2934 1.2116 1.1007
No log 1.8571 78 1.4967 0.2846 1.4967 1.2234
No log 1.9048 80 1.3875 0.2965 1.3875 1.1779
No log 1.9524 82 0.9330 0.4052 0.9330 0.9659
No log 2.0 84 0.6366 0.4397 0.6366 0.7979
No log 2.0476 86 0.5004 0.4368 0.5004 0.7074
No log 2.0952 88 0.5179 0.4219 0.5179 0.7196
No log 2.1429 90 0.5264 0.4742 0.5264 0.7255
No log 2.1905 92 0.5944 0.5059 0.5944 0.7710
No log 2.2381 94 0.8085 0.4239 0.8085 0.8992
No log 2.2857 96 1.3489 0.2777 1.3489 1.1614
No log 2.3333 98 1.7684 0.1703 1.7684 1.3298
No log 2.3810 100 1.7220 0.1630 1.7220 1.3122
No log 2.4286 102 1.4728 0.2149 1.4728 1.2136
No log 2.4762 104 1.0247 0.3664 1.0247 1.0123
No log 2.5238 106 0.6782 0.4218 0.6782 0.8236
No log 2.5714 108 0.5852 0.4849 0.5852 0.7650
No log 2.6190 110 0.5697 0.4494 0.5697 0.7548
No log 2.6667 112 0.5819 0.4653 0.5819 0.7628
No log 2.7143 114 0.6149 0.4816 0.6149 0.7842
No log 2.7619 116 0.7112 0.4879 0.7112 0.8433
No log 2.8095 118 0.8072 0.4645 0.8072 0.8984
No log 2.8571 120 0.7991 0.4812 0.7991 0.8939
No log 2.9048 122 0.6677 0.4830 0.6677 0.8171
No log 2.9524 124 0.6082 0.5409 0.6082 0.7799
No log 3.0 126 0.6101 0.5519 0.6101 0.7811
No log 3.0476 128 0.6133 0.5619 0.6133 0.7832
No log 3.0952 130 0.6946 0.5514 0.6946 0.8334
No log 3.1429 132 0.8302 0.4914 0.8302 0.9111
No log 3.1905 134 0.8502 0.5101 0.8502 0.9221
No log 3.2381 136 0.7648 0.5334 0.7648 0.8745
No log 3.2857 138 0.7191 0.5255 0.7191 0.8480
No log 3.3333 140 0.7234 0.5407 0.7234 0.8505
No log 3.3810 142 0.7708 0.5091 0.7708 0.8779
No log 3.4286 144 0.8311 0.5134 0.8311 0.9117
No log 3.4762 146 0.8745 0.4744 0.8745 0.9352
No log 3.5238 148 0.8372 0.5110 0.8372 0.9150
No log 3.5714 150 0.8142 0.5425 0.8142 0.9023
No log 3.6190 152 0.8437 0.5165 0.8437 0.9185
No log 3.6667 154 0.8698 0.5448 0.8698 0.9326
No log 3.7143 156 0.8760 0.5346 0.8760 0.9359
No log 3.7619 158 0.8187 0.5327 0.8187 0.9048
No log 3.8095 160 0.7908 0.5346 0.7908 0.8893
No log 3.8571 162 0.7690 0.5496 0.7690 0.8769
No log 3.9048 164 0.7875 0.4982 0.7875 0.8874
No log 3.9524 166 0.9156 0.5153 0.9156 0.9569
No log 4.0 168 0.9449 0.5018 0.9449 0.9721
No log 4.0476 170 0.8266 0.5068 0.8266 0.9092
No log 4.0952 172 0.7500 0.4955 0.7500 0.8660
No log 4.1429 174 0.7229 0.5101 0.7229 0.8502
No log 4.1905 176 0.7275 0.5195 0.7275 0.8529
No log 4.2381 178 0.7409 0.5082 0.7409 0.8608
No log 4.2857 180 0.7827 0.4867 0.7827 0.8847
No log 4.3333 182 0.8266 0.4944 0.8266 0.9092
No log 4.3810 184 0.8010 0.5089 0.8010 0.8950
No log 4.4286 186 0.7618 0.4848 0.7618 0.8728
No log 4.4762 188 0.7732 0.5298 0.7732 0.8793
No log 4.5238 190 0.7943 0.4963 0.7943 0.8912
No log 4.5714 192 0.7912 0.4896 0.7912 0.8895
No log 4.6190 194 0.7775 0.4926 0.7775 0.8818
No log 4.6667 196 0.7953 0.5102 0.7953 0.8918
No log 4.7143 198 0.8889 0.4817 0.8889 0.9428
No log 4.7619 200 0.9067 0.4643 0.9067 0.9522
No log 4.8095 202 0.8092 0.4968 0.8092 0.8995
No log 4.8571 204 0.7342 0.5045 0.7342 0.8568
No log 4.9048 206 0.7293 0.4729 0.7293 0.8540
No log 4.9524 208 0.7325 0.4657 0.7325 0.8558
No log 5.0 210 0.7342 0.5071 0.7342 0.8569
No log 5.0476 212 0.7509 0.4651 0.7509 0.8666
No log 5.0952 214 0.7730 0.4533 0.7730 0.8792
No log 5.1429 216 0.7528 0.4442 0.7528 0.8677
No log 5.1905 218 0.7219 0.4985 0.7219 0.8496
No log 5.2381 220 0.6899 0.54 0.6899 0.8306
No log 5.2857 222 0.6732 0.5336 0.6732 0.8205
No log 5.3333 224 0.6673 0.4970 0.6673 0.8169
No log 5.3810 226 0.6874 0.4776 0.6874 0.8291
No log 5.4286 228 0.7651 0.4145 0.7651 0.8747
No log 5.4762 230 0.8333 0.4106 0.8333 0.9129
No log 5.5238 232 0.8336 0.4106 0.8336 0.9130
No log 5.5714 234 0.7874 0.4106 0.7874 0.8874
No log 5.6190 236 0.7423 0.4814 0.7423 0.8616
No log 5.6667 238 0.7322 0.4688 0.7322 0.8557
No log 5.7143 240 0.7134 0.4795 0.7134 0.8446
No log 5.7619 242 0.7241 0.4806 0.7241 0.8509
No log 5.8095 244 0.7494 0.4514 0.7494 0.8657
No log 5.8571 246 0.7663 0.4589 0.7663 0.8754
No log 5.9048 248 0.7560 0.5259 0.7560 0.8695
No log 5.9524 250 0.7542 0.5106 0.7542 0.8685
No log 6.0 252 0.7449 0.4795 0.7449 0.8631
No log 6.0476 254 0.7418 0.4648 0.7418 0.8613
No log 6.0952 256 0.7307 0.4648 0.7308 0.8548
No log 6.1429 258 0.7178 0.4860 0.7178 0.8472
No log 6.1905 260 0.7193 0.4694 0.7193 0.8481
No log 6.2381 262 0.7309 0.5054 0.7309 0.8549
No log 6.2857 264 0.7496 0.5144 0.7496 0.8658
No log 6.3333 266 0.7686 0.4921 0.7686 0.8767
No log 6.3810 268 0.7739 0.4778 0.7739 0.8797
No log 6.4286 270 0.7739 0.4767 0.7739 0.8797
No log 6.4762 272 0.7838 0.4767 0.7838 0.8853
No log 6.5238 274 0.7991 0.4521 0.7991 0.8939
No log 6.5714 276 0.7929 0.4778 0.7929 0.8904
No log 6.6190 278 0.7801 0.4879 0.7801 0.8832
No log 6.6667 280 0.7680 0.4879 0.7680 0.8763
No log 6.7143 282 0.7506 0.4886 0.7506 0.8664
No log 6.7619 284 0.7414 0.4727 0.7414 0.8610
No log 6.8095 286 0.7304 0.4863 0.7304 0.8546
No log 6.8571 288 0.7300 0.4916 0.7300 0.8544
No log 6.9048 290 0.7364 0.5094 0.7364 0.8581
No log 6.9524 292 0.7524 0.5094 0.7524 0.8674
No log 7.0 294 0.7617 0.5094 0.7617 0.8728
No log 7.0476 296 0.7601 0.5094 0.7601 0.8718
No log 7.0952 298 0.7491 0.5079 0.7491 0.8655
No log 7.1429 300 0.7389 0.5187 0.7389 0.8596
No log 7.1905 302 0.7320 0.5326 0.7320 0.8556
No log 7.2381 304 0.7335 0.5466 0.7335 0.8565
No log 7.2857 306 0.7278 0.5466 0.7278 0.8531
No log 7.3333 308 0.7274 0.5236 0.7274 0.8529
No log 7.3810 310 0.7362 0.5227 0.7362 0.8580
No log 7.4286 312 0.7507 0.4836 0.7507 0.8664
No log 7.4762 314 0.7307 0.5148 0.7307 0.8548
No log 7.5238 316 0.6990 0.5405 0.6990 0.8361
No log 7.5714 318 0.6866 0.4763 0.6866 0.8286
No log 7.6190 320 0.6898 0.5041 0.6898 0.8306
No log 7.6667 322 0.6907 0.5041 0.6907 0.8311
No log 7.7143 324 0.6919 0.4856 0.6919 0.8318
No log 7.7619 326 0.6934 0.4944 0.6934 0.8327
No log 7.8095 328 0.6950 0.4944 0.6950 0.8336
No log 7.8571 330 0.6999 0.4844 0.6999 0.8366
No log 7.9048 332 0.7031 0.4920 0.7031 0.8385
No log 7.9524 334 0.6997 0.4944 0.6997 0.8365
No log 8.0 336 0.7044 0.4968 0.7044 0.8393
No log 8.0476 338 0.7113 0.4952 0.7113 0.8434
No log 8.0952 340 0.7116 0.4803 0.7116 0.8436
No log 8.1429 342 0.7087 0.4803 0.7087 0.8418
No log 8.1905 344 0.7004 0.4791 0.7004 0.8369
No log 8.2381 346 0.6905 0.4791 0.6905 0.8310
No log 8.2857 348 0.6859 0.4791 0.6859 0.8282
No log 8.3333 350 0.6848 0.4959 0.6848 0.8275
No log 8.3810 352 0.6894 0.5269 0.6894 0.8303
No log 8.4286 354 0.7000 0.5211 0.7000 0.8367
No log 8.4762 356 0.7156 0.5211 0.7156 0.8459
No log 8.5238 358 0.7316 0.5448 0.7316 0.8553
No log 8.5714 360 0.7429 0.5525 0.7429 0.8619
No log 8.6190 362 0.7488 0.5525 0.7488 0.8654
No log 8.6667 364 0.7501 0.5237 0.7501 0.8661
No log 8.7143 366 0.7498 0.5081 0.7498 0.8659
No log 8.7619 368 0.7488 0.5065 0.7488 0.8653
No log 8.8095 370 0.7530 0.4970 0.7530 0.8677
No log 8.8571 372 0.7563 0.4963 0.7563 0.8696
No log 8.9048 374 0.7555 0.4978 0.7555 0.8692
No log 8.9524 376 0.7524 0.5120 0.7524 0.8674
No log 9.0 378 0.7507 0.5088 0.7507 0.8664
No log 9.0476 380 0.7591 0.5193 0.7591 0.8713
No log 9.0952 382 0.7649 0.525 0.7649 0.8746
No log 9.1429 384 0.7686 0.5135 0.7686 0.8767
No log 9.1905 386 0.7746 0.4854 0.7746 0.8801
No log 9.2381 388 0.7819 0.4764 0.7819 0.8842
No log 9.2857 390 0.7786 0.4741 0.7786 0.8824
No log 9.3333 392 0.7684 0.5072 0.7684 0.8766
No log 9.3810 394 0.7581 0.5128 0.7581 0.8707
No log 9.4286 396 0.7493 0.5112 0.7493 0.8656
No log 9.4762 398 0.7404 0.5326 0.7404 0.8605
No log 9.5238 400 0.7353 0.5310 0.7353 0.8575
No log 9.5714 402 0.7319 0.5306 0.7319 0.8555
No log 9.6190 404 0.7307 0.5405 0.7307 0.8548
No log 9.6667 406 0.7304 0.5059 0.7304 0.8547
No log 9.7143 408 0.7310 0.5050 0.7310 0.8550
No log 9.7619 410 0.7319 0.5050 0.7319 0.8555
No log 9.8095 412 0.7321 0.5050 0.7321 0.8556
No log 9.8571 414 0.7325 0.5050 0.7325 0.8558
No log 9.9048 416 0.7330 0.5050 0.7330 0.8562
No log 9.9524 418 0.7332 0.5050 0.7332 0.8563
No log 10.0 420 0.7333 0.5050 0.7333 0.8563

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k8_task2_organization

Finetuned
(4023)
this model