ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k5_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5843
  • Qwk: 0.7035
  • Mse: 0.5843
  • Rmse: 0.7644

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 4.0121 0.0069 4.0121 2.0030
No log 0.2667 4 2.1405 0.0450 2.1405 1.4630
No log 0.4 6 1.4755 0.0408 1.4755 1.2147
No log 0.5333 8 1.0516 0.2734 1.0516 1.0255
No log 0.6667 10 1.0824 0.1643 1.0824 1.0404
No log 0.8 12 1.2759 -0.1333 1.2759 1.1295
No log 0.9333 14 1.4281 -0.0777 1.4281 1.1951
No log 1.0667 16 1.5161 -0.0709 1.5161 1.2313
No log 1.2 18 1.2867 -0.0394 1.2867 1.1343
No log 1.3333 20 1.1995 0.0967 1.1995 1.0952
No log 1.4667 22 1.2540 -0.0032 1.2540 1.1198
No log 1.6 24 1.1729 0.0820 1.1729 1.0830
No log 1.7333 26 1.0244 0.3326 1.0244 1.0121
No log 1.8667 28 1.0345 0.2697 1.0345 1.0171
No log 2.0 30 1.0688 0.2236 1.0688 1.0338
No log 2.1333 32 1.0751 0.2390 1.0751 1.0369
No log 2.2667 34 1.0323 0.2416 1.0323 1.0160
No log 2.4 36 1.0861 0.1436 1.0861 1.0422
No log 2.5333 38 1.0908 0.1436 1.0908 1.0444
No log 2.6667 40 0.9866 0.2416 0.9866 0.9933
No log 2.8 42 0.9687 0.2265 0.9687 0.9842
No log 2.9333 44 0.9994 0.3372 0.9994 0.9997
No log 3.0667 46 1.0022 0.3687 1.0022 1.0011
No log 3.2 48 1.1401 0.3863 1.1401 1.0678
No log 3.3333 50 1.2943 0.2993 1.2943 1.1377
No log 3.4667 52 1.1018 0.3502 1.1018 1.0497
No log 3.6 54 0.8783 0.4494 0.8783 0.9372
No log 3.7333 56 1.0112 0.3449 1.0112 1.0056
No log 3.8667 58 1.3693 0.0840 1.3693 1.1702
No log 4.0 60 1.4293 0.1379 1.4293 1.1956
No log 4.1333 62 1.1872 0.2857 1.1872 1.0896
No log 4.2667 64 0.8918 0.4265 0.8918 0.9443
No log 4.4 66 0.7505 0.5010 0.7505 0.8663
No log 4.5333 68 0.7444 0.5375 0.7444 0.8628
No log 4.6667 70 0.7857 0.5199 0.7857 0.8864
No log 4.8 72 0.9551 0.3624 0.9551 0.9773
No log 4.9333 74 0.8291 0.5586 0.8291 0.9105
No log 5.0667 76 0.7306 0.4850 0.7306 0.8547
No log 5.2 78 0.8555 0.5230 0.8555 0.9249
No log 5.3333 80 0.7358 0.5163 0.7358 0.8578
No log 5.4667 82 0.6619 0.5235 0.6619 0.8135
No log 5.6 84 0.7039 0.5910 0.7039 0.8390
No log 5.7333 86 0.6963 0.4708 0.6963 0.8345
No log 5.8667 88 0.6581 0.4995 0.6581 0.8112
No log 6.0 90 0.7346 0.5598 0.7346 0.8571
No log 6.1333 92 0.6787 0.5860 0.6787 0.8238
No log 6.2667 94 0.6124 0.6095 0.6124 0.7826
No log 6.4 96 0.5998 0.6306 0.5998 0.7745
No log 6.5333 98 0.6586 0.6573 0.6586 0.8115
No log 6.6667 100 0.6830 0.6395 0.6830 0.8264
No log 6.8 102 0.6050 0.7001 0.6050 0.7778
No log 6.9333 104 0.7239 0.5259 0.7239 0.8508
No log 7.0667 106 1.0518 0.4929 1.0518 1.0255
No log 7.2 108 0.8993 0.5283 0.8993 0.9483
No log 7.3333 110 0.6204 0.5876 0.6204 0.7877
No log 7.4667 112 0.6083 0.6695 0.6083 0.7799
No log 7.6 114 0.5755 0.7074 0.5755 0.7586
No log 7.7333 116 0.6042 0.6234 0.6042 0.7773
No log 7.8667 118 0.7455 0.5664 0.7455 0.8634
No log 8.0 120 0.7481 0.5821 0.7481 0.8649
No log 8.1333 122 0.6146 0.6386 0.6146 0.7839
No log 8.2667 124 0.6060 0.6345 0.6060 0.7785
No log 8.4 126 0.6243 0.6237 0.6243 0.7901
No log 8.5333 128 0.8178 0.5157 0.8178 0.9043
No log 8.6667 130 0.9112 0.4961 0.9112 0.9546
No log 8.8 132 0.7592 0.4767 0.7592 0.8713
No log 8.9333 134 0.6838 0.5438 0.6838 0.8269
No log 9.0667 136 0.6783 0.5327 0.6783 0.8236
No log 9.2 138 0.8247 0.5804 0.8247 0.9082
No log 9.3333 140 0.8650 0.5581 0.8650 0.9300
No log 9.4667 142 0.6710 0.5325 0.6710 0.8191
No log 9.6 144 0.6290 0.6055 0.6290 0.7931
No log 9.7333 146 0.6243 0.6055 0.6243 0.7901
No log 9.8667 148 0.6314 0.6234 0.6314 0.7946
No log 10.0 150 0.7902 0.6057 0.7902 0.8889
No log 10.1333 152 0.7442 0.6074 0.7442 0.8627
No log 10.2667 154 0.5947 0.6374 0.5947 0.7712
No log 10.4 156 0.7069 0.6291 0.7069 0.8408
No log 10.5333 158 0.7443 0.5745 0.7443 0.8627
No log 10.6667 160 0.6057 0.6605 0.6057 0.7782
No log 10.8 162 0.6188 0.5794 0.6188 0.7867
No log 10.9333 164 0.7522 0.5905 0.7522 0.8673
No log 11.0667 166 0.6684 0.6120 0.6684 0.8176
No log 11.2 168 0.5639 0.6667 0.5639 0.7510
No log 11.3333 170 0.5806 0.6623 0.5806 0.7620
No log 11.4667 172 0.5695 0.6335 0.5695 0.7547
No log 11.6 174 0.5696 0.6750 0.5696 0.7547
No log 11.7333 176 0.5664 0.6409 0.5664 0.7526
No log 11.8667 178 0.5594 0.6699 0.5594 0.7480
No log 12.0 180 0.5711 0.6497 0.5711 0.7557
No log 12.1333 182 0.5713 0.6497 0.5713 0.7558
No log 12.2667 184 0.5513 0.6374 0.5513 0.7425
No log 12.4 186 0.5841 0.5865 0.5841 0.7643
No log 12.5333 188 0.6037 0.6050 0.6037 0.7770
No log 12.6667 190 0.5787 0.5926 0.5787 0.7607
No log 12.8 192 0.5803 0.6051 0.5803 0.7618
No log 12.9333 194 0.5538 0.6460 0.5538 0.7442
No log 13.0667 196 0.5158 0.7409 0.5158 0.7182
No log 13.2 198 0.5541 0.6485 0.5541 0.7444
No log 13.3333 200 0.4967 0.7165 0.4967 0.7048
No log 13.4667 202 0.4675 0.7110 0.4675 0.6837
No log 13.6 204 0.4704 0.7158 0.4704 0.6858
No log 13.7333 206 0.4760 0.6748 0.4760 0.6900
No log 13.8667 208 0.5319 0.6572 0.5319 0.7293
No log 14.0 210 0.5976 0.5873 0.5976 0.7730
No log 14.1333 212 0.5760 0.6302 0.5760 0.7589
No log 14.2667 214 0.5844 0.6322 0.5844 0.7644
No log 14.4 216 0.6023 0.6435 0.6023 0.7761
No log 14.5333 218 0.5957 0.5759 0.5957 0.7718
No log 14.6667 220 0.6940 0.5598 0.6940 0.8330
No log 14.8 222 0.7062 0.5590 0.7062 0.8404
No log 14.9333 224 0.6085 0.5876 0.6085 0.7801
No log 15.0667 226 0.5902 0.5817 0.5902 0.7682
No log 15.2 228 0.5768 0.6207 0.5768 0.7595
No log 15.3333 230 0.5803 0.6018 0.5803 0.7618
No log 15.4667 232 0.6198 0.6125 0.6198 0.7873
No log 15.6 234 0.7469 0.5257 0.7469 0.8642
No log 15.7333 236 0.7517 0.5039 0.7517 0.8670
No log 15.8667 238 0.8336 0.5053 0.8336 0.9130
No log 16.0 240 1.0249 0.4228 1.0249 1.0124
No log 16.1333 242 0.9458 0.3973 0.9458 0.9725
No log 16.2667 244 0.6976 0.4917 0.6976 0.8352
No log 16.4 246 0.6109 0.6057 0.6109 0.7816
No log 16.5333 248 0.5962 0.5735 0.5962 0.7721
No log 16.6667 250 0.5981 0.5373 0.5981 0.7734
No log 16.8 252 0.6043 0.5886 0.6043 0.7774
No log 16.9333 254 0.6016 0.6187 0.6016 0.7757
No log 17.0667 256 0.6480 0.5970 0.6480 0.8050
No log 17.2 258 0.6572 0.5970 0.6572 0.8107
No log 17.3333 260 0.6225 0.6499 0.6225 0.7890
No log 17.4667 262 0.6024 0.6435 0.6024 0.7761
No log 17.6 264 0.6130 0.6076 0.6130 0.7830
No log 17.7333 266 0.6115 0.5690 0.6115 0.7820
No log 17.8667 268 0.6008 0.6105 0.6008 0.7751
No log 18.0 270 0.5919 0.6549 0.5919 0.7694
No log 18.1333 272 0.6023 0.6415 0.6023 0.7760
No log 18.2667 274 0.6009 0.6301 0.6009 0.7752
No log 18.4 276 0.5767 0.6425 0.5767 0.7594
No log 18.5333 278 0.5512 0.6036 0.5512 0.7425
No log 18.6667 280 0.5351 0.5879 0.5351 0.7315
No log 18.8 282 0.5073 0.7001 0.5073 0.7123
No log 18.9333 284 0.5048 0.7001 0.5048 0.7105
No log 19.0667 286 0.4977 0.6962 0.4977 0.7055
No log 19.2 288 0.4966 0.7001 0.4966 0.7047
No log 19.3333 290 0.4993 0.7001 0.4993 0.7066
No log 19.4667 292 0.5023 0.7001 0.5023 0.7087
No log 19.6 294 0.5168 0.6993 0.5168 0.7189
No log 19.7333 296 0.5108 0.7001 0.5108 0.7147
No log 19.8667 298 0.5359 0.6278 0.5359 0.7321
No log 20.0 300 0.5578 0.6248 0.5578 0.7468
No log 20.1333 302 0.5287 0.6269 0.5287 0.7271
No log 20.2667 304 0.5508 0.6807 0.5508 0.7421
No log 20.4 306 0.5853 0.6807 0.5853 0.7650
No log 20.5333 308 0.5524 0.6620 0.5524 0.7432
No log 20.6667 310 0.5668 0.5831 0.5668 0.7528
No log 20.8 312 0.6130 0.5709 0.6130 0.7830
No log 20.9333 314 0.5937 0.5895 0.5937 0.7705
No log 21.0667 316 0.5110 0.6838 0.5110 0.7149
No log 21.2 318 0.4833 0.7059 0.4833 0.6952
No log 21.3333 320 0.6087 0.6442 0.6087 0.7802
No log 21.4667 322 0.6396 0.6004 0.6396 0.7997
No log 21.6 324 0.5822 0.6611 0.5822 0.7630
No log 21.7333 326 0.5334 0.5886 0.5334 0.7304
No log 21.8667 328 0.5411 0.5704 0.5411 0.7356
No log 22.0 330 0.5516 0.5588 0.5516 0.7427
No log 22.1333 332 0.5425 0.6129 0.5425 0.7365
No log 22.2667 334 0.5411 0.6407 0.5411 0.7356
No log 22.4 336 0.5882 0.6255 0.5882 0.7669
No log 22.5333 338 0.6006 0.6255 0.6006 0.7750
No log 22.6667 340 0.5952 0.6407 0.5952 0.7715
No log 22.8 342 0.5813 0.5972 0.5813 0.7624
No log 22.9333 344 0.5715 0.6296 0.5715 0.7560
No log 23.0667 346 0.5724 0.5301 0.5724 0.7566
No log 23.2 348 0.5643 0.5759 0.5643 0.7512
No log 23.3333 350 0.5524 0.6296 0.5524 0.7432
No log 23.4667 352 0.5421 0.6407 0.5421 0.7363
No log 23.6 354 0.5437 0.6507 0.5437 0.7374
No log 23.7333 356 0.5530 0.6400 0.5530 0.7436
No log 23.8667 358 0.5533 0.6507 0.5533 0.7439
No log 24.0 360 0.5766 0.5905 0.5766 0.7594
No log 24.1333 362 0.5814 0.5694 0.5814 0.7625
No log 24.2667 364 0.5682 0.6118 0.5682 0.7538
No log 24.4 366 0.5837 0.6143 0.5837 0.7640
No log 24.5333 368 0.5964 0.5964 0.5964 0.7723
No log 24.6667 370 0.5933 0.6082 0.5933 0.7702
No log 24.8 372 0.5811 0.6082 0.5811 0.7623
No log 24.9333 374 0.5902 0.6295 0.5902 0.7682
No log 25.0667 376 0.5542 0.6422 0.5542 0.7445
No log 25.2 378 0.5365 0.6788 0.5365 0.7325
No log 25.3333 380 0.5396 0.6175 0.5396 0.7346
No log 25.4667 382 0.5500 0.6475 0.5500 0.7417
No log 25.6 384 0.5699 0.5966 0.5699 0.7549
No log 25.7333 386 0.5918 0.6367 0.5918 0.7693
No log 25.8667 388 0.6342 0.6244 0.6342 0.7964
No log 26.0 390 0.6000 0.6244 0.6000 0.7746
No log 26.1333 392 0.5391 0.6460 0.5391 0.7343
No log 26.2667 394 0.5233 0.6006 0.5233 0.7234
No log 26.4 396 0.5177 0.6555 0.5177 0.7195
No log 26.5333 398 0.5147 0.7109 0.5147 0.7174
No log 26.6667 400 0.5462 0.6569 0.5462 0.7390
No log 26.8 402 0.6227 0.6121 0.6227 0.7891
No log 26.9333 404 0.6220 0.6121 0.6220 0.7887
No log 27.0667 406 0.5964 0.6367 0.5964 0.7723
No log 27.2 408 0.5839 0.5996 0.5839 0.7642
No log 27.3333 410 0.5730 0.6008 0.5730 0.7570
No log 27.4667 412 0.5622 0.6844 0.5622 0.7498
No log 27.6 414 0.5774 0.5981 0.5774 0.7598
No log 27.7333 416 0.6104 0.6220 0.6104 0.7813
No log 27.8667 418 0.5689 0.6367 0.5689 0.7543
No log 28.0 420 0.5314 0.6706 0.5314 0.7290
No log 28.1333 422 0.5415 0.6117 0.5415 0.7359
No log 28.2667 424 0.5538 0.6117 0.5538 0.7442
No log 28.4 426 0.5908 0.6367 0.5908 0.7687
No log 28.5333 428 0.7034 0.5154 0.7034 0.8387
No log 28.6667 430 0.7738 0.5307 0.7738 0.8796
No log 28.8 432 0.7364 0.5420 0.7364 0.8582
No log 28.9333 434 0.6155 0.6244 0.6155 0.7846
No log 29.0667 436 0.5489 0.6397 0.5489 0.7409
No log 29.2 438 0.5556 0.5848 0.5556 0.7454
No log 29.3333 440 0.5506 0.6078 0.5506 0.7421
No log 29.4667 442 0.5262 0.5859 0.5262 0.7254
No log 29.6 444 0.5357 0.7035 0.5357 0.7319
No log 29.7333 446 0.5790 0.6305 0.5790 0.7609
No log 29.8667 448 0.6506 0.5154 0.6506 0.8066
No log 30.0 450 0.6460 0.5154 0.6460 0.8037
No log 30.1333 452 0.5800 0.6646 0.5800 0.7616
No log 30.2667 454 0.5306 0.7153 0.5306 0.7284
No log 30.4 456 0.5116 0.6470 0.5116 0.7152
No log 30.5333 458 0.5077 0.6636 0.5077 0.7125
No log 30.6667 460 0.5132 0.6606 0.5132 0.7164
No log 30.8 462 0.5296 0.6721 0.5296 0.7277
No log 30.9333 464 0.5291 0.6325 0.5291 0.7274
No log 31.0667 466 0.5370 0.6154 0.5370 0.7328
No log 31.2 468 0.5600 0.6078 0.5600 0.7483
No log 31.3333 470 0.5837 0.5118 0.5837 0.7640
No log 31.4667 472 0.5672 0.5977 0.5672 0.7531
No log 31.6 474 0.5517 0.5939 0.5517 0.7427
No log 31.7333 476 0.5436 0.6282 0.5436 0.7373
No log 31.8667 478 0.5383 0.6602 0.5383 0.7337
No log 32.0 480 0.5289 0.6602 0.5289 0.7273
No log 32.1333 482 0.5204 0.6602 0.5204 0.7214
No log 32.2667 484 0.5145 0.6708 0.5145 0.7173
No log 32.4 486 0.5140 0.6699 0.5140 0.7170
No log 32.5333 488 0.5223 0.6824 0.5223 0.7227
No log 32.6667 490 0.5634 0.6925 0.5634 0.7506
No log 32.8 492 0.6244 0.6099 0.6244 0.7902
No log 32.9333 494 0.6687 0.6160 0.6687 0.8177
No log 33.0667 496 0.6709 0.6160 0.6709 0.8191
No log 33.2 498 0.6208 0.6015 0.6208 0.7879
0.2685 33.3333 500 0.5653 0.7043 0.5653 0.7518
0.2685 33.4667 502 0.5683 0.5690 0.5683 0.7539
0.2685 33.6 504 0.5808 0.5210 0.5808 0.7621
0.2685 33.7333 506 0.5677 0.5455 0.5677 0.7535
0.2685 33.8667 508 0.5604 0.6518 0.5604 0.7486
0.2685 34.0 510 0.5843 0.7035 0.5843 0.7644

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k5_task5_organization

Finetuned
(4023)
this model