ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6037
  • Qwk: 0.4513
  • Mse: 0.6037
  • Rmse: 0.7770

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.6703 -0.0262 2.6703 1.6341
No log 0.0889 4 1.4414 0.0511 1.4414 1.2006
No log 0.1333 6 1.2104 -0.1993 1.2104 1.1002
No log 0.1778 8 1.0885 -0.1095 1.0885 1.0433
No log 0.2222 10 1.1405 -0.2088 1.1405 1.0679
No log 0.2667 12 1.3054 -0.2026 1.3054 1.1425
No log 0.3111 14 1.1953 -0.2191 1.1953 1.0933
No log 0.3556 16 0.9715 0.0469 0.9715 0.9856
No log 0.4 18 0.8487 0.1184 0.8487 0.9212
No log 0.4444 20 0.7605 0.1139 0.7605 0.8720
No log 0.4889 22 0.7263 0.1863 0.7263 0.8522
No log 0.5333 24 0.7119 0.1863 0.7119 0.8437
No log 0.5778 26 0.7245 0.2206 0.7245 0.8512
No log 0.6222 28 0.7335 0.0846 0.7335 0.8565
No log 0.6667 30 0.7293 0.0481 0.7293 0.8540
No log 0.7111 32 0.7412 0.0 0.7412 0.8609
No log 0.7556 34 0.7349 0.0840 0.7349 0.8572
No log 0.8 36 0.7266 0.1617 0.7266 0.8524
No log 0.8444 38 0.7234 0.2270 0.7234 0.8506
No log 0.8889 40 0.7434 0.2206 0.7434 0.8622
No log 0.9333 42 0.7752 0.1807 0.7752 0.8804
No log 0.9778 44 0.7901 0.2046 0.7901 0.8889
No log 1.0222 46 0.8162 0.0851 0.8162 0.9034
No log 1.0667 48 0.8215 0.0053 0.8215 0.9064
No log 1.1111 50 0.8064 0.0509 0.8064 0.8980
No log 1.1556 52 0.7774 0.0053 0.7774 0.8817
No log 1.2 54 0.7464 0.1282 0.7464 0.8639
No log 1.2444 56 0.7439 0.1699 0.7439 0.8625
No log 1.2889 58 0.7539 0.1737 0.7539 0.8683
No log 1.3333 60 0.8049 0.1558 0.8049 0.8972
No log 1.3778 62 0.8533 0.1166 0.8533 0.9238
No log 1.4222 64 0.8114 0.2092 0.8114 0.9008
No log 1.4667 66 0.7791 0.3324 0.7791 0.8826
No log 1.5111 68 0.8180 0.2227 0.8180 0.9044
No log 1.5556 70 0.8138 0.2685 0.8138 0.9021
No log 1.6 72 0.7776 0.2685 0.7776 0.8818
No log 1.6444 74 0.7592 0.2685 0.7592 0.8713
No log 1.6889 76 0.7596 0.2652 0.7596 0.8716
No log 1.7333 78 0.7489 0.2652 0.7489 0.8654
No log 1.7778 80 0.7588 0.3594 0.7588 0.8711
No log 1.8222 82 0.7486 0.3594 0.7486 0.8652
No log 1.8667 84 0.7252 0.3594 0.7252 0.8516
No log 1.9111 86 0.7007 0.2285 0.7007 0.8371
No log 1.9556 88 0.7318 0.3868 0.7318 0.8555
No log 2.0 90 0.8333 0.3819 0.8333 0.9128
No log 2.0444 92 0.7887 0.3894 0.7887 0.8881
No log 2.0889 94 0.7136 0.4052 0.7136 0.8448
No log 2.1333 96 0.7067 0.4052 0.7067 0.8407
No log 2.1778 98 0.7615 0.3894 0.7615 0.8726
No log 2.2222 100 0.6923 0.4437 0.6923 0.8320
No log 2.2667 102 0.6813 0.3713 0.6813 0.8254
No log 2.3111 104 0.6886 0.3452 0.6886 0.8298
No log 2.3556 106 0.8212 0.3891 0.8212 0.9062
No log 2.4 108 0.9486 0.2626 0.9486 0.9740
No log 2.4444 110 0.8642 0.3076 0.8642 0.9296
No log 2.4889 112 0.7254 0.3937 0.7254 0.8517
No log 2.5333 114 0.7038 0.4234 0.7038 0.8389
No log 2.5778 116 0.8219 0.3731 0.8219 0.9066
No log 2.6222 118 1.0356 0.2968 1.0356 1.0176
No log 2.6667 120 0.9390 0.3503 0.9390 0.9690
No log 2.7111 122 0.9551 0.3608 0.9551 0.9773
No log 2.7556 124 0.8228 0.3747 0.8228 0.9071
No log 2.8 126 0.6297 0.4639 0.6297 0.7935
No log 2.8444 128 0.6943 0.2624 0.6943 0.8333
No log 2.8889 130 0.6678 0.3265 0.6678 0.8172
No log 2.9333 132 0.6432 0.4451 0.6432 0.8020
No log 2.9778 134 0.9343 0.3523 0.9343 0.9666
No log 3.0222 136 1.0360 0.2754 1.0360 1.0179
No log 3.0667 138 0.8590 0.3727 0.8590 0.9268
No log 3.1111 140 0.6918 0.4898 0.6918 0.8318
No log 3.1556 142 0.6503 0.4562 0.6503 0.8064
No log 3.2 144 0.6433 0.3836 0.6433 0.8021
No log 3.2444 146 0.6139 0.4222 0.6139 0.7835
No log 3.2889 148 0.6493 0.4582 0.6493 0.8058
No log 3.3333 150 0.6650 0.4089 0.6650 0.8155
No log 3.3778 152 0.6546 0.3590 0.6546 0.8090
No log 3.4222 154 0.7763 0.4562 0.7763 0.8811
No log 3.4667 156 0.7758 0.4562 0.7758 0.8808
No log 3.5111 158 0.6525 0.3843 0.6525 0.8078
No log 3.5556 160 0.5937 0.3862 0.5937 0.7705
No log 3.6 162 0.6190 0.4205 0.6190 0.7868
No log 3.6444 164 0.5840 0.3915 0.5840 0.7642
No log 3.6889 166 0.7053 0.4089 0.7053 0.8398
No log 3.7333 168 0.9325 0.3868 0.9325 0.9657
No log 3.7778 170 0.8300 0.4217 0.8300 0.9110
No log 3.8222 172 0.6592 0.3891 0.6592 0.8119
No log 3.8667 174 0.6305 0.3990 0.6305 0.7940
No log 3.9111 176 0.6649 0.3817 0.6649 0.8154
No log 3.9556 178 0.6905 0.4294 0.6905 0.8310
No log 4.0 180 0.6909 0.4190 0.6909 0.8312
No log 4.0444 182 0.7256 0.3918 0.7256 0.8518
No log 4.0889 184 0.7434 0.3891 0.7434 0.8622
No log 4.1333 186 0.7324 0.4190 0.7324 0.8558
No log 4.1778 188 0.6350 0.3408 0.6350 0.7969
No log 4.2222 190 0.5998 0.3703 0.5998 0.7745
No log 4.2667 192 0.5894 0.4847 0.5894 0.7677
No log 4.3111 194 0.6210 0.5034 0.6210 0.7881
No log 4.3556 196 0.6712 0.5149 0.6712 0.8193
No log 4.4 198 0.7210 0.4979 0.7210 0.8491
No log 4.4444 200 0.6578 0.4864 0.6578 0.8110
No log 4.4889 202 0.6609 0.4827 0.6609 0.8130
No log 4.5333 204 0.6021 0.4875 0.6021 0.7760
No log 4.5778 206 0.5778 0.5289 0.5778 0.7601
No log 4.6222 208 0.6270 0.5219 0.6270 0.7919
No log 4.6667 210 0.6284 0.5131 0.6284 0.7927
No log 4.7111 212 0.5660 0.5133 0.5660 0.7523
No log 4.7556 214 0.5562 0.5272 0.5562 0.7458
No log 4.8 216 0.5271 0.5640 0.5271 0.7260
No log 4.8444 218 0.5235 0.4678 0.5235 0.7235
No log 4.8889 220 0.5294 0.4990 0.5294 0.7276
No log 4.9333 222 0.5753 0.4607 0.5753 0.7585
No log 4.9778 224 0.5226 0.4425 0.5226 0.7229
No log 5.0222 226 0.6413 0.4482 0.6413 0.8008
No log 5.0667 228 0.8767 0.3945 0.8767 0.9363
No log 5.1111 230 0.8616 0.4003 0.8616 0.9282
No log 5.1556 232 0.6677 0.4438 0.6677 0.8171
No log 5.2 234 0.5562 0.4576 0.5562 0.7458
No log 5.2444 236 0.5444 0.4701 0.5444 0.7378
No log 5.2889 238 0.5730 0.4413 0.5730 0.7570
No log 5.3333 240 0.6723 0.4801 0.6723 0.8199
No log 5.3778 242 0.7806 0.4432 0.7806 0.8835
No log 5.4222 244 0.7152 0.5068 0.7152 0.8457
No log 5.4667 246 0.6139 0.4051 0.6139 0.7835
No log 5.5111 248 0.5917 0.4575 0.5917 0.7692
No log 5.5556 250 0.6013 0.4659 0.6013 0.7754
No log 5.6 252 0.6563 0.5059 0.6563 0.8101
No log 5.6444 254 0.8132 0.5103 0.8132 0.9018
No log 5.6889 256 0.8622 0.4091 0.8622 0.9286
No log 5.7333 258 0.7158 0.5310 0.7158 0.8460
No log 5.7778 260 0.5976 0.4875 0.5976 0.7731
No log 5.8222 262 0.5745 0.4337 0.5745 0.7580
No log 5.8667 264 0.5736 0.3915 0.5736 0.7574
No log 5.9111 266 0.5761 0.3813 0.5761 0.7590
No log 5.9556 268 0.6040 0.4413 0.6040 0.7772
No log 6.0 270 0.6296 0.4602 0.6296 0.7935
No log 6.0444 272 0.6033 0.4336 0.6033 0.7767
No log 6.0889 274 0.5999 0.3966 0.5999 0.7746
No log 6.1333 276 0.6617 0.5123 0.6617 0.8134
No log 6.1778 278 0.7781 0.4921 0.7781 0.8821
No log 6.2222 280 0.7161 0.5325 0.7161 0.8462
No log 6.2667 282 0.6187 0.4889 0.6187 0.7866
No log 6.3111 284 0.6148 0.2960 0.6148 0.7841
No log 6.3556 286 0.6354 0.4189 0.6354 0.7971
No log 6.4 288 0.6024 0.3350 0.6024 0.7761
No log 6.4444 290 0.6867 0.4587 0.6867 0.8287
No log 6.4889 292 0.7974 0.4511 0.7974 0.8930
No log 6.5333 294 0.7774 0.4853 0.7774 0.8817
No log 6.5778 296 0.6835 0.4666 0.6835 0.8267
No log 6.6222 298 0.6675 0.4665 0.6675 0.8170
No log 6.6667 300 0.6704 0.4741 0.6704 0.8188
No log 6.7111 302 0.6118 0.4562 0.6118 0.7821
No log 6.7556 304 0.5898 0.4699 0.5898 0.7680
No log 6.8 306 0.5958 0.4624 0.5958 0.7719
No log 6.8444 308 0.6394 0.4389 0.6394 0.7996
No log 6.8889 310 0.7001 0.4887 0.7001 0.8367
No log 6.9333 312 0.7263 0.5068 0.7263 0.8522
No log 6.9778 314 0.6193 0.4389 0.6193 0.7870
No log 7.0222 316 0.5473 0.3649 0.5473 0.7398
No log 7.0667 318 0.5404 0.3649 0.5404 0.7351
No log 7.1111 320 0.5708 0.3814 0.5708 0.7555
No log 7.1556 322 0.6742 0.4707 0.6742 0.8211
No log 7.2 324 0.7239 0.4650 0.7239 0.8508
No log 7.2444 326 0.6701 0.5085 0.6701 0.8186
No log 7.2889 328 0.5780 0.4740 0.5780 0.7603
No log 7.3333 330 0.5603 0.5505 0.5603 0.7485
No log 7.3778 332 0.5748 0.5378 0.5748 0.7582
No log 7.4222 334 0.5739 0.5412 0.5739 0.7576
No log 7.4667 336 0.6281 0.5195 0.6281 0.7925
No log 7.5111 338 0.6240 0.5195 0.6240 0.7899
No log 7.5556 340 0.5732 0.5442 0.5732 0.7571
No log 7.6 342 0.5392 0.5218 0.5392 0.7343
No log 7.6444 344 0.5381 0.4617 0.5381 0.7335
No log 7.6889 346 0.5438 0.4700 0.5438 0.7374
No log 7.7333 348 0.5666 0.4704 0.5666 0.7527
No log 7.7778 350 0.5712 0.4618 0.5712 0.7558
No log 7.8222 352 0.6060 0.5042 0.6060 0.7785
No log 7.8667 354 0.5969 0.4764 0.5969 0.7726
No log 7.9111 356 0.5568 0.4358 0.5568 0.7462
No log 7.9556 358 0.5688 0.4928 0.5688 0.7542
No log 8.0 360 0.5831 0.5151 0.5831 0.7636
No log 8.0444 362 0.5914 0.4813 0.5914 0.7690
No log 8.0889 364 0.6427 0.5323 0.6427 0.8017
No log 8.1333 366 0.6908 0.5339 0.6908 0.8312
No log 8.1778 368 0.6346 0.4997 0.6346 0.7966
No log 8.2222 370 0.5895 0.5081 0.5895 0.7678
No log 8.2667 372 0.5632 0.5104 0.5632 0.7505
No log 8.3111 374 0.5668 0.4618 0.5668 0.7529
No log 8.3556 376 0.5726 0.4618 0.5726 0.7567
No log 8.4 378 0.5950 0.4855 0.5950 0.7713
No log 8.4444 380 0.5809 0.4597 0.5809 0.7622
No log 8.4889 382 0.5686 0.4875 0.5686 0.7541
No log 8.5333 384 0.5788 0.4855 0.5788 0.7608
No log 8.5778 386 0.6251 0.4491 0.6251 0.7907
No log 8.6222 388 0.6780 0.4808 0.6780 0.8234
No log 8.6667 390 0.6850 0.4808 0.6850 0.8276
No log 8.7111 392 0.6495 0.4808 0.6495 0.8059
No log 8.7556 394 0.5930 0.4576 0.5930 0.7701
No log 8.8 396 0.6070 0.4576 0.6070 0.7791
No log 8.8444 398 0.6282 0.4815 0.6282 0.7926
No log 8.8889 400 0.6252 0.4330 0.6252 0.7907
No log 8.9333 402 0.5755 0.4437 0.5755 0.7586
No log 8.9778 404 0.5529 0.3788 0.5529 0.7436
No log 9.0222 406 0.5591 0.4534 0.5591 0.7478
No log 9.0667 408 0.5736 0.4764 0.5736 0.7574
No log 9.1111 410 0.5525 0.5016 0.5525 0.7433
No log 9.1556 412 0.5197 0.4875 0.5197 0.7209
No log 9.2 414 0.5392 0.4769 0.5392 0.7343
No log 9.2444 416 0.5834 0.5258 0.5834 0.7638
No log 9.2889 418 0.5356 0.5327 0.5356 0.7319
No log 9.3333 420 0.4962 0.5159 0.4962 0.7044
No log 9.3778 422 0.5106 0.4660 0.5106 0.7146
No log 9.4222 424 0.5436 0.4352 0.5436 0.7373
No log 9.4667 426 0.6306 0.4224 0.6306 0.7941
No log 9.5111 428 0.6486 0.4144 0.6486 0.8053
No log 9.5556 430 0.6064 0.4329 0.6064 0.7787
No log 9.6 432 0.5158 0.4437 0.5158 0.7182
No log 9.6444 434 0.4962 0.4569 0.4962 0.7044
No log 9.6889 436 0.4924 0.5208 0.4924 0.7017
No log 9.7333 438 0.4929 0.5326 0.4929 0.7021
No log 9.7778 440 0.5071 0.4534 0.5071 0.7121
No log 9.8222 442 0.5961 0.4808 0.5961 0.7720
No log 9.8667 444 0.6489 0.4801 0.6489 0.8055
No log 9.9111 446 0.5880 0.4749 0.5880 0.7668
No log 9.9556 448 0.5277 0.4726 0.5277 0.7264
No log 10.0 450 0.5291 0.4569 0.5291 0.7274
No log 10.0444 452 0.5204 0.4637 0.5204 0.7214
No log 10.0889 454 0.5231 0.5034 0.5231 0.7233
No log 10.1333 456 0.5796 0.5042 0.5796 0.7613
No log 10.1778 458 0.5835 0.5042 0.5835 0.7639
No log 10.2222 460 0.5399 0.5498 0.5399 0.7348
No log 10.2667 462 0.5156 0.5698 0.5156 0.7180
No log 10.3111 464 0.5054 0.5076 0.5054 0.7109
No log 10.3556 466 0.5120 0.5939 0.5120 0.7155
No log 10.4 468 0.5499 0.5048 0.5499 0.7415
No log 10.4444 470 0.5720 0.4959 0.5720 0.7563
No log 10.4889 472 0.5780 0.4959 0.5780 0.7603
No log 10.5333 474 0.5795 0.5201 0.5795 0.7612
No log 10.5778 476 0.5715 0.5435 0.5715 0.7560
No log 10.6222 478 0.5517 0.4855 0.5517 0.7428
No log 10.6667 480 0.5252 0.4535 0.5252 0.7247
No log 10.7111 482 0.5232 0.4555 0.5232 0.7233
No log 10.7556 484 0.5401 0.5141 0.5401 0.7349
No log 10.8 486 0.6136 0.4723 0.6136 0.7833
No log 10.8444 488 0.6437 0.4296 0.6437 0.8023
No log 10.8889 490 0.6032 0.4741 0.6032 0.7766
No log 10.9333 492 0.5374 0.5411 0.5374 0.7331
No log 10.9778 494 0.5236 0.6377 0.5236 0.7236
No log 11.0222 496 0.5225 0.5611 0.5225 0.7229
No log 11.0667 498 0.5204 0.4923 0.5204 0.7214
0.3539 11.1111 500 0.5214 0.5268 0.5214 0.7221
0.3539 11.1556 502 0.5280 0.5022 0.5280 0.7267
0.3539 11.2 504 0.5268 0.5800 0.5268 0.7258
0.3539 11.2444 506 0.5120 0.5734 0.5120 0.7155
0.3539 11.2889 508 0.5219 0.5592 0.5219 0.7224
0.3539 11.3333 510 0.5242 0.5524 0.5242 0.7240
0.3539 11.3778 512 0.5101 0.5890 0.5101 0.7142
0.3539 11.4222 514 0.5153 0.5979 0.5153 0.7178
0.3539 11.4667 516 0.5385 0.6018 0.5385 0.7339
0.3539 11.5111 518 0.5895 0.5275 0.5895 0.7678
0.3539 11.5556 520 0.5685 0.5528 0.5685 0.7540
0.3539 11.6 522 0.5276 0.5768 0.5276 0.7263
0.3539 11.6444 524 0.5292 0.5768 0.5292 0.7275
0.3539 11.6889 526 0.5398 0.6807 0.5398 0.7347
0.3539 11.7333 528 0.5285 0.6346 0.5285 0.7270
0.3539 11.7778 530 0.5152 0.5768 0.5152 0.7178
0.3539 11.8222 532 0.5206 0.5768 0.5206 0.7215
0.3539 11.8667 534 0.5445 0.4985 0.5445 0.7379
0.3539 11.9111 536 0.5718 0.4704 0.5718 0.7562
0.3539 11.9556 538 0.6293 0.5190 0.6293 0.7933
0.3539 12.0 540 0.6452 0.5310 0.6452 0.8033
0.3539 12.0444 542 0.6876 0.5090 0.6876 0.8292
0.3539 12.0889 544 0.6746 0.5310 0.6746 0.8214
0.3539 12.1333 546 0.6367 0.5845 0.6367 0.7979
0.3539 12.1778 548 0.5958 0.5957 0.5958 0.7719
0.3539 12.2222 550 0.5508 0.5488 0.5508 0.7421
0.3539 12.2667 552 0.5456 0.4838 0.5456 0.7387
0.3539 12.3111 554 0.5373 0.4847 0.5373 0.7330
0.3539 12.3556 556 0.5565 0.4808 0.5565 0.7460
0.3539 12.4 558 0.5787 0.5016 0.5787 0.7607
0.3539 12.4444 560 0.5779 0.5468 0.5779 0.7602
0.3539 12.4889 562 0.5806 0.5468 0.5806 0.7620
0.3539 12.5333 564 0.6131 0.5624 0.6131 0.7830
0.3539 12.5778 566 0.5860 0.5624 0.5860 0.7655
0.3539 12.6222 568 0.5596 0.5909 0.5596 0.7481
0.3539 12.6667 570 0.5560 0.5909 0.5560 0.7457
0.3539 12.7111 572 0.5718 0.5817 0.5718 0.7562
0.3539 12.7556 574 0.6265 0.5362 0.6265 0.7915
0.3539 12.8 576 0.6322 0.5131 0.6322 0.7951
0.3539 12.8444 578 0.5873 0.4911 0.5873 0.7664
0.3539 12.8889 580 0.5675 0.4769 0.5675 0.7533
0.3539 12.9333 582 0.5688 0.4769 0.5688 0.7542
0.3539 12.9778 584 0.6037 0.4513 0.6037 0.7770

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task7_organization

Finetuned
(4023)
this model