ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5520
  • Qwk: 0.4933
  • Mse: 0.5520
  • Rmse: 0.7429

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0317 2 2.5793 -0.0593 2.5793 1.6060
No log 0.0635 4 1.1418 0.1253 1.1418 1.0686
No log 0.0952 6 0.8541 0.0535 0.8541 0.9242
No log 0.1270 8 1.4824 -0.2776 1.4824 1.2175
No log 0.1587 10 1.4603 -0.2256 1.4603 1.2084
No log 0.1905 12 0.9961 -0.0831 0.9961 0.9980
No log 0.2222 14 0.8001 0.0 0.8001 0.8945
No log 0.2540 16 0.8418 0.2527 0.8418 0.9175
No log 0.2857 18 0.7895 0.0 0.7895 0.8885
No log 0.3175 20 1.0083 0.0123 1.0083 1.0041
No log 0.3492 22 1.1491 0.0651 1.1491 1.0720
No log 0.3810 24 0.9603 0.0053 0.9603 0.9800
No log 0.4127 26 0.8089 0.0 0.8089 0.8994
No log 0.4444 28 0.7363 0.0 0.7363 0.8581
No log 0.4762 30 0.6947 0.1922 0.6947 0.8335
No log 0.5079 32 0.6779 0.1922 0.6779 0.8233
No log 0.5397 34 0.7882 0.0947 0.7882 0.8878
No log 0.5714 36 1.1216 0.2335 1.1216 1.0591
No log 0.6032 38 1.5564 -0.0217 1.5564 1.2476
No log 0.6349 40 1.5254 0.0 1.5254 1.2351
No log 0.6667 42 1.1670 0.2679 1.1670 1.0803
No log 0.6984 44 0.8705 0.1365 0.8705 0.9330
No log 0.7302 46 0.7984 0.0 0.7984 0.8935
No log 0.7619 48 0.7297 0.1617 0.7297 0.8542
No log 0.7937 50 0.7233 0.1617 0.7233 0.8505
No log 0.8254 52 0.7166 0.0428 0.7166 0.8465
No log 0.8571 54 0.7206 0.0 0.7206 0.8489
No log 0.8889 56 0.8169 0.1770 0.8169 0.9038
No log 0.9206 58 0.8134 0.2841 0.8134 0.9019
No log 0.9524 60 0.6959 0.2156 0.6959 0.8342
No log 0.9841 62 0.6665 0.1770 0.6665 0.8164
No log 1.0159 64 0.7695 0.2464 0.7695 0.8772
No log 1.0476 66 0.9320 0.3206 0.9320 0.9654
No log 1.0794 68 0.9653 0.3410 0.9653 0.9825
No log 1.1111 70 0.7847 0.3051 0.7847 0.8859
No log 1.1429 72 0.6635 0.2490 0.6635 0.8145
No log 1.1746 74 0.6023 0.2857 0.6023 0.7761
No log 1.2063 76 0.6292 0.3216 0.6292 0.7932
No log 1.2381 78 0.6674 0.3536 0.6674 0.8170
No log 1.2698 80 0.6567 0.1786 0.6567 0.8103
No log 1.3016 82 0.7276 0.1786 0.7276 0.8530
No log 1.3333 84 0.7639 0.2879 0.7639 0.8740
No log 1.3651 86 0.7543 0.2558 0.7543 0.8685
No log 1.3968 88 0.7538 0.2181 0.7538 0.8682
No log 1.4286 90 0.7939 0.2526 0.7939 0.8910
No log 1.4603 92 0.8404 0.2804 0.8404 0.9167
No log 1.4921 94 0.8083 0.2464 0.8083 0.8991
No log 1.5238 96 0.7930 0.2156 0.7930 0.8905
No log 1.5556 98 0.7634 0.2879 0.7634 0.8737
No log 1.5873 100 0.8372 0.2435 0.8372 0.9150
No log 1.6190 102 0.8293 0.2769 0.8293 0.9107
No log 1.6508 104 0.8369 0.2356 0.8369 0.9148
No log 1.6825 106 0.8480 0.2942 0.8480 0.9209
No log 1.7143 108 0.8106 0.2977 0.8106 0.9004
No log 1.7460 110 0.6938 0.3112 0.6938 0.8329
No log 1.7778 112 0.6509 0.1942 0.6509 0.8068
No log 1.8095 114 0.6511 0.1829 0.6511 0.8069
No log 1.8413 116 0.6453 0.1829 0.6453 0.8033
No log 1.8730 118 0.6472 0.3228 0.6472 0.8045
No log 1.9048 120 0.6300 0.2041 0.6300 0.7938
No log 1.9365 122 0.6999 0.3464 0.6999 0.8366
No log 1.9683 124 0.8302 0.3425 0.8302 0.9111
No log 2.0 126 0.8508 0.2676 0.8508 0.9224
No log 2.0317 128 0.7340 0.3484 0.7340 0.8567
No log 2.0635 130 0.6304 0.3289 0.6304 0.7940
No log 2.0952 132 0.6242 0.2783 0.6242 0.7900
No log 2.1270 134 0.6235 0.2751 0.6235 0.7896
No log 2.1587 136 0.5986 0.3081 0.5986 0.7737
No log 2.1905 138 0.6858 0.4630 0.6858 0.8281
No log 2.2222 140 0.8341 0.4228 0.8341 0.9133
No log 2.2540 142 0.7925 0.3761 0.7925 0.8902
No log 2.2857 144 0.6111 0.5050 0.6111 0.7817
No log 2.3175 146 0.5397 0.5044 0.5397 0.7346
No log 2.3492 148 0.5738 0.5556 0.5738 0.7575
No log 2.3810 150 0.6857 0.4448 0.6857 0.8281
No log 2.4127 152 0.7058 0.4032 0.7058 0.8401
No log 2.4444 154 0.7101 0.4032 0.7101 0.8427
No log 2.4762 156 0.7430 0.3988 0.7430 0.8620
No log 2.5079 158 0.7688 0.3988 0.7688 0.8768
No log 2.5397 160 0.7286 0.4122 0.7286 0.8536
No log 2.5714 162 0.7189 0.4170 0.7189 0.8479
No log 2.6032 164 0.7324 0.4076 0.7324 0.8558
No log 2.6349 166 0.7400 0.4122 0.7400 0.8603
No log 2.6667 168 0.6836 0.4359 0.6836 0.8268
No log 2.6984 170 0.5696 0.5266 0.5696 0.7547
No log 2.7302 172 0.5456 0.5107 0.5456 0.7386
No log 2.7619 174 0.5685 0.4915 0.5685 0.7540
No log 2.7937 176 0.6386 0.5498 0.6386 0.7991
No log 2.8254 178 0.6433 0.5299 0.6433 0.8021
No log 2.8571 180 0.7076 0.4926 0.7076 0.8412
No log 2.8889 182 0.6997 0.4926 0.6997 0.8365
No log 2.9206 184 0.5970 0.5877 0.5970 0.7727
No log 2.9524 186 0.5485 0.5235 0.5485 0.7406
No log 2.9841 188 0.5434 0.5460 0.5434 0.7372
No log 3.0159 190 0.5866 0.6396 0.5866 0.7659
No log 3.0476 192 0.6186 0.5445 0.6186 0.7865
No log 3.0794 194 0.6265 0.5513 0.6265 0.7915
No log 3.1111 196 0.6452 0.5378 0.6452 0.8032
No log 3.1429 198 0.5760 0.5368 0.5760 0.7589
No log 3.1746 200 0.5923 0.4758 0.5923 0.7696
No log 3.2063 202 0.6071 0.4029 0.6071 0.7792
No log 3.2381 204 0.6041 0.5304 0.6041 0.7772
No log 3.2698 206 0.7009 0.4706 0.7009 0.8372
No log 3.3016 208 0.8925 0.3866 0.8925 0.9447
No log 3.3333 210 0.8815 0.3635 0.8815 0.9389
No log 3.3651 212 0.7209 0.3974 0.7209 0.8490
No log 3.3968 214 0.6529 0.4517 0.6529 0.8080
No log 3.4286 216 0.6582 0.4140 0.6582 0.8113
No log 3.4603 218 0.7092 0.3387 0.7092 0.8421
No log 3.4921 220 0.7434 0.3872 0.7434 0.8622
No log 3.5238 222 0.6964 0.3851 0.6964 0.8345
No log 3.5556 224 0.6090 0.5492 0.6090 0.7804
No log 3.5873 226 0.6071 0.5143 0.6071 0.7792
No log 3.6190 228 0.6412 0.4636 0.6412 0.8007
No log 3.6508 230 0.8368 0.4159 0.8368 0.9148
No log 3.6825 232 0.9104 0.3993 0.9104 0.9541
No log 3.7143 234 0.8514 0.4033 0.8514 0.9227
No log 3.7460 236 0.7076 0.4448 0.7076 0.8412
No log 3.7778 238 0.5983 0.4722 0.5983 0.7735
No log 3.8095 240 0.6011 0.4569 0.6011 0.7753
No log 3.8413 242 0.5873 0.4857 0.5873 0.7664
No log 3.8730 244 0.5893 0.5687 0.5893 0.7677
No log 3.9048 246 0.5707 0.5600 0.5707 0.7555
No log 3.9365 248 0.5611 0.5522 0.5611 0.7491
No log 3.9683 250 0.5803 0.5092 0.5803 0.7618
No log 4.0 252 0.5903 0.5201 0.5903 0.7683
No log 4.0317 254 0.5971 0.5320 0.5971 0.7727
No log 4.0635 256 0.5794 0.5988 0.5794 0.7612
No log 4.0952 258 0.5522 0.5430 0.5522 0.7431
No log 4.1270 260 0.5606 0.5627 0.5606 0.7487
No log 4.1587 262 0.6711 0.5122 0.6711 0.8192
No log 4.1905 264 0.6472 0.5122 0.6472 0.8045
No log 4.2222 266 0.5445 0.5596 0.5445 0.7379
No log 4.2540 268 0.5761 0.5653 0.5761 0.7590
No log 4.2857 270 0.6004 0.5059 0.6004 0.7748
No log 4.3175 272 0.5686 0.5648 0.5686 0.7540
No log 4.3492 274 0.5850 0.4829 0.5850 0.7648
No log 4.3810 276 0.5929 0.4547 0.5929 0.7700
No log 4.4127 278 0.6001 0.4013 0.6001 0.7747
No log 4.4444 280 0.6443 0.4451 0.6443 0.8027
No log 4.4762 282 0.7600 0.3466 0.7600 0.8718
No log 4.5079 284 0.7775 0.3411 0.7775 0.8818
No log 4.5397 286 0.6477 0.4272 0.6477 0.8048
No log 4.5714 288 0.5570 0.5556 0.5570 0.7463
No log 4.6032 290 0.6874 0.4279 0.6874 0.8291
No log 4.6349 292 0.7937 0.3965 0.7937 0.8909
No log 4.6667 294 0.7461 0.4396 0.7461 0.8638
No log 4.6984 296 0.6273 0.5046 0.6273 0.7920
No log 4.7302 298 0.5377 0.5782 0.5377 0.7333
No log 4.7619 300 0.5827 0.5177 0.5827 0.7633
No log 4.7937 302 0.5775 0.5177 0.5775 0.7599
No log 4.8254 304 0.5291 0.5344 0.5291 0.7274
No log 4.8571 306 0.5534 0.5406 0.5534 0.7439
No log 4.8889 308 0.5657 0.5335 0.5657 0.7521
No log 4.9206 310 0.5439 0.5079 0.5439 0.7375
No log 4.9524 312 0.5439 0.5061 0.5439 0.7375
No log 4.9841 314 0.5439 0.5152 0.5439 0.7375
No log 5.0159 316 0.5489 0.4876 0.5489 0.7409
No log 5.0476 318 0.5309 0.4595 0.5309 0.7286
No log 5.0794 320 0.5327 0.4923 0.5327 0.7298
No log 5.1111 322 0.5414 0.4858 0.5414 0.7358
No log 5.1429 324 0.5389 0.5075 0.5389 0.7341
No log 5.1746 326 0.5385 0.5075 0.5385 0.7339
No log 5.2063 328 0.5459 0.5826 0.5459 0.7388
No log 5.2381 330 0.5377 0.5725 0.5377 0.7333
No log 5.2698 332 0.5558 0.6141 0.5558 0.7455
No log 5.3016 334 0.5415 0.5878 0.5415 0.7358
No log 5.3333 336 0.5606 0.5898 0.5606 0.7487
No log 5.3651 338 0.6813 0.5139 0.6813 0.8254
No log 5.3968 340 0.7337 0.4504 0.7337 0.8566
No log 5.4286 342 0.6653 0.4946 0.6653 0.8157
No log 5.4603 344 0.5450 0.5881 0.5450 0.7382
No log 5.4921 346 0.5356 0.5816 0.5356 0.7319
No log 5.5238 348 0.5548 0.5362 0.5548 0.7448
No log 5.5556 350 0.5560 0.5609 0.5560 0.7456
No log 5.5873 352 0.5660 0.5656 0.5660 0.7524
No log 5.6190 354 0.5807 0.5672 0.5807 0.7621
No log 5.6508 356 0.5754 0.5476 0.5754 0.7586
No log 5.6825 358 0.5710 0.5662 0.5710 0.7556
No log 5.7143 360 0.5613 0.5460 0.5613 0.7492
No log 5.7460 362 0.5679 0.5831 0.5679 0.7536
No log 5.7778 364 0.6011 0.4729 0.6011 0.7753
No log 5.8095 366 0.6288 0.4672 0.6288 0.7929
No log 5.8413 368 0.5964 0.5212 0.5964 0.7722
No log 5.8730 370 0.5645 0.5488 0.5645 0.7513
No log 5.9048 372 0.5625 0.5386 0.5625 0.7500
No log 5.9365 374 0.5506 0.5800 0.5506 0.7420
No log 5.9683 376 0.5922 0.4850 0.5922 0.7696
No log 6.0 378 0.6625 0.4556 0.6625 0.8140
No log 6.0317 380 0.6681 0.4377 0.6681 0.8174
No log 6.0635 382 0.6036 0.5031 0.6036 0.7769
No log 6.0952 384 0.5493 0.5797 0.5493 0.7411
No log 6.1270 386 0.5711 0.6075 0.5711 0.7557
No log 6.1587 388 0.5688 0.6187 0.5688 0.7542
No log 6.1905 390 0.5463 0.5480 0.5463 0.7391
No log 6.2222 392 0.5838 0.5363 0.5838 0.7641
No log 6.2540 394 0.6084 0.5696 0.6084 0.7800
No log 6.2857 396 0.5950 0.5500 0.5950 0.7713
No log 6.3175 398 0.5404 0.6146 0.5404 0.7351
No log 6.3492 400 0.5348 0.5974 0.5348 0.7313
No log 6.3810 402 0.5355 0.6426 0.5355 0.7318
No log 6.4127 404 0.5438 0.5057 0.5438 0.7375
No log 6.4444 406 0.5753 0.5438 0.5753 0.7585
No log 6.4762 408 0.6253 0.4537 0.6253 0.7908
No log 6.5079 410 0.6073 0.5233 0.6073 0.7793
No log 6.5397 412 0.5565 0.5642 0.5565 0.7460
No log 6.5714 414 0.5508 0.5953 0.5508 0.7422
No log 6.6032 416 0.5542 0.5840 0.5542 0.7444
No log 6.6349 418 0.5535 0.6105 0.5535 0.7440
No log 6.6667 420 0.5482 0.6105 0.5482 0.7404
No log 6.6984 422 0.5457 0.6105 0.5457 0.7387
No log 6.7302 424 0.5495 0.6301 0.5495 0.7413
No log 6.7619 426 0.5536 0.6301 0.5536 0.7441
No log 6.7937 428 0.5588 0.6301 0.5588 0.7476
No log 6.8254 430 0.5505 0.6105 0.5505 0.7420
No log 6.8571 432 0.5402 0.6105 0.5402 0.7350
No log 6.8889 434 0.5646 0.5956 0.5646 0.7514
No log 6.9206 436 0.6517 0.4874 0.6517 0.8073
No log 6.9524 438 0.6539 0.4874 0.6539 0.8087
No log 6.9841 440 0.5855 0.5152 0.5855 0.7652
No log 7.0159 442 0.5271 0.5649 0.5271 0.7260
No log 7.0476 444 0.5618 0.5605 0.5618 0.7496
No log 7.0794 446 0.5804 0.5403 0.5804 0.7618
No log 7.1111 448 0.5318 0.6303 0.5318 0.7292
No log 7.1429 450 0.5277 0.6567 0.5277 0.7265
No log 7.1746 452 0.5865 0.4729 0.5865 0.7658
No log 7.2063 454 0.6181 0.4562 0.6181 0.7862
No log 7.2381 456 0.5808 0.4672 0.5808 0.7621
No log 7.2698 458 0.5345 0.6489 0.5345 0.7311
No log 7.3016 460 0.5179 0.5734 0.5179 0.7197
No log 7.3333 462 0.5121 0.6330 0.5121 0.7156
No log 7.3651 464 0.5057 0.5868 0.5057 0.7111
No log 7.3968 466 0.4949 0.6330 0.4949 0.7035
No log 7.4286 468 0.4930 0.5446 0.4930 0.7021
No log 7.4603 470 0.5021 0.6197 0.5021 0.7086
No log 7.4921 472 0.5009 0.5826 0.5009 0.7078
No log 7.5238 474 0.4973 0.6105 0.4973 0.7052
No log 7.5556 476 0.4996 0.6289 0.4996 0.7068
No log 7.5873 478 0.5040 0.6639 0.5040 0.7099
No log 7.6190 480 0.5070 0.6620 0.5070 0.7121
No log 7.6508 482 0.5036 0.6620 0.5036 0.7096
No log 7.6825 484 0.4900 0.6184 0.4900 0.7000
No log 7.7143 486 0.4933 0.6292 0.4933 0.7023
No log 7.7460 488 0.4901 0.6170 0.4901 0.7000
No log 7.7778 490 0.4918 0.6426 0.4918 0.7013
No log 7.8095 492 0.5172 0.5768 0.5172 0.7192
No log 7.8413 494 0.5016 0.5995 0.5016 0.7082
No log 7.8730 496 0.4986 0.6168 0.4986 0.7061
No log 7.9048 498 0.5193 0.5438 0.5193 0.7206
0.3681 7.9365 500 0.5129 0.5584 0.5129 0.7162
0.3681 7.9683 502 0.4851 0.6601 0.4851 0.6965
0.3681 8.0 504 0.4808 0.6426 0.4808 0.6934
0.3681 8.0317 506 0.4843 0.6435 0.4843 0.6959
0.3681 8.0635 508 0.5050 0.5307 0.5050 0.7106
0.3681 8.0952 510 0.5273 0.4534 0.5273 0.7261
0.3681 8.1270 512 0.5413 0.4371 0.5413 0.7357
0.3681 8.1587 514 0.5816 0.4801 0.5816 0.7626
0.3681 8.1905 516 0.5520 0.4933 0.5520 0.7429

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task7_organization

Finetuned
(4019)
this model