ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7466
  • Qwk: 0.4371
  • Mse: 0.7466
  • Rmse: 0.8641

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0370 2 4.1409 0.0079 4.1409 2.0349
No log 0.0741 4 2.3485 0.0930 2.3485 1.5325
No log 0.1111 6 2.1782 -0.1317 2.1782 1.4759
No log 0.1481 8 1.9881 -0.1043 1.9881 1.4100
No log 0.1852 10 1.7413 -0.0458 1.7413 1.3196
No log 0.2222 12 1.4026 0.0 1.4026 1.1843
No log 0.2593 14 1.5788 0.0279 1.5788 1.2565
No log 0.2963 16 1.5705 0.0057 1.5705 1.2532
No log 0.3333 18 1.4566 -0.0113 1.4566 1.2069
No log 0.3704 20 1.3311 0.0 1.3311 1.1538
No log 0.4074 22 1.3655 0.0408 1.3655 1.1685
No log 0.4444 24 1.2927 0.0408 1.2927 1.1370
No log 0.4815 26 1.1721 0.1113 1.1721 1.0827
No log 0.5185 28 1.0987 0.2496 1.0987 1.0482
No log 0.5556 30 1.0930 0.2304 1.0930 1.0455
No log 0.5926 32 1.1452 0.2644 1.1452 1.0701
No log 0.6296 34 1.2809 0.2174 1.2809 1.1318
No log 0.6667 36 1.1324 0.2796 1.1324 1.0641
No log 0.7037 38 1.1353 0.2754 1.1353 1.0655
No log 0.7407 40 1.2678 0.2476 1.2678 1.1260
No log 0.7778 42 1.2868 0.1083 1.2868 1.1344
No log 0.8148 44 1.2355 0.0759 1.2355 1.1115
No log 0.8519 46 1.1433 0.1423 1.1433 1.0692
No log 0.8889 48 1.0300 0.3478 1.0300 1.0149
No log 0.9259 50 1.0138 0.3265 1.0138 1.0069
No log 0.9630 52 1.0569 0.2805 1.0569 1.0281
No log 1.0 54 1.0057 0.2935 1.0057 1.0028
No log 1.0370 56 0.9757 0.3288 0.9757 0.9878
No log 1.0741 58 0.9776 0.3071 0.9776 0.9887
No log 1.1111 60 0.9874 0.3498 0.9874 0.9937
No log 1.1481 62 0.9946 0.3326 0.9946 0.9973
No log 1.1852 64 0.9607 0.2849 0.9607 0.9801
No log 1.2222 66 0.9259 0.3304 0.9259 0.9622
No log 1.2593 68 0.9103 0.3540 0.9103 0.9541
No log 1.2963 70 0.9895 0.2687 0.9895 0.9947
No log 1.3333 72 1.1305 0.2799 1.1305 1.0632
No log 1.3704 74 1.2104 0.3185 1.2104 1.1002
No log 1.4074 76 1.3520 0.2243 1.3520 1.1628
No log 1.4444 78 1.6157 0.1128 1.6157 1.2711
No log 1.4815 80 1.5998 0.1039 1.5998 1.2648
No log 1.5185 82 1.3648 0.1685 1.3648 1.1682
No log 1.5556 84 1.0289 0.3340 1.0289 1.0143
No log 1.5926 86 0.9177 0.3646 0.9177 0.9579
No log 1.6296 88 0.8948 0.3774 0.8948 0.9459
No log 1.6667 90 0.9332 0.3561 0.9332 0.9660
No log 1.7037 92 0.9334 0.3561 0.9334 0.9661
No log 1.7407 94 0.8853 0.3774 0.8853 0.9409
No log 1.7778 96 0.8677 0.3733 0.8677 0.9315
No log 1.8148 98 1.0056 0.3885 1.0056 1.0028
No log 1.8519 100 1.0988 0.3249 1.0988 1.0483
No log 1.8889 102 1.0559 0.2857 1.0559 1.0276
No log 1.9259 104 0.9359 0.3700 0.9359 0.9674
No log 1.9630 106 0.8925 0.3616 0.8925 0.9447
No log 2.0 108 0.9251 0.3038 0.9251 0.9618
No log 2.0370 110 0.9047 0.3781 0.9047 0.9512
No log 2.0741 112 0.8510 0.4819 0.8510 0.9225
No log 2.1111 114 0.8787 0.4424 0.8787 0.9374
No log 2.1481 116 0.9502 0.3902 0.9502 0.9748
No log 2.1852 118 0.8703 0.3902 0.8703 0.9329
No log 2.2222 120 0.8119 0.4321 0.8119 0.9010
No log 2.2593 122 0.9713 0.4907 0.9713 0.9855
No log 2.2963 124 1.0150 0.4141 1.0150 1.0075
No log 2.3333 126 0.9087 0.5160 0.9087 0.9532
No log 2.3704 128 0.8214 0.3966 0.8214 0.9063
No log 2.4074 130 0.8214 0.4354 0.8214 0.9063
No log 2.4444 132 0.8122 0.3505 0.8122 0.9012
No log 2.4815 134 0.8674 0.4237 0.8674 0.9314
No log 2.5185 136 0.9155 0.3939 0.9155 0.9568
No log 2.5556 138 0.9768 0.3551 0.9768 0.9883
No log 2.5926 140 0.9487 0.3939 0.9487 0.9740
No log 2.6296 142 0.9730 0.3444 0.9730 0.9864
No log 2.6667 144 1.0512 0.3590 1.0512 1.0253
No log 2.7037 146 1.0519 0.3954 1.0519 1.0256
No log 2.7407 148 0.9675 0.3954 0.9675 0.9836
No log 2.7778 150 0.9071 0.3753 0.9071 0.9524
No log 2.8148 152 0.8358 0.4241 0.8358 0.9142
No log 2.8519 154 0.8106 0.4601 0.8106 0.9003
No log 2.8889 156 0.7956 0.3823 0.7956 0.8920
No log 2.9259 158 0.7927 0.4550 0.7927 0.8904
No log 2.9630 160 0.7953 0.4816 0.7953 0.8918
No log 3.0 162 0.8074 0.4277 0.8074 0.8985
No log 3.0370 164 0.7982 0.4565 0.7982 0.8934
No log 3.0741 166 0.8111 0.4865 0.8111 0.9006
No log 3.1111 168 0.8611 0.4929 0.8611 0.9279
No log 3.1481 170 0.8545 0.4075 0.8545 0.9244
No log 3.1852 172 0.8347 0.3821 0.8347 0.9136
No log 3.2222 174 0.8074 0.3797 0.8074 0.8985
No log 3.2593 176 0.8080 0.3236 0.8080 0.8989
No log 3.2963 178 0.7858 0.3797 0.7858 0.8865
No log 3.3333 180 0.7579 0.4675 0.7579 0.8706
No log 3.3704 182 0.7542 0.5192 0.7542 0.8684
No log 3.4074 184 0.7522 0.5622 0.7522 0.8673
No log 3.4444 186 0.7721 0.5263 0.7721 0.8787
No log 3.4815 188 0.8044 0.5459 0.8044 0.8969
No log 3.5185 190 0.8224 0.5305 0.8224 0.9069
No log 3.5556 192 0.8278 0.5642 0.8278 0.9098
No log 3.5926 194 0.7893 0.4784 0.7893 0.8884
No log 3.6296 196 0.7770 0.5030 0.7770 0.8815
No log 3.6667 198 0.7886 0.4749 0.7886 0.8880
No log 3.7037 200 0.7561 0.4691 0.7561 0.8696
No log 3.7407 202 0.7568 0.5330 0.7568 0.8699
No log 3.7778 204 0.8257 0.5549 0.8257 0.9087
No log 3.8148 206 0.8218 0.5746 0.8218 0.9065
No log 3.8519 208 0.7688 0.5727 0.7688 0.8768
No log 3.8889 210 0.7457 0.5905 0.7457 0.8635
No log 3.9259 212 0.7552 0.5753 0.7552 0.8690
No log 3.9630 214 0.6862 0.6397 0.6862 0.8284
No log 4.0 216 0.8273 0.4695 0.8273 0.9095
No log 4.0370 218 0.8294 0.4575 0.8294 0.9107
No log 4.0741 220 0.7794 0.4575 0.7794 0.8828
No log 4.1111 222 0.7074 0.6143 0.7074 0.8410
No log 4.1481 224 0.6952 0.5774 0.6952 0.8338
No log 4.1852 226 0.7033 0.6305 0.7033 0.8386
No log 4.2222 228 0.7329 0.5602 0.7329 0.8561
No log 4.2593 230 0.8635 0.5283 0.8635 0.9292
No log 4.2963 232 1.0430 0.4308 1.0430 1.0213
No log 4.3333 234 1.0575 0.4105 1.0575 1.0283
No log 4.3704 236 0.9123 0.4982 0.9123 0.9551
No log 4.4074 238 0.7532 0.5516 0.7532 0.8679
No log 4.4444 240 0.7333 0.4524 0.7333 0.8564
No log 4.4815 242 0.7475 0.3859 0.7475 0.8646
No log 4.5185 244 0.7955 0.4595 0.7955 0.8919
No log 4.5556 246 0.8963 0.4004 0.8963 0.9468
No log 4.5926 248 0.9070 0.4144 0.9070 0.9523
No log 4.6296 250 0.8632 0.3443 0.8632 0.9291
No log 4.6667 252 0.8281 0.3760 0.8281 0.9100
No log 4.7037 254 0.7824 0.3859 0.7824 0.8846
No log 4.7407 256 0.8195 0.3983 0.8195 0.9053
No log 4.7778 258 0.8911 0.4254 0.8911 0.9440
No log 4.8148 260 0.9232 0.4186 0.9232 0.9608
No log 4.8519 262 0.9484 0.4519 0.9484 0.9739
No log 4.8889 264 0.9416 0.4954 0.9416 0.9704
No log 4.9259 266 0.9289 0.4869 0.9289 0.9638
No log 4.9630 268 0.9445 0.4414 0.9445 0.9719
No log 5.0 270 0.9890 0.4469 0.9890 0.9945
No log 5.0370 272 0.8912 0.3952 0.8912 0.9440
No log 5.0741 274 0.8413 0.3455 0.8413 0.9172
No log 5.1111 276 0.8427 0.4368 0.8427 0.9180
No log 5.1481 278 0.8265 0.4110 0.8265 0.9091
No log 5.1852 280 0.8106 0.3074 0.8106 0.9003
No log 5.2222 282 0.8144 0.3403 0.8144 0.9024
No log 5.2593 284 0.8241 0.3528 0.8241 0.9078
No log 5.2963 286 0.8394 0.3506 0.8394 0.9162
No log 5.3333 288 0.8461 0.3214 0.8461 0.9198
No log 5.3704 290 0.8835 0.4301 0.8835 0.9399
No log 5.4074 292 0.8847 0.4568 0.8847 0.9406
No log 5.4444 294 0.8213 0.4599 0.8213 0.9063
No log 5.4815 296 0.7865 0.4888 0.7865 0.8869
No log 5.5185 298 0.8013 0.4813 0.8013 0.8952
No log 5.5556 300 0.8017 0.4676 0.8017 0.8954
No log 5.5926 302 0.8306 0.5410 0.8306 0.9114
No log 5.6296 304 0.9094 0.5012 0.9094 0.9536
No log 5.6667 306 0.9216 0.4405 0.9216 0.9600
No log 5.7037 308 0.8366 0.5740 0.8366 0.9147
No log 5.7407 310 0.7923 0.5522 0.7923 0.8901
No log 5.7778 312 0.7812 0.5066 0.7812 0.8839
No log 5.8148 314 0.7711 0.4932 0.7711 0.8781
No log 5.8519 316 0.7649 0.4869 0.7649 0.8746
No log 5.8889 318 0.8030 0.5318 0.8030 0.8961
No log 5.9259 320 0.8099 0.5093 0.8099 0.8999
No log 5.9630 322 0.7844 0.4494 0.7844 0.8857
No log 6.0 324 0.7892 0.3446 0.7892 0.8884
No log 6.0370 326 0.8332 0.4355 0.8332 0.9128
No log 6.0741 328 0.8367 0.4493 0.8367 0.9147
No log 6.1111 330 0.7883 0.3800 0.7883 0.8879
No log 6.1481 332 0.7847 0.4519 0.7847 0.8858
No log 6.1852 334 0.8055 0.5195 0.8055 0.8975
No log 6.2222 336 0.8034 0.5183 0.8034 0.8963
No log 6.2593 338 0.8285 0.5622 0.8285 0.9102
No log 6.2963 340 0.8079 0.5833 0.8079 0.8988
No log 6.3333 342 0.7978 0.5833 0.7978 0.8932
No log 6.3704 344 0.7639 0.5833 0.7639 0.8740
No log 6.4074 346 0.7662 0.5516 0.7662 0.8753
No log 6.4444 348 0.7595 0.4858 0.7595 0.8715
No log 6.4815 350 0.7433 0.4223 0.7433 0.8621
No log 6.5185 352 0.7475 0.4223 0.7475 0.8646
No log 6.5556 354 0.7591 0.5218 0.7591 0.8713
No log 6.5926 356 0.7802 0.6262 0.7802 0.8833
No log 6.6296 358 0.7351 0.6137 0.7351 0.8574
No log 6.6667 360 0.7395 0.6109 0.7395 0.8600
No log 6.7037 362 0.8046 0.5411 0.8046 0.8970
No log 6.7407 364 0.7927 0.5521 0.7927 0.8903
No log 6.7778 366 0.7544 0.6230 0.7544 0.8686
No log 6.8148 368 0.7465 0.5521 0.7465 0.8640
No log 6.8519 370 0.7820 0.5387 0.7820 0.8843
No log 6.8889 372 0.8086 0.5253 0.8086 0.8992
No log 6.9259 374 0.8278 0.5241 0.8278 0.9098
No log 6.9630 376 0.7689 0.5400 0.7689 0.8769
No log 7.0 378 0.7554 0.5540 0.7554 0.8692
No log 7.0370 380 0.7480 0.5540 0.7480 0.8648
No log 7.0741 382 0.7243 0.4981 0.7243 0.8511
No log 7.1111 384 0.7162 0.4537 0.7162 0.8463
No log 7.1481 386 0.7221 0.3915 0.7221 0.8498
No log 7.1852 388 0.7105 0.5388 0.7105 0.8429
No log 7.2222 390 0.7309 0.5740 0.7309 0.8549
No log 7.2593 392 0.8228 0.4894 0.8228 0.9071
No log 7.2963 394 0.7994 0.5636 0.7994 0.8941
No log 7.3333 396 0.7082 0.5959 0.7082 0.8415
No log 7.3704 398 0.7407 0.5336 0.7407 0.8606
No log 7.4074 400 0.7843 0.4493 0.7843 0.8856
No log 7.4444 402 0.7279 0.4379 0.7279 0.8532
No log 7.4815 404 0.7366 0.4466 0.7366 0.8582
No log 7.5185 406 0.7966 0.5178 0.7966 0.8925
No log 7.5556 408 0.7864 0.4697 0.7864 0.8868
No log 7.5926 410 0.7510 0.4209 0.7510 0.8666
No log 7.6296 412 0.7324 0.5084 0.7324 0.8558
No log 7.6667 414 0.7400 0.5305 0.7400 0.8602
No log 7.7037 416 0.7879 0.5266 0.7879 0.8876
No log 7.7407 418 0.9045 0.5283 0.9045 0.9510
No log 7.7778 420 0.9258 0.5465 0.9258 0.9622
No log 7.8148 422 0.8860 0.5094 0.8860 0.9413
No log 7.8519 424 0.8155 0.5318 0.8155 0.9031
No log 7.8889 426 0.8157 0.5579 0.8157 0.9032
No log 7.9259 428 0.8059 0.5579 0.8059 0.8977
No log 7.9630 430 0.8154 0.5579 0.8154 0.9030
No log 8.0 432 0.8038 0.5387 0.8038 0.8966
No log 8.0370 434 0.7901 0.4946 0.7901 0.8889
No log 8.0741 436 0.8000 0.5306 0.8000 0.8944
No log 8.1111 438 0.7770 0.4461 0.7770 0.8815
No log 8.1481 440 0.7743 0.4461 0.7743 0.8799
No log 8.1852 442 0.7490 0.4615 0.7490 0.8655
No log 8.2222 444 0.7537 0.5084 0.7537 0.8682
No log 8.2593 446 0.8244 0.5579 0.8244 0.9079
No log 8.2963 448 0.8499 0.5579 0.8499 0.9219
No log 8.3333 450 0.8043 0.4952 0.8043 0.8968
No log 8.3704 452 0.7322 0.4728 0.7322 0.8557
No log 8.4074 454 0.7138 0.4822 0.7138 0.8448
No log 8.4444 456 0.7182 0.4838 0.7182 0.8475
No log 8.4815 458 0.7138 0.4520 0.7138 0.8449
No log 8.5185 460 0.7431 0.4728 0.7431 0.8620
No log 8.5556 462 0.7542 0.4728 0.7542 0.8684
No log 8.5926 464 0.7336 0.4981 0.7336 0.8565
No log 8.6296 466 0.7256 0.5546 0.7256 0.8518
No log 8.6667 468 0.7653 0.5898 0.7653 0.8748
No log 8.7037 470 0.8640 0.5892 0.8640 0.9295
No log 8.7407 472 0.8839 0.6181 0.8839 0.9402
No log 8.7778 474 0.8071 0.5658 0.8071 0.8984
No log 8.8148 476 0.7297 0.5098 0.7297 0.8542
No log 8.8519 478 0.7086 0.5135 0.7086 0.8418
No log 8.8889 480 0.7192 0.4932 0.7192 0.8481
No log 8.9259 482 0.7219 0.5163 0.7219 0.8496
No log 8.9630 484 0.7265 0.5125 0.7265 0.8523
No log 9.0 486 0.7811 0.5718 0.7811 0.8838
No log 9.0370 488 0.8230 0.5482 0.8230 0.9072
No log 9.0741 490 0.7979 0.5718 0.7979 0.8932
No log 9.1111 492 0.7585 0.5339 0.7585 0.8709
No log 9.1481 494 0.7840 0.4775 0.7840 0.8855
No log 9.1852 496 0.8041 0.4801 0.8041 0.8967
No log 9.2222 498 0.7626 0.4720 0.7626 0.8733
0.3082 9.2593 500 0.7733 0.5098 0.7733 0.8794
0.3082 9.2963 502 0.8443 0.5356 0.8443 0.9188
0.3082 9.3333 504 0.8994 0.4255 0.8994 0.9484
0.3082 9.3704 506 0.8687 0.4444 0.8687 0.9320
0.3082 9.4074 508 0.8044 0.4843 0.8044 0.8969
0.3082 9.4444 510 0.7582 0.4869 0.7583 0.8708
0.3082 9.4815 512 0.7466 0.4371 0.7466 0.8641

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

Finetuned
(4019)
this model