ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5273
  • Qwk: 0.4402
  • Mse: 0.5273
  • Rmse: 0.7261

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 2.5041 -0.0262 2.5041 1.5824
No log 0.0656 4 1.2984 -0.0052 1.2984 1.1395
No log 0.0984 6 1.0822 -0.1740 1.0822 1.0403
No log 0.1311 8 1.0606 -0.2279 1.0606 1.0298
No log 0.1639 10 1.0551 0.0982 1.0551 1.0272
No log 0.1967 12 0.9672 0.1254 0.9672 0.9835
No log 0.2295 14 0.8543 0.0265 0.8543 0.9243
No log 0.2623 16 0.8081 0.1187 0.8081 0.8990
No log 0.2951 18 0.7912 0.0840 0.7912 0.8895
No log 0.3279 20 0.7717 0.0840 0.7717 0.8785
No log 0.3607 22 0.7815 0.0937 0.7815 0.8840
No log 0.3934 24 0.8066 0.0947 0.8066 0.8981
No log 0.4262 26 0.7508 0.2103 0.7508 0.8665
No log 0.4590 28 0.7851 0.0944 0.7851 0.8861
No log 0.4918 30 0.8015 0.0937 0.8015 0.8953
No log 0.5246 32 0.7478 0.0428 0.7478 0.8648
No log 0.5574 34 0.7842 0.0 0.7842 0.8855
No log 0.5902 36 0.9020 0.0509 0.9020 0.9497
No log 0.6230 38 0.9419 0.1724 0.9419 0.9705
No log 0.6557 40 0.8731 0.0937 0.8731 0.9344
No log 0.6885 42 0.7898 0.0 0.7898 0.8887
No log 0.7213 44 0.7551 0.0 0.7551 0.8690
No log 0.7541 46 0.7460 0.0481 0.7460 0.8637
No log 0.7869 48 0.7755 0.1372 0.7755 0.8806
No log 0.8197 50 0.7386 0.0937 0.7386 0.8594
No log 0.8525 52 0.7026 0.1660 0.7026 0.8382
No log 0.8852 54 0.6907 0.2713 0.6907 0.8311
No log 0.9180 56 0.6912 0.2572 0.6912 0.8314
No log 0.9508 58 0.6986 0.2572 0.6986 0.8358
No log 0.9836 60 0.7079 0.2506 0.7079 0.8413
No log 1.0164 62 0.7174 0.2476 0.7174 0.8470
No log 1.0492 64 0.7201 0.2353 0.7201 0.8486
No log 1.0820 66 0.7221 0.2751 0.7221 0.8498
No log 1.1148 68 0.8050 0.2193 0.8050 0.8972
No log 1.1475 70 0.7864 0.2409 0.7864 0.8868
No log 1.1803 72 0.7496 0.2218 0.7496 0.8658
No log 1.2131 74 0.7649 0.1884 0.7649 0.8746
No log 1.2459 76 0.7907 0.2205 0.7907 0.8892
No log 1.2787 78 0.7533 0.2205 0.7533 0.8679
No log 1.3115 80 0.7087 0.1790 0.7087 0.8419
No log 1.3443 82 0.9192 0.2602 0.9192 0.9587
No log 1.3770 84 0.8503 0.2988 0.8503 0.9221
No log 1.4098 86 0.6897 0.3509 0.6897 0.8305
No log 1.4426 88 0.8076 0.3510 0.8076 0.8987
No log 1.4754 90 1.1347 0.1947 1.1347 1.0652
No log 1.5082 92 1.2555 0.1003 1.2555 1.1205
No log 1.5410 94 1.0631 0.2214 1.0631 1.0311
No log 1.5738 96 0.8939 0.2604 0.8939 0.9455
No log 1.6066 98 0.7284 0.3730 0.7284 0.8535
No log 1.6393 100 0.6829 0.2745 0.6829 0.8264
No log 1.6721 102 0.6357 0.2405 0.6357 0.7973
No log 1.7049 104 0.6438 0.2046 0.6438 0.8024
No log 1.7377 106 0.6803 0.1979 0.6803 0.8248
No log 1.7705 108 0.7140 0.2611 0.7140 0.8450
No log 1.8033 110 0.6423 0.1942 0.6423 0.8014
No log 1.8361 112 0.6019 0.2572 0.6019 0.7758
No log 1.8689 114 0.5967 0.2857 0.5967 0.7724
No log 1.9016 116 0.6048 0.3323 0.6048 0.7777
No log 1.9344 118 0.6457 0.2987 0.6457 0.8036
No log 1.9672 120 0.6362 0.2218 0.6362 0.7977
No log 2.0 122 0.6574 0.2353 0.6574 0.8108
No log 2.0328 124 0.6717 0.1673 0.6717 0.8196
No log 2.0656 126 0.6372 0.3926 0.6372 0.7982
No log 2.0984 128 0.7534 0.3746 0.7534 0.8680
No log 2.1311 130 0.6894 0.3372 0.6894 0.8303
No log 2.1639 132 0.6272 0.4052 0.6272 0.7919
No log 2.1967 134 0.7492 0.3239 0.7492 0.8656
No log 2.2295 136 0.7911 0.3676 0.7911 0.8894
No log 2.2623 138 0.6549 0.3610 0.6549 0.8093
No log 2.2951 140 0.6505 0.3441 0.6505 0.8065
No log 2.3279 142 0.6289 0.3643 0.6289 0.7931
No log 2.3607 144 0.7439 0.3739 0.7439 0.8625
No log 2.3934 146 0.8074 0.2480 0.8074 0.8985
No log 2.4262 148 0.6857 0.3107 0.6857 0.8281
No log 2.4590 150 0.6535 0.2913 0.6535 0.8084
No log 2.4918 152 0.6457 0.3020 0.6457 0.8035
No log 2.5246 154 0.6614 0.1591 0.6614 0.8133
No log 2.5574 156 0.7959 0.1628 0.7959 0.8921
No log 2.5902 158 0.8212 0.2222 0.8212 0.9062
No log 2.6230 160 0.7360 0.2300 0.7360 0.8579
No log 2.6557 162 0.6602 0.3144 0.6602 0.8125
No log 2.6885 164 0.7267 0.2808 0.7267 0.8524
No log 2.7213 166 0.7046 0.2661 0.7046 0.8394
No log 2.7541 168 0.7613 0.2679 0.7613 0.8725
No log 2.7869 170 0.8154 0.2651 0.8154 0.9030
No log 2.8197 172 0.7463 0.2182 0.7463 0.8639
No log 2.8525 174 0.6981 0.2749 0.6981 0.8355
No log 2.8852 176 0.6878 0.1853 0.6878 0.8293
No log 2.9180 178 0.6821 0.2513 0.6821 0.8259
No log 2.9508 180 0.7049 0.3536 0.7049 0.8396
No log 2.9836 182 0.7245 0.3425 0.7245 0.8512
No log 3.0164 184 0.6934 0.3229 0.6934 0.8327
No log 3.0492 186 0.6666 0.2591 0.6666 0.8165
No log 3.0820 188 0.7295 0.2866 0.7295 0.8541
No log 3.1148 190 0.7953 0.3440 0.7953 0.8918
No log 3.1475 192 0.7791 0.3173 0.7791 0.8827
No log 3.1803 194 0.7399 0.3044 0.7399 0.8602
No log 3.2131 196 0.7383 0.3876 0.7383 0.8593
No log 3.2459 198 0.7185 0.3701 0.7185 0.8476
No log 3.2787 200 0.7057 0.3417 0.7057 0.8401
No log 3.3115 202 0.6750 0.3375 0.6750 0.8216
No log 3.3443 204 0.6719 0.3149 0.6719 0.8197
No log 3.3770 206 0.6526 0.3186 0.6526 0.8079
No log 3.4098 208 0.6355 0.4459 0.6355 0.7972
No log 3.4426 210 0.6322 0.4067 0.6322 0.7951
No log 3.4754 212 0.6492 0.4362 0.6492 0.8057
No log 3.5082 214 0.6899 0.4420 0.6899 0.8306
No log 3.5410 216 0.7809 0.3871 0.7809 0.8837
No log 3.5738 218 0.7538 0.4672 0.7538 0.8682
No log 3.6066 220 0.6901 0.4183 0.6901 0.8307
No log 3.6393 222 0.6826 0.4838 0.6826 0.8262
No log 3.6721 224 0.6511 0.4114 0.6511 0.8069
No log 3.7049 226 0.6670 0.4096 0.6670 0.8167
No log 3.7377 228 0.7066 0.3277 0.7066 0.8406
No log 3.7705 230 0.6801 0.3331 0.6801 0.8247
No log 3.8033 232 0.6341 0.4043 0.6341 0.7963
No log 3.8361 234 0.6244 0.3953 0.6244 0.7902
No log 3.8689 236 0.6241 0.4161 0.6241 0.7900
No log 3.9016 238 0.6214 0.4590 0.6214 0.7883
No log 3.9344 240 0.6058 0.4505 0.6058 0.7783
No log 3.9672 242 0.6459 0.4480 0.6459 0.8037
No log 4.0 244 0.6821 0.4952 0.6821 0.8259
No log 4.0328 246 0.6278 0.4636 0.6278 0.7923
No log 4.0656 248 0.5822 0.4517 0.5822 0.7630
No log 4.0984 250 0.5592 0.4212 0.5592 0.7478
No log 4.1311 252 0.5714 0.4502 0.5714 0.7559
No log 4.1639 254 0.5447 0.4847 0.5447 0.7380
No log 4.1967 256 0.6486 0.4518 0.6486 0.8054
No log 4.2295 258 0.7100 0.3829 0.7100 0.8426
No log 4.2623 260 0.6501 0.4518 0.6501 0.8063
No log 4.2951 262 0.6112 0.3859 0.6112 0.7818
No log 4.3279 264 0.6092 0.3915 0.6092 0.7805
No log 4.3607 266 0.6117 0.3702 0.6117 0.7821
No log 4.3934 268 0.6142 0.2819 0.6142 0.7837
No log 4.4262 270 0.6243 0.2589 0.6243 0.7901
No log 4.4590 272 0.6293 0.2285 0.6293 0.7933
No log 4.4918 274 0.6208 0.2677 0.6208 0.7879
No log 4.5246 276 0.6354 0.2987 0.6354 0.7972
No log 4.5574 278 0.6560 0.3377 0.6560 0.8099
No log 4.5902 280 0.6315 0.3942 0.6315 0.7947
No log 4.6230 282 0.6270 0.3942 0.6270 0.7918
No log 4.6557 284 0.6141 0.3942 0.6141 0.7836
No log 4.6885 286 0.6147 0.3942 0.6147 0.7840
No log 4.7213 288 0.6634 0.4161 0.6634 0.8145
No log 4.7541 290 0.7073 0.4594 0.7073 0.8410
No log 4.7869 292 0.6996 0.4463 0.6996 0.8364
No log 4.8197 294 0.6466 0.4064 0.6466 0.8041
No log 4.8525 296 0.6456 0.4216 0.6456 0.8035
No log 4.8852 298 0.6324 0.3919 0.6324 0.7953
No log 4.9180 300 0.6522 0.4875 0.6522 0.8076
No log 4.9508 302 0.7398 0.5200 0.7398 0.8601
No log 4.9836 304 0.7847 0.4161 0.7847 0.8859
No log 5.0164 306 0.6798 0.4690 0.6798 0.8245
No log 5.0492 308 0.5810 0.4635 0.5810 0.7622
No log 5.0820 310 0.6113 0.4234 0.6113 0.7819
No log 5.1148 312 0.6300 0.3746 0.6300 0.7937
No log 5.1475 314 0.5917 0.4701 0.5917 0.7692
No log 5.1803 316 0.6231 0.4480 0.6231 0.7894
No log 5.2131 318 0.7202 0.4987 0.7202 0.8487
No log 5.2459 320 0.7532 0.4740 0.7532 0.8679
No log 5.2787 322 0.6834 0.4761 0.6834 0.8267
No log 5.3115 324 0.6007 0.4776 0.6007 0.7751
No log 5.3443 326 0.6265 0.4260 0.6265 0.7915
No log 5.3770 328 0.6420 0.3615 0.6420 0.8013
No log 5.4098 330 0.6191 0.3615 0.6191 0.7868
No log 5.4426 332 0.5796 0.5056 0.5796 0.7613
No log 5.4754 334 0.5672 0.4364 0.5672 0.7531
No log 5.5082 336 0.6423 0.5184 0.6423 0.8014
No log 5.5410 338 0.7332 0.4740 0.7332 0.8562
No log 5.5738 340 0.8237 0.4243 0.8237 0.9076
No log 5.6066 342 0.7773 0.4243 0.7773 0.8817
No log 5.6393 344 0.6355 0.5046 0.6355 0.7972
No log 5.6721 346 0.5821 0.5509 0.5821 0.7629
No log 5.7049 348 0.5974 0.5636 0.5974 0.7729
No log 5.7377 350 0.6743 0.4527 0.6743 0.8212
No log 5.7705 352 0.6946 0.4122 0.6946 0.8334
No log 5.8033 354 0.6499 0.4341 0.6499 0.8062
No log 5.8361 356 0.5871 0.4674 0.5871 0.7662
No log 5.8689 358 0.5481 0.4249 0.5481 0.7403
No log 5.9016 360 0.5515 0.3974 0.5515 0.7427
No log 5.9344 362 0.5888 0.4434 0.5888 0.7673
No log 5.9672 364 0.6362 0.4225 0.6362 0.7976
No log 6.0 366 0.6698 0.4800 0.6698 0.8184
No log 6.0328 368 0.6479 0.4868 0.6479 0.8049
No log 6.0656 370 0.5874 0.5200 0.5874 0.7664
No log 6.0984 372 0.6255 0.4952 0.6255 0.7909
No log 6.1311 374 0.7743 0.3946 0.7743 0.8799
No log 6.1639 376 0.8734 0.3671 0.8734 0.9346
No log 6.1967 378 0.8330 0.3451 0.8330 0.9127
No log 6.2295 380 0.7046 0.3916 0.7046 0.8394
No log 6.2623 382 0.5860 0.4444 0.5860 0.7655
No log 6.2951 384 0.5908 0.4182 0.5908 0.7686
No log 6.3279 386 0.6051 0.4414 0.6051 0.7779
No log 6.3607 388 0.5861 0.4504 0.5861 0.7656
No log 6.3934 390 0.6193 0.5044 0.6193 0.7870
No log 6.4262 392 0.7188 0.2903 0.7188 0.8478
No log 6.4590 394 0.7584 0.3131 0.7584 0.8709
No log 6.4918 396 0.6983 0.3362 0.6983 0.8356
No log 6.5246 398 0.6370 0.5184 0.6370 0.7981
No log 6.5574 400 0.5933 0.4738 0.5933 0.7702
No log 6.5902 402 0.5821 0.5516 0.5821 0.7630
No log 6.6230 404 0.5538 0.4962 0.5538 0.7442
No log 6.6557 406 0.5313 0.4857 0.5313 0.7289
No log 6.6885 408 0.5236 0.5143 0.5236 0.7236
No log 6.7213 410 0.5410 0.5283 0.5410 0.7355
No log 6.7541 412 0.5571 0.4868 0.5571 0.7464
No log 6.7869 414 0.5347 0.5283 0.5347 0.7313
No log 6.8197 416 0.5081 0.5098 0.5081 0.7128
No log 6.8525 418 0.5081 0.5195 0.5081 0.7128
No log 6.8852 420 0.5635 0.5378 0.5635 0.7507
No log 6.9180 422 0.6032 0.5393 0.6032 0.7767
No log 6.9508 424 0.5958 0.5184 0.5958 0.7719
No log 6.9836 426 0.5847 0.5249 0.5847 0.7647
No log 7.0164 428 0.5932 0.5184 0.5932 0.7702
No log 7.0492 430 0.5862 0.5030 0.5862 0.7656
No log 7.0820 432 0.5872 0.5220 0.5872 0.7663
No log 7.1148 434 0.6244 0.5603 0.6244 0.7902
No log 7.1475 436 0.6430 0.5166 0.6430 0.8019
No log 7.1803 438 0.6274 0.5636 0.6274 0.7921
No log 7.2131 440 0.6072 0.6078 0.6072 0.7793
No log 7.2459 442 0.5855 0.5516 0.5855 0.7652
No log 7.2787 444 0.5756 0.5302 0.5756 0.7587
No log 7.3115 446 0.5960 0.5250 0.5960 0.7720
No log 7.3443 448 0.6149 0.5809 0.6149 0.7841
No log 7.3770 450 0.6603 0.5283 0.6603 0.8126
No log 7.4098 452 0.6507 0.5932 0.6507 0.8067
No log 7.4426 454 0.6328 0.5302 0.6328 0.7955
No log 7.4754 456 0.6519 0.4883 0.6519 0.8074
No log 7.5082 458 0.6418 0.5243 0.6418 0.8011
No log 7.5410 460 0.6560 0.5738 0.6560 0.8099
No log 7.5738 462 0.6425 0.5144 0.6425 0.8015
No log 7.6066 464 0.6158 0.4734 0.6158 0.7847
No log 7.6393 466 0.5839 0.4423 0.5839 0.7642
No log 7.6721 468 0.5973 0.4114 0.5973 0.7729
No log 7.7049 470 0.6090 0.4171 0.6090 0.7804
No log 7.7377 472 0.6098 0.4463 0.6098 0.7809
No log 7.7705 474 0.6071 0.4217 0.6071 0.7791
No log 7.8033 476 0.6061 0.4217 0.6061 0.7785
No log 7.8361 478 0.6045 0.4217 0.6045 0.7775
No log 7.8689 480 0.6001 0.3910 0.6001 0.7747
No log 7.9016 482 0.5930 0.4386 0.5930 0.7701
No log 7.9344 484 0.6119 0.4189 0.6119 0.7822
No log 7.9672 486 0.6280 0.4544 0.6280 0.7925
No log 8.0 488 0.6282 0.4834 0.6282 0.7926
No log 8.0328 490 0.5772 0.4655 0.5772 0.7597
No log 8.0656 492 0.5474 0.4160 0.5474 0.7399
No log 8.0984 494 0.5458 0.4194 0.5458 0.7388
No log 8.1311 496 0.5380 0.4253 0.5380 0.7335
No log 8.1639 498 0.5221 0.4194 0.5221 0.7226
0.3019 8.1967 500 0.5129 0.4722 0.5129 0.7162
0.3019 8.2295 502 0.5428 0.4979 0.5428 0.7367
0.3019 8.2623 504 0.5931 0.4509 0.5931 0.7701
0.3019 8.2951 506 0.6149 0.3993 0.6149 0.7842
0.3019 8.3279 508 0.5532 0.4674 0.5532 0.7438
0.3019 8.3607 510 0.5098 0.4949 0.5098 0.7140
0.3019 8.3934 512 0.5006 0.3945 0.5006 0.7076
0.3019 8.4262 514 0.5085 0.4547 0.5085 0.7131
0.3019 8.4590 516 0.5163 0.4526 0.5163 0.7185
0.3019 8.4918 518 0.5222 0.4463 0.5222 0.7226
0.3019 8.5246 520 0.5273 0.4402 0.5273 0.7261

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k18_task7_organization

Finetuned
(4019)
this model