ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7209
  • Qwk: 0.3387
  • Mse: 0.7209
  • Rmse: 0.8491

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0263 2 2.6114 -0.0593 2.6114 1.6160
No log 0.0526 4 1.3662 0.0412 1.3662 1.1689
No log 0.0789 6 1.0366 -0.1408 1.0366 1.0181
No log 0.1053 8 0.9287 0.0778 0.9287 0.9637
No log 0.1316 10 1.1332 0.0070 1.1332 1.0645
No log 0.1579 12 1.3977 -0.1097 1.3977 1.1823
No log 0.1842 14 1.3071 -0.1828 1.3071 1.1433
No log 0.2105 16 1.0926 0.0787 1.0926 1.0453
No log 0.2368 18 0.9905 0.0787 0.9905 0.9952
No log 0.2632 20 0.8839 0.1359 0.8839 0.9401
No log 0.2895 22 0.8665 0.0 0.8665 0.9309
No log 0.3158 24 0.8149 0.0410 0.8149 0.9027
No log 0.3421 26 0.7893 -0.0127 0.7893 0.8884
No log 0.3684 28 0.8394 0.1268 0.8394 0.9162
No log 0.3947 30 0.9352 0.1308 0.9352 0.9671
No log 0.4211 32 0.9961 0.0646 0.9961 0.9980
No log 0.4474 34 1.0164 0.1298 1.0164 1.0081
No log 0.4737 36 1.0397 0.0973 1.0397 1.0196
No log 0.5 38 1.0867 -0.1101 1.0867 1.0425
No log 0.5263 40 1.2202 -0.0210 1.2202 1.1046
No log 0.5526 42 1.4327 0.0174 1.4327 1.1970
No log 0.5789 44 1.4707 0.0475 1.4707 1.2127
No log 0.6053 46 1.1569 0.0400 1.1569 1.0756
No log 0.6316 48 0.9935 0.1528 0.9935 0.9967
No log 0.6579 50 1.1130 0.0209 1.1130 1.0550
No log 0.6842 52 1.1576 0.1213 1.1576 1.0759
No log 0.7105 54 1.1930 0.0973 1.1930 1.0922
No log 0.7368 56 1.0684 0.1115 1.0684 1.0336
No log 0.7632 58 1.0599 0.1422 1.0599 1.0295
No log 0.7895 60 1.0905 0.2239 1.0905 1.0443
No log 0.8158 62 1.1419 0.2230 1.1419 1.0686
No log 0.8421 64 1.0858 0.2037 1.0858 1.0420
No log 0.8684 66 0.9096 0.2087 0.9096 0.9537
No log 0.8947 68 0.7219 0.0 0.7219 0.8497
No log 0.9211 70 0.7079 0.1699 0.7079 0.8414
No log 0.9474 72 0.7158 0.2121 0.7158 0.8461
No log 0.9737 74 0.7851 0.2706 0.7851 0.8861
No log 1.0 76 1.0026 0.2964 1.0026 1.0013
No log 1.0263 78 1.0787 0.2580 1.0787 1.0386
No log 1.0526 80 1.2280 0.1947 1.2280 1.1082
No log 1.0789 82 1.2189 0.1959 1.2189 1.1041
No log 1.1053 84 1.2182 0.2176 1.2182 1.1037
No log 1.1316 86 1.0492 0.2580 1.0492 1.0243
No log 1.1579 88 0.8457 0.2853 0.8457 0.9196
No log 1.1842 90 0.7569 0.1797 0.7569 0.8700
No log 1.2105 92 0.7646 0.2837 0.7646 0.8744
No log 1.2368 94 0.7433 0.2516 0.7433 0.8621
No log 1.2632 96 0.7267 0.2909 0.7267 0.8525
No log 1.2895 98 0.7198 0.2229 0.7198 0.8484
No log 1.3158 100 0.7310 0.1884 0.7310 0.8550
No log 1.3421 102 0.7452 0.1884 0.7452 0.8633
No log 1.3684 104 0.7575 0.1539 0.7575 0.8703
No log 1.3947 106 0.7670 0.1539 0.7670 0.8758
No log 1.4211 108 0.8000 0.1729 0.8000 0.8944
No log 1.4474 110 0.7892 0.1569 0.7892 0.8883
No log 1.4737 112 0.8449 0.2547 0.8449 0.9192
No log 1.5 114 0.9809 0.2369 0.9809 0.9904
No log 1.5263 116 0.8940 0.2892 0.8940 0.9455
No log 1.5526 118 0.7389 0.2158 0.7389 0.8596
No log 1.5789 120 0.8308 0.2555 0.8308 0.9115
No log 1.6053 122 0.8792 0.3002 0.8792 0.9376
No log 1.6316 124 0.7961 0.3617 0.7961 0.8923
No log 1.6579 126 0.7042 0.3308 0.7042 0.8391
No log 1.6842 128 0.7617 0.2932 0.7617 0.8728
No log 1.7105 130 0.7418 0.3329 0.7418 0.8613
No log 1.7368 132 0.6867 0.2386 0.6867 0.8287
No log 1.7632 134 0.7614 0.2642 0.7614 0.8726
No log 1.7895 136 0.8907 0.3401 0.8907 0.9438
No log 1.8158 138 0.9496 0.2877 0.9496 0.9745
No log 1.8421 140 0.8391 0.3376 0.8391 0.9160
No log 1.8684 142 0.7244 0.3739 0.7244 0.8511
No log 1.8947 144 0.6983 0.4495 0.6983 0.8356
No log 1.9211 146 0.7083 0.3972 0.7083 0.8416
No log 1.9474 148 0.7564 0.3896 0.7564 0.8697
No log 1.9737 150 0.7292 0.3631 0.7292 0.8539
No log 2.0 152 0.7076 0.3378 0.7076 0.8412
No log 2.0263 154 0.7992 0.3123 0.7992 0.8940
No log 2.0526 156 0.8510 0.2756 0.8510 0.9225
No log 2.0789 158 0.7901 0.3706 0.7901 0.8889
No log 2.1053 160 0.8273 0.3706 0.8273 0.9096
No log 2.1316 162 0.9694 0.2938 0.9694 0.9846
No log 2.1579 164 0.9429 0.3767 0.9429 0.9710
No log 2.1842 166 0.8155 0.2519 0.8155 0.9030
No log 2.2105 168 0.7651 0.3887 0.7651 0.8747
No log 2.2368 170 0.7690 0.4257 0.7690 0.8769
No log 2.2632 172 0.7564 0.3460 0.7564 0.8697
No log 2.2895 174 0.7505 0.3400 0.7505 0.8663
No log 2.3158 176 0.7352 0.4065 0.7352 0.8575
No log 2.3421 178 0.7190 0.4432 0.7190 0.8479
No log 2.3684 180 0.7200 0.4813 0.7200 0.8485
No log 2.3947 182 0.7545 0.2517 0.7545 0.8686
No log 2.4211 184 0.7076 0.4 0.7076 0.8412
No log 2.4474 186 0.6914 0.4 0.6914 0.8315
No log 2.4737 188 0.6827 0.3933 0.6827 0.8263
No log 2.5 190 0.6881 0.4196 0.6881 0.8295
No log 2.5263 192 0.7034 0.4098 0.7034 0.8387
No log 2.5526 194 0.7959 0.3247 0.7959 0.8921
No log 2.5789 196 0.8757 0.2948 0.8757 0.9358
No log 2.6053 198 0.7463 0.3662 0.7463 0.8639
No log 2.6316 200 0.7901 0.3971 0.7901 0.8889
No log 2.6579 202 0.7460 0.4155 0.7460 0.8637
No log 2.6842 204 0.6647 0.4023 0.6647 0.8153
No log 2.7105 206 0.6338 0.3910 0.6338 0.7961
No log 2.7368 208 0.6600 0.3213 0.6600 0.8124
No log 2.7632 210 0.6548 0.3628 0.6548 0.8092
No log 2.7895 212 0.6587 0.4051 0.6587 0.8116
No log 2.8158 214 0.6787 0.4301 0.6787 0.8238
No log 2.8421 216 0.6894 0.4094 0.6894 0.8303
No log 2.8684 218 0.7169 0.3223 0.7169 0.8467
No log 2.8947 220 0.7045 0.3554 0.7045 0.8393
No log 2.9211 222 0.7548 0.4495 0.7548 0.8688
No log 2.9474 224 0.7643 0.4719 0.7643 0.8742
No log 2.9737 226 0.7325 0.3575 0.7325 0.8558
No log 3.0 228 0.8097 0.3967 0.8097 0.8998
No log 3.0263 230 0.8403 0.3734 0.8403 0.9167
No log 3.0526 232 0.8309 0.3791 0.8309 0.9116
No log 3.0789 234 0.7274 0.3768 0.7274 0.8529
No log 3.1053 236 0.7283 0.4116 0.7283 0.8534
No log 3.1316 238 0.7187 0.4116 0.7187 0.8478
No log 3.1579 240 0.6636 0.4914 0.6636 0.8146
No log 3.1842 242 0.7092 0.3523 0.7092 0.8421
No log 3.2105 244 0.7735 0.2476 0.7735 0.8795
No log 3.2368 246 0.8439 0.2756 0.8439 0.9186
No log 3.2632 248 0.9420 0.3032 0.9420 0.9706
No log 3.2895 250 0.8581 0.3174 0.8581 0.9263
No log 3.3158 252 0.8433 0.3431 0.8433 0.9183
No log 3.3421 254 0.8367 0.3431 0.8367 0.9147
No log 3.3684 256 0.7639 0.3962 0.7639 0.8740
No log 3.3947 258 0.6843 0.4265 0.6843 0.8272
No log 3.4211 260 0.6830 0.4847 0.6830 0.8264
No log 3.4474 262 0.7209 0.4007 0.7209 0.8491
No log 3.4737 264 0.7132 0.4555 0.7132 0.8445
No log 3.5 266 0.7011 0.4322 0.7011 0.8373
No log 3.5263 268 0.7348 0.2938 0.7348 0.8572
No log 3.5526 270 0.7119 0.3189 0.7119 0.8437
No log 3.5789 272 0.6911 0.5021 0.6911 0.8313
No log 3.6053 274 0.7041 0.5150 0.7041 0.8391
No log 3.6316 276 0.7152 0.5150 0.7152 0.8457
No log 3.6579 278 0.7324 0.5150 0.7324 0.8558
No log 3.6842 280 0.7407 0.4678 0.7407 0.8606
No log 3.7105 282 0.7627 0.3923 0.7627 0.8734
No log 3.7368 284 0.7649 0.4090 0.7649 0.8746
No log 3.7632 286 0.8059 0.3349 0.8059 0.8977
No log 3.7895 288 0.7814 0.3383 0.7814 0.8840
No log 3.8158 290 0.6805 0.4348 0.6805 0.8249
No log 3.8421 292 0.6440 0.4914 0.6440 0.8025
No log 3.8684 294 0.6429 0.4914 0.6429 0.8018
No log 3.8947 296 0.6358 0.4244 0.6358 0.7974
No log 3.9211 298 0.7040 0.3876 0.7040 0.8391
No log 3.9474 300 0.7883 0.3269 0.7883 0.8879
No log 3.9737 302 0.7289 0.4616 0.7289 0.8538
No log 4.0 304 0.6629 0.3601 0.6629 0.8142
No log 4.0263 306 0.6571 0.3645 0.6571 0.8106
No log 4.0526 308 0.6535 0.3645 0.6535 0.8084
No log 4.0789 310 0.6301 0.3667 0.6301 0.7938
No log 4.1053 312 0.6316 0.4105 0.6316 0.7947
No log 4.1316 314 0.7251 0.4765 0.7251 0.8515
No log 4.1579 316 0.6835 0.4886 0.6835 0.8268
No log 4.1842 318 0.5775 0.4278 0.5775 0.7600
No log 4.2105 320 0.6377 0.4587 0.6377 0.7986
No log 4.2368 322 0.6326 0.4587 0.6326 0.7953
No log 4.2632 324 0.6004 0.4719 0.6004 0.7748
No log 4.2895 326 0.7807 0.4503 0.7807 0.8836
No log 4.3158 328 0.9337 0.3079 0.9337 0.9663
No log 4.3421 330 0.9605 0.3079 0.9605 0.9800
No log 4.3684 332 0.7863 0.4057 0.7863 0.8867
No log 4.3947 334 0.6499 0.4200 0.6499 0.8062
No log 4.4211 336 0.6411 0.4555 0.6411 0.8007
No log 4.4474 338 0.6596 0.3738 0.6596 0.8122
No log 4.4737 340 0.6749 0.3738 0.6749 0.8215
No log 4.5 342 0.7093 0.4205 0.7093 0.8422
No log 4.5263 344 0.7117 0.4205 0.7117 0.8436
No log 4.5526 346 0.6941 0.4006 0.6941 0.8331
No log 4.5789 348 0.6910 0.4300 0.6910 0.8313
No log 4.6053 350 0.6969 0.3213 0.6969 0.8348
No log 4.6316 352 0.7304 0.3457 0.7304 0.8546
No log 4.6579 354 0.7320 0.3457 0.7320 0.8556
No log 4.6842 356 0.7301 0.3457 0.7301 0.8545
No log 4.7105 358 0.7235 0.3409 0.7235 0.8506
No log 4.7368 360 0.7308 0.3994 0.7308 0.8549
No log 4.7632 362 0.7209 0.3308 0.7209 0.8491
No log 4.7895 364 0.6965 0.3691 0.6965 0.8346
No log 4.8158 366 0.6841 0.3691 0.6841 0.8271
No log 4.8421 368 0.6889 0.3754 0.6889 0.8300
No log 4.8684 370 0.6966 0.3331 0.6966 0.8346
No log 4.8947 372 0.6822 0.3833 0.6822 0.8260
No log 4.9211 374 0.6652 0.4719 0.6652 0.8156
No log 4.9474 376 0.6846 0.4986 0.6846 0.8274
No log 4.9737 378 0.6658 0.4717 0.6658 0.8160
No log 5.0 380 0.7051 0.4321 0.7051 0.8397
No log 5.0263 382 0.9097 0.2752 0.9097 0.9538
No log 5.0526 384 1.0318 0.2954 1.0318 1.0158
No log 5.0789 386 0.9598 0.2987 0.9598 0.9797
No log 5.1053 388 0.7847 0.3613 0.7847 0.8858
No log 5.1316 390 0.7445 0.3871 0.7445 0.8628
No log 5.1579 392 0.7458 0.3871 0.7458 0.8636
No log 5.1842 394 0.7344 0.3871 0.7344 0.8570
No log 5.2105 396 0.6911 0.4102 0.6911 0.8313
No log 5.2368 398 0.6766 0.3725 0.6766 0.8226
No log 5.2632 400 0.6891 0.3605 0.6891 0.8302
No log 5.2895 402 0.6933 0.3331 0.6933 0.8327
No log 5.3158 404 0.7042 0.2465 0.7042 0.8392
No log 5.3421 406 0.6915 0.3780 0.6915 0.8316
No log 5.3684 408 0.6790 0.4613 0.6790 0.8240
No log 5.3947 410 0.6818 0.4613 0.6818 0.8257
No log 5.4211 412 0.6852 0.3915 0.6852 0.8278
No log 5.4474 414 0.6810 0.4338 0.6810 0.8252
No log 5.4737 416 0.6853 0.3837 0.6853 0.8278
No log 5.5 418 0.6823 0.3689 0.6823 0.8260
No log 5.5263 420 0.6722 0.3628 0.6722 0.8199
No log 5.5526 422 0.6710 0.3511 0.6710 0.8191
No log 5.5789 424 0.6700 0.3556 0.6700 0.8185
No log 5.6053 426 0.6770 0.3460 0.6770 0.8228
No log 5.6316 428 0.7006 0.3910 0.7006 0.8370
No log 5.6579 430 0.7067 0.3556 0.7067 0.8407
No log 5.6842 432 0.7136 0.3417 0.7136 0.8447
No log 5.7105 434 0.7227 0.3667 0.7227 0.8501
No log 5.7368 436 0.7244 0.4265 0.7244 0.8511
No log 5.7632 438 0.7162 0.4029 0.7162 0.8463
No log 5.7895 440 0.7211 0.3729 0.7211 0.8492
No log 5.8158 442 0.7136 0.3598 0.7136 0.8447
No log 5.8421 444 0.7800 0.3891 0.7800 0.8832
No log 5.8684 446 0.8301 0.3333 0.8301 0.9111
No log 5.8947 448 0.7777 0.4239 0.7777 0.8819
No log 5.9211 450 0.7472 0.3222 0.7472 0.8644
No log 5.9474 452 0.7339 0.3160 0.7339 0.8567
No log 5.9737 454 0.7175 0.2806 0.7175 0.8471
No log 6.0 456 0.7144 0.3964 0.7144 0.8452
No log 6.0263 458 0.7491 0.3265 0.7491 0.8655
No log 6.0526 460 0.8631 0.1763 0.8631 0.9290
No log 6.0789 462 0.8878 0.2237 0.8878 0.9422
No log 6.1053 464 0.8455 0.2499 0.8455 0.9195
No log 6.1316 466 0.7594 0.4107 0.7594 0.8715
No log 6.1579 468 0.7549 0.4106 0.7549 0.8688
No log 6.1842 470 0.7337 0.4448 0.7337 0.8565
No log 6.2105 472 0.7074 0.4137 0.7074 0.8410
No log 6.2368 474 0.8323 0.2599 0.8323 0.9123
No log 6.2632 476 0.9882 0.3015 0.9882 0.9941
No log 6.2895 478 1.0109 0.2727 1.0109 1.0054
No log 6.3158 480 0.8862 0.2806 0.8862 0.9414
No log 6.3421 482 0.7344 0.2815 0.7344 0.8570
No log 6.3684 484 0.6684 0.4762 0.6684 0.8175
No log 6.3947 486 0.6638 0.4281 0.6638 0.8147
No log 6.4211 488 0.6559 0.4281 0.6559 0.8099
No log 6.4474 490 0.6332 0.5189 0.6332 0.7957
No log 6.4737 492 0.6772 0.4161 0.6772 0.8229
No log 6.5 494 0.7425 0.2651 0.7425 0.8617
No log 6.5263 496 0.7404 0.2920 0.7404 0.8604
No log 6.5526 498 0.6942 0.3695 0.6942 0.8332
0.3706 6.5789 500 0.6603 0.4556 0.6603 0.8126
0.3706 6.6053 502 0.6625 0.4556 0.6625 0.8139
0.3706 6.6316 504 0.7388 0.3067 0.7388 0.8595
0.3706 6.6579 506 0.8375 0.2730 0.8375 0.9151
0.3706 6.6842 508 0.8253 0.2456 0.8253 0.9084
0.3706 6.7105 510 0.7209 0.3387 0.7209 0.8491

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task7_organization

Finetuned
(4019)
this model