ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6522
  • Qwk: 0.4985
  • Mse: 0.6522
  • Rmse: 0.8076

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 2.7107 -0.0084 2.7107 1.6464
No log 0.0851 4 1.4742 -0.0197 1.4742 1.2141
No log 0.1277 6 1.3844 -0.1928 1.3844 1.1766
No log 0.1702 8 1.2348 -0.0681 1.2348 1.1112
No log 0.2128 10 1.2209 0.0164 1.2209 1.1050
No log 0.2553 12 1.0702 0.1827 1.0702 1.0345
No log 0.2979 14 1.0368 0.2460 1.0368 1.0182
No log 0.3404 16 0.9869 0.1815 0.9869 0.9935
No log 0.3830 18 0.9557 -0.0462 0.9557 0.9776
No log 0.4255 20 0.9500 0.0053 0.9500 0.9747
No log 0.4681 22 0.8793 0.0078 0.8793 0.9377
No log 0.5106 24 0.7878 0.0937 0.7878 0.8876
No log 0.5532 26 0.7141 0.2063 0.7141 0.8450
No log 0.5957 28 0.6872 0.1539 0.6872 0.8290
No log 0.6383 30 0.7247 0.2041 0.7247 0.8513
No log 0.6809 32 0.7439 0.2080 0.7439 0.8625
No log 0.7234 34 0.7762 0.1372 0.7762 0.8810
No log 0.7660 36 0.8467 0.2526 0.8467 0.9202
No log 0.8085 38 0.9011 0.2736 0.9011 0.9493
No log 0.8511 40 1.0243 0.2533 1.0243 1.0121
No log 0.8936 42 0.9207 0.2613 0.9207 0.9595
No log 0.9362 44 0.7374 0.2148 0.7374 0.8587
No log 0.9787 46 0.7555 0.1294 0.7555 0.8692
No log 1.0213 48 0.7903 0.1800 0.7903 0.8890
No log 1.0638 50 0.7995 0.2096 0.7995 0.8941
No log 1.1064 52 0.7578 0.1176 0.7578 0.8705
No log 1.1489 54 0.7606 0.2681 0.7606 0.8722
No log 1.1915 56 0.8622 0.1993 0.8622 0.9285
No log 1.2340 58 0.8070 0.2550 0.8070 0.8984
No log 1.2766 60 0.7323 0.0713 0.7323 0.8557
No log 1.3191 62 0.7793 0.1489 0.7793 0.8828
No log 1.3617 64 0.7544 0.1091 0.7544 0.8686
No log 1.4043 66 0.7645 0.1492 0.7645 0.8744
No log 1.4468 68 0.8101 0.2862 0.8101 0.9001
No log 1.4894 70 0.9133 0.3230 0.9133 0.9557
No log 1.5319 72 1.0468 0.2051 1.0468 1.0232
No log 1.5745 74 0.9473 0.2530 0.9473 0.9733
No log 1.6170 76 0.8577 0.1198 0.8577 0.9261
No log 1.6596 78 0.8955 0.1528 0.8955 0.9463
No log 1.7021 80 0.8666 0.0377 0.8666 0.9309
No log 1.7447 82 0.7950 0.1440 0.7950 0.8916
No log 1.7872 84 0.7609 0.2135 0.7609 0.8723
No log 1.8298 86 0.7876 0.2148 0.7876 0.8875
No log 1.8723 88 0.8131 0.1850 0.8131 0.9017
No log 1.9149 90 0.7632 0.1386 0.7632 0.8736
No log 1.9574 92 0.7623 0.2981 0.7623 0.8731
No log 2.0 94 0.7630 0.3372 0.7630 0.8735
No log 2.0426 96 0.6914 0.2847 0.6914 0.8315
No log 2.0851 98 0.6398 0.3781 0.6398 0.7999
No log 2.1277 100 0.6529 0.4635 0.6529 0.8080
No log 2.1702 102 0.6176 0.4322 0.6176 0.7859
No log 2.2128 104 0.7422 0.3121 0.7422 0.8615
No log 2.2553 106 0.7807 0.3440 0.7807 0.8836
No log 2.2979 108 0.6868 0.3837 0.6868 0.8287
No log 2.3404 110 0.7186 0.3408 0.7186 0.8477
No log 2.3830 112 0.9050 0.3251 0.9050 0.9513
No log 2.4255 114 0.9243 0.3274 0.9243 0.9614
No log 2.4681 116 0.8808 0.3385 0.8808 0.9385
No log 2.5106 118 0.7457 0.2862 0.7457 0.8636
No log 2.5532 120 0.7351 0.2038 0.7351 0.8574
No log 2.5957 122 0.8401 0.1222 0.8401 0.9166
No log 2.6383 124 0.9322 0.0520 0.9322 0.9655
No log 2.6809 126 0.9522 0.1222 0.9522 0.9758
No log 2.7234 128 1.0040 0.1930 1.0040 1.0020
No log 2.7660 130 1.0433 0.0812 1.0433 1.0214
No log 2.8085 132 0.9830 0.1566 0.9830 0.9915
No log 2.8511 134 0.8870 0.2077 0.8870 0.9418
No log 2.8936 136 0.7904 0.2386 0.7904 0.8890
No log 2.9362 138 0.7792 0.2334 0.7792 0.8827
No log 2.9787 140 0.7673 0.3031 0.7673 0.8759
No log 3.0213 142 0.7617 0.3031 0.7617 0.8727
No log 3.0638 144 0.7750 0.2783 0.7750 0.8803
No log 3.1064 146 0.7868 0.3649 0.7868 0.8870
No log 3.1489 148 0.7824 0.3569 0.7824 0.8846
No log 3.1915 150 0.7578 0.3458 0.7578 0.8705
No log 3.2340 152 0.7320 0.4013 0.7320 0.8556
No log 3.2766 154 0.8305 0.2626 0.8305 0.9113
No log 3.3191 156 0.8641 0.3193 0.8641 0.9296
No log 3.3617 158 0.7151 0.3291 0.7151 0.8457
No log 3.4043 160 0.6852 0.3864 0.6852 0.8278
No log 3.4468 162 0.6863 0.4582 0.6863 0.8284
No log 3.4894 164 0.6712 0.4402 0.6712 0.8193
No log 3.5319 166 0.7599 0.3505 0.7599 0.8717
No log 3.5745 168 0.9594 0.2428 0.9594 0.9795
No log 3.6170 170 1.1297 0.1912 1.1297 1.0629
No log 3.6596 172 1.0968 0.1924 1.0968 1.0473
No log 3.7021 174 0.8873 0.3290 0.8873 0.9420
No log 3.7447 176 0.7754 0.3202 0.7754 0.8806
No log 3.7872 178 0.7303 0.2085 0.7303 0.8546
No log 3.8298 180 0.7193 0.2815 0.7193 0.8481
No log 3.8723 182 0.7927 0.2171 0.7927 0.8903
No log 3.9149 184 0.8186 0.2817 0.8186 0.9048
No log 3.9574 186 0.8145 0.2149 0.8145 0.9025
No log 4.0 188 0.8365 0.2471 0.8365 0.9146
No log 4.0426 190 0.8534 0.2607 0.8534 0.9238
No log 4.0851 192 0.7823 0.2574 0.7823 0.8845
No log 4.1277 194 0.7177 0.2170 0.7177 0.8472
No log 4.1702 196 0.6956 0.2574 0.6956 0.8340
No log 4.2128 198 0.6808 0.3243 0.6808 0.8251
No log 4.2553 200 0.6751 0.3502 0.6751 0.8216
No log 4.2979 202 0.6842 0.3324 0.6842 0.8272
No log 4.3404 204 0.7127 0.3763 0.7127 0.8442
No log 4.3830 206 0.6806 0.3498 0.6806 0.8250
No log 4.4255 208 0.6533 0.3837 0.6533 0.8083
No log 4.4681 210 0.6559 0.3886 0.6559 0.8099
No log 4.5106 212 0.6277 0.4681 0.6277 0.7923
No log 4.5532 214 0.6598 0.3688 0.6598 0.8123
No log 4.5957 216 0.6306 0.4278 0.6306 0.7941
No log 4.6383 218 0.6046 0.4701 0.6046 0.7776
No log 4.6809 220 0.6204 0.5227 0.6204 0.7876
No log 4.7234 222 0.6377 0.5117 0.6377 0.7986
No log 4.7660 224 0.6137 0.4898 0.6137 0.7834
No log 4.8085 226 0.6710 0.4157 0.6710 0.8191
No log 4.8511 228 0.7457 0.4265 0.7457 0.8636
No log 4.8936 230 0.6499 0.4125 0.6499 0.8061
No log 4.9362 232 0.6576 0.5538 0.6576 0.8109
No log 4.9787 234 0.7241 0.3337 0.7241 0.8509
No log 5.0213 236 0.7104 0.3157 0.7104 0.8428
No log 5.0638 238 0.6227 0.5339 0.6227 0.7891
No log 5.1064 240 0.6089 0.4569 0.6089 0.7803
No log 5.1489 242 0.6129 0.4361 0.6129 0.7829
No log 5.1915 244 0.6489 0.3930 0.6489 0.8055
No log 5.2340 246 0.7823 0.2939 0.7823 0.8845
No log 5.2766 248 0.7422 0.2915 0.7422 0.8615
No log 5.3191 250 0.6555 0.4118 0.6555 0.8097
No log 5.3617 252 0.6432 0.4022 0.6432 0.8020
No log 5.4043 254 0.6459 0.4022 0.6459 0.8037
No log 5.4468 256 0.6565 0.5117 0.6565 0.8102
No log 5.4894 258 0.6310 0.4701 0.6310 0.7943
No log 5.5319 260 0.6618 0.3793 0.6618 0.8135
No log 5.5745 262 0.6439 0.4747 0.6439 0.8024
No log 5.6170 264 0.6276 0.4747 0.6276 0.7922
No log 5.6596 266 0.6147 0.5125 0.6147 0.7840
No log 5.7021 268 0.6272 0.4697 0.6272 0.7920
No log 5.7447 270 0.7287 0.3957 0.7287 0.8537
No log 5.7872 272 0.7012 0.4375 0.7012 0.8374
No log 5.8298 274 0.6273 0.4747 0.6273 0.7920
No log 5.8723 276 0.6885 0.4642 0.6885 0.8298
No log 5.9149 278 0.6701 0.4576 0.6701 0.8186
No log 5.9574 280 0.6511 0.4820 0.6511 0.8069
No log 6.0 282 0.6749 0.4422 0.6749 0.8216
No log 6.0426 284 0.6707 0.4495 0.6707 0.8189
No log 6.0851 286 0.7197 0.4504 0.7197 0.8484
No log 6.1277 288 0.7214 0.4819 0.7214 0.8493
No log 6.1702 290 0.6972 0.4211 0.6972 0.8350
No log 6.2128 292 0.7366 0.4131 0.7366 0.8583
No log 6.2553 294 0.8379 0.2993 0.8379 0.9153
No log 6.2979 296 0.8487 0.3258 0.8487 0.9213
No log 6.3404 298 0.7299 0.3808 0.7299 0.8543
No log 6.3830 300 0.6650 0.5079 0.6650 0.8155
No log 6.4255 302 0.7218 0.4879 0.7218 0.8496
No log 6.4681 304 0.7740 0.4303 0.7740 0.8798
No log 6.5106 306 0.6735 0.5290 0.6735 0.8207
No log 6.5532 308 0.6517 0.5179 0.6517 0.8073
No log 6.5957 310 0.6361 0.5021 0.6361 0.7976
No log 6.6383 312 0.6417 0.4914 0.6417 0.8011
No log 6.6809 314 0.6424 0.4358 0.6424 0.8015
No log 6.7234 316 0.6383 0.5268 0.6383 0.7989
No log 6.7660 318 0.6499 0.3810 0.6499 0.8062
No log 6.8085 320 0.6304 0.4547 0.6304 0.7940
No log 6.8511 322 0.6357 0.4768 0.6357 0.7973
No log 6.8936 324 0.6307 0.4768 0.6307 0.7942
No log 6.9362 326 0.6352 0.4364 0.6352 0.7970
No log 6.9787 328 0.6313 0.4082 0.6313 0.7946
No log 7.0213 330 0.6390 0.5326 0.6390 0.7994
No log 7.0638 332 0.6404 0.5326 0.6404 0.8002
No log 7.1064 334 0.6407 0.4322 0.6407 0.8004
No log 7.1489 336 0.6521 0.4105 0.6521 0.8075
No log 7.1915 338 0.6833 0.4241 0.6833 0.8266
No log 7.2340 340 0.6678 0.3856 0.6678 0.8172
No log 7.2766 342 0.6363 0.4082 0.6363 0.7977
No log 7.3191 344 0.6194 0.4380 0.6194 0.7870
No log 7.3617 346 0.6097 0.4380 0.6097 0.7809
No log 7.4043 348 0.6070 0.4820 0.6070 0.7791
No log 7.4468 350 0.6031 0.4701 0.6031 0.7766
No log 7.4894 352 0.6093 0.4919 0.6093 0.7806
No log 7.5319 354 0.6094 0.4516 0.6094 0.7806
No log 7.5745 356 0.6175 0.4516 0.6175 0.7858
No log 7.6170 358 0.6619 0.4913 0.6619 0.8136
No log 7.6596 360 0.7030 0.4562 0.7030 0.8384
No log 7.7021 362 0.6711 0.4534 0.6711 0.8192
No log 7.7447 364 0.6393 0.5134 0.6393 0.7996
No log 7.7872 366 0.7175 0.4925 0.7175 0.8470
No log 7.8298 368 0.7257 0.4189 0.7257 0.8519
No log 7.8723 370 0.6662 0.4828 0.6662 0.8162
No log 7.9149 372 0.6695 0.5177 0.6695 0.8182
No log 7.9574 374 0.7480 0.4389 0.7480 0.8649
No log 8.0 376 0.7947 0.4434 0.7947 0.8914
No log 8.0426 378 0.7757 0.4822 0.7757 0.8807
No log 8.0851 380 0.7492 0.5176 0.7492 0.8656
No log 8.1277 382 0.7575 0.5411 0.7575 0.8703
No log 8.1702 384 0.7225 0.5044 0.7225 0.8500
No log 8.2128 386 0.7002 0.4955 0.7002 0.8368
No log 8.2553 388 0.7693 0.4197 0.7693 0.8771
No log 8.2979 390 0.7702 0.4056 0.7702 0.8776
No log 8.3404 392 0.7766 0.4479 0.7766 0.8812
No log 8.3830 394 0.7946 0.3870 0.7946 0.8914
No log 8.4255 396 0.8110 0.3832 0.8110 0.9006
No log 8.4681 398 0.7897 0.4299 0.7897 0.8886
No log 8.5106 400 0.7091 0.5150 0.7091 0.8421
No log 8.5532 402 0.6615 0.5245 0.6615 0.8133
No log 8.5957 404 0.6383 0.5044 0.6383 0.7989
No log 8.6383 406 0.6388 0.5568 0.6388 0.7992
No log 8.6809 408 0.6322 0.5632 0.6322 0.7951
No log 8.7234 410 0.6303 0.4495 0.6303 0.7939
No log 8.7660 412 0.6577 0.5368 0.6577 0.8110
No log 8.8085 414 0.6885 0.4841 0.6885 0.8298
No log 8.8511 416 0.7035 0.4783 0.7035 0.8388
No log 8.8936 418 0.7042 0.4353 0.7042 0.8392
No log 8.9362 420 0.6828 0.4348 0.6828 0.8263
No log 8.9787 422 0.6805 0.4496 0.6805 0.8249
No log 9.0213 424 0.7345 0.4772 0.7345 0.8570
No log 9.0638 426 0.7746 0.4199 0.7746 0.8801
No log 9.1064 428 0.7146 0.3909 0.7146 0.8454
No log 9.1489 430 0.6563 0.4758 0.6563 0.8101
No log 9.1915 432 0.6442 0.4813 0.6442 0.8026
No log 9.2340 434 0.6434 0.4915 0.6434 0.8021
No log 9.2766 436 0.6466 0.5037 0.6466 0.8041
No log 9.3191 438 0.6472 0.5190 0.6472 0.8045
No log 9.3617 440 0.6436 0.4806 0.6436 0.8023
No log 9.4043 442 0.6542 0.5174 0.6542 0.8088
No log 9.4468 444 0.6575 0.4794 0.6575 0.8109
No log 9.4894 446 0.6393 0.4806 0.6393 0.7995
No log 9.5319 448 0.6365 0.4928 0.6365 0.7978
No log 9.5745 450 0.6273 0.4928 0.6273 0.7920
No log 9.6170 452 0.6194 0.5201 0.6194 0.7870
No log 9.6596 454 0.6069 0.4757 0.6069 0.7790
No log 9.7021 456 0.6034 0.5195 0.6034 0.7768
No log 9.7447 458 0.6126 0.5268 0.6126 0.7827
No log 9.7872 460 0.6256 0.4768 0.6256 0.7910
No log 9.8298 462 0.6326 0.5076 0.6326 0.7954
No log 9.8723 464 0.6272 0.4768 0.6272 0.7920
No log 9.9149 466 0.6376 0.4637 0.6376 0.7985
No log 9.9574 468 0.6504 0.4595 0.6504 0.8064
No log 10.0 470 0.6532 0.5057 0.6532 0.8082
No log 10.0426 472 0.6629 0.5087 0.6629 0.8142
No log 10.0851 474 0.6560 0.4820 0.6560 0.8100
No log 10.1277 476 0.6624 0.4885 0.6624 0.8139
No log 10.1702 478 0.6653 0.4820 0.6653 0.8157
No log 10.2128 480 0.6643 0.4820 0.6643 0.8150
No log 10.2553 482 0.6535 0.5107 0.6535 0.8084
No log 10.2979 484 0.6455 0.4820 0.6455 0.8035
No log 10.3404 486 0.6447 0.5340 0.6447 0.8029
No log 10.3830 488 0.6486 0.5340 0.6486 0.8054
No log 10.4255 490 0.6520 0.4820 0.6520 0.8074
No log 10.4681 492 0.6497 0.5009 0.6497 0.8061
No log 10.5106 494 0.6629 0.4461 0.6629 0.8142
No log 10.5532 496 0.6579 0.4461 0.6579 0.8111
No log 10.5957 498 0.6388 0.5357 0.6388 0.7993
0.3595 10.6383 500 0.6424 0.5357 0.6424 0.8015
0.3595 10.6809 502 0.6511 0.5640 0.6511 0.8069
0.3595 10.7234 504 0.6502 0.5640 0.6502 0.8064
0.3595 10.7660 506 0.6405 0.5057 0.6405 0.8003
0.3595 10.8085 508 0.6677 0.4734 0.6677 0.8171
0.3595 10.8511 510 0.6622 0.4795 0.6622 0.8137
0.3595 10.8936 512 0.6430 0.5340 0.6430 0.8019
0.3595 10.9362 514 0.6522 0.4985 0.6522 0.8076

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task7_organization

Finetuned
(4019)
this model