ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7071
  • Qwk: 0.3567
  • Mse: 0.7071
  • Rmse: 0.8409

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 2.5163 -0.0449 2.5163 1.5863
No log 0.0909 4 1.2588 0.0726 1.2588 1.1220
No log 0.1364 6 1.2992 -0.2509 1.2992 1.1398
No log 0.1818 8 1.2875 -0.2580 1.2875 1.1347
No log 0.2273 10 1.0711 -0.0654 1.0711 1.0349
No log 0.2727 12 1.0578 -0.0687 1.0578 1.0285
No log 0.3182 14 0.9715 -0.0970 0.9715 0.9857
No log 0.3636 16 0.9314 0.1358 0.9314 0.9651
No log 0.4091 18 1.1539 0.0169 1.1539 1.0742
No log 0.4545 20 1.1202 0.0169 1.1202 1.0584
No log 0.5 22 0.8871 0.1673 0.8871 0.9419
No log 0.5455 24 0.7330 0.1372 0.7330 0.8561
No log 0.5909 26 0.7351 -0.0054 0.7351 0.8574
No log 0.6364 28 0.7844 0.0359 0.7844 0.8856
No log 0.6818 30 0.8217 0.0 0.8217 0.9065
No log 0.7273 32 0.8278 0.0 0.8278 0.9098
No log 0.7727 34 0.8465 0.0 0.8465 0.9201
No log 0.8182 36 0.8291 0.0 0.8291 0.9106
No log 0.8636 38 0.7992 -0.0027 0.7992 0.8940
No log 0.9091 40 0.7893 -0.0027 0.7893 0.8884
No log 0.9545 42 0.7824 0.1561 0.7824 0.8845
No log 1.0 44 0.7760 0.1962 0.7760 0.8809
No log 1.0455 46 0.8247 0.1327 0.8247 0.9081
No log 1.0909 48 0.8473 0.1327 0.8473 0.9205
No log 1.1364 50 0.8271 0.0846 0.8271 0.9094
No log 1.1818 52 0.8927 0.0053 0.8927 0.9448
No log 1.2273 54 0.8989 0.0944 0.8989 0.9481
No log 1.2727 56 0.8655 0.0474 0.8655 0.9303
No log 1.3182 58 0.8366 -0.0103 0.8366 0.9147
No log 1.3636 60 0.8069 0.2145 0.8069 0.8983
No log 1.4091 62 0.8049 0.2027 0.8049 0.8971
No log 1.4545 64 0.8026 0.0295 0.8026 0.8959
No log 1.5 66 0.7967 0.1807 0.7967 0.8926
No log 1.5455 68 0.7758 0.1094 0.7758 0.8808
No log 1.5909 70 0.7803 0.1660 0.7803 0.8834
No log 1.6364 72 0.8112 0.0898 0.8112 0.9007
No log 1.6818 74 0.8073 0.0455 0.8073 0.8985
No log 1.7273 76 0.7927 0.0870 0.7927 0.8904
No log 1.7727 78 0.7839 0.1904 0.7839 0.8854
No log 1.8182 80 0.7877 0.1569 0.7877 0.8875
No log 1.8636 82 0.8034 0.1942 0.8034 0.8963
No log 1.9091 84 0.8365 0.1548 0.8365 0.9146
No log 1.9545 86 0.8889 0.0569 0.8889 0.9428
No log 2.0 88 0.8669 0.0526 0.8669 0.9311
No log 2.0455 90 0.8231 0.0905 0.8231 0.9073
No log 2.0909 92 0.7989 0.0410 0.7989 0.8938
No log 2.1364 94 0.7358 0.2677 0.7358 0.8578
No log 2.1818 96 0.7302 0.2336 0.7302 0.8545
No log 2.2273 98 0.7456 0.1187 0.7456 0.8635
No log 2.2727 100 0.7493 0.1187 0.7493 0.8656
No log 2.3182 102 0.7406 0.1863 0.7406 0.8606
No log 2.3636 104 0.8554 0.2866 0.8554 0.9249
No log 2.4091 106 0.8938 0.2467 0.8938 0.9454
No log 2.4545 108 0.8187 0.3369 0.8187 0.9048
No log 2.5 110 0.8395 0.2837 0.8395 0.9162
No log 2.5455 112 0.9267 0.2373 0.9267 0.9627
No log 2.5909 114 1.1087 0.1750 1.1087 1.0530
No log 2.6364 116 1.1481 0.1775 1.1481 1.0715
No log 2.6818 118 0.9846 0.1249 0.9846 0.9922
No log 2.7273 120 0.8192 0.2550 0.8192 0.9051
No log 2.7727 122 0.7295 0.2677 0.7295 0.8541
No log 2.8182 124 0.7032 0.2677 0.7032 0.8386
No log 2.8636 126 0.6827 0.2677 0.6827 0.8263
No log 2.9091 128 0.6815 0.2336 0.6815 0.8255
No log 2.9545 130 0.6891 0.2677 0.6891 0.8301
No log 3.0 132 0.7115 0.3755 0.7115 0.8435
No log 3.0455 134 0.7761 0.3372 0.7761 0.8810
No log 3.0909 136 0.7631 0.3167 0.7631 0.8736
No log 3.1364 138 0.7020 0.3465 0.7020 0.8378
No log 3.1818 140 0.6957 0.3050 0.6957 0.8341
No log 3.2273 142 0.7284 0.2751 0.7284 0.8535
No log 3.2727 144 0.7321 0.3023 0.7321 0.8556
No log 3.3182 146 0.7140 0.2717 0.7140 0.8450
No log 3.3636 148 0.7303 0.3041 0.7303 0.8546
No log 3.4091 150 0.7290 0.3041 0.7290 0.8538
No log 3.4545 152 0.7213 0.3002 0.7213 0.8493
No log 3.5 154 0.7492 0.3226 0.7492 0.8656
No log 3.5455 156 0.7982 0.2870 0.7982 0.8934
No log 3.5909 158 0.7464 0.3457 0.7464 0.8640
No log 3.6364 160 0.7019 0.3160 0.7019 0.8378
No log 3.6818 162 0.7590 0.3976 0.7590 0.8712
No log 3.7273 164 0.7510 0.4144 0.7510 0.8666
No log 3.7727 166 0.6939 0.3816 0.6939 0.8330
No log 3.8182 168 0.6686 0.3884 0.6686 0.8177
No log 3.8636 170 0.6555 0.3467 0.6555 0.8096
No log 3.9091 172 0.6726 0.4036 0.6726 0.8201
No log 3.9545 174 0.7161 0.3902 0.7161 0.8462
No log 4.0 176 0.6888 0.4249 0.6888 0.8299
No log 4.0455 178 0.6291 0.3530 0.6291 0.7931
No log 4.0909 180 0.6162 0.3945 0.6162 0.7850
No log 4.1364 182 0.6117 0.3945 0.6117 0.7821
No log 4.1818 184 0.6020 0.4206 0.6020 0.7759
No log 4.2273 186 0.6037 0.4136 0.6037 0.7770
No log 4.2727 188 0.6105 0.3144 0.6105 0.7814
No log 4.3182 190 0.6960 0.3586 0.6960 0.8342
No log 4.3636 192 0.7893 0.3847 0.7893 0.8884
No log 4.4091 194 0.6958 0.3844 0.6958 0.8341
No log 4.4545 196 0.6025 0.4006 0.6025 0.7762
No log 4.5 198 0.6609 0.4674 0.6609 0.8130
No log 4.5455 200 0.7863 0.4527 0.7863 0.8867
No log 4.5909 202 0.7670 0.4527 0.7670 0.8758
No log 4.6364 204 0.6211 0.4315 0.6211 0.7881
No log 4.6818 206 0.6108 0.4872 0.6108 0.7815
No log 4.7273 208 0.6556 0.4243 0.6556 0.8097
No log 4.7727 210 0.6547 0.4684 0.6547 0.8091
No log 4.8182 212 0.6160 0.5157 0.6160 0.7848
No log 4.8636 214 0.6029 0.3974 0.6029 0.7765
No log 4.9091 216 0.6164 0.3902 0.6164 0.7851
No log 4.9545 218 0.6399 0.4467 0.6399 0.8000
No log 5.0 220 0.7162 0.4077 0.7162 0.8463
No log 5.0455 222 0.8434 0.3906 0.8434 0.9184
No log 5.0909 224 0.8174 0.4177 0.8174 0.9041
No log 5.1364 226 0.7233 0.4166 0.7233 0.8504
No log 5.1818 228 0.8111 0.3854 0.8111 0.9006
No log 5.2273 230 1.0463 0.2176 1.0463 1.0229
No log 5.2727 232 1.0610 0.2176 1.0610 1.0301
No log 5.3182 234 0.8820 0.3706 0.8820 0.9392
No log 5.3636 236 0.7729 0.3768 0.7729 0.8791
No log 5.4091 238 0.7088 0.4037 0.7088 0.8419
No log 5.4545 240 0.7128 0.4460 0.7128 0.8442
No log 5.5 242 0.8408 0.4183 0.8408 0.9170
No log 5.5455 244 0.8339 0.3786 0.8339 0.9132
No log 5.5909 246 0.7583 0.2917 0.7583 0.8708
No log 5.6364 248 0.7569 0.3159 0.7569 0.8700
No log 5.6818 250 0.8237 0.3727 0.8237 0.9076
No log 5.7273 252 0.7983 0.4051 0.7983 0.8935
No log 5.7727 254 0.6522 0.3914 0.6522 0.8076
No log 5.8182 256 0.5787 0.4934 0.5787 0.7608
No log 5.8636 258 0.5943 0.4478 0.5943 0.7709
No log 5.9091 260 0.6000 0.5389 0.6000 0.7746
No log 5.9545 262 0.6312 0.4464 0.6312 0.7945
No log 6.0 264 0.6809 0.4199 0.6809 0.8252
No log 6.0455 266 0.5847 0.4513 0.5847 0.7647
No log 6.0909 268 0.5319 0.5151 0.5319 0.7293
No log 6.1364 270 0.5704 0.5463 0.5704 0.7552
No log 6.1818 272 0.7008 0.3727 0.7008 0.8371
No log 6.2273 274 0.7620 0.3849 0.7620 0.8729
No log 6.2727 276 0.7243 0.4170 0.7243 0.8511
No log 6.3182 278 0.6436 0.4802 0.6436 0.8022
No log 6.3636 280 0.6284 0.4378 0.6284 0.7927
No log 6.4091 282 0.6578 0.4354 0.6578 0.8111
No log 6.4545 284 0.6571 0.4354 0.6571 0.8106
No log 6.5 286 0.6408 0.4253 0.6408 0.8005
No log 6.5455 288 0.6295 0.4364 0.6295 0.7934
No log 6.5909 290 0.6030 0.4738 0.6030 0.7765
No log 6.6364 292 0.6184 0.5098 0.6184 0.7864
No log 6.6818 294 0.6259 0.4795 0.6259 0.7912
No log 6.7273 296 0.5856 0.4966 0.5856 0.7652
No log 6.7727 298 0.5563 0.4719 0.5563 0.7458
No log 6.8182 300 0.5598 0.4820 0.5598 0.7482
No log 6.8636 302 0.5522 0.4866 0.5522 0.7431
No log 6.9091 304 0.5528 0.4459 0.5528 0.7435
No log 6.9545 306 0.5590 0.4705 0.5590 0.7476
No log 7.0 308 0.5634 0.4330 0.5634 0.7506
No log 7.0455 310 0.5686 0.3416 0.5686 0.7540
No log 7.0909 312 0.5919 0.3738 0.5919 0.7693
No log 7.1364 314 0.6200 0.4602 0.6200 0.7874
No log 7.1818 316 0.6227 0.3482 0.6227 0.7891
No log 7.2273 318 0.6299 0.3316 0.6299 0.7937
No log 7.2727 320 0.6382 0.2633 0.6382 0.7989
No log 7.3182 322 0.6405 0.2633 0.6405 0.8003
No log 7.3636 324 0.6477 0.2988 0.6477 0.8048
No log 7.4091 326 0.6404 0.3002 0.6404 0.8003
No log 7.4545 328 0.6607 0.4227 0.6607 0.8128
No log 7.5 330 0.6584 0.3934 0.6584 0.8114
No log 7.5455 332 0.6538 0.3738 0.6538 0.8086
No log 7.5909 334 0.6785 0.4356 0.6785 0.8237
No log 7.6364 336 0.7295 0.4334 0.7295 0.8541
No log 7.6818 338 0.7522 0.4369 0.7522 0.8673
No log 7.7273 340 0.6867 0.4078 0.6867 0.8287
No log 7.7727 342 0.6594 0.4322 0.6594 0.8121
No log 7.8182 344 0.6372 0.4236 0.6372 0.7982
No log 7.8636 346 0.6420 0.4838 0.6420 0.8012
No log 7.9091 348 0.6842 0.5354 0.6842 0.8272
No log 7.9545 350 0.7093 0.4993 0.7093 0.8422
No log 8.0 352 0.6506 0.6450 0.6506 0.8066
No log 8.0455 354 0.6074 0.5982 0.6074 0.7794
No log 8.0909 356 0.6151 0.5902 0.6151 0.7843
No log 8.1364 358 0.7130 0.5721 0.7130 0.8444
No log 8.1818 360 0.7500 0.5308 0.7500 0.8660
No log 8.2273 362 0.6487 0.5981 0.6487 0.8054
No log 8.2727 364 0.5886 0.5701 0.5886 0.7672
No log 8.3182 366 0.6469 0.4727 0.6469 0.8043
No log 8.3636 368 0.6638 0.4654 0.6638 0.8147
No log 8.4091 370 0.6566 0.4654 0.6566 0.8103
No log 8.4545 372 0.6392 0.4179 0.6392 0.7995
No log 8.5 374 0.6476 0.3978 0.6476 0.8047
No log 8.5455 376 0.6598 0.4074 0.6598 0.8123
No log 8.5909 378 0.6649 0.4044 0.6649 0.8154
No log 8.6364 380 0.6928 0.4261 0.6928 0.8324
No log 8.6818 382 0.6868 0.4016 0.6868 0.8287
No log 8.7273 384 0.6629 0.4221 0.6629 0.8142
No log 8.7727 386 0.6497 0.4458 0.6497 0.8060
No log 8.8182 388 0.6473 0.4516 0.6473 0.8045
No log 8.8636 390 0.6399 0.4516 0.6399 0.8000
No log 8.9091 392 0.6528 0.3972 0.6528 0.8080
No log 8.9545 394 0.7213 0.3546 0.7213 0.8493
No log 9.0 396 0.7134 0.3546 0.7134 0.8446
No log 9.0455 398 0.6859 0.3546 0.6859 0.8282
No log 9.0909 400 0.6436 0.4022 0.6436 0.8023
No log 9.1364 402 0.6462 0.4601 0.6462 0.8039
No log 9.1818 404 0.6467 0.4889 0.6467 0.8041
No log 9.2273 406 0.6165 0.4758 0.6165 0.7852
No log 9.2727 408 0.5972 0.4657 0.5972 0.7728
No log 9.3182 410 0.5852 0.5079 0.5852 0.7650
No log 9.3636 412 0.5781 0.4466 0.5781 0.7603
No log 9.4091 414 0.5756 0.4869 0.5756 0.7587
No log 9.4545 416 0.5552 0.4809 0.5552 0.7451
No log 9.5 418 0.5489 0.5344 0.5489 0.7409
No log 9.5455 420 0.5428 0.5422 0.5428 0.7368
No log 9.5909 422 0.5392 0.4615 0.5392 0.7343
No log 9.6364 424 0.5680 0.5479 0.5680 0.7537
No log 9.6818 426 0.5594 0.5479 0.5594 0.7479
No log 9.7273 428 0.5365 0.5389 0.5365 0.7325
No log 9.7727 430 0.5765 0.5239 0.5765 0.7593
No log 9.8182 432 0.6043 0.4954 0.6043 0.7774
No log 9.8636 434 0.5977 0.5488 0.5977 0.7731
No log 9.9091 436 0.6139 0.5438 0.6139 0.7835
No log 9.9545 438 0.6283 0.5411 0.6283 0.7927
No log 10.0 440 0.6486 0.5324 0.6486 0.8054
No log 10.0455 442 0.6426 0.4397 0.6426 0.8016
No log 10.0909 444 0.6371 0.3787 0.6371 0.7982
No log 10.1364 446 0.6330 0.3787 0.6330 0.7956
No log 10.1818 448 0.6463 0.3837 0.6463 0.8040
No log 10.2273 450 0.6760 0.4302 0.6760 0.8222
No log 10.2727 452 0.6944 0.4795 0.6944 0.8333
No log 10.3182 454 0.6517 0.4627 0.6517 0.8073
No log 10.3636 456 0.6390 0.4627 0.6390 0.7994
No log 10.4091 458 0.6494 0.5133 0.6494 0.8058
No log 10.4545 460 0.6960 0.4982 0.6960 0.8343
No log 10.5 462 0.7037 0.4982 0.7037 0.8388
No log 10.5455 464 0.6534 0.4982 0.6534 0.8083
No log 10.5909 466 0.5934 0.5098 0.5934 0.7703
No log 10.6364 468 0.5670 0.4874 0.5670 0.7530
No log 10.6818 470 0.5759 0.5368 0.5759 0.7589
No log 10.7273 472 0.5753 0.5051 0.5753 0.7585
No log 10.7727 474 0.5974 0.4924 0.5974 0.7729
No log 10.8182 476 0.6243 0.5103 0.6243 0.7901
No log 10.8636 478 0.6488 0.5229 0.6488 0.8055
No log 10.9091 480 0.6695 0.5321 0.6695 0.8182
No log 10.9545 482 0.7169 0.5567 0.7169 0.8467
No log 11.0 484 0.6809 0.5013 0.6809 0.8252
No log 11.0455 486 0.6341 0.5173 0.6341 0.7963
No log 11.0909 488 0.6296 0.5324 0.6296 0.7935
No log 11.1364 490 0.6582 0.4721 0.6582 0.8113
No log 11.1818 492 0.7056 0.4144 0.7056 0.8400
No log 11.2273 494 0.7135 0.4189 0.7135 0.8447
No log 11.2727 496 0.6937 0.4340 0.6937 0.8329
No log 11.3182 498 0.6759 0.4681 0.6759 0.8221
0.3569 11.3636 500 0.6712 0.4681 0.6712 0.8192
0.3569 11.4091 502 0.6780 0.4513 0.6780 0.8234
0.3569 11.4545 504 0.6702 0.4282 0.6702 0.8186
0.3569 11.5 506 0.6452 0.4393 0.6452 0.8032
0.3569 11.5455 508 0.6524 0.4393 0.6524 0.8077
0.3569 11.5909 510 0.7071 0.3567 0.7071 0.8409

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k13_task7_organization

Finetuned
(4019)
this model