ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5298
  • Qwk: 0.5022
  • Mse: 0.5298
  • Rmse: 0.7279

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 2.4361 -0.0646 2.4361 1.5608
No log 0.0909 4 1.2781 0.0679 1.2781 1.1305
No log 0.1364 6 1.1987 -0.0886 1.1987 1.0948
No log 0.1818 8 1.0693 0.1010 1.0693 1.0341
No log 0.2273 10 0.8312 0.0444 0.8312 0.9117
No log 0.2727 12 0.7905 0.1724 0.7905 0.8891
No log 0.3182 14 1.3614 0.1694 1.3614 1.1668
No log 0.3636 16 1.2133 0.1694 1.2133 1.1015
No log 0.4091 18 0.9059 0.2008 0.9059 0.9518
No log 0.4545 20 0.8375 -0.0426 0.8375 0.9152
No log 0.5 22 0.7690 0.0840 0.7690 0.8769
No log 0.5455 24 0.7958 0.1139 0.7958 0.8921
No log 0.5909 26 0.7407 0.1922 0.7407 0.8606
No log 0.6364 28 0.7684 0.0 0.7684 0.8766
No log 0.6818 30 0.8974 0.0547 0.8974 0.9473
No log 0.7273 32 1.0154 0.1313 1.0154 1.0077
No log 0.7727 34 0.9447 0.1957 0.9447 0.9719
No log 0.8182 36 0.8862 0.2331 0.8862 0.9414
No log 0.8636 38 0.7866 0.1739 0.7866 0.8869
No log 0.9091 40 0.6531 0.1942 0.6531 0.8081
No log 0.9545 42 0.6932 0.1456 0.6932 0.8326
No log 1.0 44 0.8511 0.0 0.8511 0.9225
No log 1.0455 46 0.8871 0.0 0.8871 0.9419
No log 1.0909 48 0.7932 0.0481 0.7932 0.8906
No log 1.1364 50 0.9002 0.1724 0.9002 0.9488
No log 1.1818 52 0.9945 0.2087 0.9945 0.9972
No log 1.2273 54 0.9051 0.0947 0.9051 0.9514
No log 1.2727 56 0.8769 0.1739 0.8769 0.9364
No log 1.3182 58 0.8373 0.1352 0.8373 0.9150
No log 1.3636 60 0.8862 0.2027 0.8862 0.9414
No log 1.4091 62 0.9602 0.1618 0.9602 0.9799
No log 1.4545 64 0.9607 0.2508 0.9607 0.9802
No log 1.5 66 0.8251 0.2672 0.8251 0.9083
No log 1.5455 68 0.7079 0.2324 0.7079 0.8414
No log 1.5909 70 0.7096 0.2642 0.7096 0.8424
No log 1.6364 72 0.7687 0.3094 0.7687 0.8767
No log 1.6818 74 0.9230 0.3173 0.9230 0.9608
No log 1.7273 76 1.1829 0.1453 1.1829 1.0876
No log 1.7727 78 1.2465 0.1670 1.2465 1.1165
No log 1.8182 80 1.1358 0.1924 1.1358 1.0658
No log 1.8636 82 0.9687 0.3111 0.9687 0.9842
No log 1.9091 84 0.7652 0.3013 0.7652 0.8748
No log 1.9545 86 0.6228 0.1702 0.6228 0.7892
No log 2.0 88 0.6131 0.2270 0.6131 0.7830
No log 2.0455 90 0.6173 0.2270 0.6173 0.7857
No log 2.0909 92 0.6168 0.0428 0.6168 0.7854
No log 2.1364 94 0.6222 0.0428 0.6222 0.7888
No log 2.1818 96 0.6264 0.0889 0.6264 0.7915
No log 2.2273 98 0.6197 0.0840 0.6197 0.7872
No log 2.2727 100 0.6168 0.2087 0.6168 0.7854
No log 2.3182 102 0.7209 0.2526 0.7209 0.8491
No log 2.3636 104 0.8578 0.3509 0.8578 0.9262
No log 2.4091 106 0.8921 0.3521 0.8921 0.9445
No log 2.4545 108 0.8419 0.3553 0.8419 0.9175
No log 2.5 110 0.8287 0.3695 0.8287 0.9103
No log 2.5455 112 0.7505 0.3416 0.7505 0.8663
No log 2.5909 114 0.5843 0.2443 0.5843 0.7644
No log 2.6364 116 0.5513 0.3039 0.5513 0.7425
No log 2.6818 118 0.5849 0.3832 0.5849 0.7648
No log 2.7273 120 0.5938 0.4151 0.5938 0.7706
No log 2.7727 122 0.5578 0.4339 0.5578 0.7469
No log 2.8182 124 0.5359 0.4224 0.5359 0.7320
No log 2.8636 126 0.5418 0.4224 0.5418 0.7361
No log 2.9091 128 0.5829 0.4298 0.5829 0.7635
No log 2.9545 130 0.5593 0.5044 0.5593 0.7478
No log 3.0 132 0.5682 0.5101 0.5682 0.7538
No log 3.0455 134 0.6113 0.4235 0.6113 0.7819
No log 3.0909 136 0.8492 0.3274 0.8492 0.9215
No log 3.1364 138 0.9724 0.2246 0.9724 0.9861
No log 3.1818 140 0.8844 0.3173 0.8844 0.9404
No log 3.2273 142 0.6482 0.4329 0.6482 0.8051
No log 3.2727 144 0.5954 0.4538 0.5954 0.7716
No log 3.3182 146 0.6191 0.4235 0.6191 0.7869
No log 3.3636 148 0.6293 0.4174 0.6293 0.7933
No log 3.4091 150 0.6113 0.4345 0.6113 0.7819
No log 3.4545 152 0.6401 0.4370 0.6401 0.8001
No log 3.5 154 0.6712 0.3653 0.6712 0.8192
No log 3.5455 156 0.6721 0.3695 0.6721 0.8198
No log 3.5909 158 0.7966 0.3678 0.7966 0.8925
No log 3.6364 160 0.8054 0.3183 0.8054 0.8974
No log 3.6818 162 0.6980 0.3633 0.6980 0.8354
No log 3.7273 164 0.7073 0.3633 0.7073 0.8410
No log 3.7727 166 0.9149 0.3446 0.9149 0.9565
No log 3.8182 168 1.0928 0.1737 1.0928 1.0454
No log 3.8636 170 1.0027 0.2537 1.0027 1.0014
No log 3.9091 172 0.7035 0.2758 0.7035 0.8387
No log 3.9545 174 0.6277 0.3382 0.6277 0.7923
No log 4.0 176 0.6787 0.4190 0.6787 0.8238
No log 4.0455 178 0.6251 0.4147 0.6251 0.7907
No log 4.0909 180 0.6405 0.5339 0.6405 0.8003
No log 4.1364 182 0.6977 0.4729 0.6977 0.8353
No log 4.1818 184 0.6706 0.5212 0.6706 0.8189
No log 4.2273 186 0.6194 0.5339 0.6194 0.7870
No log 4.2727 188 0.5878 0.5361 0.5878 0.7667
No log 4.3182 190 0.5902 0.5076 0.5902 0.7682
No log 4.3636 192 0.5919 0.5076 0.5919 0.7693
No log 4.4091 194 0.5871 0.5457 0.5871 0.7662
No log 4.4545 196 0.6653 0.4463 0.6653 0.8157
No log 4.5 198 0.8670 0.3805 0.8670 0.9311
No log 4.5455 200 1.0075 0.2457 1.0075 1.0037
No log 4.5909 202 0.9349 0.3567 0.9349 0.9669
No log 4.6364 204 0.8157 0.3867 0.8157 0.9032
No log 4.6818 206 0.6546 0.4489 0.6546 0.8090
No log 4.7273 208 0.6256 0.4249 0.6256 0.7909
No log 4.7727 210 0.6803 0.3230 0.6803 0.8248
No log 4.8182 212 0.6463 0.3407 0.6463 0.8039
No log 4.8636 214 0.5931 0.3688 0.5931 0.7701
No log 4.9091 216 0.5966 0.3974 0.5966 0.7724
No log 4.9545 218 0.6472 0.3936 0.6472 0.8045
No log 5.0 220 0.6551 0.4548 0.6551 0.8094
No log 5.0455 222 0.6605 0.5233 0.6605 0.8127
No log 5.0909 224 0.6475 0.5809 0.6475 0.8047
No log 5.1364 226 0.6709 0.5524 0.6709 0.8191
No log 5.1818 228 0.7837 0.4246 0.7837 0.8852
No log 5.2273 230 0.9144 0.3532 0.9144 0.9562
No log 5.2727 232 0.9463 0.3352 0.9463 0.9728
No log 5.3182 234 0.8471 0.3596 0.8471 0.9204
No log 5.3636 236 0.7058 0.5569 0.7058 0.8401
No log 5.4091 238 0.7457 0.4690 0.7457 0.8636
No log 5.4545 240 0.7626 0.4269 0.7626 0.8733
No log 5.5 242 0.7182 0.4355 0.7182 0.8475
No log 5.5455 244 0.6473 0.4589 0.6473 0.8046
No log 5.5909 246 0.5824 0.4425 0.5824 0.7631
No log 5.6364 248 0.5878 0.3970 0.5878 0.7667
No log 5.6818 250 0.5774 0.4471 0.5774 0.7599
No log 5.7273 252 0.6246 0.4589 0.6246 0.7903
No log 5.7727 254 0.6781 0.4926 0.6781 0.8235
No log 5.8182 256 0.6721 0.5061 0.6721 0.8198
No log 5.8636 258 0.6356 0.5378 0.6356 0.7973
No log 5.9091 260 0.5494 0.5970 0.5494 0.7412
No log 5.9545 262 0.5511 0.5970 0.5511 0.7423
No log 6.0 264 0.6162 0.4952 0.6162 0.7850
No log 6.0455 266 0.7660 0.4033 0.7660 0.8752
No log 6.0909 268 0.7409 0.4504 0.7409 0.8607
No log 6.1364 270 0.5719 0.5736 0.5719 0.7562
No log 6.1818 272 0.5267 0.5656 0.5267 0.7257
No log 6.2273 274 0.5501 0.6183 0.5501 0.7417
No log 6.2727 276 0.5926 0.5445 0.5926 0.7698
No log 6.3182 278 0.5826 0.5692 0.5826 0.7633
No log 6.3636 280 0.5593 0.5985 0.5593 0.7478
No log 6.4091 282 0.5736 0.5438 0.5736 0.7574
No log 6.4545 284 0.6148 0.5474 0.6148 0.7841
No log 6.5 286 0.6089 0.5543 0.6089 0.7803
No log 6.5455 288 0.5456 0.5432 0.5456 0.7386
No log 6.5909 290 0.5286 0.4949 0.5286 0.7271
No log 6.6364 292 0.5321 0.5915 0.5321 0.7295
No log 6.6818 294 0.5992 0.5650 0.5992 0.7741
No log 6.7273 296 0.6591 0.5073 0.6591 0.8119
No log 6.7727 298 0.6411 0.5387 0.6411 0.8007
No log 6.8182 300 0.5932 0.5436 0.5932 0.7702
No log 6.8636 302 0.5746 0.5358 0.5746 0.7580
No log 6.9091 304 0.6371 0.4948 0.6371 0.7982
No log 6.9545 306 0.7069 0.4784 0.7069 0.8408
No log 7.0 308 0.6625 0.4948 0.6625 0.8139
No log 7.0455 310 0.5736 0.5589 0.5736 0.7574
No log 7.0909 312 0.5286 0.5430 0.5286 0.7270
No log 7.1364 314 0.5220 0.5890 0.5220 0.7225
No log 7.1818 316 0.5372 0.5702 0.5372 0.7330
No log 7.2273 318 0.5493 0.5718 0.5493 0.7411
No log 7.2727 320 0.5274 0.5266 0.5274 0.7262
No log 7.3182 322 0.5347 0.5283 0.5347 0.7312
No log 7.3636 324 0.5385 0.4958 0.5385 0.7338
No log 7.4091 326 0.5348 0.5321 0.5348 0.7313
No log 7.4545 328 0.5323 0.5687 0.5323 0.7296
No log 7.5 330 0.5354 0.5687 0.5354 0.7317
No log 7.5455 332 0.5418 0.5687 0.5418 0.7361
No log 7.5909 334 0.5535 0.4091 0.5535 0.7440
No log 7.6364 336 0.6283 0.4576 0.6283 0.7927
No log 7.6818 338 0.6307 0.4576 0.6307 0.7942
No log 7.7273 340 0.5502 0.3919 0.5502 0.7418
No log 7.7727 342 0.5758 0.4139 0.5758 0.7588
No log 7.8182 344 0.6581 0.3573 0.6581 0.8112
No log 7.8636 346 0.6683 0.3573 0.6683 0.8175
No log 7.9091 348 0.6004 0.3616 0.6004 0.7748
No log 7.9545 350 0.5198 0.5574 0.5198 0.7209
No log 8.0 352 0.5091 0.5286 0.5091 0.7135
No log 8.0455 354 0.5042 0.5475 0.5042 0.7101
No log 8.0909 356 0.5675 0.5718 0.5675 0.7533
No log 8.1364 358 0.6121 0.5178 0.6121 0.7824
No log 8.1818 360 0.6253 0.4825 0.6253 0.7907
No log 8.2273 362 0.5507 0.5731 0.5507 0.7421
No log 8.2727 364 0.5136 0.5930 0.5136 0.7167
No log 8.3182 366 0.5210 0.5846 0.5210 0.7218
No log 8.3636 368 0.5876 0.5152 0.5876 0.7666
No log 8.4091 370 0.7400 0.4615 0.7400 0.8602
No log 8.4545 372 0.7526 0.4297 0.7526 0.8675
No log 8.5 374 0.6356 0.5363 0.6356 0.7972
No log 8.5455 376 0.5599 0.5846 0.5599 0.7483
No log 8.5909 378 0.5564 0.5930 0.5564 0.7459
No log 8.6364 380 0.5594 0.5765 0.5594 0.7479
No log 8.6818 382 0.5958 0.4850 0.5958 0.7719
No log 8.7273 384 0.6271 0.4575 0.6271 0.7919
No log 8.7727 386 0.5981 0.4918 0.5981 0.7734
No log 8.8182 388 0.5585 0.5319 0.5585 0.7473
No log 8.8636 390 0.5599 0.5993 0.5599 0.7483
No log 8.9091 392 0.5715 0.5679 0.5715 0.7560
No log 8.9545 394 0.5458 0.6185 0.5458 0.7388
No log 9.0 396 0.5322 0.5319 0.5322 0.7295
No log 9.0455 398 0.5953 0.5030 0.5953 0.7716
No log 9.0909 400 0.7279 0.4161 0.7279 0.8532
No log 9.1364 402 0.7800 0.3760 0.7800 0.8832
No log 9.1818 404 0.7368 0.3085 0.7368 0.8584
No log 9.2273 406 0.6375 0.3620 0.6375 0.7984
No log 9.2727 408 0.5683 0.3093 0.5683 0.7539
No log 9.3182 410 0.5516 0.4561 0.5516 0.7427
No log 9.3636 412 0.5423 0.4847 0.5423 0.7364
No log 9.4091 414 0.5457 0.4538 0.5457 0.7387
No log 9.4545 416 0.5995 0.4694 0.5995 0.7743
No log 9.5 418 0.6254 0.4633 0.6254 0.7908
No log 9.5455 420 0.5780 0.5250 0.5780 0.7603
No log 9.5909 422 0.5698 0.5860 0.5698 0.7548
No log 9.6364 424 0.6192 0.4587 0.6192 0.7869
No log 9.6818 426 0.6179 0.4371 0.6179 0.7860
No log 9.7273 428 0.5558 0.5440 0.5558 0.7455
No log 9.7727 430 0.5442 0.5617 0.5442 0.7377
No log 9.8182 432 0.5865 0.5230 0.5865 0.7658
No log 9.8636 434 0.6123 0.4652 0.6123 0.7825
No log 9.9091 436 0.5924 0.4652 0.5924 0.7697
No log 9.9545 438 0.5664 0.5144 0.5664 0.7526
No log 10.0 440 0.5577 0.5212 0.5577 0.7468
No log 10.0455 442 0.5693 0.5300 0.5693 0.7545
No log 10.0909 444 0.5605 0.5300 0.5605 0.7487
No log 10.1364 446 0.5353 0.5702 0.5353 0.7316
No log 10.1818 448 0.5129 0.5781 0.5129 0.7162
No log 10.2273 450 0.5004 0.5750 0.5004 0.7074
No log 10.2727 452 0.4951 0.5750 0.4951 0.7036
No log 10.3182 454 0.4963 0.5750 0.4963 0.7045
No log 10.3636 456 0.5402 0.5438 0.5402 0.7350
No log 10.4091 458 0.5731 0.5101 0.5731 0.7570
No log 10.4545 460 0.5519 0.5317 0.5519 0.7429
No log 10.5 462 0.5298 0.5373 0.5298 0.7279
No log 10.5455 464 0.5188 0.5397 0.5188 0.7203
No log 10.5909 466 0.5111 0.5457 0.5111 0.7149
No log 10.6364 468 0.5081 0.5440 0.5081 0.7128
No log 10.6818 470 0.5203 0.5343 0.5203 0.7213
No log 10.7273 472 0.5496 0.5320 0.5496 0.7414
No log 10.7727 474 0.5399 0.5398 0.5399 0.7347
No log 10.8182 476 0.5103 0.5065 0.5103 0.7144
No log 10.8636 478 0.5073 0.4768 0.5073 0.7122
No log 10.9091 480 0.5075 0.4768 0.5075 0.7124
No log 10.9545 482 0.5067 0.5022 0.5067 0.7118
No log 11.0 484 0.5078 0.6183 0.5078 0.7126
No log 11.0455 486 0.5279 0.5943 0.5279 0.7266
No log 11.0909 488 0.5609 0.5014 0.5609 0.7489
No log 11.1364 490 0.5743 0.5014 0.5743 0.7578
No log 11.1818 492 0.5594 0.5751 0.5594 0.7479
No log 11.2273 494 0.5376 0.5750 0.5376 0.7332
No log 11.2727 496 0.5338 0.4768 0.5338 0.7306
No log 11.3182 498 0.5258 0.5326 0.5258 0.7251
0.3341 11.3636 500 0.5254 0.5022 0.5254 0.7248
0.3341 11.4091 502 0.5339 0.5430 0.5339 0.7307
0.3341 11.4545 504 0.5455 0.5750 0.5455 0.7386
0.3341 11.5 506 0.5462 0.5970 0.5462 0.7391
0.3341 11.5455 508 0.5342 0.5357 0.5342 0.7309
0.3341 11.5909 510 0.5334 0.5357 0.5334 0.7303
0.3341 11.6364 512 0.5309 0.4953 0.5309 0.7286
0.3341 11.6818 514 0.5332 0.5022 0.5332 0.7302
0.3341 11.7273 516 0.5312 0.5022 0.5312 0.7288
0.3341 11.7727 518 0.5379 0.4768 0.5379 0.7334
0.3341 11.8182 520 0.5298 0.5022 0.5298 0.7279

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task7_organization

Finetuned
(4019)
this model