ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k9_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7808
  • Qwk: 0.5610
  • Mse: 0.7808
  • Rmse: 0.8836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 3.9224 0.0118 3.9224 1.9805
No log 0.0784 4 1.8528 0.0110 1.8528 1.3612
No log 0.1176 6 1.0945 0.0345 1.0945 1.0462
No log 0.1569 8 1.0320 -0.1331 1.0320 1.0159
No log 0.1961 10 0.8082 0.0959 0.8082 0.8990
No log 0.2353 12 0.6971 0.1687 0.6971 0.8349
No log 0.2745 14 0.7217 0.1546 0.7217 0.8495
No log 0.3137 16 0.7418 0.1442 0.7418 0.8613
No log 0.3529 18 0.7112 0.1442 0.7112 0.8433
No log 0.3922 20 0.7243 0.1063 0.7243 0.8510
No log 0.4314 22 0.6721 0.1519 0.6721 0.8198
No log 0.4706 24 0.6862 0.2819 0.6862 0.8284
No log 0.5098 26 0.6899 0.2769 0.6899 0.8306
No log 0.5490 28 0.7276 0.2172 0.7276 0.8530
No log 0.5882 30 0.6812 0.2844 0.6812 0.8253
No log 0.6275 32 0.6298 0.2736 0.6298 0.7936
No log 0.6667 34 0.6353 0.2917 0.6353 0.7971
No log 0.7059 36 0.6113 0.2061 0.6113 0.7819
No log 0.7451 38 0.6471 0.1916 0.6471 0.8044
No log 0.7843 40 0.6222 0.2187 0.6222 0.7888
No log 0.8235 42 0.5712 0.3720 0.5712 0.7557
No log 0.8627 44 0.5698 0.4017 0.5698 0.7548
No log 0.9020 46 0.5751 0.3797 0.5751 0.7583
No log 0.9412 48 0.5798 0.3963 0.5798 0.7614
No log 0.9804 50 0.5927 0.4480 0.5927 0.7699
No log 1.0196 52 0.5791 0.5126 0.5791 0.7610
No log 1.0588 54 0.6074 0.4965 0.6074 0.7794
No log 1.0980 56 0.5954 0.5599 0.5954 0.7716
No log 1.1373 58 0.5896 0.5893 0.5896 0.7679
No log 1.1765 60 0.5690 0.5747 0.5690 0.7543
No log 1.2157 62 0.5699 0.5317 0.5699 0.7549
No log 1.2549 64 0.6036 0.5496 0.6036 0.7769
No log 1.2941 66 0.6501 0.5417 0.6501 0.8063
No log 1.3333 68 0.8347 0.4263 0.8347 0.9136
No log 1.3725 70 0.8790 0.4263 0.8790 0.9375
No log 1.4118 72 0.6891 0.5279 0.6891 0.8301
No log 1.4510 74 0.6067 0.4550 0.6067 0.7789
No log 1.4902 76 0.6224 0.4465 0.6224 0.7889
No log 1.5294 78 0.6810 0.5372 0.6810 0.8252
No log 1.5686 80 0.6555 0.5463 0.6555 0.8096
No log 1.6078 82 0.6035 0.5825 0.6035 0.7769
No log 1.6471 84 0.6548 0.5487 0.6548 0.8092
No log 1.6863 86 0.6936 0.5164 0.6936 0.8328
No log 1.7255 88 0.7113 0.5333 0.7113 0.8434
No log 1.7647 90 0.6886 0.5220 0.6886 0.8298
No log 1.8039 92 0.7518 0.5392 0.7518 0.8670
No log 1.8431 94 0.8957 0.4893 0.8957 0.9464
No log 1.8824 96 0.9263 0.4703 0.9263 0.9625
No log 1.9216 98 0.8958 0.5074 0.8958 0.9465
No log 1.9608 100 0.8217 0.4877 0.8217 0.9065
No log 2.0 102 0.8092 0.5057 0.8092 0.8995
No log 2.0392 104 0.8030 0.5112 0.8030 0.8961
No log 2.0784 106 0.7796 0.5473 0.7796 0.8830
No log 2.1176 108 0.8669 0.4661 0.8669 0.9311
No log 2.1569 110 0.8875 0.4373 0.8875 0.9421
No log 2.1961 112 0.8242 0.4956 0.8242 0.9079
No log 2.2353 114 0.7421 0.4994 0.7421 0.8614
No log 2.2745 116 0.7375 0.4783 0.7375 0.8588
No log 2.3137 118 0.8004 0.4803 0.8004 0.8947
No log 2.3529 120 1.0073 0.4297 1.0073 1.0037
No log 2.3922 122 1.1968 0.3638 1.1968 1.0940
No log 2.4314 124 1.2199 0.3788 1.2199 1.1045
No log 2.4706 126 0.9808 0.4139 0.9808 0.9903
No log 2.5098 128 0.9424 0.4588 0.9424 0.9708
No log 2.5490 130 0.8569 0.5 0.8569 0.9257
No log 2.5882 132 0.8433 0.4905 0.8433 0.9183
No log 2.6275 134 0.8859 0.4965 0.8859 0.9412
No log 2.6667 136 1.0319 0.3992 1.0319 1.0158
No log 2.7059 138 1.1034 0.4192 1.1034 1.0504
No log 2.7451 140 1.0218 0.4326 1.0218 1.0108
No log 2.7843 142 0.8553 0.4995 0.8553 0.9248
No log 2.8235 144 0.7877 0.5110 0.7877 0.8875
No log 2.8627 146 0.7771 0.5174 0.7771 0.8815
No log 2.9020 148 0.8267 0.4945 0.8267 0.9092
No log 2.9412 150 1.0159 0.4567 1.0159 1.0079
No log 2.9804 152 1.1785 0.4557 1.1785 1.0856
No log 3.0196 154 1.2135 0.4357 1.2135 1.1016
No log 3.0588 156 1.1054 0.4600 1.1054 1.0514
No log 3.0980 158 1.0853 0.4606 1.0853 1.0418
No log 3.1373 160 1.0354 0.4658 1.0354 1.0176
No log 3.1765 162 0.8970 0.5055 0.8970 0.9471
No log 3.2157 164 0.9070 0.5455 0.9070 0.9523
No log 3.2549 166 0.9465 0.4760 0.9465 0.9729
No log 3.2941 168 1.0594 0.4823 1.0594 1.0293
No log 3.3333 170 1.1758 0.4837 1.1758 1.0844
No log 3.3725 172 1.0939 0.4754 1.0939 1.0459
No log 3.4118 174 0.9155 0.4699 0.9155 0.9568
No log 3.4510 176 0.8445 0.5428 0.8445 0.9190
No log 3.4902 178 0.8572 0.5184 0.8572 0.9258
No log 3.5294 180 0.8830 0.5025 0.8830 0.9397
No log 3.5686 182 0.8203 0.5197 0.8203 0.9057
No log 3.6078 184 0.7199 0.5470 0.7199 0.8485
No log 3.6471 186 0.6967 0.5638 0.6967 0.8347
No log 3.6863 188 0.6992 0.5637 0.6992 0.8362
No log 3.7255 190 0.7196 0.5650 0.7196 0.8483
No log 3.7647 192 0.7607 0.5453 0.7607 0.8722
No log 3.8039 194 0.8900 0.5225 0.8900 0.9434
No log 3.8431 196 1.0379 0.4773 1.0379 1.0188
No log 3.8824 198 1.1053 0.4671 1.1053 1.0513
No log 3.9216 200 1.2521 0.4766 1.2521 1.1190
No log 3.9608 202 1.2307 0.4709 1.2307 1.1094
No log 4.0 204 1.1470 0.4685 1.1470 1.0710
No log 4.0392 206 1.1631 0.4600 1.1631 1.0785
No log 4.0784 208 1.2462 0.4672 1.2462 1.1163
No log 4.1176 210 1.2161 0.4672 1.2161 1.1028
No log 4.1569 212 1.1318 0.4943 1.1318 1.0639
No log 4.1961 214 0.9474 0.4685 0.9474 0.9733
No log 4.2353 216 0.8542 0.5239 0.8542 0.9242
No log 4.2745 218 0.8438 0.5235 0.8438 0.9186
No log 4.3137 220 0.7958 0.5231 0.7958 0.8921
No log 4.3529 222 0.8266 0.5217 0.8266 0.9092
No log 4.3922 224 0.9384 0.4636 0.9384 0.9687
No log 4.4314 226 0.9236 0.4359 0.9236 0.9610
No log 4.4706 228 0.8514 0.4239 0.8514 0.9227
No log 4.5098 230 0.8152 0.4740 0.8152 0.9029
No log 4.5490 232 0.8277 0.4548 0.8277 0.9098
No log 4.5882 234 0.8650 0.4727 0.8650 0.9300
No log 4.6275 236 0.8463 0.5021 0.8463 0.9200
No log 4.6667 238 0.8042 0.5245 0.8042 0.8968
No log 4.7059 240 0.7945 0.5241 0.7945 0.8913
No log 4.7451 242 0.7605 0.5179 0.7605 0.8721
No log 4.7843 244 0.7085 0.5053 0.7085 0.8417
No log 4.8235 246 0.6991 0.4979 0.6991 0.8361
No log 4.8627 248 0.7247 0.5136 0.7247 0.8513
No log 4.9020 250 0.7501 0.4965 0.7501 0.8661
No log 4.9412 252 0.7835 0.4668 0.7835 0.8852
No log 4.9804 254 0.7970 0.5051 0.7970 0.8927
No log 5.0196 256 0.8089 0.5147 0.8089 0.8994
No log 5.0588 258 0.8148 0.5082 0.8148 0.9027
No log 5.0980 260 0.8142 0.4993 0.8142 0.9023
No log 5.1373 262 0.7984 0.5358 0.7984 0.8935
No log 5.1765 264 0.8140 0.5361 0.8140 0.9022
No log 5.2157 266 0.8129 0.5389 0.8129 0.9016
No log 5.2549 268 0.8120 0.5168 0.8120 0.9011
No log 5.2941 270 0.8349 0.5133 0.8349 0.9137
No log 5.3333 272 0.8874 0.5053 0.8874 0.9420
No log 5.3725 274 0.9353 0.5061 0.9353 0.9671
No log 5.4118 276 0.9361 0.5038 0.9361 0.9675
No log 5.4510 278 0.8981 0.5125 0.8981 0.9477
No log 5.4902 280 0.8392 0.5208 0.8392 0.9161
No log 5.5294 282 0.7760 0.5512 0.7760 0.8809
No log 5.5686 284 0.7563 0.5663 0.7563 0.8696
No log 5.6078 286 0.7675 0.5356 0.7675 0.8761
No log 5.6471 288 0.8124 0.5286 0.8124 0.9014
No log 5.6863 290 0.8113 0.5405 0.8113 0.9007
No log 5.7255 292 0.7860 0.5523 0.7860 0.8865
No log 5.7647 294 0.7394 0.5373 0.7394 0.8599
No log 5.8039 296 0.7211 0.5706 0.7211 0.8492
No log 5.8431 298 0.7075 0.5567 0.7075 0.8411
No log 5.8824 300 0.7028 0.5311 0.7028 0.8384
No log 5.9216 302 0.7212 0.5872 0.7212 0.8492
No log 5.9608 304 0.7914 0.5115 0.7914 0.8896
No log 6.0 306 0.9307 0.4280 0.9307 0.9647
No log 6.0392 308 1.0129 0.4247 1.0129 1.0064
No log 6.0784 310 0.9529 0.4296 0.9529 0.9762
No log 6.1176 312 0.8220 0.4393 0.8220 0.9066
No log 6.1569 314 0.7037 0.5220 0.7037 0.8389
No log 6.1961 316 0.6758 0.5406 0.6758 0.8221
No log 6.2353 318 0.6739 0.5472 0.6739 0.8209
No log 6.2745 320 0.6973 0.5800 0.6973 0.8350
No log 6.3137 322 0.7692 0.5336 0.7692 0.8771
No log 6.3529 324 0.8105 0.5418 0.8105 0.9003
No log 6.3922 326 0.8381 0.5206 0.8381 0.9155
No log 6.4314 328 0.8233 0.5287 0.8233 0.9074
No log 6.4706 330 0.8173 0.5248 0.8173 0.9041
No log 6.5098 332 0.7951 0.5359 0.7951 0.8917
No log 6.5490 334 0.7806 0.5588 0.7806 0.8835
No log 6.5882 336 0.7976 0.5089 0.7976 0.8931
No log 6.6275 338 0.8523 0.5089 0.8523 0.9232
No log 6.6667 340 0.8979 0.4993 0.8979 0.9476
No log 6.7059 342 0.8675 0.5068 0.8675 0.9314
No log 6.7451 344 0.8218 0.5089 0.8218 0.9065
No log 6.7843 346 0.7731 0.5439 0.7731 0.8792
No log 6.8235 348 0.7531 0.5235 0.7531 0.8678
No log 6.8627 350 0.7369 0.5265 0.7369 0.8584
No log 6.9020 352 0.7398 0.5181 0.7398 0.8601
No log 6.9412 354 0.7494 0.5185 0.7494 0.8657
No log 6.9804 356 0.7850 0.5140 0.7850 0.8860
No log 7.0196 358 0.8126 0.4809 0.8126 0.9015
No log 7.0588 360 0.8178 0.4815 0.8178 0.9043
No log 7.0980 362 0.8184 0.4815 0.8184 0.9047
No log 7.1373 364 0.8077 0.4817 0.8077 0.8987
No log 7.1765 366 0.7922 0.5145 0.7922 0.8901
No log 7.2157 368 0.7951 0.5201 0.7951 0.8917
No log 7.2549 370 0.8176 0.4968 0.8176 0.9042
No log 7.2941 372 0.8318 0.5059 0.8318 0.9121
No log 7.3333 374 0.8443 0.4873 0.8443 0.9188
No log 7.3725 376 0.8397 0.4873 0.8397 0.9164
No log 7.4118 378 0.8305 0.4879 0.8305 0.9113
No log 7.4510 380 0.8037 0.5391 0.8037 0.8965
No log 7.4902 382 0.7506 0.5658 0.7506 0.8664
No log 7.5294 384 0.7152 0.5159 0.7152 0.8457
No log 7.5686 386 0.7121 0.5159 0.7121 0.8438
No log 7.6078 388 0.7300 0.5380 0.7300 0.8544
No log 7.6471 390 0.7916 0.5494 0.7916 0.8897
No log 7.6863 392 0.8561 0.4835 0.8561 0.9252
No log 7.7255 394 0.8862 0.4819 0.8862 0.9414
No log 7.7647 396 0.8794 0.4877 0.8794 0.9378
No log 7.8039 398 0.8536 0.4952 0.8536 0.9239
No log 7.8431 400 0.8074 0.5472 0.8074 0.8986
No log 7.8824 402 0.7858 0.5420 0.7858 0.8865
No log 7.9216 404 0.7952 0.5497 0.7952 0.8918
No log 7.9608 406 0.8022 0.5376 0.8022 0.8957
No log 8.0 408 0.8165 0.5424 0.8165 0.9036
No log 8.0392 410 0.8106 0.5346 0.8106 0.9003
No log 8.0784 412 0.8166 0.5113 0.8166 0.9037
No log 8.1176 414 0.8417 0.5021 0.8417 0.9175
No log 8.1569 416 0.8304 0.5036 0.8304 0.9113
No log 8.1961 418 0.8121 0.5130 0.8121 0.9012
No log 8.2353 420 0.7944 0.5328 0.7944 0.8913
No log 8.2745 422 0.7737 0.5414 0.7737 0.8796
No log 8.3137 424 0.7527 0.5534 0.7527 0.8676
No log 8.3529 426 0.7415 0.5548 0.7415 0.8611
No log 8.3922 428 0.7270 0.5437 0.7270 0.8526
No log 8.4314 430 0.7259 0.5437 0.7259 0.8520
No log 8.4706 432 0.7262 0.5437 0.7262 0.8522
No log 8.5098 434 0.7179 0.5591 0.7179 0.8473
No log 8.5490 436 0.7210 0.5410 0.7210 0.8491
No log 8.5882 438 0.7399 0.5481 0.7399 0.8602
No log 8.6275 440 0.7739 0.5321 0.7739 0.8797
No log 8.6667 442 0.7967 0.5304 0.7967 0.8926
No log 8.7059 444 0.8079 0.5396 0.8079 0.8989
No log 8.7451 446 0.8117 0.5396 0.8117 0.9009
No log 8.7843 448 0.8159 0.5396 0.8159 0.9033
No log 8.8235 450 0.8308 0.5379 0.8308 0.9115
No log 8.8627 452 0.8589 0.5106 0.8589 0.9268
No log 8.9020 454 0.8729 0.4828 0.8729 0.9343
No log 8.9412 456 0.8727 0.4828 0.8727 0.9342
No log 8.9804 458 0.8547 0.4828 0.8547 0.9245
No log 9.0196 460 0.8302 0.5329 0.8302 0.9112
No log 9.0588 462 0.8030 0.5333 0.8030 0.8961
No log 9.0980 464 0.7785 0.5546 0.7785 0.8823
No log 9.1373 466 0.7647 0.5351 0.7647 0.8745
No log 9.1765 468 0.7574 0.5306 0.7574 0.8703
No log 9.2157 470 0.7487 0.5385 0.7487 0.8653
No log 9.2549 472 0.7463 0.5340 0.7463 0.8639
No log 9.2941 474 0.7511 0.5306 0.7511 0.8667
No log 9.3333 476 0.7565 0.5238 0.7565 0.8697
No log 9.3725 478 0.7654 0.5442 0.7654 0.8749
No log 9.4118 480 0.7745 0.5442 0.7745 0.8801
No log 9.4510 482 0.7851 0.5532 0.7851 0.8861
No log 9.4902 484 0.7937 0.5333 0.7937 0.8909
No log 9.5294 486 0.7990 0.5371 0.7990 0.8938
No log 9.5686 488 0.7976 0.5371 0.7976 0.8931
No log 9.6078 490 0.7919 0.5457 0.7919 0.8899
No log 9.6471 492 0.7886 0.5532 0.7886 0.8880
No log 9.6863 494 0.7859 0.5532 0.7859 0.8865
No log 9.7255 496 0.7859 0.5532 0.7859 0.8865
No log 9.7647 498 0.7859 0.5532 0.7859 0.8865
0.4377 9.8039 500 0.7850 0.5597 0.7850 0.8860
0.4377 9.8431 502 0.7829 0.5597 0.7829 0.8848
0.4377 9.8824 504 0.7815 0.5661 0.7815 0.8840
0.4377 9.9216 506 0.7811 0.5661 0.7811 0.8838
0.4377 9.9608 508 0.7811 0.5610 0.7811 0.8838
0.4377 10.0 510 0.7808 0.5610 0.7808 0.8836

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k9_task2_organization

Finetuned
(4023)
this model