ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7467
  • Qwk: 0.5344
  • Mse: 0.7467
  • Rmse: 0.8641

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 3.8919 -0.0078 3.8919 1.9728
No log 0.0645 4 1.8864 0.0934 1.8864 1.3735
No log 0.0968 6 1.0609 0.0424 1.0609 1.0300
No log 0.1290 8 0.8590 0.0231 0.8590 0.9268
No log 0.1613 10 1.0649 -0.0083 1.0649 1.0320
No log 0.1935 12 0.9097 0.1245 0.9097 0.9538
No log 0.2258 14 0.9166 0.1317 0.9166 0.9574
No log 0.2581 16 0.8468 0.1771 0.8468 0.9202
No log 0.2903 18 0.9637 0.1208 0.9637 0.9817
No log 0.3226 20 1.1228 0.2395 1.1228 1.0596
No log 0.3548 22 0.7706 0.2656 0.7706 0.8778
No log 0.3871 24 0.6853 0.3517 0.6853 0.8278
No log 0.4194 26 0.6947 0.3544 0.6947 0.8335
No log 0.4516 28 0.6540 0.3667 0.6540 0.8087
No log 0.4839 30 0.5989 0.3241 0.5989 0.7739
No log 0.5161 32 0.7971 0.2919 0.7971 0.8928
No log 0.5484 34 0.8876 0.1924 0.8876 0.9421
No log 0.5806 36 0.6806 0.3048 0.6806 0.8250
No log 0.6129 38 0.5760 0.4245 0.5760 0.7590
No log 0.6452 40 0.7297 0.3690 0.7297 0.8542
No log 0.6774 42 0.7605 0.4037 0.7605 0.8720
No log 0.7097 44 0.6425 0.4246 0.6425 0.8016
No log 0.7419 46 0.6061 0.3831 0.6061 0.7785
No log 0.7742 48 0.6417 0.4087 0.6417 0.8010
No log 0.8065 50 0.7108 0.4060 0.7108 0.8431
No log 0.8387 52 0.7481 0.4088 0.7481 0.8649
No log 0.8710 54 0.6734 0.4847 0.6734 0.8206
No log 0.9032 56 0.6463 0.5351 0.6463 0.8040
No log 0.9355 58 0.7559 0.4956 0.7559 0.8694
No log 0.9677 60 0.7451 0.5417 0.7451 0.8632
No log 1.0 62 0.6228 0.4723 0.6228 0.7892
No log 1.0323 64 0.5917 0.4946 0.5917 0.7692
No log 1.0645 66 0.5973 0.4909 0.5973 0.7728
No log 1.0968 68 0.6866 0.5434 0.6866 0.8286
No log 1.1290 70 0.9867 0.4700 0.9867 0.9933
No log 1.1613 72 1.0941 0.4308 1.0941 1.0460
No log 1.1935 74 0.8806 0.4961 0.8806 0.9384
No log 1.2258 76 0.6552 0.5009 0.6552 0.8094
No log 1.2581 78 0.6475 0.5097 0.6475 0.8047
No log 1.2903 80 0.7787 0.5517 0.7787 0.8824
No log 1.3226 82 0.9123 0.4872 0.9123 0.9552
No log 1.3548 84 1.0899 0.4306 1.0899 1.0440
No log 1.3871 86 0.9183 0.4878 0.9183 0.9583
No log 1.4194 88 0.6690 0.6047 0.6690 0.8179
No log 1.4516 90 0.6800 0.5668 0.6800 0.8246
No log 1.4839 92 0.6482 0.5465 0.6482 0.8051
No log 1.5161 94 0.7494 0.5558 0.7494 0.8657
No log 1.5484 96 0.7873 0.5529 0.7873 0.8873
No log 1.5806 98 1.0576 0.4620 1.0576 1.0284
No log 1.6129 100 1.1500 0.4341 1.1500 1.0724
No log 1.6452 102 0.9781 0.5342 0.9781 0.9890
No log 1.6774 104 0.6616 0.5724 0.6616 0.8134
No log 1.7097 106 0.7048 0.5041 0.7048 0.8395
No log 1.7419 108 0.5994 0.5228 0.5994 0.7742
No log 1.7742 110 0.5772 0.5833 0.5772 0.7598
No log 1.8065 112 0.9010 0.4546 0.9010 0.9492
No log 1.8387 114 0.9083 0.4399 0.9083 0.9530
No log 1.8710 116 0.6567 0.5250 0.6567 0.8104
No log 1.9032 118 0.4979 0.4924 0.4979 0.7056
No log 1.9355 120 0.5917 0.4498 0.5917 0.7692
No log 1.9677 122 0.6567 0.4196 0.6567 0.8104
No log 2.0 124 0.5930 0.4750 0.5930 0.7701
No log 2.0323 126 0.6872 0.5130 0.6872 0.8290
No log 2.0645 128 0.9774 0.5075 0.9774 0.9887
No log 2.0968 130 0.9859 0.4910 0.9859 0.9929
No log 2.1290 132 0.9797 0.5109 0.9797 0.9898
No log 2.1613 134 0.8621 0.5338 0.8621 0.9285
No log 2.1935 136 0.9533 0.4716 0.9533 0.9764
No log 2.2258 138 0.9438 0.4830 0.9438 0.9715
No log 2.2581 140 0.8474 0.5041 0.8474 0.9206
No log 2.2903 142 0.8078 0.5174 0.8078 0.8988
No log 2.3226 144 0.9897 0.4747 0.9897 0.9948
No log 2.3548 146 1.0384 0.4521 1.0384 1.0190
No log 2.3871 148 0.8664 0.5105 0.8664 0.9308
No log 2.4194 150 0.6845 0.4553 0.6845 0.8273
No log 2.4516 152 0.6278 0.4495 0.6278 0.7923
No log 2.4839 154 0.6523 0.4894 0.6523 0.8076
No log 2.5161 156 0.8259 0.4629 0.8259 0.9088
No log 2.5484 158 0.8951 0.4317 0.8951 0.9461
No log 2.5806 160 0.8689 0.4447 0.8689 0.9321
No log 2.6129 162 0.7335 0.4991 0.7335 0.8564
No log 2.6452 164 0.6412 0.4988 0.6412 0.8007
No log 2.6774 166 0.6658 0.5250 0.6658 0.8160
No log 2.7097 168 0.7467 0.5 0.7467 0.8641
No log 2.7419 170 0.8126 0.5075 0.8126 0.9014
No log 2.7742 172 0.8783 0.5133 0.8783 0.9372
No log 2.8065 174 1.0818 0.4787 1.0818 1.0401
No log 2.8387 176 1.2149 0.4627 1.2149 1.1022
No log 2.8710 178 1.0101 0.4872 1.0101 1.0050
No log 2.9032 180 0.8026 0.5037 0.8026 0.8959
No log 2.9355 182 0.7967 0.5225 0.7967 0.8926
No log 2.9677 184 0.9351 0.4898 0.9351 0.9670
No log 3.0 186 1.0268 0.4555 1.0268 1.0133
No log 3.0323 188 0.9654 0.4881 0.9654 0.9826
No log 3.0645 190 0.8468 0.4819 0.8468 0.9202
No log 3.0968 192 0.8943 0.4967 0.8943 0.9457
No log 3.1290 194 1.0725 0.4569 1.0725 1.0356
No log 3.1613 196 1.1496 0.4698 1.1496 1.0722
No log 3.1935 198 0.9674 0.4578 0.9674 0.9836
No log 3.2258 200 0.9129 0.4906 0.9129 0.9555
No log 3.2581 202 0.9164 0.4898 0.9164 0.9573
No log 3.2903 204 1.0699 0.4744 1.0699 1.0343
No log 3.3226 206 1.2021 0.4628 1.2021 1.0964
No log 3.3548 208 1.1463 0.4570 1.1463 1.0706
No log 3.3871 210 0.8724 0.4640 0.8724 0.9340
No log 3.4194 212 0.6661 0.5447 0.6661 0.8161
No log 3.4516 214 0.6786 0.5481 0.6786 0.8237
No log 3.4839 216 0.6861 0.5798 0.6861 0.8283
No log 3.5161 218 0.8011 0.5048 0.8011 0.8950
No log 3.5484 220 0.9892 0.4642 0.9892 0.9946
No log 3.5806 222 1.2804 0.4571 1.2804 1.1315
No log 3.6129 224 1.4216 0.4578 1.4216 1.1923
No log 3.6452 226 1.3385 0.4705 1.3385 1.1569
No log 3.6774 228 1.1104 0.4760 1.1104 1.0538
No log 3.7097 230 0.9980 0.5037 0.9980 0.9990
No log 3.7419 232 1.0174 0.5147 1.0174 1.0087
No log 3.7742 234 1.1005 0.4966 1.1005 1.0491
No log 3.8065 236 1.1465 0.5027 1.1465 1.0707
No log 3.8387 238 1.1878 0.4907 1.1878 1.0898
No log 3.8710 240 1.1875 0.4826 1.1875 1.0897
No log 3.9032 242 1.0447 0.4823 1.0447 1.0221
No log 3.9355 244 0.8666 0.5622 0.8666 0.9309
No log 3.9677 246 0.8285 0.4934 0.8285 0.9102
No log 4.0 248 0.8262 0.4852 0.8262 0.9090
No log 4.0323 250 0.7932 0.5230 0.7932 0.8906
No log 4.0645 252 0.8113 0.5721 0.8113 0.9007
No log 4.0968 254 0.8627 0.4788 0.8627 0.9288
No log 4.1290 256 0.8327 0.5371 0.8327 0.9125
No log 4.1613 258 0.8523 0.5407 0.8523 0.9232
No log 4.1935 260 0.8685 0.5722 0.8685 0.9319
No log 4.2258 262 0.8819 0.5332 0.8819 0.9391
No log 4.2581 264 0.8910 0.4969 0.8910 0.9440
No log 4.2903 266 0.9206 0.5620 0.9206 0.9595
No log 4.3226 268 1.0250 0.5120 1.0250 1.0124
No log 4.3548 270 1.1017 0.4424 1.1017 1.0496
No log 4.3871 272 1.1647 0.4227 1.1647 1.0792
No log 4.4194 274 1.0472 0.4561 1.0472 1.0233
No log 4.4516 276 0.8535 0.5022 0.8535 0.9238
No log 4.4839 278 0.7381 0.5851 0.7381 0.8591
No log 4.5161 280 0.7314 0.6050 0.7314 0.8552
No log 4.5484 282 0.7646 0.6017 0.7646 0.8744
No log 4.5806 284 0.7875 0.6075 0.7875 0.8874
No log 4.6129 286 0.8142 0.5661 0.8142 0.9023
No log 4.6452 288 0.8577 0.5435 0.8577 0.9261
No log 4.6774 290 0.8661 0.5546 0.8661 0.9306
No log 4.7097 292 0.9229 0.5244 0.9229 0.9607
No log 4.7419 294 1.0206 0.4546 1.0206 1.0103
No log 4.7742 296 0.9562 0.5166 0.9562 0.9779
No log 4.8065 298 0.8746 0.5241 0.8746 0.9352
No log 4.8387 300 0.8637 0.5310 0.8637 0.9294
No log 4.8710 302 0.8672 0.5402 0.8672 0.9312
No log 4.9032 304 0.9143 0.5248 0.9143 0.9562
No log 4.9355 306 0.8899 0.5325 0.8899 0.9433
No log 4.9677 308 0.8680 0.5393 0.8680 0.9317
No log 5.0 310 0.8379 0.5110 0.8379 0.9154
No log 5.0323 312 0.8163 0.5453 0.8163 0.9035
No log 5.0645 314 0.8051 0.5410 0.8051 0.8973
No log 5.0968 316 0.8857 0.4972 0.8857 0.9411
No log 5.1290 318 1.0094 0.4671 1.0094 1.0047
No log 5.1613 320 1.0656 0.4382 1.0656 1.0323
No log 5.1935 322 1.1306 0.4238 1.1306 1.0633
No log 5.2258 324 1.1029 0.4352 1.1029 1.0502
No log 5.2581 326 1.0482 0.4481 1.0482 1.0238
No log 5.2903 328 0.9727 0.4965 0.9727 0.9863
No log 5.3226 330 0.8320 0.5003 0.8320 0.9121
No log 5.3548 332 0.7744 0.5112 0.7744 0.8800
No log 5.3871 334 0.7555 0.5244 0.7555 0.8692
No log 5.4194 336 0.7381 0.5589 0.7381 0.8591
No log 5.4516 338 0.7420 0.5488 0.7420 0.8614
No log 5.4839 340 0.7810 0.4917 0.7810 0.8837
No log 5.5161 342 0.8415 0.4849 0.8415 0.9173
No log 5.5484 344 0.8127 0.5008 0.8127 0.9015
No log 5.5806 346 0.8353 0.5051 0.8353 0.9140
No log 5.6129 348 0.8821 0.4971 0.8821 0.9392
No log 5.6452 350 0.9002 0.4853 0.9002 0.9488
No log 5.6774 352 0.9238 0.4733 0.9238 0.9611
No log 5.7097 354 0.9381 0.4733 0.9381 0.9685
No log 5.7419 356 0.9582 0.4678 0.9582 0.9789
No log 5.7742 358 0.9184 0.4971 0.9184 0.9583
No log 5.8065 360 0.8182 0.5276 0.8182 0.9046
No log 5.8387 362 0.7953 0.5758 0.7953 0.8918
No log 5.8710 364 0.8131 0.5562 0.8131 0.9017
No log 5.9032 366 0.8427 0.5487 0.8427 0.9180
No log 5.9355 368 0.8732 0.5547 0.8732 0.9345
No log 5.9677 370 0.9191 0.5105 0.9191 0.9587
No log 6.0 372 0.9099 0.5105 0.9099 0.9539
No log 6.0323 374 0.8425 0.5161 0.8425 0.9179
No log 6.0645 376 0.7691 0.5617 0.7691 0.8770
No log 6.0968 378 0.7401 0.5755 0.7401 0.8603
No log 6.1290 380 0.7239 0.5767 0.7239 0.8508
No log 6.1613 382 0.7458 0.5299 0.7458 0.8636
No log 6.1935 384 0.8043 0.4915 0.8043 0.8968
No log 6.2258 386 0.8146 0.5016 0.8146 0.9026
No log 6.2581 388 0.7959 0.5182 0.7959 0.8921
No log 6.2903 390 0.8057 0.5048 0.8057 0.8976
No log 6.3226 392 0.8185 0.5231 0.8185 0.9047
No log 6.3548 394 0.7960 0.5279 0.7960 0.8922
No log 6.3871 396 0.8225 0.5257 0.8225 0.9069
No log 6.4194 398 0.8564 0.5082 0.8564 0.9254
No log 6.4516 400 0.8834 0.5114 0.8834 0.9399
No log 6.4839 402 0.8975 0.4874 0.8975 0.9474
No log 6.5161 404 0.9071 0.4684 0.9071 0.9524
No log 6.5484 406 0.8901 0.4627 0.8901 0.9434
No log 6.5806 408 0.8780 0.4791 0.8780 0.9370
No log 6.6129 410 0.8097 0.5572 0.8097 0.8998
No log 6.6452 412 0.7507 0.5824 0.7507 0.8664
No log 6.6774 414 0.7380 0.5644 0.7380 0.8590
No log 6.7097 416 0.7605 0.5489 0.7605 0.8721
No log 6.7419 418 0.8575 0.4865 0.8575 0.9260
No log 6.7742 420 0.9693 0.4494 0.9693 0.9845
No log 6.8065 422 1.0342 0.4438 1.0342 1.0169
No log 6.8387 424 1.0161 0.4451 1.0161 1.0080
No log 6.8710 426 0.9874 0.4448 0.9874 0.9937
No log 6.9032 428 0.9087 0.4723 0.9087 0.9533
No log 6.9355 430 0.8134 0.5054 0.8134 0.9019
No log 6.9677 432 0.7728 0.5478 0.7728 0.8791
No log 7.0 434 0.7475 0.5519 0.7475 0.8646
No log 7.0323 436 0.7470 0.5519 0.7470 0.8643
No log 7.0645 438 0.7840 0.5295 0.7840 0.8854
No log 7.0968 440 0.8205 0.5071 0.8205 0.9058
No log 7.1290 442 0.8739 0.4588 0.8739 0.9348
No log 7.1613 444 0.8755 0.4788 0.8755 0.9357
No log 7.1935 446 0.8452 0.4934 0.8452 0.9193
No log 7.2258 448 0.8150 0.5080 0.8150 0.9028
No log 7.2581 450 0.7749 0.5380 0.7749 0.8803
No log 7.2903 452 0.7394 0.5461 0.7394 0.8599
No log 7.3226 454 0.7278 0.5560 0.7278 0.8531
No log 7.3548 456 0.7450 0.5403 0.7450 0.8631
No log 7.3871 458 0.7538 0.5515 0.7538 0.8682
No log 7.4194 460 0.7710 0.5025 0.7710 0.8780
No log 7.4516 462 0.7781 0.5025 0.7781 0.8821
No log 7.4839 464 0.7732 0.5319 0.7732 0.8793
No log 7.5161 466 0.7858 0.5307 0.7858 0.8865
No log 7.5484 468 0.7685 0.5378 0.7685 0.8767
No log 7.5806 470 0.7227 0.5506 0.7227 0.8501
No log 7.6129 472 0.7104 0.5729 0.7104 0.8429
No log 7.6452 474 0.7164 0.5919 0.7164 0.8464
No log 7.6774 476 0.7417 0.5763 0.7417 0.8612
No log 7.7097 478 0.7851 0.5534 0.7851 0.8860
No log 7.7419 480 0.8636 0.4981 0.8636 0.9293
No log 7.7742 482 0.9552 0.4821 0.9552 0.9774
No log 7.8065 484 1.0334 0.4469 1.0334 1.0166
No log 7.8387 486 1.0779 0.4645 1.0779 1.0382
No log 7.8710 488 1.0610 0.4645 1.0610 1.0301
No log 7.9032 490 1.0167 0.4595 1.0167 1.0083
No log 7.9355 492 0.9749 0.4446 0.9749 0.9874
No log 7.9677 494 0.9017 0.4688 0.9017 0.9496
No log 8.0 496 0.8212 0.5008 0.8212 0.9062
No log 8.0323 498 0.7554 0.5307 0.7554 0.8691
0.4559 8.0645 500 0.7365 0.5438 0.7365 0.8582
0.4559 8.0968 502 0.7429 0.5438 0.7429 0.8619
0.4559 8.1290 504 0.7733 0.5106 0.7733 0.8794
0.4559 8.1613 506 0.8212 0.4635 0.8212 0.9062
0.4559 8.1935 508 0.8955 0.4668 0.8955 0.9463
0.4559 8.2258 510 0.9390 0.4717 0.9390 0.9690
0.4559 8.2581 512 0.9517 0.4812 0.9517 0.9755
0.4559 8.2903 514 0.9427 0.4723 0.9427 0.9709
0.4559 8.3226 516 0.9373 0.4723 0.9373 0.9682
0.4559 8.3548 518 0.9158 0.4668 0.9158 0.9570
0.4559 8.3871 520 0.8864 0.4732 0.8864 0.9415
0.4559 8.4194 522 0.8558 0.4752 0.8558 0.9251
0.4559 8.4516 524 0.8501 0.4752 0.8501 0.9220
0.4559 8.4839 526 0.8514 0.4752 0.8514 0.9227
0.4559 8.5161 528 0.8425 0.4993 0.8425 0.9179
0.4559 8.5484 530 0.8398 0.4993 0.8398 0.9164
0.4559 8.5806 532 0.8257 0.5072 0.8257 0.9087
0.4559 8.6129 534 0.8324 0.5007 0.8324 0.9124
0.4559 8.6452 536 0.8382 0.5 0.8382 0.9156
0.4559 8.6774 538 0.8495 0.4985 0.8495 0.9217
0.4559 8.7097 540 0.8402 0.4985 0.8402 0.9166
0.4559 8.7419 542 0.8373 0.4985 0.8373 0.9151
0.4559 8.7742 544 0.8309 0.4992 0.8309 0.9115
0.4559 8.8065 546 0.8214 0.5131 0.8214 0.9063
0.4559 8.8387 548 0.8216 0.5131 0.8216 0.9064
0.4559 8.8710 550 0.8425 0.5035 0.8425 0.9179
0.4559 8.9032 552 0.8518 0.5035 0.8518 0.9229
0.4559 8.9355 554 0.8578 0.5035 0.8578 0.9262
0.4559 8.9677 556 0.8463 0.5154 0.8463 0.9200
0.4559 9.0 558 0.8339 0.5015 0.8339 0.9132
0.4559 9.0323 560 0.8337 0.5015 0.8337 0.9130
0.4559 9.0645 562 0.8200 0.5089 0.8200 0.9055
0.4559 9.0968 564 0.7994 0.4995 0.7994 0.8941
0.4559 9.1290 566 0.7837 0.5255 0.7837 0.8853
0.4559 9.1613 568 0.7803 0.5322 0.7803 0.8833
0.4559 9.1935 570 0.7825 0.5189 0.7825 0.8846
0.4559 9.2258 572 0.7894 0.5122 0.7894 0.8885
0.4559 9.2581 574 0.7974 0.5122 0.7974 0.8929
0.4559 9.2903 576 0.8058 0.5023 0.8058 0.8976
0.4559 9.3226 578 0.8152 0.5023 0.8152 0.9029
0.4559 9.3548 580 0.8258 0.5172 0.8258 0.9087
0.4559 9.3871 582 0.8345 0.5163 0.8345 0.9135
0.4559 9.4194 584 0.8408 0.5035 0.8408 0.9170
0.4559 9.4516 586 0.8343 0.5163 0.8343 0.9134
0.4559 9.4839 588 0.8261 0.5163 0.8261 0.9089
0.4559 9.5161 590 0.8082 0.5182 0.8082 0.8990
0.4559 9.5484 592 0.7902 0.5122 0.7902 0.8889
0.4559 9.5806 594 0.7781 0.5122 0.7781 0.8821
0.4559 9.6129 596 0.7699 0.5189 0.7699 0.8774
0.4559 9.6452 598 0.7648 0.5255 0.7648 0.8745
0.4559 9.6774 600 0.7608 0.5276 0.7608 0.8723
0.4559 9.7097 602 0.7550 0.5344 0.7550 0.8689
0.4559 9.7419 604 0.7500 0.5356 0.7500 0.8660
0.4559 9.7742 606 0.7487 0.5344 0.7487 0.8653
0.4559 9.8065 608 0.7475 0.5344 0.7475 0.8646
0.4559 9.8387 610 0.7467 0.5344 0.7467 0.8641
0.4559 9.8710 612 0.7455 0.5344 0.7455 0.8634
0.4559 9.9032 614 0.7453 0.5344 0.7453 0.8633
0.4559 9.9355 616 0.7456 0.5344 0.7456 0.8635
0.4559 9.9677 618 0.7461 0.5344 0.7461 0.8638
0.4559 10.0 620 0.7467 0.5344 0.7467 0.8641

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k11_task2_organization

Finetuned
(4023)
this model