ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8168
  • Qwk: 0.5036
  • Mse: 0.8168
  • Rmse: 0.9038

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 4.0641 -0.0204 4.0641 2.0160
No log 0.0645 4 2.0954 0.0307 2.0954 1.4475
No log 0.0968 6 1.4884 0.0572 1.4884 1.2200
No log 0.1290 8 1.1007 0.0 1.1007 1.0491
No log 0.1613 10 0.7655 0.1874 0.7655 0.8750
No log 0.1935 12 0.7370 0.2326 0.7370 0.8585
No log 0.2258 14 0.7575 0.2141 0.7575 0.8703
No log 0.2581 16 0.7205 0.1826 0.7205 0.8488
No log 0.2903 18 0.6354 0.2379 0.6354 0.7971
No log 0.3226 20 0.5828 0.3694 0.5828 0.7634
No log 0.3548 22 0.6133 0.4230 0.6133 0.7831
No log 0.3871 24 0.5981 0.4105 0.5981 0.7733
No log 0.4194 26 0.6591 0.2736 0.6591 0.8118
No log 0.4516 28 0.7930 0.1485 0.7930 0.8905
No log 0.4839 30 0.7506 0.2210 0.7506 0.8664
No log 0.5161 32 0.7706 0.2620 0.7706 0.8778
No log 0.5484 34 0.7424 0.2959 0.7424 0.8616
No log 0.5806 36 0.6637 0.3195 0.6637 0.8147
No log 0.6129 38 0.6580 0.2769 0.6580 0.8112
No log 0.6452 40 0.6286 0.3169 0.6286 0.7928
No log 0.6774 42 0.6428 0.3393 0.6428 0.8018
No log 0.7097 44 0.6578 0.3341 0.6578 0.8111
No log 0.7419 46 0.6184 0.4086 0.6184 0.7864
No log 0.7742 48 0.6290 0.3270 0.6290 0.7931
No log 0.8065 50 0.6335 0.3224 0.6335 0.7959
No log 0.8387 52 0.5971 0.4284 0.5971 0.7727
No log 0.8710 54 0.5841 0.3797 0.5841 0.7643
No log 0.9032 56 0.5766 0.3757 0.5766 0.7593
No log 0.9355 58 0.5391 0.3789 0.5391 0.7343
No log 0.9677 60 0.5145 0.4390 0.5145 0.7173
No log 1.0 62 0.5613 0.3722 0.5613 0.7492
No log 1.0323 64 0.5841 0.3888 0.5841 0.7643
No log 1.0645 66 0.6259 0.3794 0.6259 0.7911
No log 1.0968 68 0.7051 0.3811 0.7051 0.8397
No log 1.1290 70 0.6400 0.4480 0.6400 0.8000
No log 1.1613 72 0.5567 0.5116 0.5567 0.7461
No log 1.1935 74 0.5492 0.5320 0.5492 0.7411
No log 1.2258 76 0.5571 0.5268 0.5571 0.7464
No log 1.2581 78 0.6242 0.5295 0.6242 0.7901
No log 1.2903 80 0.7222 0.4757 0.7222 0.8498
No log 1.3226 82 0.8668 0.3174 0.8668 0.9310
No log 1.3548 84 1.0035 0.3020 1.0035 1.0017
No log 1.3871 86 0.9812 0.3430 0.9812 0.9906
No log 1.4194 88 0.7439 0.5029 0.7439 0.8625
No log 1.4516 90 0.6257 0.5668 0.6257 0.7910
No log 1.4839 92 0.6299 0.5707 0.6299 0.7936
No log 1.5161 94 0.6266 0.5725 0.6266 0.7916
No log 1.5484 96 0.7451 0.5206 0.7451 0.8632
No log 1.5806 98 0.9807 0.4560 0.9807 0.9903
No log 1.6129 100 1.0512 0.4261 1.0512 1.0253
No log 1.6452 102 0.9029 0.4714 0.9029 0.9502
No log 1.6774 104 0.7958 0.4844 0.7958 0.8921
No log 1.7097 106 0.6978 0.5348 0.6978 0.8354
No log 1.7419 108 0.6967 0.5124 0.6967 0.8347
No log 1.7742 110 0.7085 0.4770 0.7085 0.8417
No log 1.8065 112 0.8030 0.4884 0.8030 0.8961
No log 1.8387 114 0.9246 0.4977 0.9246 0.9616
No log 1.8710 116 0.7977 0.5344 0.7977 0.8931
No log 1.9032 118 0.6931 0.4730 0.6931 0.8325
No log 1.9355 120 0.7534 0.4901 0.7534 0.8680
No log 1.9677 122 0.7375 0.5377 0.7375 0.8588
No log 2.0 124 0.9200 0.5316 0.9200 0.9592
No log 2.0323 126 0.8483 0.5184 0.8483 0.9210
No log 2.0645 128 0.7749 0.5439 0.7749 0.8803
No log 2.0968 130 0.7153 0.5157 0.7153 0.8458
No log 2.1290 132 0.7097 0.5566 0.7097 0.8424
No log 2.1613 134 0.7958 0.4871 0.7958 0.8921
No log 2.1935 136 0.9737 0.4579 0.9737 0.9868
No log 2.2258 138 0.9520 0.4636 0.9520 0.9757
No log 2.2581 140 0.9830 0.4891 0.9830 0.9914
No log 2.2903 142 1.0615 0.4575 1.0615 1.0303
No log 2.3226 144 1.1649 0.4277 1.1649 1.0793
No log 2.3548 146 1.0658 0.4466 1.0658 1.0324
No log 2.3871 148 0.9055 0.5520 0.9055 0.9516
No log 2.4194 150 0.8949 0.5011 0.8949 0.9460
No log 2.4516 152 0.9135 0.4731 0.9135 0.9558
No log 2.4839 154 1.0100 0.4148 1.0100 1.0050
No log 2.5161 156 0.9832 0.3905 0.9832 0.9916
No log 2.5484 158 0.7507 0.4720 0.7507 0.8664
No log 2.5806 160 0.6064 0.5245 0.6064 0.7787
No log 2.6129 162 0.6604 0.5164 0.6604 0.8127
No log 2.6452 164 0.6710 0.5473 0.6710 0.8191
No log 2.6774 166 0.7045 0.5049 0.7045 0.8393
No log 2.7097 168 0.9057 0.4511 0.9057 0.9517
No log 2.7419 170 0.8690 0.4566 0.8690 0.9322
No log 2.7742 172 0.7074 0.4715 0.7074 0.8411
No log 2.8065 174 0.7023 0.4635 0.7023 0.8380
No log 2.8387 176 0.7577 0.5018 0.7577 0.8704
No log 2.8710 178 0.7293 0.5143 0.7293 0.8540
No log 2.9032 180 0.7352 0.5258 0.7352 0.8574
No log 2.9355 182 0.8023 0.5166 0.8023 0.8957
No log 2.9677 184 0.9155 0.5087 0.9155 0.9568
No log 3.0 186 1.0901 0.4755 1.0901 1.0441
No log 3.0323 188 1.1907 0.4509 1.1907 1.0912
No log 3.0645 190 1.1316 0.4732 1.1316 1.0638
No log 3.0968 192 1.1986 0.4507 1.1986 1.0948
No log 3.1290 194 0.9949 0.4782 0.9949 0.9974
No log 3.1613 196 0.8799 0.5462 0.8799 0.9380
No log 3.1935 198 0.8952 0.5196 0.8952 0.9462
No log 3.2258 200 0.9049 0.5291 0.9049 0.9512
No log 3.2581 202 1.0701 0.4860 1.0701 1.0344
No log 3.2903 204 1.0939 0.4855 1.0939 1.0459
No log 3.3226 206 0.9484 0.4921 0.9484 0.9739
No log 3.3548 208 0.8129 0.5286 0.8129 0.9016
No log 3.3871 210 0.6957 0.5633 0.6957 0.8341
No log 3.4194 212 0.6825 0.5384 0.6825 0.8261
No log 3.4516 214 0.8439 0.4868 0.8439 0.9187
No log 3.4839 216 0.9589 0.4060 0.9589 0.9792
No log 3.5161 218 0.9646 0.4455 0.9646 0.9821
No log 3.5484 220 0.9015 0.4992 0.9015 0.9495
No log 3.5806 222 0.9232 0.4874 0.9232 0.9609
No log 3.6129 224 0.9129 0.5407 0.9129 0.9554
No log 3.6452 226 0.9591 0.5095 0.9591 0.9793
No log 3.6774 228 1.1055 0.4441 1.1055 1.0514
No log 3.7097 230 1.2378 0.4239 1.2378 1.1125
No log 3.7419 232 1.1938 0.4328 1.1938 1.0926
No log 3.7742 234 1.0021 0.4846 1.0021 1.0011
No log 3.8065 236 0.9204 0.5253 0.9204 0.9594
No log 3.8387 238 0.9099 0.5429 0.9099 0.9539
No log 3.8710 240 1.0122 0.5116 1.0122 1.0061
No log 3.9032 242 1.0681 0.4957 1.0681 1.0335
No log 3.9355 244 0.9423 0.5121 0.9423 0.9707
No log 3.9677 246 0.8348 0.5146 0.8348 0.9137
No log 4.0 248 0.6971 0.4935 0.6971 0.8349
No log 4.0323 250 0.6609 0.5222 0.6609 0.8130
No log 4.0645 252 0.6805 0.5504 0.6805 0.8249
No log 4.0968 254 0.7926 0.4903 0.7926 0.8903
No log 4.1290 256 0.9535 0.4722 0.9535 0.9765
No log 4.1613 258 0.9991 0.4514 0.9991 0.9996
No log 4.1935 260 0.9479 0.4846 0.9479 0.9736
No log 4.2258 262 0.8318 0.5505 0.8318 0.9120
No log 4.2581 264 0.7980 0.5595 0.7980 0.8933
No log 4.2903 266 0.7463 0.6032 0.7463 0.8639
No log 4.3226 268 0.7460 0.5479 0.7460 0.8637
No log 4.3548 270 0.7844 0.5110 0.7844 0.8856
No log 4.3871 272 0.8549 0.517 0.8549 0.9246
No log 4.4194 274 0.9513 0.5118 0.9513 0.9754
No log 4.4516 276 0.8896 0.5132 0.8896 0.9432
No log 4.4839 278 0.7382 0.4567 0.7382 0.8592
No log 4.5161 280 0.6939 0.4803 0.6939 0.8330
No log 4.5484 282 0.7072 0.4617 0.7072 0.8409
No log 4.5806 284 0.7681 0.4910 0.7681 0.8764
No log 4.6129 286 0.9145 0.5085 0.9145 0.9563
No log 4.6452 288 0.9771 0.5157 0.9771 0.9885
No log 4.6774 290 0.9495 0.5085 0.9495 0.9744
No log 4.7097 292 0.8754 0.5045 0.8754 0.9356
No log 4.7419 294 0.8768 0.5345 0.8768 0.9364
No log 4.7742 296 0.8831 0.5115 0.8831 0.9397
No log 4.8065 298 0.9377 0.5031 0.9377 0.9683
No log 4.8387 300 1.0338 0.4809 1.0338 1.0168
No log 4.8710 302 0.9646 0.4895 0.9646 0.9821
No log 4.9032 304 0.8273 0.4816 0.8273 0.9096
No log 4.9355 306 0.7134 0.54 0.7134 0.8446
No log 4.9677 308 0.6759 0.5375 0.6759 0.8221
No log 5.0 310 0.6857 0.5746 0.6857 0.8281
No log 5.0323 312 0.7488 0.5223 0.7488 0.8653
No log 5.0645 314 0.7626 0.5178 0.7626 0.8733
No log 5.0968 316 0.7355 0.5376 0.7355 0.8576
No log 5.1290 318 0.7929 0.4976 0.7929 0.8904
No log 5.1613 320 0.7859 0.5076 0.7859 0.8865
No log 5.1935 322 0.7544 0.4759 0.7544 0.8686
No log 5.2258 324 0.7803 0.4754 0.7803 0.8834
No log 5.2581 326 0.8432 0.4958 0.8432 0.9183
No log 5.2903 328 0.9166 0.4880 0.9166 0.9574
No log 5.3226 330 0.9719 0.4810 0.9719 0.9859
No log 5.3548 332 0.9716 0.4926 0.9716 0.9857
No log 5.3871 334 0.8892 0.5081 0.8892 0.9430
No log 5.4194 336 0.7958 0.5219 0.7958 0.8921
No log 5.4516 338 0.7566 0.5344 0.7566 0.8698
No log 5.4839 340 0.7008 0.5519 0.7008 0.8372
No log 5.5161 342 0.6693 0.5849 0.6693 0.8181
No log 5.5484 344 0.6916 0.6002 0.6916 0.8316
No log 5.5806 346 0.7931 0.5513 0.7931 0.8906
No log 5.6129 348 0.8710 0.4754 0.8710 0.9333
No log 5.6452 350 0.9052 0.4734 0.9052 0.9514
No log 5.6774 352 0.8759 0.4682 0.8759 0.9359
No log 5.7097 354 0.7964 0.5104 0.7964 0.8924
No log 5.7419 356 0.7396 0.5093 0.7396 0.8600
No log 5.7742 358 0.6821 0.5040 0.6821 0.8259
No log 5.8065 360 0.6691 0.5418 0.6691 0.8180
No log 5.8387 362 0.6871 0.5260 0.6871 0.8289
No log 5.8710 364 0.7408 0.5010 0.7408 0.8607
No log 5.9032 366 0.7939 0.5110 0.7939 0.8910
No log 5.9355 368 0.8229 0.54 0.8229 0.9072
No log 5.9677 370 0.9110 0.4998 0.9110 0.9545
No log 6.0 372 0.9071 0.5224 0.9071 0.9524
No log 6.0323 374 0.8473 0.5557 0.8473 0.9205
No log 6.0645 376 0.7826 0.5637 0.7826 0.8846
No log 6.0968 378 0.7729 0.5334 0.7729 0.8792
No log 6.1290 380 0.7744 0.5572 0.7744 0.8800
No log 6.1613 382 0.8313 0.5201 0.8313 0.9118
No log 6.1935 384 0.9220 0.5154 0.9220 0.9602
No log 6.2258 386 0.9009 0.5229 0.9009 0.9492
No log 6.2581 388 0.8297 0.5228 0.8297 0.9109
No log 6.2903 390 0.8226 0.5171 0.8226 0.9070
No log 6.3226 392 0.7780 0.5047 0.7780 0.8821
No log 6.3548 394 0.7704 0.4995 0.7704 0.8777
No log 6.3871 396 0.8081 0.4849 0.8081 0.8990
No log 6.4194 398 0.8350 0.4886 0.8350 0.9138
No log 6.4516 400 0.8413 0.5149 0.8413 0.9172
No log 6.4839 402 0.8712 0.4968 0.8712 0.9334
No log 6.5161 404 0.8722 0.5082 0.8722 0.9339
No log 6.5484 406 0.8266 0.4910 0.8266 0.9092
No log 6.5806 408 0.8041 0.4926 0.8041 0.8967
No log 6.6129 410 0.8358 0.4997 0.8358 0.9142
No log 6.6452 412 0.8807 0.4901 0.8807 0.9384
No log 6.6774 414 0.8869 0.4762 0.8869 0.9418
No log 6.7097 416 0.8181 0.5021 0.8181 0.9045
No log 6.7419 418 0.7757 0.4992 0.7757 0.8807
No log 6.7742 420 0.7629 0.4941 0.7629 0.8734
No log 6.8065 422 0.7747 0.4773 0.7747 0.8802
No log 6.8387 424 0.8409 0.5043 0.8409 0.9170
No log 6.8710 426 0.8624 0.5035 0.8624 0.9287
No log 6.9032 428 0.8431 0.5043 0.8431 0.9182
No log 6.9355 430 0.8395 0.4893 0.8395 0.9162
No log 6.9677 432 0.8194 0.4839 0.8194 0.9052
No log 7.0 434 0.8460 0.5008 0.8460 0.9198
No log 7.0323 436 0.8863 0.5013 0.8863 0.9414
No log 7.0645 438 0.8995 0.5078 0.8995 0.9484
No log 7.0968 440 0.8633 0.5152 0.8633 0.9291
No log 7.1290 442 0.7806 0.5216 0.7806 0.8835
No log 7.1613 444 0.7458 0.4942 0.7458 0.8636
No log 7.1935 446 0.7454 0.4890 0.7454 0.8634
No log 7.2258 448 0.7627 0.4937 0.7627 0.8733
No log 7.2581 450 0.8013 0.5067 0.8013 0.8952
No log 7.2903 452 0.7976 0.5182 0.7976 0.8931
No log 7.3226 454 0.7837 0.5015 0.7837 0.8853
No log 7.3548 456 0.7418 0.4855 0.7418 0.8613
No log 7.3871 458 0.6936 0.5114 0.6936 0.8329
No log 7.4194 460 0.6756 0.5222 0.6756 0.8220
No log 7.4516 462 0.6849 0.5190 0.6849 0.8276
No log 7.4839 464 0.7053 0.4989 0.7053 0.8398
No log 7.5161 466 0.7183 0.5078 0.7183 0.8475
No log 7.5484 468 0.7129 0.5027 0.7129 0.8443
No log 7.5806 470 0.6963 0.4981 0.6963 0.8344
No log 7.6129 472 0.6925 0.5190 0.6925 0.8322
No log 7.6452 474 0.6993 0.4847 0.6993 0.8362
No log 7.6774 476 0.7185 0.4699 0.7185 0.8477
No log 7.7097 478 0.7292 0.4795 0.7292 0.8539
No log 7.7419 480 0.7416 0.4829 0.7416 0.8612
No log 7.7742 482 0.7576 0.4829 0.7576 0.8704
No log 7.8065 484 0.7425 0.4901 0.7425 0.8617
No log 7.8387 486 0.7508 0.4773 0.7508 0.8665
No log 7.8710 488 0.7956 0.4914 0.7956 0.8920
No log 7.9032 490 0.8514 0.5117 0.8514 0.9227
No log 7.9355 492 0.8603 0.5166 0.8603 0.9275
No log 7.9677 494 0.8557 0.5117 0.8557 0.9250
No log 8.0 496 0.8503 0.5117 0.8503 0.9221
No log 8.0323 498 0.8351 0.5076 0.8351 0.9138
0.4809 8.0645 500 0.8461 0.5117 0.8461 0.9198
0.4809 8.0968 502 0.8460 0.5126 0.8460 0.9198
0.4809 8.1290 504 0.8359 0.5076 0.8359 0.9143
0.4809 8.1613 506 0.8675 0.5111 0.8676 0.9314
0.4809 8.1935 508 0.8756 0.5157 0.8756 0.9358
0.4809 8.2258 510 0.8489 0.5070 0.8489 0.9214
0.4809 8.2581 512 0.8220 0.5028 0.8220 0.9066
0.4809 8.2903 514 0.7792 0.5052 0.7792 0.8827
0.4809 8.3226 516 0.7481 0.4915 0.7481 0.8650
0.4809 8.3548 518 0.7359 0.4986 0.7359 0.8579
0.4809 8.3871 520 0.7368 0.5041 0.7368 0.8584
0.4809 8.4194 522 0.7353 0.5221 0.7353 0.8575
0.4809 8.4516 524 0.7317 0.5221 0.7317 0.8554
0.4809 8.4839 526 0.7350 0.5221 0.7350 0.8573
0.4809 8.5161 528 0.7319 0.4986 0.7319 0.8555
0.4809 8.5484 530 0.7472 0.5211 0.7472 0.8644
0.4809 8.5806 532 0.7709 0.5044 0.7709 0.8780
0.4809 8.6129 534 0.8020 0.5044 0.8020 0.8956
0.4809 8.6452 536 0.8409 0.5071 0.8409 0.9170
0.4809 8.6774 538 0.8623 0.5123 0.8623 0.9286
0.4809 8.7097 540 0.8749 0.517 0.8749 0.9353
0.4809 8.7419 542 0.8744 0.5174 0.8744 0.9351
0.4809 8.7742 544 0.8627 0.5174 0.8627 0.9288
0.4809 8.8065 546 0.8390 0.5078 0.8390 0.9160
0.4809 8.8387 548 0.8058 0.5036 0.8058 0.8976
0.4809 8.8710 550 0.7891 0.5044 0.7891 0.8883
0.4809 8.9032 552 0.7743 0.4655 0.7743 0.8800
0.4809 8.9355 554 0.7658 0.4715 0.7658 0.8751
0.4809 8.9677 556 0.7663 0.4720 0.7663 0.8754
0.4809 9.0 558 0.7717 0.4659 0.7717 0.8785
0.4809 9.0323 560 0.7900 0.4818 0.7900 0.8888
0.4809 9.0645 562 0.8071 0.5036 0.8071 0.8984
0.4809 9.0968 564 0.8157 0.5028 0.8157 0.9032
0.4809 9.1290 566 0.8353 0.5078 0.8353 0.9139
0.4809 9.1613 568 0.8453 0.5063 0.8453 0.9194
0.4809 9.1935 570 0.8356 0.5129 0.8356 0.9141
0.4809 9.2258 572 0.8281 0.5078 0.8281 0.9100
0.4809 9.2581 574 0.8107 0.5036 0.8107 0.9004
0.4809 9.2903 576 0.7933 0.5044 0.7933 0.8907
0.4809 9.3226 578 0.7779 0.4659 0.7779 0.8820
0.4809 9.3548 580 0.7712 0.4659 0.7712 0.8782
0.4809 9.3871 582 0.7723 0.4659 0.7723 0.8788
0.4809 9.4194 584 0.7792 0.4659 0.7792 0.8827
0.4809 9.4516 586 0.7915 0.4655 0.7915 0.8896
0.4809 9.4839 588 0.8078 0.4985 0.8078 0.8988
0.4809 9.5161 590 0.8245 0.5028 0.8245 0.9080
0.4809 9.5484 592 0.8353 0.5078 0.8353 0.9140
0.4809 9.5806 594 0.8464 0.5129 0.8464 0.9200
0.4809 9.6129 596 0.8547 0.5129 0.8547 0.9245
0.4809 9.6452 598 0.8568 0.5175 0.8568 0.9256
0.4809 9.6774 600 0.8572 0.5129 0.8572 0.9259
0.4809 9.7097 602 0.8523 0.5129 0.8523 0.9232
0.4809 9.7419 604 0.8457 0.5078 0.8457 0.9196
0.4809 9.7742 606 0.8397 0.5078 0.8397 0.9164
0.4809 9.8065 608 0.8363 0.5078 0.8363 0.9145
0.4809 9.8387 610 0.8299 0.5078 0.8299 0.9110
0.4809 9.8710 612 0.8243 0.5036 0.8243 0.9079
0.4809 9.9032 614 0.8202 0.5036 0.8202 0.9057
0.4809 9.9355 616 0.8182 0.5036 0.8182 0.9046
0.4809 9.9677 618 0.8172 0.5036 0.8172 0.9040
0.4809 10.0 620 0.8168 0.5036 0.8168 0.9038

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k11_task2_organization

Finetuned
(4023)
this model