ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7611
  • Qwk: 0.4959
  • Mse: 0.7611
  • Rmse: 0.8724

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 3.9145 0.0017 3.9145 1.9785
No log 0.0690 4 1.9625 0.0540 1.9625 1.4009
No log 0.1034 6 1.1246 0.0696 1.1246 1.0605
No log 0.1379 8 0.7532 0.1146 0.7532 0.8679
No log 0.1724 10 0.7395 0.2638 0.7395 0.8600
No log 0.2069 12 0.7716 0.2009 0.7716 0.8784
No log 0.2414 14 0.8085 0.1819 0.8085 0.8992
No log 0.2759 16 0.7699 0.1727 0.7699 0.8774
No log 0.3103 18 0.7491 0.0800 0.7491 0.8655
No log 0.3448 20 0.8822 -0.0384 0.8822 0.9392
No log 0.3793 22 1.0507 0.0 1.0507 1.0250
No log 0.4138 24 0.9115 0.0324 0.9115 0.9547
No log 0.4483 26 0.7034 0.1073 0.7034 0.8387
No log 0.4828 28 0.6501 0.2151 0.6501 0.8063
No log 0.5172 30 0.6409 0.2149 0.6409 0.8006
No log 0.5517 32 0.6022 0.1930 0.6022 0.7760
No log 0.5862 34 0.5921 0.2270 0.5921 0.7694
No log 0.6207 36 0.6437 0.3558 0.6437 0.8023
No log 0.6552 38 0.5707 0.3653 0.5707 0.7554
No log 0.6897 40 0.7511 0.3868 0.7511 0.8667
No log 0.7241 42 0.7931 0.3741 0.7931 0.8905
No log 0.7586 44 0.5968 0.5108 0.5968 0.7725
No log 0.7931 46 0.8410 0.4101 0.8410 0.9171
No log 0.8276 48 1.0542 0.1990 1.0542 1.0267
No log 0.8621 50 0.9607 0.2841 0.9607 0.9802
No log 0.8966 52 0.7047 0.4196 0.7047 0.8394
No log 0.9310 54 0.6471 0.3633 0.6471 0.8044
No log 0.9655 56 0.9678 0.2540 0.9678 0.9838
No log 1.0 58 1.0307 0.2031 1.0307 1.0152
No log 1.0345 60 0.7887 0.3349 0.7887 0.8881
No log 1.0690 62 0.5906 0.3976 0.5906 0.7685
No log 1.1034 64 0.6515 0.4516 0.6515 0.8072
No log 1.1379 66 0.6267 0.3926 0.6267 0.7916
No log 1.1724 68 0.6033 0.4699 0.6033 0.7767
No log 1.2069 70 0.6109 0.4784 0.6109 0.7816
No log 1.2414 72 0.6451 0.5043 0.6451 0.8032
No log 1.2759 74 0.7366 0.4640 0.7366 0.8582
No log 1.3103 76 0.7469 0.4813 0.7469 0.8643
No log 1.3448 78 0.7089 0.5520 0.7089 0.8420
No log 1.3793 80 0.8209 0.5080 0.8209 0.9060
No log 1.4138 82 0.8321 0.5200 0.8321 0.9122
No log 1.4483 84 0.7738 0.5064 0.7738 0.8797
No log 1.4828 86 0.6896 0.5352 0.6896 0.8304
No log 1.5172 88 0.6806 0.5303 0.6806 0.8250
No log 1.5517 90 0.7403 0.5561 0.7403 0.8604
No log 1.5862 92 0.9000 0.4566 0.9000 0.9487
No log 1.6207 94 0.9110 0.4559 0.9110 0.9545
No log 1.6552 96 0.7505 0.5601 0.7505 0.8663
No log 1.6897 98 0.6512 0.5427 0.6512 0.8070
No log 1.7241 100 0.6664 0.5364 0.6664 0.8163
No log 1.7586 102 0.6844 0.5612 0.6844 0.8273
No log 1.7931 104 0.7363 0.5225 0.7363 0.8581
No log 1.8276 106 0.7873 0.4570 0.7873 0.8873
No log 1.8621 108 0.7973 0.4666 0.7973 0.8929
No log 1.8966 110 0.8245 0.4383 0.8245 0.9080
No log 1.9310 112 0.7863 0.5370 0.7863 0.8868
No log 1.9655 114 0.7368 0.5480 0.7368 0.8584
No log 2.0 116 0.7671 0.5200 0.7671 0.8758
No log 2.0345 118 0.7307 0.5223 0.7307 0.8548
No log 2.0690 120 0.6384 0.5397 0.6384 0.7990
No log 2.1034 122 0.7124 0.5381 0.7124 0.8441
No log 2.1379 124 0.7765 0.4629 0.7765 0.8812
No log 2.1724 126 0.7105 0.5236 0.7105 0.8429
No log 2.2069 128 0.7208 0.5577 0.7208 0.8490
No log 2.2414 130 0.7680 0.5617 0.7680 0.8763
No log 2.2759 132 0.7621 0.5089 0.7621 0.8730
No log 2.3103 134 0.7758 0.5036 0.7758 0.8808
No log 2.3448 136 0.8090 0.5456 0.8090 0.8994
No log 2.3793 138 0.9838 0.4787 0.9838 0.9919
No log 2.4138 140 1.2493 0.4035 1.2493 1.1177
No log 2.4483 142 1.2539 0.4110 1.2539 1.1198
No log 2.4828 144 1.0792 0.4366 1.0792 1.0389
No log 2.5172 146 0.8687 0.5128 0.8687 0.9321
No log 2.5517 148 0.8312 0.4911 0.8312 0.9117
No log 2.5862 150 0.9134 0.4914 0.9134 0.9557
No log 2.6207 152 1.0679 0.4830 1.0679 1.0334
No log 2.6552 154 1.1072 0.4779 1.1072 1.0522
No log 2.6897 156 1.0369 0.4582 1.0369 1.0183
No log 2.7241 158 0.9266 0.4613 0.9266 0.9626
No log 2.7586 160 0.9325 0.5016 0.9325 0.9657
No log 2.7931 162 1.0204 0.4575 1.0204 1.0102
No log 2.8276 164 1.0420 0.4652 1.0420 1.0208
No log 2.8621 166 1.0520 0.4592 1.0520 1.0257
No log 2.8966 168 1.0339 0.4591 1.0339 1.0168
No log 2.9310 170 0.9467 0.4748 0.9467 0.9730
No log 2.9655 172 0.8499 0.4909 0.8499 0.9219
No log 3.0 174 0.8242 0.5060 0.8242 0.9079
No log 3.0345 176 0.8555 0.4974 0.8555 0.9249
No log 3.0690 178 0.8828 0.4852 0.8828 0.9396
No log 3.1034 180 0.9012 0.4762 0.9012 0.9493
No log 3.1379 182 0.8823 0.5127 0.8823 0.9393
No log 3.1724 184 0.8905 0.4985 0.8905 0.9436
No log 3.2069 186 0.8714 0.4918 0.8714 0.9335
No log 3.2414 188 0.8470 0.4929 0.8470 0.9204
No log 3.2759 190 0.8308 0.4770 0.8308 0.9115
No log 3.3103 192 0.8001 0.5044 0.8001 0.8945
No log 3.3448 194 0.7545 0.4863 0.7545 0.8686
No log 3.3793 196 0.7265 0.5003 0.7265 0.8524
No log 3.4138 198 0.7187 0.5229 0.7187 0.8478
No log 3.4483 200 0.7680 0.4919 0.7680 0.8764
No log 3.4828 202 0.9378 0.4434 0.9378 0.9684
No log 3.5172 204 1.2643 0.4194 1.2643 1.1244
No log 3.5517 206 1.6111 0.3530 1.6111 1.2693
No log 3.5862 208 1.5980 0.3467 1.5980 1.2641
No log 3.6207 210 1.3546 0.3652 1.3546 1.1639
No log 3.6552 212 1.1738 0.4035 1.1738 1.0834
No log 3.6897 214 1.0500 0.4317 1.0500 1.0247
No log 3.7241 216 1.0489 0.4628 1.0489 1.0242
No log 3.7586 218 1.0698 0.4677 1.0698 1.0343
No log 3.7931 220 1.1505 0.4528 1.1505 1.0726
No log 3.8276 222 1.2410 0.4469 1.2410 1.1140
No log 3.8621 224 1.2248 0.4457 1.2248 1.1067
No log 3.8966 226 1.1422 0.4569 1.1422 1.0688
No log 3.9310 228 1.1076 0.4558 1.1076 1.0524
No log 3.9655 230 1.0479 0.4661 1.0479 1.0237
No log 4.0 232 0.9070 0.4939 0.9070 0.9524
No log 4.0345 234 0.8546 0.4950 0.8546 0.9244
No log 4.0690 236 0.8076 0.5061 0.8076 0.8987
No log 4.1034 238 0.8113 0.4905 0.8113 0.9007
No log 4.1379 240 0.8319 0.4905 0.8319 0.9121
No log 4.1724 242 0.8349 0.4806 0.8349 0.9137
No log 4.2069 244 0.8405 0.5069 0.8405 0.9168
No log 4.2414 246 0.7997 0.5108 0.7997 0.8943
No log 4.2759 248 0.7317 0.5159 0.7317 0.8554
No log 4.3103 250 0.7176 0.5075 0.7176 0.8471
No log 4.3448 252 0.7138 0.5052 0.7138 0.8449
No log 4.3793 254 0.7169 0.5231 0.7169 0.8467
No log 4.4138 256 0.7259 0.5060 0.7259 0.8520
No log 4.4483 258 0.7411 0.5015 0.7411 0.8609
No log 4.4828 260 0.8066 0.5148 0.8066 0.8981
No log 4.5172 262 0.8551 0.5213 0.8551 0.9247
No log 4.5517 264 0.8619 0.4870 0.8619 0.9284
No log 4.5862 266 0.8184 0.4873 0.8184 0.9047
No log 4.6207 268 0.7846 0.5096 0.7846 0.8858
No log 4.6552 270 0.7743 0.5265 0.7743 0.8799
No log 4.6897 272 0.7823 0.5066 0.7823 0.8845
No log 4.7241 274 0.8650 0.4608 0.8650 0.9300
No log 4.7586 276 0.9375 0.4402 0.9375 0.9683
No log 4.7931 278 0.9653 0.4300 0.9653 0.9825
No log 4.8276 280 0.8885 0.4614 0.8885 0.9426
No log 4.8621 282 0.8569 0.4791 0.8569 0.9257
No log 4.8966 284 0.8916 0.4702 0.8916 0.9443
No log 4.9310 286 0.9892 0.4253 0.9892 0.9946
No log 4.9655 288 0.9977 0.4253 0.9977 0.9988
No log 5.0 290 0.9787 0.4253 0.9787 0.9893
No log 5.0345 292 0.9458 0.4318 0.9458 0.9725
No log 5.0690 294 0.8553 0.4752 0.8553 0.9248
No log 5.1034 296 0.8072 0.4799 0.8072 0.8984
No log 5.1379 298 0.7974 0.4956 0.7974 0.8930
No log 5.1724 300 0.7891 0.5276 0.7891 0.8883
No log 5.2069 302 0.7896 0.5065 0.7896 0.8886
No log 5.2414 304 0.8200 0.4935 0.8200 0.9055
No log 5.2759 306 0.8200 0.5 0.8200 0.9055
No log 5.3103 308 0.7997 0.4995 0.7997 0.8942
No log 5.3448 310 0.8007 0.5099 0.8007 0.8948
No log 5.3793 312 0.8066 0.5071 0.8066 0.8981
No log 5.4138 314 0.8389 0.4804 0.8389 0.9159
No log 5.4483 316 0.8452 0.4827 0.8452 0.9194
No log 5.4828 318 0.8742 0.4736 0.8742 0.9350
No log 5.5172 320 0.8862 0.4100 0.8862 0.9414
No log 5.5517 322 0.9026 0.4214 0.9026 0.9501
No log 5.5862 324 0.8383 0.4337 0.8383 0.9156
No log 5.6207 326 0.7507 0.4585 0.7507 0.8664
No log 5.6552 328 0.7372 0.5316 0.7372 0.8586
No log 5.6897 330 0.7785 0.5565 0.7785 0.8824
No log 5.7241 332 0.8063 0.5417 0.8063 0.8980
No log 5.7586 334 0.8134 0.5243 0.8134 0.9019
No log 5.7931 336 0.8067 0.5138 0.8067 0.8981
No log 5.8276 338 0.7767 0.5219 0.7767 0.8813
No log 5.8621 340 0.7836 0.5043 0.7836 0.8852
No log 5.8966 342 0.7966 0.4816 0.7966 0.8925
No log 5.9310 344 0.8093 0.4810 0.8093 0.8996
No log 5.9655 346 0.7868 0.4810 0.7868 0.8870
No log 6.0 348 0.7716 0.4764 0.7716 0.8784
No log 6.0345 350 0.7502 0.5136 0.7502 0.8662
No log 6.0690 352 0.7300 0.5170 0.7300 0.8544
No log 6.1034 354 0.7289 0.5437 0.7289 0.8538
No log 6.1379 356 0.7483 0.5143 0.7483 0.8650
No log 6.1724 358 0.7904 0.4790 0.7904 0.8891
No log 6.2069 360 0.8038 0.5010 0.8038 0.8966
No log 6.2414 362 0.8055 0.4462 0.8055 0.8975
No log 6.2759 364 0.7801 0.4542 0.7801 0.8832
No log 6.3103 366 0.7696 0.5050 0.7696 0.8773
No log 6.3448 368 0.7633 0.5044 0.7633 0.8737
No log 6.3793 370 0.7733 0.4624 0.7733 0.8794
No log 6.4138 372 0.7789 0.4677 0.7789 0.8825
No log 6.4483 374 0.8128 0.4524 0.8128 0.9015
No log 6.4828 376 0.8365 0.4673 0.8365 0.9146
No log 6.5172 378 0.8660 0.4719 0.8660 0.9306
No log 6.5517 380 0.8424 0.4565 0.8424 0.9178
No log 6.5862 382 0.8181 0.4608 0.8181 0.9045
No log 6.6207 384 0.8118 0.4612 0.8118 0.9010
No log 6.6552 386 0.7955 0.4680 0.7955 0.8919
No log 6.6897 388 0.7925 0.4754 0.7925 0.8902
No log 6.7241 390 0.7718 0.4775 0.7718 0.8785
No log 6.7586 392 0.7559 0.4770 0.7559 0.8694
No log 6.7931 394 0.7445 0.4737 0.7445 0.8628
No log 6.8276 396 0.7545 0.4737 0.7545 0.8686
No log 6.8621 398 0.7634 0.4694 0.7634 0.8737
No log 6.8966 400 0.7611 0.4621 0.7611 0.8724
No log 6.9310 402 0.7756 0.4253 0.7756 0.8807
No log 6.9655 404 0.7673 0.4418 0.7673 0.8760
No log 7.0 406 0.7403 0.5024 0.7403 0.8604
No log 7.0345 408 0.7438 0.5110 0.7438 0.8624
No log 7.0690 410 0.7697 0.4971 0.7697 0.8773
No log 7.1034 412 0.7946 0.4940 0.7946 0.8914
No log 7.1379 414 0.8066 0.4977 0.8066 0.8981
No log 7.1724 416 0.8059 0.4898 0.8059 0.8977
No log 7.2069 418 0.7795 0.5021 0.7795 0.8829
No log 7.2414 420 0.7310 0.5264 0.7310 0.8550
No log 7.2759 422 0.7033 0.5323 0.7033 0.8387
No log 7.3103 424 0.6985 0.5408 0.6985 0.8357
No log 7.3448 426 0.7163 0.5248 0.7163 0.8464
No log 7.3793 428 0.7300 0.5127 0.7300 0.8544
No log 7.4138 430 0.7521 0.5108 0.7521 0.8673
No log 7.4483 432 0.7623 0.5016 0.7623 0.8731
No log 7.4828 434 0.7664 0.5016 0.7664 0.8754
No log 7.5172 436 0.7647 0.5068 0.7647 0.8745
No log 7.5517 438 0.7641 0.5 0.7641 0.8741
No log 7.5862 440 0.7629 0.5111 0.7629 0.8735
No log 7.6207 442 0.7665 0.4820 0.7665 0.8755
No log 7.6552 444 0.8064 0.4565 0.8064 0.8980
No log 7.6897 446 0.8606 0.4595 0.8606 0.9277
No log 7.7241 448 0.9216 0.4529 0.9216 0.9600
No log 7.7586 450 0.9369 0.4529 0.9369 0.9679
No log 7.7931 452 0.9035 0.4529 0.9035 0.9505
No log 7.8276 454 0.8630 0.4549 0.8630 0.9290
No log 7.8621 456 0.8496 0.4499 0.8496 0.9217
No log 7.8966 458 0.8399 0.4398 0.8399 0.9165
No log 7.9310 460 0.8296 0.4700 0.8296 0.9108
No log 7.9655 462 0.8409 0.4816 0.8409 0.9170
No log 8.0 464 0.8501 0.4696 0.8501 0.9220
No log 8.0345 466 0.8363 0.4713 0.8363 0.9145
No log 8.0690 468 0.8358 0.4785 0.8358 0.9142
No log 8.1034 470 0.8560 0.4413 0.8560 0.9252
No log 8.1379 472 0.8719 0.4367 0.8719 0.9338
No log 8.1724 474 0.8903 0.4516 0.8903 0.9436
No log 8.2069 476 0.8723 0.4367 0.8723 0.9340
No log 8.2414 478 0.8513 0.4419 0.8513 0.9226
No log 8.2759 480 0.8172 0.4591 0.8172 0.9040
No log 8.3103 482 0.7781 0.4847 0.7781 0.8821
No log 8.3448 484 0.7608 0.5145 0.7608 0.8723
No log 8.3793 486 0.7607 0.5262 0.7607 0.8722
No log 8.4138 488 0.7624 0.5 0.7624 0.8732
No log 8.4483 490 0.7617 0.5038 0.7617 0.8728
No log 8.4828 492 0.7483 0.5088 0.7483 0.8650
No log 8.5172 494 0.7468 0.5088 0.7468 0.8642
No log 8.5517 496 0.7393 0.4820 0.7393 0.8598
No log 8.5862 498 0.7363 0.4820 0.7363 0.8581
0.4564 8.6207 500 0.7341 0.4820 0.7341 0.8568
0.4564 8.6552 502 0.7394 0.4820 0.7394 0.8599
0.4564 8.6897 504 0.7431 0.4900 0.7431 0.8620
0.4564 8.7241 506 0.7529 0.4817 0.7529 0.8677
0.4564 8.7586 508 0.7733 0.4931 0.7733 0.8793
0.4564 8.7931 510 0.8005 0.4581 0.8005 0.8947
0.4564 8.8276 512 0.8298 0.4518 0.8298 0.9109
0.4564 8.8621 514 0.8563 0.4434 0.8563 0.9253
0.4564 8.8966 516 0.8595 0.4434 0.8595 0.9271
0.4564 8.9310 518 0.8475 0.4518 0.8475 0.9206
0.4564 8.9655 520 0.8369 0.4518 0.8369 0.9148
0.4564 9.0 522 0.8208 0.4521 0.8208 0.9060
0.4564 9.0345 524 0.8131 0.4604 0.8131 0.9017
0.4564 9.0690 526 0.8054 0.4736 0.8054 0.8974
0.4564 9.1034 528 0.7897 0.4762 0.7897 0.8887
0.4564 9.1379 530 0.7748 0.4923 0.7748 0.8802
0.4564 9.1724 532 0.7642 0.4700 0.7642 0.8742
0.4564 9.2069 534 0.7543 0.4971 0.7543 0.8685
0.4564 9.2414 536 0.7444 0.5048 0.7444 0.8628
0.4564 9.2759 538 0.7397 0.5048 0.7397 0.8600
0.4564 9.3103 540 0.7394 0.5048 0.7394 0.8599
0.4564 9.3448 542 0.7423 0.5048 0.7423 0.8615
0.4564 9.3793 544 0.7430 0.5048 0.7430 0.8620
0.4564 9.4138 546 0.7476 0.5039 0.7476 0.8646
0.4564 9.4483 548 0.7559 0.4834 0.7559 0.8694
0.4564 9.4828 550 0.7669 0.4968 0.7669 0.8757
0.4564 9.5172 552 0.7750 0.4596 0.7750 0.8804
0.4564 9.5517 554 0.7756 0.4592 0.7756 0.8807
0.4564 9.5862 556 0.7727 0.4592 0.7727 0.8790
0.4564 9.6207 558 0.7734 0.4592 0.7734 0.8794
0.4564 9.6552 560 0.7723 0.4645 0.7723 0.8788
0.4564 9.6897 562 0.7718 0.4645 0.7718 0.8785
0.4564 9.7241 564 0.7705 0.4592 0.7705 0.8778
0.4564 9.7586 566 0.7684 0.4539 0.7684 0.8766
0.4564 9.7931 568 0.7667 0.4675 0.7667 0.8756
0.4564 9.8276 570 0.7658 0.4881 0.7658 0.8751
0.4564 9.8621 572 0.7644 0.4959 0.7644 0.8743
0.4564 9.8966 574 0.7632 0.4959 0.7632 0.8736
0.4564 9.9310 576 0.7621 0.4959 0.7621 0.8730
0.4564 9.9655 578 0.7614 0.4959 0.7614 0.8726
0.4564 10.0 580 0.7611 0.4959 0.7611 0.8724

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k10_task2_organization

Finetuned
(4023)
this model