ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7879
  • Qwk: 0.5480
  • Mse: 0.7879
  • Rmse: 0.8877

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 4.0652 -0.0009 4.0652 2.0162
No log 0.0690 4 2.1214 0.0703 2.1214 1.4565
No log 0.1034 6 1.2092 0.0678 1.2092 1.0996
No log 0.1379 8 0.9316 -0.0343 0.9316 0.9652
No log 0.1724 10 0.7414 0.0610 0.7414 0.8611
No log 0.2069 12 0.7188 0.1727 0.7188 0.8478
No log 0.2414 14 0.6895 0.2101 0.6895 0.8303
No log 0.2759 16 0.6382 0.1876 0.6382 0.7989
No log 0.3103 18 0.6128 0.2671 0.6128 0.7828
No log 0.3448 20 0.5872 0.3152 0.5872 0.7663
No log 0.3793 22 0.5808 0.3335 0.5808 0.7621
No log 0.4138 24 0.6119 0.3313 0.6119 0.7822
No log 0.4483 26 0.6604 0.3098 0.6604 0.8126
No log 0.4828 28 0.6365 0.3906 0.6365 0.7978
No log 0.5172 30 0.6158 0.3622 0.6158 0.7847
No log 0.5517 32 0.6156 0.3466 0.6156 0.7846
No log 0.5862 34 0.7369 0.2192 0.7369 0.8584
No log 0.6207 36 0.8212 0.3067 0.8212 0.9062
No log 0.6552 38 0.7922 0.3063 0.7922 0.8900
No log 0.6897 40 0.7665 0.2919 0.7665 0.8755
No log 0.7241 42 0.6805 0.3961 0.6805 0.8249
No log 0.7586 44 0.6643 0.4194 0.6643 0.8150
No log 0.7931 46 0.6393 0.3979 0.6393 0.7996
No log 0.8276 48 0.6590 0.4456 0.6590 0.8118
No log 0.8621 50 0.7158 0.4872 0.7158 0.8460
No log 0.8966 52 0.6696 0.4982 0.6696 0.8183
No log 0.9310 54 0.7283 0.3951 0.7283 0.8534
No log 0.9655 56 0.7341 0.4135 0.7341 0.8568
No log 1.0 58 0.6153 0.3961 0.6153 0.7844
No log 1.0345 60 0.6132 0.4930 0.6132 0.7831
No log 1.0690 62 0.6579 0.5135 0.6579 0.8111
No log 1.1034 64 0.6953 0.5063 0.6953 0.8339
No log 1.1379 66 0.7644 0.5114 0.7644 0.8743
No log 1.1724 68 0.8702 0.4918 0.8702 0.9329
No log 1.2069 70 0.7333 0.5070 0.7333 0.8563
No log 1.2414 72 0.6599 0.4921 0.6599 0.8124
No log 1.2759 74 0.7018 0.4912 0.7018 0.8377
No log 1.3103 76 0.8108 0.5266 0.8108 0.9004
No log 1.3448 78 1.2260 0.4056 1.2260 1.1073
No log 1.3793 80 1.2148 0.4263 1.2148 1.1022
No log 1.4138 82 0.9399 0.4438 0.9399 0.9695
No log 1.4483 84 0.7822 0.5120 0.7822 0.8844
No log 1.4828 86 0.8103 0.5042 0.8103 0.9002
No log 1.5172 88 0.9864 0.4202 0.9864 0.9932
No log 1.5517 90 1.1445 0.3992 1.1445 1.0698
No log 1.5862 92 1.0854 0.4439 1.0854 1.0418
No log 1.6207 94 0.9631 0.4787 0.9631 0.9814
No log 1.6552 96 0.9775 0.4385 0.9775 0.9887
No log 1.6897 98 1.0267 0.4918 1.0267 1.0132
No log 1.7241 100 1.0863 0.4613 1.0863 1.0423
No log 1.7586 102 0.9598 0.4902 0.9598 0.9797
No log 1.7931 104 0.8666 0.4743 0.8666 0.9309
No log 1.8276 106 0.8273 0.4859 0.8273 0.9095
No log 1.8621 108 0.8138 0.5184 0.8138 0.9021
No log 1.8966 110 0.8252 0.5381 0.8252 0.9084
No log 1.9310 112 0.7829 0.5544 0.7829 0.8848
No log 1.9655 114 0.7858 0.5114 0.7858 0.8864
No log 2.0 116 0.8772 0.5106 0.8772 0.9366
No log 2.0345 118 0.8429 0.4775 0.8429 0.9181
No log 2.0690 120 0.9180 0.4372 0.9180 0.9581
No log 2.1034 122 1.0487 0.4250 1.0487 1.0240
No log 2.1379 124 1.0140 0.4462 1.0140 1.0070
No log 2.1724 126 0.9622 0.4424 0.9622 0.9809
No log 2.2069 128 1.1389 0.4777 1.1389 1.0672
No log 2.2414 130 1.2043 0.4716 1.2043 1.0974
No log 2.2759 132 1.1574 0.4725 1.1574 1.0758
No log 2.3103 134 1.0329 0.4860 1.0329 1.0163
No log 2.3448 136 0.8843 0.5370 0.8843 0.9404
No log 2.3793 138 0.9092 0.4859 0.9092 0.9535
No log 2.4138 140 0.9971 0.4920 0.9971 0.9986
No log 2.4483 142 0.8726 0.5139 0.8726 0.9341
No log 2.4828 144 0.8345 0.5065 0.8345 0.9135
No log 2.5172 146 1.1333 0.4560 1.1333 1.0646
No log 2.5517 148 1.2002 0.4394 1.2002 1.0956
No log 2.5862 150 1.0016 0.5181 1.0016 1.0008
No log 2.6207 152 0.8550 0.5189 0.8550 0.9247
No log 2.6552 154 0.8495 0.5315 0.8495 0.9217
No log 2.6897 156 0.8157 0.5570 0.8157 0.9032
No log 2.7241 158 0.7874 0.5333 0.7874 0.8873
No log 2.7586 160 0.7795 0.5172 0.7795 0.8829
No log 2.7931 162 0.7681 0.5133 0.7681 0.8764
No log 2.8276 164 0.8663 0.5531 0.8663 0.9308
No log 2.8621 166 1.1432 0.4462 1.1432 1.0692
No log 2.8966 168 1.1895 0.4507 1.1895 1.0906
No log 2.9310 170 1.0400 0.4882 1.0400 1.0198
No log 2.9655 172 0.7570 0.5464 0.7570 0.8701
No log 3.0 174 0.8331 0.4154 0.8331 0.9127
No log 3.0345 176 0.8473 0.4132 0.8473 0.9205
No log 3.0690 178 0.7328 0.5122 0.7328 0.8561
No log 3.1034 180 0.7968 0.5518 0.7968 0.8926
No log 3.1379 182 0.8416 0.5595 0.8416 0.9174
No log 3.1724 184 0.7114 0.5605 0.7114 0.8434
No log 3.2069 186 0.6971 0.4507 0.6971 0.8349
No log 3.2414 188 0.7737 0.4386 0.7737 0.8796
No log 3.2759 190 0.7282 0.4682 0.7282 0.8533
No log 3.3103 192 0.8089 0.5272 0.8089 0.8994
No log 3.3448 194 0.9963 0.4795 0.9963 0.9981
No log 3.3793 196 0.9912 0.4824 0.9912 0.9956
No log 3.4138 198 0.8901 0.4937 0.8901 0.9435
No log 3.4483 200 0.8864 0.4825 0.8864 0.9415
No log 3.4828 202 0.8953 0.4768 0.8953 0.9462
No log 3.5172 204 0.8986 0.4972 0.8986 0.9480
No log 3.5517 206 0.8426 0.5069 0.8426 0.9179
No log 3.5862 208 0.7847 0.5335 0.7847 0.8858
No log 3.6207 210 0.7516 0.5722 0.7516 0.8670
No log 3.6552 212 0.7297 0.5494 0.7297 0.8542
No log 3.6897 214 0.7377 0.5520 0.7377 0.8589
No log 3.7241 216 0.7105 0.5782 0.7105 0.8429
No log 3.7586 218 0.7374 0.5911 0.7374 0.8587
No log 3.7931 220 0.7602 0.5745 0.7602 0.8719
No log 3.8276 222 0.7699 0.5524 0.7699 0.8774
No log 3.8621 224 0.8149 0.5355 0.8149 0.9027
No log 3.8966 226 0.8187 0.5381 0.8187 0.9048
No log 3.9310 228 0.8374 0.5215 0.8374 0.9151
No log 3.9655 230 0.8365 0.5513 0.8365 0.9146
No log 4.0 232 0.7834 0.5710 0.7834 0.8851
No log 4.0345 234 0.7258 0.5588 0.7258 0.8519
No log 4.0690 236 0.6726 0.5776 0.6726 0.8201
No log 4.1034 238 0.6933 0.5881 0.6933 0.8326
No log 4.1379 240 0.7220 0.5863 0.7220 0.8497
No log 4.1724 242 0.7625 0.5398 0.7625 0.8732
No log 4.2069 244 0.8657 0.5172 0.8657 0.9304
No log 4.2414 246 0.8532 0.5133 0.8532 0.9237
No log 4.2759 248 0.8281 0.5217 0.8281 0.9100
No log 4.3103 250 0.8279 0.4952 0.8279 0.9099
No log 4.3448 252 0.8237 0.5003 0.8237 0.9076
No log 4.3793 254 0.8431 0.4744 0.8431 0.9182
No log 4.4138 256 0.8494 0.4947 0.8494 0.9216
No log 4.4483 258 0.9081 0.5095 0.9081 0.9530
No log 4.4828 260 0.9340 0.5264 0.9340 0.9664
No log 4.5172 262 0.9580 0.5302 0.9580 0.9788
No log 4.5517 264 0.9404 0.5193 0.9404 0.9697
No log 4.5862 266 0.9517 0.5310 0.9517 0.9755
No log 4.6207 268 1.0096 0.5224 1.0096 1.0048
No log 4.6552 270 1.1317 0.4596 1.1317 1.0638
No log 4.6897 272 1.0908 0.4599 1.0908 1.0444
No log 4.7241 274 0.8990 0.5260 0.8990 0.9482
No log 4.7586 276 0.8204 0.5640 0.8204 0.9058
No log 4.7931 278 0.8580 0.5520 0.8580 0.9263
No log 4.8276 280 0.7999 0.5514 0.7999 0.8944
No log 4.8621 282 0.7924 0.5127 0.7924 0.8902
No log 4.8966 284 0.9807 0.4478 0.9807 0.9903
No log 4.9310 286 1.0936 0.4204 1.0936 1.0458
No log 4.9655 288 1.0024 0.4509 1.0024 1.0012
No log 5.0 290 0.8540 0.5037 0.8540 0.9241
No log 5.0345 292 0.8338 0.5414 0.8338 0.9131
No log 5.0690 294 0.9376 0.4702 0.9376 0.9683
No log 5.1034 296 0.9577 0.4702 0.9577 0.9786
No log 5.1379 298 0.9233 0.4995 0.9233 0.9609
No log 5.1724 300 1.0045 0.4789 1.0045 1.0022
No log 5.2069 302 1.2117 0.4624 1.2117 1.1008
No log 5.2414 304 1.2598 0.4263 1.2598 1.1224
No log 5.2759 306 1.1159 0.4528 1.1159 1.0564
No log 5.3103 308 0.9115 0.4871 0.9115 0.9547
No log 5.3448 310 0.8050 0.5484 0.8050 0.8972
No log 5.3793 312 0.7840 0.5176 0.7840 0.8855
No log 5.4138 314 0.7478 0.5222 0.7478 0.8648
No log 5.4483 316 0.7301 0.5496 0.7301 0.8544
No log 5.4828 318 0.8030 0.5008 0.8030 0.8961
No log 5.5172 320 0.9063 0.4827 0.9063 0.9520
No log 5.5517 322 0.9618 0.4756 0.9618 0.9807
No log 5.5862 324 0.9617 0.4950 0.9617 0.9807
No log 5.6207 326 0.8957 0.5111 0.8957 0.9464
No log 5.6552 328 0.8078 0.5425 0.8078 0.8988
No log 5.6897 330 0.7901 0.5199 0.7901 0.8889
No log 5.7241 332 0.7781 0.5271 0.7781 0.8821
No log 5.7586 334 0.7887 0.5398 0.7887 0.8881
No log 5.7931 336 0.8025 0.5239 0.8025 0.8958
No log 5.8276 338 0.7612 0.5276 0.7612 0.8725
No log 5.8621 340 0.7130 0.5668 0.7130 0.8444
No log 5.8966 342 0.6964 0.5629 0.6964 0.8345
No log 5.9310 344 0.6599 0.6007 0.6599 0.8124
No log 5.9655 346 0.6732 0.5993 0.6732 0.8205
No log 6.0 348 0.7150 0.5628 0.7150 0.8456
No log 6.0345 350 0.7383 0.5535 0.7383 0.8592
No log 6.0690 352 0.7501 0.5470 0.7501 0.8661
No log 6.1034 354 0.7570 0.5333 0.7570 0.8700
No log 6.1379 356 0.8066 0.5272 0.8066 0.8981
No log 6.1724 358 0.9317 0.5176 0.9317 0.9652
No log 6.2069 360 0.9925 0.4901 0.9925 0.9962
No log 6.2414 362 0.9433 0.4851 0.9433 0.9712
No log 6.2759 364 0.8781 0.4925 0.8781 0.9371
No log 6.3103 366 0.8248 0.4990 0.8248 0.9082
No log 6.3448 368 0.8046 0.5267 0.8046 0.8970
No log 6.3793 370 0.8303 0.4997 0.8303 0.9112
No log 6.4138 372 0.8691 0.4982 0.8691 0.9322
No log 6.4483 374 0.8761 0.5129 0.8761 0.9360
No log 6.4828 376 0.9102 0.5111 0.9102 0.9541
No log 6.5172 378 0.8871 0.4905 0.8871 0.9418
No log 6.5517 380 0.9053 0.4955 0.9053 0.9515
No log 6.5862 382 0.8786 0.4955 0.8786 0.9373
No log 6.6207 384 0.8366 0.5095 0.8366 0.9146
No log 6.6552 386 0.8148 0.5067 0.8148 0.9027
No log 6.6897 388 0.8019 0.5052 0.8019 0.8955
No log 6.7241 390 0.8358 0.5013 0.8358 0.9142
No log 6.7586 392 0.8741 0.4916 0.8741 0.9349
No log 6.7931 394 0.8819 0.4916 0.8819 0.9391
No log 6.8276 396 0.8273 0.5106 0.8273 0.9096
No log 6.8621 398 0.7518 0.5163 0.7518 0.8671
No log 6.8966 400 0.7041 0.5157 0.7041 0.8391
No log 6.9310 402 0.7057 0.5303 0.7057 0.8401
No log 6.9655 404 0.7350 0.4968 0.7350 0.8573
No log 7.0 406 0.7709 0.5218 0.7709 0.8780
No log 7.0345 408 0.8481 0.5084 0.8481 0.9209
No log 7.0690 410 0.9113 0.5114 0.9113 0.9546
No log 7.1034 412 0.9612 0.4998 0.9612 0.9804
No log 7.1379 414 0.9824 0.5024 0.9824 0.9912
No log 7.1724 416 0.9924 0.5071 0.9924 0.9962
No log 7.2069 418 0.9540 0.5005 0.9540 0.9767
No log 7.2414 420 0.8738 0.4865 0.8738 0.9348
No log 7.2759 422 0.8049 0.4794 0.8049 0.8971
No log 7.3103 424 0.7876 0.5090 0.7876 0.8875
No log 7.3448 426 0.7889 0.5157 0.7889 0.8882
No log 7.3793 428 0.7990 0.4992 0.7990 0.8939
No log 7.4138 430 0.8353 0.4882 0.8353 0.9140
No log 7.4483 432 0.9275 0.5066 0.9275 0.9631
No log 7.4828 434 0.9997 0.4633 0.9997 0.9998
No log 7.5172 436 1.0136 0.4513 1.0136 1.0068
No log 7.5517 438 1.0314 0.4504 1.0314 1.0156
No log 7.5862 440 0.9983 0.4735 0.9983 0.9992
No log 7.6207 442 0.9112 0.5051 0.9112 0.9546
No log 7.6552 444 0.8243 0.5128 0.8243 0.9079
No log 7.6897 446 0.7618 0.5216 0.7618 0.8728
No log 7.7241 448 0.7427 0.5443 0.7427 0.8618
No log 7.7586 450 0.7359 0.5522 0.7359 0.8579
No log 7.7931 452 0.7385 0.5148 0.7385 0.8594
No log 7.8276 454 0.7679 0.5218 0.7679 0.8763
No log 7.8621 456 0.8091 0.5115 0.8091 0.8995
No log 7.8966 458 0.8693 0.5116 0.8693 0.9324
No log 7.9310 460 0.8830 0.5108 0.8830 0.9397
No log 7.9655 462 0.8468 0.5201 0.8468 0.9202
No log 8.0 464 0.7848 0.5266 0.7848 0.8859
No log 8.0345 466 0.7249 0.5355 0.7249 0.8514
No log 8.0690 468 0.7090 0.5242 0.7090 0.8420
No log 8.1034 470 0.7154 0.5258 0.7154 0.8458
No log 8.1379 472 0.7323 0.5258 0.7323 0.8557
No log 8.1724 474 0.7529 0.5242 0.7529 0.8677
No log 8.2069 476 0.7782 0.5324 0.7782 0.8822
No log 8.2414 478 0.7902 0.5172 0.7902 0.8889
No log 8.2759 480 0.7931 0.5091 0.7931 0.8906
No log 8.3103 482 0.7781 0.5336 0.7781 0.8821
No log 8.3448 484 0.7845 0.4951 0.7845 0.8857
No log 8.3793 486 0.7680 0.5208 0.7680 0.8764
No log 8.4138 488 0.7527 0.5368 0.7527 0.8676
No log 8.4483 490 0.7540 0.5234 0.7540 0.8683
No log 8.4828 492 0.7588 0.5234 0.7588 0.8711
No log 8.5172 494 0.7458 0.5368 0.7458 0.8636
No log 8.5517 496 0.7396 0.5286 0.7396 0.8600
No log 8.5862 498 0.7465 0.5211 0.7465 0.8640
0.4473 8.6207 500 0.7467 0.5211 0.7467 0.8641
0.4473 8.6552 502 0.7496 0.5371 0.7496 0.8658
0.4473 8.6897 504 0.7518 0.5371 0.7518 0.8670
0.4473 8.7241 506 0.7718 0.5137 0.7718 0.8785
0.4473 8.7586 508 0.7936 0.5235 0.7936 0.8908
0.4473 8.7931 510 0.8113 0.5174 0.8113 0.9007
0.4473 8.8276 512 0.8181 0.5106 0.8181 0.9045
0.4473 8.8621 514 0.8124 0.5174 0.8124 0.9014
0.4473 8.8966 516 0.8093 0.5174 0.8093 0.8996
0.4473 8.9310 518 0.7932 0.5174 0.7932 0.8906
0.4473 8.9655 520 0.7646 0.5326 0.7646 0.8744
0.4473 9.0 522 0.7371 0.5169 0.7371 0.8585
0.4473 9.0345 524 0.7163 0.5519 0.7163 0.8464
0.4473 9.0690 526 0.7076 0.5519 0.7076 0.8412
0.4473 9.1034 528 0.7103 0.5519 0.7103 0.8428
0.4473 9.1379 530 0.7107 0.5460 0.7107 0.8430
0.4473 9.1724 532 0.7089 0.5519 0.7089 0.8420
0.4473 9.2069 534 0.7061 0.5519 0.7061 0.8403
0.4473 9.2414 536 0.7087 0.5465 0.7087 0.8419
0.4473 9.2759 538 0.7189 0.5385 0.7189 0.8479
0.4473 9.3103 540 0.7311 0.5253 0.7311 0.8550
0.4473 9.3448 542 0.7405 0.5299 0.7405 0.8605
0.4473 9.3793 544 0.7509 0.5351 0.7509 0.8665
0.4473 9.4138 546 0.7673 0.5454 0.7673 0.8760
0.4473 9.4483 548 0.7754 0.5374 0.7754 0.8806
0.4473 9.4828 550 0.7834 0.5374 0.7834 0.8851
0.4473 9.5172 552 0.7862 0.5374 0.7862 0.8867
0.4473 9.5517 554 0.7807 0.5374 0.7807 0.8836
0.4473 9.5862 556 0.7776 0.5374 0.7776 0.8818
0.4473 9.6207 558 0.7753 0.5440 0.7753 0.8805
0.4473 9.6552 560 0.7737 0.5453 0.7737 0.8796
0.4473 9.6897 562 0.7744 0.5453 0.7744 0.8800
0.4473 9.7241 564 0.7752 0.5440 0.7752 0.8804
0.4473 9.7586 566 0.7781 0.5440 0.7781 0.8821
0.4473 9.7931 568 0.7820 0.5374 0.7820 0.8843
0.4473 9.8276 570 0.7864 0.5480 0.7864 0.8868
0.4473 9.8621 572 0.7885 0.5480 0.7885 0.8880
0.4473 9.8966 574 0.7894 0.5480 0.7894 0.8885
0.4473 9.9310 576 0.7890 0.5480 0.7890 0.8883
0.4473 9.9655 578 0.7884 0.5480 0.7884 0.8879
0.4473 10.0 580 0.7879 0.5480 0.7879 0.8877

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k10_task2_organization

Finetuned
(4023)
this model