ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k10_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7313
  • Qwk: 0.6905
  • Mse: 0.7313
  • Rmse: 0.8551

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0317 2 5.3624 -0.0390 5.3624 2.3157
No log 0.0635 4 3.0038 0.0859 3.0038 1.7332
No log 0.0952 6 1.9875 0.0341 1.9875 1.4098
No log 0.1270 8 1.6567 -0.0127 1.6567 1.2871
No log 0.1587 10 2.3963 -0.2153 2.3963 1.5480
No log 0.1905 12 2.2869 -0.1729 2.2869 1.5123
No log 0.2222 14 1.8120 -0.1670 1.8120 1.3461
No log 0.2540 16 1.7592 -0.0751 1.7592 1.3263
No log 0.2857 18 1.6276 0.0066 1.6276 1.2758
No log 0.3175 20 1.5140 0.0301 1.5140 1.2305
No log 0.3492 22 1.3094 0.2270 1.3094 1.1443
No log 0.3810 24 1.1123 0.3318 1.1123 1.0547
No log 0.4127 26 1.0491 0.3706 1.0491 1.0243
No log 0.4444 28 1.0336 0.3651 1.0336 1.0167
No log 0.4762 30 1.2457 0.2584 1.2457 1.1161
No log 0.5079 32 1.8739 0.2427 1.8739 1.3689
No log 0.5397 34 2.5148 0.2224 2.5148 1.5858
No log 0.5714 36 2.5519 0.2230 2.5519 1.5975
No log 0.6032 38 2.0128 0.2840 2.0128 1.4187
No log 0.6349 40 1.5244 0.2762 1.5244 1.2346
No log 0.6667 42 1.3312 0.3578 1.3312 1.1538
No log 0.6984 44 1.0940 0.4004 1.0940 1.0459
No log 0.7302 46 1.0836 0.4487 1.0836 1.0409
No log 0.7619 48 0.8912 0.5559 0.8912 0.9440
No log 0.7937 50 0.8695 0.6193 0.8695 0.9325
No log 0.8254 52 1.2619 0.5133 1.2619 1.1233
No log 0.8571 54 1.4285 0.4716 1.4285 1.1952
No log 0.8889 56 1.0931 0.5600 1.0931 1.0455
No log 0.9206 58 0.8925 0.6344 0.8925 0.9447
No log 0.9524 60 0.8282 0.6166 0.8282 0.9101
No log 0.9841 62 0.8545 0.5802 0.8545 0.9244
No log 1.0159 64 0.9971 0.5316 0.9971 0.9986
No log 1.0476 66 1.4182 0.4324 1.4182 1.1909
No log 1.0794 68 1.5204 0.4129 1.5204 1.2330
No log 1.1111 70 1.2532 0.4519 1.2532 1.1195
No log 1.1429 72 0.9703 0.5121 0.9703 0.9850
No log 1.1746 74 0.8335 0.5244 0.8335 0.9130
No log 1.2063 76 0.8102 0.5824 0.8102 0.9001
No log 1.2381 78 0.7866 0.6303 0.7866 0.8869
No log 1.2698 80 0.8353 0.6303 0.8353 0.9139
No log 1.3016 82 0.9445 0.6016 0.9445 0.9719
No log 1.3333 84 1.0055 0.5880 1.0055 1.0028
No log 1.3651 86 0.8081 0.6027 0.8081 0.8989
No log 1.3968 88 0.7707 0.6343 0.7707 0.8779
No log 1.4286 90 0.8836 0.5627 0.8836 0.9400
No log 1.4603 92 0.8418 0.5994 0.8418 0.9175
No log 1.4921 94 0.7421 0.6523 0.7421 0.8615
No log 1.5238 96 1.2268 0.5060 1.2268 1.1076
No log 1.5556 98 1.6148 0.4143 1.6148 1.2708
No log 1.5873 100 1.3343 0.5050 1.3343 1.1551
No log 1.6190 102 0.8038 0.6500 0.8038 0.8965
No log 1.6508 104 0.7295 0.6730 0.7295 0.8541
No log 1.6825 106 0.7295 0.6844 0.7295 0.8541
No log 1.7143 108 0.7578 0.6426 0.7578 0.8705
No log 1.7460 110 0.9024 0.5965 0.9024 0.9499
No log 1.7778 112 0.8676 0.6210 0.8676 0.9314
No log 1.8095 114 0.7159 0.6925 0.7159 0.8461
No log 1.8413 116 0.6400 0.7088 0.6400 0.8000
No log 1.8730 118 0.6952 0.6852 0.6952 0.8338
No log 1.9048 120 0.6559 0.6637 0.6559 0.8099
No log 1.9365 122 0.6234 0.7265 0.6234 0.7895
No log 1.9683 124 0.6856 0.7314 0.6856 0.8280
No log 2.0 126 0.7214 0.7223 0.7214 0.8493
No log 2.0317 128 0.6775 0.7019 0.6775 0.8231
No log 2.0635 130 0.7223 0.6897 0.7223 0.8499
No log 2.0952 132 0.8111 0.6759 0.8111 0.9006
No log 2.1270 134 0.7426 0.6681 0.7426 0.8617
No log 2.1587 136 0.7499 0.6758 0.7499 0.8660
No log 2.1905 138 0.7294 0.6900 0.7294 0.8541
No log 2.2222 140 0.7890 0.6593 0.7890 0.8882
No log 2.2540 142 0.8971 0.6561 0.8971 0.9472
No log 2.2857 144 0.8037 0.6666 0.8037 0.8965
No log 2.3175 146 0.7011 0.6650 0.7011 0.8373
No log 2.3492 148 0.6961 0.6612 0.6961 0.8343
No log 2.3810 150 0.6891 0.6612 0.6891 0.8301
No log 2.4127 152 0.6859 0.6625 0.6859 0.8282
No log 2.4444 154 0.7239 0.6619 0.7239 0.8508
No log 2.4762 156 0.8751 0.6446 0.8751 0.9355
No log 2.5079 158 0.8500 0.6439 0.8500 0.9219
No log 2.5397 160 0.7251 0.6848 0.7251 0.8516
No log 2.5714 162 0.7276 0.6531 0.7276 0.8530
No log 2.6032 164 0.7423 0.6611 0.7423 0.8616
No log 2.6349 166 0.7602 0.6618 0.7602 0.8719
No log 2.6667 168 0.7818 0.6382 0.7818 0.8842
No log 2.6984 170 0.7864 0.6427 0.7864 0.8868
No log 2.7302 172 0.7797 0.6448 0.7797 0.8830
No log 2.7619 174 0.8528 0.6167 0.8528 0.9235
No log 2.7937 176 1.0502 0.5677 1.0502 1.0248
No log 2.8254 178 0.9713 0.5872 0.9713 0.9855
No log 2.8571 180 0.8149 0.6338 0.8149 0.9027
No log 2.8889 182 0.7990 0.6669 0.7990 0.8939
No log 2.9206 184 0.8105 0.6690 0.8105 0.9003
No log 2.9524 186 0.7363 0.6676 0.7363 0.8581
No log 2.9841 188 0.7390 0.6866 0.7390 0.8596
No log 3.0159 190 0.7315 0.6645 0.7315 0.8553
No log 3.0476 192 0.7323 0.6636 0.7323 0.8558
No log 3.0794 194 0.7440 0.6921 0.7440 0.8625
No log 3.1111 196 0.8274 0.6578 0.8274 0.9096
No log 3.1429 198 0.8815 0.6362 0.8815 0.9389
No log 3.1746 200 0.8349 0.6648 0.8349 0.9137
No log 3.2063 202 0.7686 0.6880 0.7686 0.8767
No log 3.2381 204 0.7598 0.6975 0.7598 0.8716
No log 3.2698 206 0.7391 0.6957 0.7391 0.8597
No log 3.3016 208 0.7276 0.6799 0.7276 0.8530
No log 3.3333 210 0.7560 0.6929 0.7560 0.8695
No log 3.3651 212 0.8041 0.6832 0.8041 0.8967
No log 3.3968 214 0.8920 0.6616 0.8920 0.9444
No log 3.4286 216 0.8820 0.6573 0.8820 0.9392
No log 3.4603 218 0.8148 0.6875 0.8148 0.9027
No log 3.4921 220 0.7062 0.6974 0.7062 0.8403
No log 3.5238 222 0.6959 0.6818 0.6959 0.8342
No log 3.5556 224 0.7432 0.6816 0.7432 0.8621
No log 3.5873 226 0.8619 0.6415 0.8619 0.9284
No log 3.6190 228 0.8886 0.6318 0.8886 0.9427
No log 3.6508 230 0.8539 0.6481 0.8539 0.9240
No log 3.6825 232 0.8324 0.6754 0.8324 0.9124
No log 3.7143 234 0.8335 0.6811 0.8335 0.9129
No log 3.7460 236 0.8442 0.6673 0.8442 0.9188
No log 3.7778 238 0.9441 0.6137 0.9441 0.9717
No log 3.8095 240 1.0134 0.6154 1.0134 1.0067
No log 3.8413 242 0.9665 0.6229 0.9665 0.9831
No log 3.8730 244 0.8577 0.6877 0.8577 0.9261
No log 3.9048 246 0.7031 0.7049 0.7031 0.8385
No log 3.9365 248 0.6801 0.7098 0.6801 0.8247
No log 3.9683 250 0.6785 0.6865 0.6785 0.8237
No log 4.0 252 0.6800 0.6865 0.6800 0.8246
No log 4.0317 254 0.6808 0.7148 0.6808 0.8251
No log 4.0635 256 0.6751 0.6582 0.6751 0.8216
No log 4.0952 258 0.6684 0.6718 0.6684 0.8175
No log 4.1270 260 0.6856 0.7153 0.6856 0.8280
No log 4.1587 262 0.7113 0.6978 0.7113 0.8434
No log 4.1905 264 0.6727 0.7236 0.6727 0.8202
No log 4.2222 266 0.6598 0.6789 0.6598 0.8123
No log 4.2540 268 0.6654 0.6789 0.6654 0.8157
No log 4.2857 270 0.6863 0.6866 0.6863 0.8284
No log 4.3175 272 0.7128 0.6921 0.7128 0.8443
No log 4.3492 274 0.7709 0.6930 0.7709 0.8780
No log 4.3810 276 0.8370 0.6537 0.8370 0.9149
No log 4.4127 278 0.8153 0.6820 0.8153 0.9030
No log 4.4444 280 0.7690 0.6950 0.7690 0.8770
No log 4.4762 282 0.7551 0.6734 0.7551 0.8690
No log 4.5079 284 0.8204 0.7053 0.8204 0.9058
No log 4.5397 286 0.8811 0.6278 0.8811 0.9387
No log 4.5714 288 0.8041 0.7046 0.8041 0.8967
No log 4.6032 290 0.7542 0.7049 0.7542 0.8684
No log 4.6349 292 0.7698 0.6868 0.7698 0.8774
No log 4.6667 294 0.7454 0.6868 0.7454 0.8633
No log 4.6984 296 0.7641 0.6546 0.7641 0.8741
No log 4.7302 298 0.7815 0.6377 0.7815 0.8840
No log 4.7619 300 0.7249 0.6820 0.7249 0.8514
No log 4.7937 302 0.7174 0.6944 0.7174 0.8470
No log 4.8254 304 0.6936 0.7173 0.6936 0.8328
No log 4.8571 306 0.7174 0.6703 0.7174 0.8470
No log 4.8889 308 0.7992 0.6432 0.7992 0.8940
No log 4.9206 310 0.8619 0.6359 0.8619 0.9284
No log 4.9524 312 1.0115 0.6010 1.0115 1.0057
No log 4.9841 314 1.0431 0.5940 1.0431 1.0213
No log 5.0159 316 1.0346 0.6312 1.0346 1.0172
No log 5.0476 318 0.9494 0.6398 0.9494 0.9744
No log 5.0794 320 0.8351 0.6508 0.8351 0.9138
No log 5.1111 322 0.7812 0.6701 0.7812 0.8839
No log 5.1429 324 0.8163 0.6454 0.8163 0.9035
No log 5.1746 326 0.9087 0.6593 0.9087 0.9533
No log 5.2063 328 0.9950 0.6298 0.9950 0.9975
No log 5.2381 330 1.0751 0.5972 1.0751 1.0369
No log 5.2698 332 0.9752 0.5951 0.9752 0.9875
No log 5.3016 334 0.7963 0.6798 0.7963 0.8924
No log 5.3333 336 0.6726 0.7007 0.6726 0.8201
No log 5.3651 338 0.6356 0.7465 0.6356 0.7973
No log 5.3968 340 0.6347 0.7592 0.6347 0.7967
No log 5.4286 342 0.6466 0.7567 0.6466 0.8041
No log 5.4603 344 0.6955 0.6852 0.6955 0.8340
No log 5.4921 346 0.7935 0.6695 0.7935 0.8908
No log 5.5238 348 0.8402 0.6228 0.8402 0.9166
No log 5.5556 350 0.8185 0.6532 0.8185 0.9047
No log 5.5873 352 0.7841 0.6875 0.7841 0.8855
No log 5.6190 354 0.7090 0.6852 0.7090 0.8420
No log 5.6508 356 0.6625 0.7636 0.6625 0.8139
No log 5.6825 358 0.6629 0.7380 0.6629 0.8142
No log 5.7143 360 0.7089 0.6832 0.7089 0.8420
No log 5.7460 362 0.7305 0.6753 0.7305 0.8547
No log 5.7778 364 0.7021 0.6832 0.7021 0.8379
No log 5.8095 366 0.6587 0.7386 0.6587 0.8116
No log 5.8413 368 0.6186 0.7416 0.6186 0.7865
No log 5.8730 370 0.6163 0.6778 0.6163 0.7851
No log 5.9048 372 0.6245 0.6757 0.6245 0.7902
No log 5.9365 374 0.6055 0.6858 0.6055 0.7781
No log 5.9683 376 0.6182 0.7406 0.6182 0.7862
No log 6.0 378 0.7000 0.6942 0.7000 0.8366
No log 6.0317 380 0.7572 0.6758 0.7572 0.8702
No log 6.0635 382 0.7736 0.6868 0.7736 0.8795
No log 6.0952 384 0.7659 0.6821 0.7659 0.8752
No log 6.1270 386 0.7419 0.6735 0.7419 0.8613
No log 6.1587 388 0.6933 0.7044 0.6933 0.8327
No log 6.1905 390 0.6429 0.7462 0.6429 0.8018
No log 6.2222 392 0.6459 0.7462 0.6459 0.8037
No log 6.2540 394 0.6958 0.7080 0.6958 0.8341
No log 6.2857 396 0.8075 0.6678 0.8075 0.8986
No log 6.3175 398 0.8703 0.6320 0.8703 0.9329
No log 6.3492 400 0.8210 0.6512 0.8210 0.9061
No log 6.3810 402 0.7679 0.6692 0.7679 0.8763
No log 6.4127 404 0.7093 0.7100 0.7093 0.8422
No log 6.4444 406 0.6921 0.7289 0.6921 0.8319
No log 6.4762 408 0.7131 0.7084 0.7131 0.8445
No log 6.5079 410 0.7721 0.6605 0.7721 0.8787
No log 6.5397 412 0.8381 0.6239 0.8381 0.9155
No log 6.5714 414 0.9277 0.5816 0.9277 0.9632
No log 6.6032 416 0.9115 0.5926 0.9115 0.9547
No log 6.6349 418 0.8513 0.6047 0.8513 0.9227
No log 6.6667 420 0.7753 0.6508 0.7753 0.8805
No log 6.6984 422 0.7444 0.6773 0.7444 0.8628
No log 6.7302 424 0.7628 0.6710 0.7628 0.8734
No log 6.7619 426 0.7996 0.6406 0.7996 0.8942
No log 6.7937 428 0.7920 0.6492 0.7920 0.8899
No log 6.8254 430 0.7613 0.6717 0.7613 0.8725
No log 6.8571 432 0.7348 0.7031 0.7348 0.8572
No log 6.8889 434 0.7180 0.7131 0.7180 0.8474
No log 6.9206 436 0.6861 0.7158 0.6861 0.8283
No log 6.9524 438 0.6668 0.7206 0.6668 0.8166
No log 6.9841 440 0.6695 0.7397 0.6695 0.8183
No log 7.0159 442 0.6681 0.7436 0.6681 0.8173
No log 7.0476 444 0.6640 0.7453 0.6640 0.8149
No log 7.0794 446 0.6658 0.7453 0.6658 0.8160
No log 7.1111 448 0.6987 0.7075 0.6987 0.8359
No log 7.1429 450 0.7365 0.6845 0.7365 0.8582
No log 7.1746 452 0.7914 0.6262 0.7914 0.8896
No log 7.2063 454 0.8039 0.6262 0.8039 0.8966
No log 7.2381 456 0.7810 0.6462 0.7810 0.8838
No log 7.2698 458 0.7221 0.7061 0.7221 0.8497
No log 7.3016 460 0.6824 0.7218 0.6824 0.8261
No log 7.3333 462 0.6667 0.7259 0.6667 0.8165
No log 7.3651 464 0.6707 0.7153 0.6707 0.8190
No log 7.3968 466 0.6762 0.7215 0.6762 0.8223
No log 7.4286 468 0.7134 0.7175 0.7134 0.8446
No log 7.4603 470 0.8078 0.6462 0.8078 0.8988
No log 7.4921 472 0.9445 0.6115 0.9445 0.9719
No log 7.5238 474 1.0057 0.6025 1.0057 1.0029
No log 7.5556 476 1.0105 0.6096 1.0105 1.0052
No log 7.5873 478 0.9656 0.6121 0.9656 0.9827
No log 7.6190 480 0.8876 0.6386 0.8876 0.9421
No log 7.6508 482 0.7837 0.6640 0.7837 0.8853
No log 7.6825 484 0.7152 0.6838 0.7152 0.8457
No log 7.7143 486 0.6902 0.7226 0.6902 0.8308
No log 7.7460 488 0.6950 0.6938 0.6950 0.8336
No log 7.7778 490 0.7322 0.6788 0.7322 0.8557
No log 7.8095 492 0.7831 0.6669 0.7831 0.8850
No log 7.8413 494 0.8237 0.6375 0.8237 0.9076
No log 7.8730 496 0.8643 0.6155 0.8643 0.9297
No log 7.9048 498 0.8751 0.6110 0.8751 0.9355
0.4172 7.9365 500 0.8659 0.6259 0.8659 0.9305
0.4172 7.9683 502 0.8320 0.6396 0.8320 0.9121
0.4172 8.0 504 0.8275 0.6404 0.8275 0.9096
0.4172 8.0317 506 0.8160 0.6439 0.8160 0.9033
0.4172 8.0635 508 0.8136 0.6474 0.8136 0.9020
0.4172 8.0952 510 0.8019 0.6582 0.8019 0.8955
0.4172 8.1270 512 0.7648 0.6753 0.7648 0.8745
0.4172 8.1587 514 0.7267 0.6839 0.7267 0.8525
0.4172 8.1905 516 0.7040 0.7200 0.7040 0.8391
0.4172 8.2222 518 0.6905 0.7227 0.6905 0.8309
0.4172 8.2540 520 0.6968 0.7102 0.6968 0.8347
0.4172 8.2857 522 0.7160 0.7216 0.7160 0.8462
0.4172 8.3175 524 0.7484 0.6821 0.7484 0.8651
0.4172 8.3492 526 0.8095 0.6582 0.8095 0.8997
0.4172 8.3810 528 0.8557 0.6196 0.8557 0.9250
0.4172 8.4127 530 0.8773 0.6159 0.8773 0.9366
0.4172 8.4444 532 0.8689 0.6220 0.8689 0.9322
0.4172 8.4762 534 0.8354 0.6341 0.8354 0.9140
0.4172 8.5079 536 0.8038 0.6566 0.8038 0.8965
0.4172 8.5397 538 0.7861 0.6614 0.7861 0.8866
0.4172 8.5714 540 0.7766 0.6789 0.7766 0.8812
0.4172 8.6032 542 0.7750 0.6800 0.7750 0.8803
0.4172 8.6349 544 0.7741 0.6793 0.7741 0.8798
0.4172 8.6667 546 0.7838 0.6457 0.7838 0.8853
0.4172 8.6984 548 0.8055 0.6294 0.8055 0.8975
0.4172 8.7302 550 0.8259 0.6173 0.8259 0.9088
0.4172 8.7619 552 0.8363 0.6182 0.8363 0.9145
0.4172 8.7937 554 0.8515 0.6182 0.8515 0.9228
0.4172 8.8254 556 0.8503 0.6182 0.8503 0.9221
0.4172 8.8571 558 0.8262 0.6210 0.8262 0.9089
0.4172 8.8889 560 0.8048 0.6390 0.8048 0.8971
0.4172 8.9206 562 0.7825 0.6705 0.7825 0.8846
0.4172 8.9524 564 0.7530 0.6836 0.7530 0.8678
0.4172 8.9841 566 0.7347 0.6836 0.7347 0.8572
0.4172 9.0159 568 0.7181 0.6868 0.7181 0.8474
0.4172 9.0476 570 0.7176 0.6868 0.7176 0.8471
0.4172 9.0794 572 0.7266 0.6868 0.7266 0.8524
0.4172 9.1111 574 0.7424 0.6836 0.7424 0.8616
0.4172 9.1429 576 0.7639 0.6818 0.7639 0.8740
0.4172 9.1746 578 0.7715 0.6774 0.7715 0.8783
0.4172 9.2063 580 0.7667 0.6774 0.7667 0.8756
0.4172 9.2381 582 0.7483 0.6818 0.7483 0.8651
0.4172 9.2698 584 0.7394 0.6836 0.7394 0.8599
0.4172 9.3016 586 0.7245 0.6836 0.7245 0.8512
0.4172 9.3333 588 0.7047 0.7112 0.7047 0.8395
0.4172 9.3651 590 0.6957 0.7215 0.6957 0.8341
0.4172 9.3968 592 0.6941 0.7215 0.6941 0.8331
0.4172 9.4286 594 0.6988 0.7112 0.6988 0.8360
0.4172 9.4603 596 0.7017 0.7112 0.7017 0.8377
0.4172 9.4921 598 0.7042 0.7112 0.7042 0.8392
0.4172 9.5238 600 0.7026 0.7112 0.7026 0.8382
0.4172 9.5556 602 0.7040 0.7089 0.7040 0.8390
0.4172 9.5873 604 0.7103 0.7150 0.7103 0.8428
0.4172 9.6190 606 0.7172 0.6868 0.7172 0.8469
0.4172 9.6508 608 0.7283 0.6836 0.7283 0.8534
0.4172 9.6825 610 0.7373 0.6836 0.7373 0.8586
0.4172 9.7143 612 0.7408 0.6836 0.7408 0.8607
0.4172 9.7460 614 0.7403 0.6836 0.7403 0.8604
0.4172 9.7778 616 0.7371 0.6836 0.7371 0.8585
0.4172 9.8095 618 0.7360 0.6836 0.7360 0.8579
0.4172 9.8413 620 0.7348 0.6836 0.7348 0.8572
0.4172 9.8730 622 0.7340 0.6836 0.7340 0.8567
0.4172 9.9048 624 0.7331 0.6905 0.7331 0.8562
0.4172 9.9365 626 0.7324 0.6905 0.7324 0.8558
0.4172 9.9683 628 0.7316 0.6905 0.7316 0.8553
0.4172 10.0 630 0.7313 0.6905 0.7313 0.8551

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k17_task2_organization

Finetuned
(4023)
this model