ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7244
  • Qwk: 0.5147
  • Mse: 0.7244
  • Rmse: 0.8511

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 4.2897 0.0182 4.2897 2.0712
No log 0.1333 4 2.4675 -0.0180 2.4675 1.5708
No log 0.2 6 1.8893 0.0203 1.8893 1.3745
No log 0.2667 8 2.3108 -0.0019 2.3108 1.5201
No log 0.3333 10 2.4384 -0.0155 2.4384 1.5615
No log 0.4 12 1.4480 0.0030 1.4480 1.2033
No log 0.4667 14 1.1316 0.0938 1.1316 1.0638
No log 0.5333 16 1.1167 0.2489 1.1167 1.0568
No log 0.6 18 1.3487 0.0496 1.3487 1.1613
No log 0.6667 20 1.3461 0.0878 1.3461 1.1602
No log 0.7333 22 1.1704 0.1471 1.1704 1.0819
No log 0.8 24 0.9619 0.2935 0.9619 0.9808
No log 0.8667 26 0.9685 0.3127 0.9685 0.9841
No log 0.9333 28 1.0081 0.2100 1.0081 1.0040
No log 1.0 30 1.1068 0.2271 1.1068 1.0520
No log 1.0667 32 1.0501 0.2613 1.0501 1.0247
No log 1.1333 34 0.9696 0.3048 0.9696 0.9847
No log 1.2 36 0.9381 0.3916 0.9381 0.9685
No log 1.2667 38 1.0586 0.2125 1.0586 1.0289
No log 1.3333 40 1.1642 0.1744 1.1642 1.0790
No log 1.4 42 1.0579 0.2094 1.0579 1.0286
No log 1.4667 44 0.8792 0.3625 0.8792 0.9376
No log 1.5333 46 0.8569 0.3435 0.8569 0.9257
No log 1.6 48 0.8631 0.3896 0.8631 0.9291
No log 1.6667 50 0.9179 0.3832 0.9179 0.9581
No log 1.7333 52 1.0192 0.2471 1.0192 1.0096
No log 1.8 54 1.0616 0.2618 1.0616 1.0304
No log 1.8667 56 0.9878 0.2618 0.9878 0.9939
No log 1.9333 58 0.8042 0.4217 0.8042 0.8968
No log 2.0 60 0.7140 0.4887 0.7140 0.8450
No log 2.0667 62 0.7288 0.4606 0.7288 0.8537
No log 2.1333 64 0.6959 0.4871 0.6959 0.8342
No log 2.2 66 0.6703 0.5138 0.6703 0.8187
No log 2.2667 68 0.6329 0.5823 0.6329 0.7956
No log 2.3333 70 0.6863 0.6060 0.6863 0.8284
No log 2.4 72 0.6643 0.6071 0.6643 0.8151
No log 2.4667 74 0.6390 0.6143 0.6390 0.7994
No log 2.5333 76 0.6446 0.5939 0.6446 0.8028
No log 2.6 78 0.6642 0.6039 0.6642 0.8150
No log 2.6667 80 0.6804 0.6167 0.6804 0.8249
No log 2.7333 82 0.7051 0.6025 0.7051 0.8397
No log 2.8 84 0.6654 0.5774 0.6654 0.8157
No log 2.8667 86 0.7141 0.5396 0.7141 0.8450
No log 2.9333 88 0.7137 0.5042 0.7137 0.8448
No log 3.0 90 0.7093 0.5763 0.7093 0.8422
No log 3.0667 92 0.7111 0.6247 0.7111 0.8433
No log 3.1333 94 0.7884 0.5902 0.7884 0.8879
No log 3.2 96 0.8066 0.5411 0.8066 0.8981
No log 3.2667 98 0.7198 0.5614 0.7198 0.8484
No log 3.3333 100 0.8062 0.5160 0.8062 0.8979
No log 3.4 102 0.8701 0.5362 0.8701 0.9328
No log 3.4667 104 0.7137 0.6166 0.7137 0.8448
No log 3.5333 106 0.7519 0.5842 0.7519 0.8671
No log 3.6 108 1.3342 0.4308 1.3342 1.1551
No log 3.6667 110 1.5821 0.3040 1.5821 1.2578
No log 3.7333 112 1.3341 0.3330 1.3341 1.1550
No log 3.8 114 0.8352 0.5210 0.8352 0.9139
No log 3.8667 116 0.6850 0.5314 0.6850 0.8276
No log 3.9333 118 0.7990 0.4578 0.7990 0.8939
No log 4.0 120 0.7684 0.5141 0.7684 0.8766
No log 4.0667 122 0.6632 0.6488 0.6632 0.8144
No log 4.1333 124 0.6608 0.5599 0.6608 0.8129
No log 4.2 126 0.8451 0.54 0.8451 0.9193
No log 4.2667 128 0.9716 0.4681 0.9716 0.9857
No log 4.3333 130 0.8326 0.4371 0.8326 0.9125
No log 4.4 132 0.7492 0.4938 0.7492 0.8656
No log 4.4667 134 0.7021 0.5740 0.7021 0.8379
No log 4.5333 136 0.6836 0.6112 0.6836 0.8268
No log 4.6 138 0.6734 0.5894 0.6734 0.8206
No log 4.6667 140 0.6849 0.5729 0.6849 0.8276
No log 4.7333 142 0.7434 0.5475 0.7434 0.8622
No log 4.8 144 0.7970 0.5346 0.7970 0.8927
No log 4.8667 146 0.7224 0.5510 0.7224 0.8499
No log 4.9333 148 0.6318 0.6306 0.6318 0.7949
No log 5.0 150 0.6209 0.5905 0.6209 0.7880
No log 5.0667 152 0.6127 0.6111 0.6127 0.7828
No log 5.1333 154 0.6789 0.6318 0.6789 0.8239
No log 5.2 156 0.8405 0.5468 0.8405 0.9168
No log 5.2667 158 0.8646 0.5482 0.8646 0.9298
No log 5.3333 160 0.7941 0.5828 0.7941 0.8911
No log 5.4 162 0.7457 0.5629 0.7457 0.8635
No log 5.4667 164 0.7182 0.5396 0.7182 0.8475
No log 5.5333 166 0.6563 0.6310 0.6563 0.8101
No log 5.6 168 0.6360 0.6562 0.6360 0.7975
No log 5.6667 170 0.6882 0.5730 0.6882 0.8296
No log 5.7333 172 0.7964 0.4960 0.7964 0.8924
No log 5.8 174 0.8342 0.3804 0.8342 0.9134
No log 5.8667 176 0.7338 0.5208 0.7338 0.8566
No log 5.9333 178 0.6923 0.6078 0.6923 0.8320
No log 6.0 180 0.7596 0.5650 0.7596 0.8715
No log 6.0667 182 0.7876 0.5650 0.7876 0.8875
No log 6.1333 184 0.7007 0.6067 0.7007 0.8371
No log 6.2 186 0.8123 0.5668 0.8123 0.9013
No log 6.2667 188 0.9609 0.4581 0.9609 0.9802
No log 6.3333 190 0.8789 0.5316 0.8789 0.9375
No log 6.4 192 0.7983 0.5430 0.7983 0.8934
No log 6.4667 194 0.7999 0.5041 0.7999 0.8944
No log 6.5333 196 0.7951 0.5059 0.7951 0.8917
No log 6.6 198 0.7533 0.5603 0.7533 0.8679
No log 6.6667 200 0.7632 0.5596 0.7632 0.8736
No log 6.7333 202 0.8093 0.4757 0.8093 0.8996
No log 6.8 204 0.8983 0.4612 0.8983 0.9478
No log 6.8667 206 0.8981 0.4300 0.8981 0.9477
No log 6.9333 208 0.8577 0.4731 0.8577 0.9261
No log 7.0 210 0.8629 0.4503 0.8629 0.9290
No log 7.0667 212 0.7940 0.5279 0.7940 0.8911
No log 7.1333 214 0.8026 0.4940 0.8026 0.8959
No log 7.2 216 0.7828 0.3860 0.7828 0.8847
No log 7.2667 218 0.7783 0.3877 0.7783 0.8822
No log 7.3333 220 0.7568 0.5301 0.7568 0.8699
No log 7.4 222 0.7278 0.5587 0.7278 0.8531
No log 7.4667 224 0.7363 0.5603 0.7363 0.8581
No log 7.5333 226 0.7649 0.6116 0.7649 0.8746
No log 7.6 228 0.7648 0.6108 0.7648 0.8745
No log 7.6667 230 0.7449 0.5914 0.7449 0.8631
No log 7.7333 232 0.7685 0.6063 0.7685 0.8766
No log 7.8 234 0.7787 0.5493 0.7787 0.8824
No log 7.8667 236 0.8676 0.4825 0.8676 0.9315
No log 7.9333 238 0.8216 0.5245 0.8216 0.9064
No log 8.0 240 0.7577 0.5364 0.7577 0.8704
No log 8.0667 242 0.7369 0.5379 0.7369 0.8584
No log 8.1333 244 0.6962 0.5796 0.6962 0.8344
No log 8.2 246 0.6626 0.5917 0.6626 0.8140
No log 8.2667 248 0.6859 0.6084 0.6859 0.8282
No log 8.3333 250 0.7136 0.5644 0.7136 0.8447
No log 8.4 252 0.6789 0.5859 0.6789 0.8240
No log 8.4667 254 0.6773 0.5774 0.6773 0.8230
No log 8.5333 256 0.6826 0.6272 0.6826 0.8262
No log 8.6 258 0.7361 0.5978 0.7361 0.8580
No log 8.6667 260 0.8214 0.5387 0.8214 0.9063
No log 8.7333 262 0.6845 0.6410 0.6845 0.8273
No log 8.8 264 0.5906 0.6687 0.5906 0.7685
No log 8.8667 266 0.5817 0.7082 0.5817 0.7627
No log 8.9333 268 0.5992 0.6815 0.5992 0.7741
No log 9.0 270 0.6088 0.6664 0.6088 0.7803
No log 9.0667 272 0.6375 0.6455 0.6375 0.7984
No log 9.1333 274 0.6649 0.6630 0.6649 0.8154
No log 9.2 276 0.6353 0.6575 0.6353 0.7970
No log 9.2667 278 0.6351 0.6626 0.6351 0.7969
No log 9.3333 280 0.6242 0.6447 0.6242 0.7900
No log 9.4 282 0.6499 0.6038 0.6499 0.8062
No log 9.4667 284 0.6765 0.6337 0.6765 0.8225
No log 9.5333 286 0.6773 0.5545 0.6773 0.8230
No log 9.6 288 0.6777 0.5866 0.6777 0.8232
No log 9.6667 290 0.6704 0.5842 0.6704 0.8188
No log 9.7333 292 0.6378 0.5949 0.6378 0.7986
No log 9.8 294 0.6570 0.6237 0.6570 0.8106
No log 9.8667 296 0.7394 0.5924 0.7394 0.8599
No log 9.9333 298 0.8777 0.4681 0.8777 0.9369
No log 10.0 300 0.7818 0.5607 0.7818 0.8842
No log 10.0667 302 0.6791 0.5846 0.6791 0.8241
No log 10.1333 304 0.6299 0.5898 0.6299 0.7937
No log 10.2 306 0.6512 0.5898 0.6512 0.8070
No log 10.2667 308 0.6444 0.5872 0.6444 0.8028
No log 10.3333 310 0.6393 0.5846 0.6393 0.7996
No log 10.4 312 0.5887 0.7019 0.5887 0.7672
No log 10.4667 314 0.5831 0.6988 0.5831 0.7636
No log 10.5333 316 0.6121 0.6547 0.6121 0.7824
No log 10.6 318 0.7075 0.6218 0.7075 0.8412
No log 10.6667 320 0.6871 0.6099 0.6871 0.8289
No log 10.7333 322 0.6349 0.6305 0.6349 0.7968
No log 10.8 324 0.6254 0.5503 0.6254 0.7908
No log 10.8667 326 0.6310 0.5618 0.6310 0.7943
No log 10.9333 328 0.6389 0.5618 0.6389 0.7993
No log 11.0 330 0.6430 0.5618 0.6430 0.8019
No log 11.0667 332 0.6929 0.6377 0.6929 0.8324
No log 11.1333 334 0.7443 0.5800 0.7443 0.8627
No log 11.2 336 0.7234 0.5942 0.7234 0.8505
No log 11.2667 338 0.6609 0.6305 0.6609 0.8129
No log 11.3333 340 0.6186 0.6018 0.6186 0.7865
No log 11.4 342 0.6158 0.6491 0.6158 0.7847
No log 11.4667 344 0.6373 0.6144 0.6373 0.7983
No log 11.5333 346 0.6180 0.6598 0.6180 0.7861
No log 11.6 348 0.6332 0.6822 0.6332 0.7958
No log 11.6667 350 0.7676 0.5725 0.7676 0.8761
No log 11.7333 352 0.7537 0.5725 0.7537 0.8682
No log 11.8 354 0.6474 0.6529 0.6474 0.8046
No log 11.8667 356 0.6081 0.6537 0.6081 0.7798
No log 11.9333 358 0.6244 0.5659 0.6244 0.7902
No log 12.0 360 0.6165 0.5771 0.6165 0.7852
No log 12.0667 362 0.6281 0.6073 0.6281 0.7925
No log 12.1333 364 0.7261 0.6099 0.7261 0.8521
No log 12.2 366 0.7316 0.6331 0.7316 0.8553
No log 12.2667 368 0.6356 0.6485 0.6356 0.7973
No log 12.3333 370 0.6148 0.7115 0.6148 0.7841
No log 12.4 372 0.6148 0.6830 0.6148 0.7841
No log 12.4667 374 0.6162 0.6681 0.6162 0.7850
No log 12.5333 376 0.6238 0.5880 0.6238 0.7898
No log 12.6 378 0.6412 0.5959 0.6412 0.8008
No log 12.6667 380 0.6424 0.5959 0.6424 0.8015
No log 12.7333 382 0.6295 0.6045 0.6295 0.7934
No log 12.8 384 0.6561 0.6429 0.6561 0.8100
No log 12.8667 386 0.6896 0.6257 0.6896 0.8304
No log 12.9333 388 0.6280 0.6045 0.6280 0.7924
No log 13.0 390 0.5989 0.6359 0.5989 0.7739
No log 13.0667 392 0.6251 0.6256 0.6251 0.7906
No log 13.1333 394 0.6261 0.5988 0.6261 0.7913
No log 13.2 396 0.6084 0.6380 0.6084 0.7800
No log 13.2667 398 0.6170 0.5855 0.6170 0.7855
No log 13.3333 400 0.6303 0.6045 0.6303 0.7939
No log 13.4 402 0.6382 0.6501 0.6382 0.7989
No log 13.4667 404 0.6444 0.5738 0.6444 0.8027
No log 13.5333 406 0.6164 0.6935 0.6164 0.7851
No log 13.6 408 0.6027 0.6935 0.6027 0.7764
No log 13.6667 410 0.5917 0.7012 0.5917 0.7692
No log 13.7333 412 0.5959 0.6528 0.5959 0.7719
No log 13.8 414 0.6100 0.6639 0.6100 0.7810
No log 13.8667 416 0.6454 0.5663 0.6454 0.8034
No log 13.9333 418 0.6686 0.5846 0.6686 0.8177
No log 14.0 420 0.6558 0.5822 0.6558 0.8098
No log 14.0667 422 0.6137 0.6405 0.6137 0.7834
No log 14.1333 424 0.5959 0.6866 0.5959 0.7719
No log 14.2 426 0.5954 0.7320 0.5954 0.7717
No log 14.2667 428 0.5966 0.6872 0.5966 0.7724
No log 14.3333 430 0.6054 0.6695 0.6054 0.7780
No log 14.4 432 0.6017 0.7041 0.6017 0.7757
No log 14.4667 434 0.6026 0.6843 0.6026 0.7763
No log 14.5333 436 0.5966 0.6843 0.5966 0.7724
No log 14.6 438 0.5929 0.6843 0.5929 0.7700
No log 14.6667 440 0.5808 0.7122 0.5808 0.7621
No log 14.7333 442 0.5778 0.7070 0.5778 0.7601
No log 14.8 444 0.6074 0.6249 0.6074 0.7794
No log 14.8667 446 0.6291 0.6137 0.6291 0.7931
No log 14.9333 448 0.5946 0.6045 0.5946 0.7711
No log 15.0 450 0.5799 0.7095 0.5799 0.7615
No log 15.0667 452 0.5805 0.6460 0.5805 0.7619
No log 15.1333 454 0.6039 0.6371 0.6039 0.7771
No log 15.2 456 0.6234 0.6157 0.6234 0.7895
No log 15.2667 458 0.6427 0.6262 0.6427 0.8017
No log 15.3333 460 0.6151 0.6167 0.6151 0.7843
No log 15.4 462 0.5860 0.6451 0.5860 0.7655
No log 15.4667 464 0.5925 0.6625 0.5925 0.7697
No log 15.5333 466 0.5940 0.6307 0.5940 0.7707
No log 15.6 468 0.6153 0.6025 0.6153 0.7844
No log 15.6667 470 0.6406 0.5678 0.6406 0.8004
No log 15.7333 472 0.6417 0.6054 0.6417 0.8011
No log 15.8 474 0.6475 0.5898 0.6475 0.8047
No log 15.8667 476 0.6364 0.6302 0.6364 0.7977
No log 15.9333 478 0.6312 0.6025 0.6312 0.7945
No log 16.0 480 0.6546 0.5880 0.6546 0.8091
No log 16.0667 482 0.6841 0.5607 0.6841 0.8271
No log 16.1333 484 0.6678 0.5844 0.6678 0.8172
No log 16.2 486 0.6318 0.6564 0.6318 0.7949
No log 16.2667 488 0.6117 0.6740 0.6117 0.7821
No log 16.3333 490 0.6124 0.6564 0.6124 0.7825
No log 16.4 492 0.6501 0.5833 0.6501 0.8063
No log 16.4667 494 0.6887 0.5266 0.6887 0.8299
No log 16.5333 496 0.6825 0.5383 0.6825 0.8261
No log 16.6 498 0.6530 0.5751 0.6530 0.8081
0.2897 16.6667 500 0.6319 0.5590 0.6319 0.7949
0.2897 16.7333 502 0.6287 0.5610 0.6287 0.7929
0.2897 16.8 504 0.6257 0.5905 0.6257 0.7910
0.2897 16.8667 506 0.6259 0.5786 0.6259 0.7912
0.2897 16.9333 508 0.6288 0.5894 0.6288 0.7929
0.2897 17.0 510 0.6482 0.5718 0.6482 0.8051
0.2897 17.0667 512 0.6383 0.5614 0.6383 0.7989
0.2897 17.1333 514 0.6054 0.6415 0.6054 0.7781
0.2897 17.2 516 0.6066 0.6778 0.6066 0.7788
0.2897 17.2667 518 0.6064 0.6684 0.6064 0.7787
0.2897 17.3333 520 0.6019 0.6985 0.6019 0.7758
0.2897 17.4 522 0.6023 0.6197 0.6023 0.7761
0.2897 17.4667 524 0.6173 0.6073 0.6173 0.7857
0.2897 17.5333 526 0.6434 0.5688 0.6434 0.8021
0.2897 17.6 528 0.6447 0.5595 0.6447 0.8029
0.2897 17.6667 530 0.6064 0.6055 0.6064 0.7787
0.2897 17.7333 532 0.5958 0.7110 0.5958 0.7719
0.2897 17.8 534 0.5922 0.7026 0.5922 0.7695
0.2897 17.8667 536 0.5910 0.6931 0.5910 0.7688
0.2897 17.9333 538 0.5833 0.7115 0.5833 0.7637
0.2897 18.0 540 0.5952 0.6797 0.5952 0.7715
0.2897 18.0667 542 0.5899 0.7259 0.5899 0.7680
0.2897 18.1333 544 0.5898 0.6806 0.5898 0.7680
0.2897 18.2 546 0.5866 0.6528 0.5866 0.7659
0.2897 18.2667 548 0.5892 0.6566 0.5892 0.7676
0.2897 18.3333 550 0.5901 0.6566 0.5901 0.7682
0.2897 18.4 552 0.6151 0.5975 0.6151 0.7843
0.2897 18.4667 554 0.6405 0.5975 0.6405 0.8003
0.2897 18.5333 556 0.6356 0.5666 0.6356 0.7973
0.2897 18.6 558 0.6304 0.5622 0.6304 0.7940
0.2897 18.6667 560 0.6322 0.6195 0.6322 0.7951
0.2897 18.7333 562 0.6284 0.6497 0.6284 0.7927
0.2897 18.8 564 0.6147 0.5871 0.6147 0.7840
0.2897 18.8667 566 0.6389 0.5585 0.6389 0.7993
0.2897 18.9333 568 0.6756 0.5846 0.6756 0.8219
0.2897 19.0 570 0.6745 0.6035 0.6745 0.8213
0.2897 19.0667 572 0.6955 0.5852 0.6955 0.8339
0.2897 19.1333 574 0.6556 0.5895 0.6556 0.8097
0.2897 19.2 576 0.6528 0.6128 0.6528 0.8080
0.2897 19.2667 578 0.6391 0.6157 0.6391 0.7994
0.2897 19.3333 580 0.6316 0.5869 0.6316 0.7948
0.2897 19.4 582 0.6106 0.5835 0.6106 0.7814
0.2897 19.4667 584 0.6070 0.6117 0.6070 0.7791
0.2897 19.5333 586 0.5950 0.6427 0.5950 0.7714
0.2897 19.6 588 0.5849 0.6427 0.5849 0.7648
0.2897 19.6667 590 0.5775 0.6424 0.5775 0.7599
0.2897 19.7333 592 0.6030 0.6147 0.6030 0.7765
0.2897 19.8 594 0.5978 0.6147 0.5978 0.7732
0.2897 19.8667 596 0.5589 0.6564 0.5589 0.7476
0.2897 19.9333 598 0.6034 0.6330 0.6034 0.7768
0.2897 20.0 600 0.6935 0.5865 0.6935 0.8328
0.2897 20.0667 602 0.6723 0.6057 0.6723 0.8199
0.2897 20.1333 604 0.5889 0.6623 0.5889 0.7674
0.2897 20.2 606 0.5891 0.5944 0.5891 0.7675
0.2897 20.2667 608 0.6778 0.5824 0.6778 0.8233
0.2897 20.3333 610 0.6962 0.5824 0.6962 0.8344
0.2897 20.4 612 0.6474 0.6053 0.6474 0.8046
0.2897 20.4667 614 0.6090 0.5980 0.6090 0.7804
0.2897 20.5333 616 0.6161 0.6078 0.6161 0.7849
0.2897 20.6 618 0.6139 0.6049 0.6139 0.7835
0.2897 20.6667 620 0.6242 0.6055 0.6242 0.7900
0.2897 20.7333 622 0.7093 0.5370 0.7093 0.8422
0.2897 20.8 624 0.8783 0.4764 0.8783 0.9372
0.2897 20.8667 626 0.9189 0.4964 0.9189 0.9586
0.2897 20.9333 628 0.8367 0.5295 0.8367 0.9147
0.2897 21.0 630 0.7244 0.5147 0.7244 0.8511

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

Finetuned
(4019)
this model