ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5784
  • Qwk: 0.5874
  • Mse: 0.5784
  • Rmse: 0.7606

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 4.0022 -0.0092 4.0022 2.0006
No log 0.1081 4 2.2289 0.0790 2.2289 1.4929
No log 0.1622 6 1.4834 -0.0078 1.4834 1.2180
No log 0.2162 8 1.0969 0.2341 1.0969 1.0473
No log 0.2703 10 1.1663 0.1284 1.1663 1.0800
No log 0.3243 12 1.2020 0.1379 1.2020 1.0964
No log 0.3784 14 1.2609 0.0427 1.2609 1.1229
No log 0.4324 16 1.1813 0.1101 1.1813 1.0869
No log 0.4865 18 1.1649 0.1389 1.1649 1.0793
No log 0.5405 20 1.1239 0.1276 1.1239 1.0602
No log 0.5946 22 1.5251 0.1138 1.5251 1.2350
No log 0.6486 24 1.3851 0.1362 1.3851 1.1769
No log 0.7027 26 0.9634 0.3011 0.9634 0.9815
No log 0.7568 28 1.0564 0.3210 1.0564 1.0278
No log 0.8108 30 0.9284 0.4344 0.9284 0.9635
No log 0.8649 32 1.1053 0.2547 1.1053 1.0514
No log 0.9189 34 1.5393 0.1825 1.5393 1.2407
No log 0.9730 36 1.3579 0.2089 1.3579 1.1653
No log 1.0270 38 0.9269 0.3753 0.9269 0.9628
No log 1.0811 40 0.8411 0.4943 0.8411 0.9171
No log 1.1351 42 1.0202 0.3782 1.0202 1.0100
No log 1.1892 44 0.8738 0.5176 0.8738 0.9348
No log 1.2432 46 0.7992 0.3998 0.7992 0.8940
No log 1.2973 48 0.8055 0.4951 0.8055 0.8975
No log 1.3514 50 0.7449 0.5758 0.7449 0.8631
No log 1.4054 52 0.8226 0.5020 0.8226 0.9070
No log 1.4595 54 0.8707 0.4894 0.8707 0.9331
No log 1.5135 56 0.7513 0.5383 0.7513 0.8668
No log 1.5676 58 0.6702 0.5688 0.6702 0.8186
No log 1.6216 60 0.7079 0.6053 0.7079 0.8414
No log 1.6757 62 0.9282 0.4487 0.9282 0.9635
No log 1.7297 64 0.8926 0.4681 0.8926 0.9448
No log 1.7838 66 0.6672 0.6762 0.6672 0.8168
No log 1.8378 68 0.6649 0.6018 0.6649 0.8154
No log 1.8919 70 0.6631 0.6275 0.6631 0.8143
No log 1.9459 72 0.6746 0.6297 0.6746 0.8213
No log 2.0 74 0.7076 0.5828 0.7076 0.8412
No log 2.0541 76 0.6829 0.5934 0.6829 0.8264
No log 2.1081 78 0.6803 0.5736 0.6803 0.8248
No log 2.1622 80 0.6455 0.5939 0.6455 0.8034
No log 2.2162 82 0.6698 0.5992 0.6698 0.8184
No log 2.2703 84 0.6378 0.5626 0.6378 0.7986
No log 2.3243 86 0.6551 0.5770 0.6551 0.8094
No log 2.3784 88 0.6083 0.6815 0.6083 0.7799
No log 2.4324 90 0.7251 0.5119 0.7251 0.8516
No log 2.4865 92 0.7235 0.5443 0.7235 0.8506
No log 2.5405 94 0.6209 0.5972 0.6209 0.7880
No log 2.5946 96 0.6406 0.6006 0.6406 0.8004
No log 2.6486 98 0.6377 0.5869 0.6377 0.7985
No log 2.7027 100 0.6462 0.5948 0.6462 0.8039
No log 2.7568 102 0.6523 0.6199 0.6523 0.8077
No log 2.8108 104 0.6237 0.5972 0.6237 0.7898
No log 2.8649 106 0.8063 0.5396 0.8063 0.8980
No log 2.9189 108 0.8789 0.4014 0.8789 0.9375
No log 2.9730 110 0.6702 0.6725 0.6702 0.8187
No log 3.0270 112 0.6478 0.5891 0.6478 0.8048
No log 3.0811 114 0.6360 0.5891 0.6360 0.7975
No log 3.1351 116 0.6412 0.6461 0.6412 0.8008
No log 3.1892 118 0.7018 0.5770 0.7018 0.8377
No log 3.2432 120 0.6017 0.6691 0.6017 0.7757
No log 3.2973 122 0.6296 0.6177 0.6296 0.7935
No log 3.3514 124 0.6206 0.6063 0.6206 0.7878
No log 3.4054 126 0.5662 0.6926 0.5662 0.7525
No log 3.4595 128 0.6019 0.6497 0.6019 0.7758
No log 3.5135 130 0.5479 0.7025 0.5479 0.7402
No log 3.5676 132 0.5443 0.7189 0.5443 0.7377
No log 3.6216 134 0.5967 0.6704 0.5967 0.7725
No log 3.6757 136 0.5625 0.6662 0.5625 0.7500
No log 3.7297 138 0.5465 0.7199 0.5465 0.7392
No log 3.7838 140 0.5866 0.6661 0.5866 0.7659
No log 3.8378 142 0.6635 0.5739 0.6635 0.8146
No log 3.8919 144 0.5897 0.6402 0.5897 0.7679
No log 3.9459 146 0.6143 0.5156 0.6143 0.7838
No log 4.0 148 0.6449 0.4995 0.6449 0.8031
No log 4.0541 150 0.6102 0.6043 0.6102 0.7812
No log 4.1081 152 0.5758 0.6286 0.5758 0.7588
No log 4.1622 154 0.5787 0.5795 0.5787 0.7607
No log 4.2162 156 0.5991 0.6466 0.5991 0.7740
No log 4.2703 158 0.5460 0.6566 0.5460 0.7389
No log 4.3243 160 0.5739 0.6241 0.5739 0.7576
No log 4.3784 162 0.6989 0.5598 0.6989 0.8360
No log 4.4324 164 0.6505 0.5559 0.6505 0.8066
No log 4.4865 166 0.5890 0.5303 0.5890 0.7674
No log 4.5405 168 0.5920 0.5303 0.5920 0.7694
No log 4.5946 170 0.6359 0.5482 0.6359 0.7974
No log 4.6486 172 0.8217 0.5574 0.8217 0.9065
No log 4.7027 174 0.8455 0.5556 0.8455 0.9195
No log 4.7568 176 0.6388 0.5981 0.6388 0.7992
No log 4.8108 178 0.5541 0.6488 0.5541 0.7444
No log 4.8649 180 0.6211 0.5828 0.6211 0.7881
No log 4.9189 182 0.5703 0.5212 0.5703 0.7552
No log 4.9730 184 0.5796 0.5859 0.5796 0.7613
No log 5.0270 186 0.5679 0.5859 0.5679 0.7536
No log 5.0811 188 0.5469 0.6107 0.5469 0.7395
No log 5.1351 190 0.6123 0.6466 0.6123 0.7825
No log 5.1892 192 0.7554 0.5966 0.7554 0.8692
No log 5.2432 194 0.7618 0.5873 0.7618 0.8728
No log 5.2973 196 0.7752 0.6086 0.7752 0.8805
No log 5.3514 198 0.5971 0.6327 0.5971 0.7727
No log 5.4054 200 0.5253 0.6822 0.5253 0.7247
No log 5.4595 202 0.5625 0.6386 0.5625 0.7500
No log 5.5135 204 0.6388 0.5781 0.6388 0.7993
No log 5.5676 206 0.6301 0.5793 0.6301 0.7938
No log 5.6216 208 0.5761 0.5807 0.5761 0.7590
No log 5.6757 210 0.5677 0.5862 0.5677 0.7534
No log 5.7297 212 0.5455 0.6745 0.5455 0.7386
No log 5.7838 214 0.5480 0.6795 0.5480 0.7402
No log 5.8378 216 0.5407 0.6419 0.5407 0.7353
No log 5.8919 218 0.5616 0.6165 0.5616 0.7494
No log 5.9459 220 0.5792 0.5653 0.5792 0.7610
No log 6.0 222 0.6291 0.5819 0.6291 0.7932
No log 6.0541 224 0.7493 0.6396 0.7493 0.8656
No log 6.1081 226 0.6874 0.6161 0.6874 0.8291
No log 6.1622 228 0.5667 0.5795 0.5667 0.7528
No log 6.2162 230 0.5757 0.6165 0.5757 0.7587
No log 6.2703 232 0.5517 0.6575 0.5517 0.7428
No log 6.3243 234 0.5703 0.6470 0.5703 0.7552
No log 6.3784 236 0.6111 0.6347 0.6111 0.7817
No log 6.4324 238 0.5680 0.6470 0.5680 0.7536
No log 6.4865 240 0.5664 0.6673 0.5664 0.7526
No log 6.5405 242 0.5903 0.6557 0.5903 0.7683
No log 6.5946 244 0.5934 0.6511 0.5934 0.7703
No log 6.6486 246 0.5579 0.6712 0.5579 0.7469
No log 6.7027 248 0.6101 0.6445 0.6101 0.7811
No log 6.7568 250 0.6604 0.6173 0.6604 0.8127
No log 6.8108 252 0.5962 0.5902 0.5962 0.7721
No log 6.8649 254 0.6026 0.6123 0.6026 0.7763
No log 6.9189 256 0.6123 0.5932 0.6123 0.7825
No log 6.9730 258 0.6113 0.5555 0.6113 0.7819
No log 7.0270 260 0.6863 0.6150 0.6863 0.8284
No log 7.0811 262 0.7416 0.5683 0.7416 0.8612
No log 7.1351 264 0.6970 0.5583 0.6970 0.8349
No log 7.1892 266 0.6092 0.6099 0.6092 0.7805
No log 7.2432 268 0.6218 0.5522 0.6218 0.7885
No log 7.2973 270 0.6247 0.5183 0.6247 0.7904
No log 7.3514 272 0.6239 0.5459 0.6239 0.7899
No log 7.4054 274 0.6180 0.5759 0.6180 0.7861
No log 7.4595 276 0.5967 0.6184 0.5967 0.7725
No log 7.5135 278 0.6065 0.6122 0.6065 0.7788
No log 7.5676 280 0.7024 0.5491 0.7024 0.8381
No log 7.6216 282 0.6900 0.5509 0.6900 0.8307
No log 7.6757 284 0.6081 0.5978 0.6081 0.7798
No log 7.7297 286 0.6302 0.5678 0.6302 0.7938
No log 7.7838 288 0.7883 0.5389 0.7883 0.8879
No log 7.8378 290 0.7913 0.5389 0.7913 0.8895
No log 7.8919 292 0.6343 0.6082 0.6343 0.7964
No log 7.9459 294 0.5300 0.7010 0.5300 0.7280
No log 8.0 296 0.5454 0.6814 0.5454 0.7385
No log 8.0541 298 0.5963 0.6500 0.5963 0.7722
No log 8.1081 300 0.5875 0.6276 0.5875 0.7665
No log 8.1622 302 0.5522 0.6400 0.5522 0.7431
No log 8.2162 304 0.5590 0.6516 0.5590 0.7477
No log 8.2703 306 0.5745 0.6461 0.5745 0.7580
No log 8.3243 308 0.6168 0.6599 0.6168 0.7854
No log 8.3784 310 0.7839 0.5583 0.7839 0.8854
No log 8.4324 312 0.8137 0.5385 0.8137 0.9020
No log 8.4865 314 0.7214 0.5579 0.7214 0.8494
No log 8.5405 316 0.6033 0.5694 0.6033 0.7768
No log 8.5946 318 0.6125 0.5302 0.6125 0.7826
No log 8.6486 320 0.6138 0.5302 0.6138 0.7834
No log 8.7027 322 0.6124 0.5568 0.6124 0.7825
No log 8.7568 324 0.6204 0.6129 0.6204 0.7877
No log 8.8108 326 0.5797 0.6725 0.5797 0.7614
No log 8.8649 328 0.5272 0.6658 0.5272 0.7261
No log 8.9189 330 0.5799 0.6502 0.5799 0.7615
No log 8.9730 332 0.5748 0.6748 0.5748 0.7582
No log 9.0270 334 0.5491 0.6500 0.5491 0.7410
No log 9.0811 336 0.5606 0.5543 0.5606 0.7487
No log 9.1351 338 0.5906 0.4925 0.5906 0.7685
No log 9.1892 340 0.5815 0.5440 0.5815 0.7626
No log 9.2432 342 0.5593 0.5677 0.5593 0.7478
No log 9.2973 344 0.5630 0.6387 0.5630 0.7503
No log 9.3514 346 0.5910 0.6596 0.5910 0.7688
No log 9.4054 348 0.5739 0.6377 0.5739 0.7576
No log 9.4595 350 0.5540 0.6128 0.5540 0.7443
No log 9.5135 352 0.5618 0.6363 0.5618 0.7495
No log 9.5676 354 0.5569 0.6337 0.5569 0.7463
No log 9.6216 356 0.5386 0.6517 0.5386 0.7339
No log 9.6757 358 0.5723 0.6929 0.5723 0.7565
No log 9.7297 360 0.6041 0.6914 0.6041 0.7772
No log 9.7838 362 0.5916 0.6623 0.5916 0.7692
No log 9.8378 364 0.5678 0.6207 0.5678 0.7535
No log 9.8919 366 0.5793 0.5441 0.5793 0.7611
No log 9.9459 368 0.5839 0.5441 0.5839 0.7642
No log 10.0 370 0.5984 0.5694 0.5984 0.7735
No log 10.0541 372 0.6380 0.5819 0.6380 0.7988
No log 10.1081 374 0.6572 0.5706 0.6572 0.8107
No log 10.1622 376 0.6555 0.5706 0.6555 0.8096
No log 10.2162 378 0.6819 0.5355 0.6819 0.8258
No log 10.2703 380 0.7248 0.5606 0.7248 0.8514
No log 10.3243 382 0.7082 0.5718 0.7082 0.8415
No log 10.3784 384 0.6499 0.5905 0.6499 0.8062
No log 10.4324 386 0.6170 0.5422 0.6170 0.7855
No log 10.4865 388 0.6172 0.5653 0.6172 0.7856
No log 10.5405 390 0.6146 0.5905 0.6146 0.7840
No log 10.5946 392 0.6960 0.6025 0.6960 0.8343
No log 10.6486 394 0.7230 0.5576 0.7230 0.8503
No log 10.7027 396 0.6383 0.6109 0.6383 0.7989
No log 10.7568 398 0.6020 0.5402 0.6020 0.7759
No log 10.8108 400 0.6762 0.5663 0.6762 0.8223
No log 10.8649 402 0.7218 0.5953 0.7218 0.8496
No log 10.9189 404 0.6561 0.5988 0.6561 0.8100
No log 10.9730 406 0.5602 0.6345 0.5602 0.7485
No log 11.0270 408 0.5773 0.6479 0.5773 0.7598
No log 11.0811 410 0.6476 0.6534 0.6476 0.8047
No log 11.1351 412 0.6821 0.5920 0.6821 0.8259
No log 11.1892 414 0.6475 0.5697 0.6475 0.8047
No log 11.2432 416 0.5774 0.5894 0.5774 0.7599
No log 11.2973 418 0.5773 0.6276 0.5773 0.7598
No log 11.3514 420 0.5798 0.6500 0.5798 0.7615
No log 11.4054 422 0.5699 0.6602 0.5699 0.7549
No log 11.4595 424 0.5615 0.6756 0.5615 0.7494
No log 11.5135 426 0.5576 0.6756 0.5576 0.7467
No log 11.5676 428 0.5569 0.6861 0.5569 0.7463
No log 11.6216 430 0.5622 0.6720 0.5622 0.7498
No log 11.6757 432 0.5752 0.6537 0.5752 0.7584
No log 11.7297 434 0.6076 0.6315 0.6076 0.7795
No log 11.7838 436 0.6418 0.5728 0.6418 0.8011
No log 11.8378 438 0.6898 0.5479 0.6898 0.8306
No log 11.8919 440 0.6825 0.5463 0.6825 0.8261
No log 11.9459 442 0.6236 0.6422 0.6236 0.7897
No log 12.0 444 0.5935 0.6252 0.5935 0.7704
No log 12.0541 446 0.5916 0.6229 0.5916 0.7692
No log 12.1081 448 0.5924 0.6154 0.5924 0.7697
No log 12.1622 450 0.5963 0.6154 0.5963 0.7722
No log 12.2162 452 0.6023 0.6262 0.6023 0.7761
No log 12.2703 454 0.6008 0.6262 0.6008 0.7751
No log 12.3243 456 0.5908 0.6154 0.5908 0.7687
No log 12.3784 458 0.6198 0.6060 0.6198 0.7873
No log 12.4324 460 0.6837 0.6130 0.6837 0.8269
No log 12.4865 462 0.6779 0.6130 0.6779 0.8233
No log 12.5405 464 0.6110 0.5865 0.6110 0.7817
No log 12.5946 466 0.5824 0.6057 0.5824 0.7631
No log 12.6486 468 0.5996 0.6218 0.5996 0.7744
No log 12.7027 470 0.6323 0.6310 0.6323 0.7952
No log 12.7568 472 0.6009 0.6670 0.6009 0.7752
No log 12.8108 474 0.5775 0.6043 0.5775 0.7599
No log 12.8649 476 0.5473 0.6517 0.5473 0.7398
No log 12.9189 478 0.6257 0.6072 0.6257 0.7910
No log 12.9730 480 0.7180 0.5722 0.7180 0.8474
No log 13.0270 482 0.6944 0.5508 0.6944 0.8333
No log 13.0811 484 0.6003 0.6269 0.6003 0.7748
No log 13.1351 486 0.5427 0.6254 0.5427 0.7367
No log 13.1892 488 0.5480 0.6332 0.5480 0.7403
No log 13.2432 490 0.5500 0.6254 0.5500 0.7416
No log 13.2973 492 0.5738 0.5225 0.5738 0.7575
No log 13.3514 494 0.6113 0.5636 0.6113 0.7819
No log 13.4054 496 0.6085 0.5627 0.6085 0.7801
No log 13.4595 498 0.5720 0.5928 0.5720 0.7563
0.2748 13.5135 500 0.5435 0.6437 0.5435 0.7373
0.2748 13.5676 502 0.5474 0.6317 0.5474 0.7399
0.2748 13.6216 504 0.5598 0.6084 0.5598 0.7482
0.2748 13.6757 506 0.5658 0.5874 0.5658 0.7522
0.2748 13.7297 508 0.5713 0.6084 0.5713 0.7559
0.2748 13.7838 510 0.5784 0.5874 0.5784 0.7606

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task5_organization

Finetuned
(4019)
this model