ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5802
  • Qwk: 0.6099
  • Mse: 0.5802
  • Rmse: 0.7617

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1538 2 4.1726 -0.0118 4.1726 2.0427
No log 0.3077 4 2.4193 0.0465 2.4193 1.5554
No log 0.4615 6 1.5711 0.0143 1.5711 1.2534
No log 0.6154 8 1.2725 0.0462 1.2725 1.1280
No log 0.7692 10 1.0545 0.2667 1.0545 1.0269
No log 0.9231 12 1.0507 0.2239 1.0507 1.0251
No log 1.0769 14 1.1576 0.1361 1.1576 1.0759
No log 1.2308 16 1.0091 0.2316 1.0091 1.0045
No log 1.3846 18 0.9495 0.2865 0.9495 0.9744
No log 1.5385 20 0.9673 0.2764 0.9673 0.9835
No log 1.6923 22 1.0250 0.2478 1.0250 1.0124
No log 1.8462 24 0.8537 0.3896 0.8537 0.9240
No log 2.0 26 0.8663 0.4754 0.8663 0.9308
No log 2.1538 28 1.0248 0.4668 1.0248 1.0123
No log 2.3077 30 1.4296 0.2847 1.4296 1.1957
No log 2.4615 32 2.0362 0.2495 2.0362 1.4269
No log 2.6154 34 1.5887 0.3583 1.5887 1.2604
No log 2.7692 36 0.7324 0.6500 0.7324 0.8558
No log 2.9231 38 0.9198 0.5443 0.9198 0.9590
No log 3.0769 40 1.0971 0.3083 1.0971 1.0474
No log 3.2308 42 0.6226 0.5823 0.6226 0.7890
No log 3.3846 44 0.7857 0.5530 0.7857 0.8864
No log 3.5385 46 1.0247 0.4754 1.0247 1.0123
No log 3.6923 48 0.8764 0.5690 0.8764 0.9362
No log 3.8462 50 0.6209 0.6297 0.6209 0.7879
No log 4.0 52 0.7702 0.6029 0.7702 0.8776
No log 4.1538 54 0.9271 0.4852 0.9271 0.9629
No log 4.3077 56 0.7361 0.6401 0.7361 0.8579
No log 4.4615 58 0.5948 0.6350 0.5948 0.7712
No log 4.6154 60 0.7468 0.6382 0.7468 0.8642
No log 4.7692 62 0.7357 0.6565 0.7357 0.8577
No log 4.9231 64 0.6131 0.6509 0.6131 0.7830
No log 5.0769 66 0.5878 0.6664 0.5878 0.7667
No log 5.2308 68 0.5907 0.6628 0.5907 0.7686
No log 5.3846 70 0.5852 0.6374 0.5852 0.7650
No log 5.5385 72 0.6772 0.6678 0.6772 0.8229
No log 5.6923 74 0.7330 0.6291 0.7330 0.8562
No log 5.8462 76 0.7830 0.6201 0.7830 0.8849
No log 6.0 78 0.7960 0.6201 0.7960 0.8922
No log 6.1538 80 0.6320 0.6737 0.6320 0.7950
No log 6.3077 82 0.5979 0.6125 0.5979 0.7733
No log 6.4615 84 0.6220 0.6411 0.6220 0.7887
No log 6.6154 86 0.6129 0.6403 0.6129 0.7829
No log 6.7692 88 0.5642 0.6484 0.5642 0.7511
No log 6.9231 90 0.5645 0.6301 0.5645 0.7513
No log 7.0769 92 0.5766 0.6830 0.5766 0.7593
No log 7.2308 94 0.6244 0.5826 0.6244 0.7902
No log 7.3846 96 0.8437 0.5819 0.8437 0.9185
No log 7.5385 98 0.7807 0.6263 0.7807 0.8836
No log 7.6923 100 0.5970 0.6447 0.5970 0.7727
No log 7.8462 102 0.6133 0.6958 0.6133 0.7831
No log 8.0 104 0.6207 0.6696 0.6207 0.7878
No log 8.1538 106 0.5666 0.6846 0.5666 0.7527
No log 8.3077 108 0.7925 0.6421 0.7925 0.8902
No log 8.4615 110 1.0515 0.4607 1.0515 1.0254
No log 8.6154 112 0.8762 0.5798 0.8762 0.9361
No log 8.7692 114 0.5839 0.6820 0.5839 0.7642
No log 8.9231 116 0.5703 0.6693 0.5703 0.7552
No log 9.0769 118 0.5803 0.6966 0.5803 0.7618
No log 9.2308 120 0.7264 0.5759 0.7264 0.8523
No log 9.3846 122 0.6685 0.5792 0.6685 0.8176
No log 9.5385 124 0.5733 0.6249 0.5733 0.7571
No log 9.6923 126 0.5725 0.6664 0.5725 0.7566
No log 9.8462 128 0.6026 0.6589 0.6026 0.7763
No log 10.0 130 0.5933 0.6564 0.5933 0.7703
No log 10.1538 132 0.6120 0.6464 0.6120 0.7823
No log 10.3077 134 0.6528 0.6828 0.6528 0.8079
No log 10.4615 136 0.6747 0.6451 0.6747 0.8214
No log 10.6154 138 0.6483 0.7063 0.6483 0.8052
No log 10.7692 140 0.6029 0.6853 0.6029 0.7765
No log 10.9231 142 0.5832 0.6748 0.5832 0.7637
No log 11.0769 144 0.5987 0.6516 0.5987 0.7738
No log 11.2308 146 0.5791 0.6625 0.5791 0.7610
No log 11.3846 148 0.5656 0.6733 0.5656 0.7521
No log 11.5385 150 0.5755 0.6857 0.5755 0.7586
No log 11.6923 152 0.5681 0.7034 0.5681 0.7537
No log 11.8462 154 0.6633 0.6442 0.6633 0.8144
No log 12.0 156 0.7578 0.5969 0.7578 0.8705
No log 12.1538 158 0.7148 0.5969 0.7148 0.8455
No log 12.3077 160 0.5779 0.6401 0.5779 0.7602
No log 12.4615 162 0.5840 0.6269 0.5840 0.7642
No log 12.6154 164 0.6014 0.6092 0.6014 0.7755
No log 12.7692 166 0.5649 0.6252 0.5649 0.7516
No log 12.9231 168 0.5701 0.7104 0.5701 0.7550
No log 13.0769 170 0.6655 0.5902 0.6655 0.8158
No log 13.2308 172 0.6281 0.6657 0.6281 0.7926
No log 13.3846 174 0.5503 0.6813 0.5503 0.7418
No log 13.5385 176 0.6325 0.7116 0.6325 0.7953
No log 13.6923 178 0.6465 0.6682 0.6465 0.8041
No log 13.8462 180 0.5930 0.6751 0.5930 0.7701
No log 14.0 182 0.5565 0.6537 0.5565 0.7460
No log 14.1538 184 0.5784 0.6544 0.5784 0.7605
No log 14.3077 186 0.6012 0.6704 0.6012 0.7754
No log 14.4615 188 0.6015 0.7285 0.6015 0.7756
No log 14.6154 190 0.5778 0.7077 0.5778 0.7601
No log 14.7692 192 0.5794 0.7070 0.5794 0.7612
No log 14.9231 194 0.5754 0.7178 0.5754 0.7585
No log 15.0769 196 0.5690 0.6995 0.5690 0.7543
No log 15.2308 198 0.5858 0.6108 0.5858 0.7653
No log 15.3846 200 0.5904 0.6228 0.5904 0.7684
No log 15.5385 202 0.5987 0.6488 0.5987 0.7737
No log 15.6923 204 0.5743 0.7272 0.5743 0.7578
No log 15.8462 206 0.6122 0.6649 0.6122 0.7824
No log 16.0 208 0.6875 0.6274 0.6875 0.8292
No log 16.1538 210 0.6686 0.6395 0.6686 0.8177
No log 16.3077 212 0.5982 0.7283 0.5982 0.7734
No log 16.4615 214 0.5831 0.6949 0.5831 0.7636
No log 16.6154 216 0.5817 0.5817 0.5817 0.7627
No log 16.7692 218 0.5832 0.6134 0.5832 0.7637
No log 16.9231 220 0.5689 0.6133 0.5689 0.7542
No log 17.0769 222 0.5759 0.6239 0.5759 0.7589
No log 17.2308 224 0.5835 0.6133 0.5835 0.7638
No log 17.3846 226 0.5991 0.5714 0.5991 0.7740
No log 17.5385 228 0.5765 0.6537 0.5765 0.7593
No log 17.6923 230 0.5775 0.6830 0.5775 0.7599
No log 17.8462 232 0.5615 0.6464 0.5615 0.7494
No log 18.0 234 0.5622 0.6350 0.5622 0.7498
No log 18.1538 236 0.5584 0.6417 0.5584 0.7473
No log 18.3077 238 0.5373 0.6500 0.5373 0.7330
No log 18.4615 240 0.5372 0.6575 0.5372 0.7329
No log 18.6154 242 0.6069 0.6072 0.6069 0.7790
No log 18.7692 244 0.7254 0.6413 0.7254 0.8517
No log 18.9231 246 0.7541 0.6055 0.7541 0.8684
No log 19.0769 248 0.7341 0.6175 0.7341 0.8568
No log 19.2308 250 0.6155 0.6780 0.6155 0.7845
No log 19.3846 252 0.5495 0.6764 0.5495 0.7413
No log 19.5385 254 0.5554 0.6764 0.5554 0.7452
No log 19.6923 256 0.5417 0.6619 0.5417 0.7360
No log 19.8462 258 0.5420 0.6460 0.5420 0.7362
No log 20.0 260 0.5511 0.6046 0.5511 0.7424
No log 20.1538 262 0.5556 0.5845 0.5556 0.7454
No log 20.3077 264 0.5543 0.6479 0.5543 0.7445
No log 20.4615 266 0.5785 0.6865 0.5785 0.7606
No log 20.6154 268 0.5913 0.6948 0.5913 0.7690
No log 20.7692 270 0.5814 0.6693 0.5814 0.7625
No log 20.9231 272 0.5713 0.6916 0.5713 0.7558
No log 21.0769 274 0.5974 0.6281 0.5974 0.7729
No log 21.2308 276 0.6067 0.6107 0.6067 0.7789
No log 21.3846 278 0.5770 0.6325 0.5770 0.7596
No log 21.5385 280 0.5663 0.6672 0.5663 0.7526
No log 21.6923 282 0.5757 0.6347 0.5757 0.7587
No log 21.8462 284 0.5810 0.6648 0.5810 0.7622
No log 22.0 286 0.5655 0.6455 0.5655 0.7520
No log 22.1538 288 0.5568 0.6545 0.5568 0.7462
No log 22.3077 290 0.5663 0.6134 0.5663 0.7526
No log 22.4615 292 0.5687 0.5771 0.5687 0.7541
No log 22.6154 294 0.5646 0.5950 0.5646 0.7514
No log 22.7692 296 0.5629 0.6500 0.5629 0.7503
No log 22.9231 298 0.6052 0.6612 0.6052 0.7779
No log 23.0769 300 0.6843 0.6662 0.6843 0.8272
No log 23.2308 302 0.6842 0.6688 0.6842 0.8272
No log 23.3846 304 0.6432 0.6615 0.6432 0.8020
No log 23.5385 306 0.6146 0.6630 0.6146 0.7840
No log 23.6923 308 0.5941 0.6552 0.5941 0.7708
No log 23.8462 310 0.5795 0.6465 0.5795 0.7612
No log 24.0 312 0.6105 0.6402 0.6105 0.7813
No log 24.1538 314 0.6413 0.6421 0.6413 0.8008
No log 24.3077 316 0.5939 0.6215 0.5939 0.7706
No log 24.4615 318 0.5512 0.5771 0.5512 0.7425
No log 24.6154 320 0.5393 0.6753 0.5393 0.7344
No log 24.7692 322 0.5468 0.6641 0.5468 0.7395
No log 24.9231 324 0.5615 0.6929 0.5615 0.7493
No log 25.0769 326 0.5513 0.7253 0.5513 0.7425
No log 25.2308 328 0.5967 0.6818 0.5967 0.7725
No log 25.3846 330 0.6678 0.6427 0.6678 0.8172
No log 25.5385 332 0.6695 0.6450 0.6695 0.8182
No log 25.6923 334 0.6018 0.6899 0.6018 0.7757
No log 25.8462 336 0.5466 0.6830 0.5466 0.7393
No log 26.0 338 0.5785 0.6779 0.5785 0.7606
No log 26.1538 340 0.6052 0.6469 0.6052 0.7779
No log 26.3077 342 0.5917 0.6779 0.5917 0.7692
No log 26.4615 344 0.5733 0.6681 0.5733 0.7572
No log 26.6154 346 0.6050 0.5748 0.6050 0.7778
No log 26.7692 348 0.6243 0.6080 0.6243 0.7901
No log 26.9231 350 0.6025 0.6115 0.6025 0.7762
No log 27.0769 352 0.5709 0.6433 0.5709 0.7556
No log 27.2308 354 0.5720 0.6641 0.5720 0.7563
No log 27.3846 356 0.5662 0.6641 0.5662 0.7525
No log 27.5385 358 0.5768 0.6154 0.5768 0.7595
No log 27.6923 360 0.6460 0.6226 0.6460 0.8037
No log 27.8462 362 0.7496 0.6111 0.7496 0.8658
No log 28.0 364 0.7838 0.5260 0.7838 0.8853
No log 28.1538 366 0.7476 0.5868 0.7476 0.8646
No log 28.3077 368 0.6393 0.6525 0.6393 0.7996
No log 28.4615 370 0.5874 0.6393 0.5874 0.7664
No log 28.6154 372 0.5699 0.6433 0.5699 0.7549
No log 28.7692 374 0.5760 0.6227 0.5760 0.7589
No log 28.9231 376 0.5919 0.6278 0.5919 0.7693
No log 29.0769 378 0.6046 0.6119 0.6046 0.7775
No log 29.2308 380 0.6040 0.5917 0.6040 0.7772
No log 29.3846 382 0.5934 0.6129 0.5934 0.7703
No log 29.5385 384 0.5753 0.5856 0.5753 0.7585
No log 29.6923 386 0.5826 0.6262 0.5826 0.7633
No log 29.8462 388 0.5972 0.6276 0.5972 0.7728
No log 30.0 390 0.5955 0.6262 0.5955 0.7717
No log 30.1538 392 0.5883 0.6433 0.5883 0.7670
No log 30.3077 394 0.5933 0.6046 0.5933 0.7702
No log 30.4615 396 0.5941 0.6262 0.5941 0.7708
No log 30.6154 398 0.5998 0.6276 0.5998 0.7744
No log 30.7692 400 0.6212 0.6361 0.6212 0.7882
No log 30.9231 402 0.6144 0.6328 0.6144 0.7838
No log 31.0769 404 0.5850 0.6397 0.5850 0.7649
No log 31.2308 406 0.5644 0.6157 0.5644 0.7513
No log 31.3846 408 0.5671 0.6619 0.5671 0.7530
No log 31.5385 410 0.5799 0.6636 0.5799 0.7615
No log 31.6923 412 0.5779 0.6890 0.5779 0.7602
No log 31.8462 414 0.5679 0.6593 0.5679 0.7536
No log 32.0 416 0.5616 0.6288 0.5616 0.7494
No log 32.1538 418 0.5638 0.6288 0.5638 0.7509
No log 32.3077 420 0.5669 0.6288 0.5669 0.7529
No log 32.4615 422 0.5690 0.5887 0.5690 0.7543
No log 32.6154 424 0.5768 0.6593 0.5768 0.7595
No log 32.7692 426 0.6088 0.6174 0.6088 0.7803
No log 32.9231 428 0.6137 0.6174 0.6137 0.7834
No log 33.0769 430 0.5867 0.6278 0.5867 0.7660
No log 33.2308 432 0.5761 0.6288 0.5761 0.7590
No log 33.3846 434 0.5969 0.6397 0.5969 0.7726
No log 33.5385 436 0.6030 0.6493 0.6030 0.7765
No log 33.6923 438 0.5910 0.6380 0.5910 0.7688
No log 33.8462 440 0.5638 0.6537 0.5638 0.7509
No log 34.0 442 0.5585 0.6249 0.5585 0.7473
No log 34.1538 444 0.5610 0.6593 0.5610 0.7490
No log 34.3077 446 0.5793 0.6256 0.5793 0.7611
No log 34.4615 448 0.5802 0.6368 0.5802 0.7617
No log 34.6154 450 0.5685 0.6358 0.5685 0.7540
No log 34.7692 452 0.5519 0.6259 0.5519 0.7429
No log 34.9231 454 0.5660 0.6335 0.5660 0.7523
No log 35.0769 456 0.6022 0.5986 0.6022 0.7760
No log 35.2308 458 0.6049 0.5986 0.6049 0.7778
No log 35.3846 460 0.5716 0.6278 0.5716 0.7560
No log 35.5385 462 0.5518 0.6720 0.5518 0.7428
No log 35.6923 464 0.5799 0.7286 0.5799 0.7615
No log 35.8462 466 0.6108 0.6826 0.6108 0.7815
No log 36.0 468 0.5992 0.6661 0.5992 0.7741
No log 36.1538 470 0.5706 0.6659 0.5706 0.7554
No log 36.3077 472 0.5772 0.6680 0.5772 0.7597
No log 36.4615 474 0.5926 0.6455 0.5926 0.7698
No log 36.6154 476 0.5996 0.6455 0.5996 0.7743
No log 36.7692 478 0.5869 0.6356 0.5869 0.7661
No log 36.9231 480 0.5651 0.6500 0.5651 0.7517
No log 37.0769 482 0.5504 0.6680 0.5504 0.7419
No log 37.2308 484 0.5455 0.6720 0.5455 0.7386
No log 37.3846 486 0.5437 0.6330 0.5437 0.7374
No log 37.5385 488 0.5421 0.6442 0.5421 0.7363
No log 37.6923 490 0.5462 0.6442 0.5462 0.7391
No log 37.8462 492 0.5493 0.6770 0.5493 0.7411
No log 38.0 494 0.5538 0.6461 0.5538 0.7442
No log 38.1538 496 0.5648 0.6753 0.5648 0.7515
No log 38.3077 498 0.5904 0.6444 0.5904 0.7684
0.2598 38.4615 500 0.6005 0.6552 0.6005 0.7749
0.2598 38.6154 502 0.6394 0.6552 0.6394 0.7996
0.2598 38.7692 504 0.6688 0.6317 0.6688 0.8178
0.2598 38.9231 506 0.6392 0.6411 0.6392 0.7995
0.2598 39.0769 508 0.6166 0.6215 0.6166 0.7852
0.2598 39.2308 510 0.5802 0.6099 0.5802 0.7617

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task5_organization

Finetuned
(4019)
this model