ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5889
  • Qwk: 0.5323
  • Mse: 0.5889
  • Rmse: 0.7674

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 4.4033 -0.0538 4.4033 2.0984
No log 0.0909 4 2.3087 0.0288 2.3087 1.5194
No log 0.1364 6 1.2428 0.0361 1.2428 1.1148
No log 0.1818 8 1.0802 -0.0175 1.0802 1.0393
No log 0.2273 10 1.1948 0.0909 1.1948 1.0931
No log 0.2727 12 1.0569 0.0674 1.0569 1.0280
No log 0.3182 14 1.0328 0.0727 1.0328 1.0163
No log 0.3636 16 1.2413 0.1739 1.2413 1.1142
No log 0.4091 18 1.8109 0.1215 1.8109 1.3457
No log 0.4545 20 1.4325 0.1563 1.4325 1.1969
No log 0.5 22 0.8889 0.2263 0.8889 0.9428
No log 0.5455 24 0.6830 0.3714 0.6830 0.8265
No log 0.5909 26 0.6340 0.4111 0.6340 0.7962
No log 0.6364 28 0.6108 0.4212 0.6108 0.7815
No log 0.6818 30 0.6854 0.4688 0.6854 0.8279
No log 0.7273 32 1.1176 0.3067 1.1176 1.0572
No log 0.7727 34 1.0687 0.2424 1.0687 1.0338
No log 0.8182 36 0.9908 0.2144 0.9908 0.9954
No log 0.8636 38 0.9330 0.1723 0.9330 0.9659
No log 0.9091 40 1.1090 0.1027 1.1090 1.0531
No log 0.9545 42 1.1448 0.1079 1.1448 1.0700
No log 1.0 44 0.8808 0.2453 0.8808 0.9385
No log 1.0455 46 0.6092 0.3923 0.6092 0.7805
No log 1.0909 48 0.6660 0.4096 0.6660 0.8161
No log 1.1364 50 0.6824 0.4211 0.6824 0.8261
No log 1.1818 52 0.5946 0.4260 0.5946 0.7711
No log 1.2273 54 0.6795 0.3616 0.6795 0.8243
No log 1.2727 56 0.6125 0.4061 0.6125 0.7827
No log 1.3182 58 0.6032 0.4881 0.6032 0.7767
No log 1.3636 60 0.5920 0.4963 0.5920 0.7694
No log 1.4091 62 0.7287 0.4563 0.7287 0.8536
No log 1.4545 64 1.1781 0.3566 1.1781 1.0854
No log 1.5 66 1.1201 0.3789 1.1201 1.0583
No log 1.5455 68 0.6920 0.5588 0.6920 0.8319
No log 1.5909 70 0.6277 0.5505 0.6277 0.7923
No log 1.6364 72 0.6089 0.5525 0.6089 0.7803
No log 1.6818 74 0.6083 0.5357 0.6083 0.7800
No log 1.7273 76 0.6958 0.5555 0.6958 0.8341
No log 1.7727 78 1.1820 0.4007 1.1820 1.0872
No log 1.8182 80 1.1266 0.4343 1.1266 1.0614
No log 1.8636 82 0.7125 0.5977 0.7125 0.8441
No log 1.9091 84 0.6431 0.5241 0.6431 0.8019
No log 1.9545 86 0.6329 0.4986 0.6329 0.7955
No log 2.0 88 0.8788 0.4862 0.8788 0.9374
No log 2.0455 90 1.0403 0.3973 1.0403 1.0200
No log 2.0909 92 0.7697 0.5436 0.7697 0.8773
No log 2.1364 94 0.5988 0.5274 0.5988 0.7738
No log 2.1818 96 0.5804 0.5316 0.5804 0.7618
No log 2.2273 98 0.5692 0.5320 0.5692 0.7545
No log 2.2727 100 0.5692 0.5427 0.5692 0.7545
No log 2.3182 102 0.5878 0.5030 0.5878 0.7667
No log 2.3636 104 0.5802 0.5391 0.5802 0.7617
No log 2.4091 106 0.6295 0.4990 0.6295 0.7934
No log 2.4545 108 0.6622 0.5281 0.6622 0.8138
No log 2.5 110 0.6207 0.5313 0.6207 0.7878
No log 2.5455 112 0.6344 0.5236 0.6344 0.7965
No log 2.5909 114 0.6612 0.5731 0.6612 0.8132
No log 2.6364 116 0.9994 0.4444 0.9994 0.9997
No log 2.6818 118 0.9680 0.4552 0.9680 0.9839
No log 2.7273 120 0.7574 0.5393 0.7574 0.8703
No log 2.7727 122 0.6920 0.5772 0.6920 0.8319
No log 2.8182 124 0.6820 0.5152 0.6820 0.8259
No log 2.8636 126 0.6941 0.4657 0.6941 0.8331
No log 2.9091 128 0.8589 0.4668 0.8589 0.9268
No log 2.9545 130 0.7585 0.4588 0.7585 0.8709
No log 3.0 132 0.6715 0.5005 0.6715 0.8194
No log 3.0455 134 0.7463 0.5207 0.7463 0.8639
No log 3.0909 136 0.6932 0.5227 0.6932 0.8326
No log 3.1364 138 0.8267 0.4726 0.8267 0.9092
No log 3.1818 140 0.9203 0.4098 0.9203 0.9593
No log 3.2273 142 0.8558 0.4876 0.8558 0.9251
No log 3.2727 144 0.7390 0.5286 0.7390 0.8597
No log 3.3182 146 0.6446 0.4648 0.6446 0.8029
No log 3.3636 148 0.6473 0.4636 0.6473 0.8045
No log 3.4091 150 0.6271 0.4144 0.6271 0.7919
No log 3.4545 152 0.6477 0.4415 0.6477 0.8048
No log 3.5 154 0.6561 0.4426 0.6561 0.8100
No log 3.5455 156 0.6944 0.5114 0.6944 0.8333
No log 3.5909 158 0.7083 0.5028 0.7083 0.8416
No log 3.6364 160 0.7219 0.4999 0.7219 0.8496
No log 3.6818 162 0.7577 0.5039 0.7577 0.8704
No log 3.7273 164 0.7012 0.5458 0.7012 0.8374
No log 3.7727 166 0.7922 0.4948 0.7922 0.8900
No log 3.8182 168 0.8855 0.4421 0.8855 0.9410
No log 3.8636 170 0.7425 0.4990 0.7425 0.8617
No log 3.9091 172 0.6592 0.5179 0.6592 0.8119
No log 3.9545 174 0.7246 0.5030 0.7246 0.8512
No log 4.0 176 0.6534 0.5410 0.6534 0.8083
No log 4.0455 178 0.6907 0.4705 0.6907 0.8311
No log 4.0909 180 0.8960 0.4224 0.8960 0.9466
No log 4.1364 182 0.8901 0.4297 0.8901 0.9435
No log 4.1818 184 0.6941 0.4546 0.6941 0.8331
No log 4.2273 186 0.6441 0.4599 0.6441 0.8025
No log 4.2727 188 0.6551 0.5421 0.6551 0.8094
No log 4.3182 190 0.6616 0.5282 0.6616 0.8134
No log 4.3636 192 0.7745 0.5163 0.7745 0.8800
No log 4.4091 194 0.8546 0.5011 0.8546 0.9244
No log 4.4545 196 0.7713 0.4924 0.7713 0.8783
No log 4.5 198 0.7163 0.4858 0.7163 0.8464
No log 4.5455 200 0.7447 0.4037 0.7447 0.8629
No log 4.5909 202 0.7205 0.3885 0.7205 0.8488
No log 4.6364 204 0.6594 0.4171 0.6594 0.8120
No log 4.6818 206 0.7172 0.4723 0.7172 0.8469
No log 4.7273 208 0.7432 0.5260 0.7432 0.8621
No log 4.7727 210 0.6747 0.5201 0.6747 0.8214
No log 4.8182 212 0.7873 0.4621 0.7873 0.8873
No log 4.8636 214 0.9349 0.4449 0.9349 0.9669
No log 4.9091 216 0.8006 0.4881 0.8006 0.8947
No log 4.9545 218 0.6345 0.4753 0.6345 0.7966
No log 5.0 220 0.7377 0.4855 0.7377 0.8589
No log 5.0455 222 0.7726 0.5013 0.7726 0.8790
No log 5.0909 224 0.6606 0.5276 0.6606 0.8128
No log 5.1364 226 0.6328 0.5414 0.6328 0.7955
No log 5.1818 228 0.6817 0.5085 0.6817 0.8257
No log 5.2273 230 0.6813 0.5503 0.6813 0.8254
No log 5.2727 232 0.6295 0.5336 0.6295 0.7934
No log 5.3182 234 0.6250 0.5318 0.6250 0.7906
No log 5.3636 236 0.6101 0.5373 0.6101 0.7811
No log 5.4091 238 0.6317 0.5170 0.6317 0.7948
No log 5.4545 240 0.6610 0.5425 0.6610 0.8130
No log 5.5 242 0.6394 0.5777 0.6394 0.7996
No log 5.5455 244 0.6321 0.5421 0.6321 0.7950
No log 5.5909 246 0.6207 0.5521 0.6207 0.7878
No log 5.6364 248 0.6070 0.5623 0.6070 0.7791
No log 5.6818 250 0.6041 0.5221 0.6041 0.7773
No log 5.7273 252 0.6121 0.5397 0.6121 0.7824
No log 5.7727 254 0.6140 0.5531 0.6140 0.7836
No log 5.8182 256 0.6139 0.5930 0.6139 0.7835
No log 5.8636 258 0.6031 0.5601 0.6031 0.7766
No log 5.9091 260 0.6053 0.5285 0.6053 0.7780
No log 5.9545 262 0.6116 0.5563 0.6116 0.7821
No log 6.0 264 0.6215 0.5379 0.6215 0.7884
No log 6.0455 266 0.7053 0.4936 0.7053 0.8398
No log 6.0909 268 0.7237 0.4936 0.7237 0.8507
No log 6.1364 270 0.7920 0.4654 0.7920 0.8899
No log 6.1818 272 1.0079 0.4543 1.0079 1.0040
No log 6.2273 274 1.1400 0.4045 1.1400 1.0677
No log 6.2727 276 0.9443 0.4805 0.9443 0.9718
No log 6.3182 278 0.7015 0.5482 0.7015 0.8375
No log 6.3636 280 0.6383 0.5582 0.6383 0.7990
No log 6.4091 282 0.6221 0.5645 0.6221 0.7887
No log 6.4545 284 0.6249 0.5016 0.6249 0.7905
No log 6.5 286 0.6038 0.4511 0.6038 0.7771
No log 6.5455 288 0.5781 0.5251 0.5781 0.7604
No log 6.5909 290 0.5810 0.5272 0.5810 0.7623
No log 6.6364 292 0.5753 0.5421 0.5753 0.7585
No log 6.6818 294 0.5873 0.4983 0.5873 0.7664
No log 6.7273 296 0.5982 0.5069 0.5982 0.7734
No log 6.7727 298 0.5980 0.5698 0.5980 0.7733
No log 6.8182 300 0.6651 0.5410 0.6651 0.8155
No log 6.8636 302 0.6810 0.5344 0.6810 0.8253
No log 6.9091 304 0.6363 0.4926 0.6363 0.7977
No log 6.9545 306 0.6089 0.4659 0.6089 0.7804
No log 7.0 308 0.6079 0.4414 0.6079 0.7797
No log 7.0455 310 0.6146 0.4392 0.6146 0.7840
No log 7.0909 312 0.6184 0.4592 0.6184 0.7864
No log 7.1364 314 0.6822 0.5173 0.6822 0.8259
No log 7.1818 316 0.6787 0.4942 0.6787 0.8239
No log 7.2273 318 0.6192 0.5203 0.6192 0.7869
No log 7.2727 320 0.6235 0.4834 0.6235 0.7896
No log 7.3182 322 0.6527 0.4650 0.6527 0.8079
No log 7.3636 324 0.6243 0.4533 0.6243 0.7901
No log 7.4091 326 0.6160 0.4533 0.6160 0.7848
No log 7.4545 328 0.6270 0.4797 0.6270 0.7919
No log 7.5 330 0.7291 0.4902 0.7291 0.8539
No log 7.5455 332 0.7224 0.5027 0.7224 0.8500
No log 7.5909 334 0.6219 0.5171 0.6219 0.7886
No log 7.6364 336 0.5974 0.5471 0.5974 0.7729
No log 7.6818 338 0.6089 0.5282 0.6089 0.7803
No log 7.7273 340 0.6117 0.5480 0.6117 0.7821
No log 7.7727 342 0.6733 0.5403 0.6733 0.8205
No log 7.8182 344 0.7453 0.4913 0.7453 0.8633
No log 7.8636 346 0.7504 0.4913 0.7504 0.8663
No log 7.9091 348 0.6930 0.5561 0.6930 0.8325
No log 7.9545 350 0.6527 0.5566 0.6527 0.8079
No log 8.0 352 0.6665 0.5386 0.6665 0.8164
No log 8.0455 354 0.7263 0.4885 0.7263 0.8522
No log 8.0909 356 0.7288 0.4896 0.7288 0.8537
No log 8.1364 358 0.6450 0.4754 0.6450 0.8031
No log 8.1818 360 0.5817 0.4864 0.5817 0.7627
No log 8.2273 362 0.5714 0.5144 0.5714 0.7559
No log 8.2727 364 0.5934 0.5107 0.5934 0.7703
No log 8.3182 366 0.5956 0.5332 0.5956 0.7717
No log 8.3636 368 0.5572 0.5876 0.5572 0.7464
No log 8.4091 370 0.5492 0.5544 0.5492 0.7411
No log 8.4545 372 0.5473 0.5470 0.5473 0.7398
No log 8.5 374 0.5521 0.5684 0.5521 0.7430
No log 8.5455 376 0.5757 0.5651 0.5757 0.7588
No log 8.5909 378 0.5707 0.5627 0.5707 0.7555
No log 8.6364 380 0.5654 0.5320 0.5654 0.7519
No log 8.6818 382 0.5681 0.5578 0.5681 0.7537
No log 8.7273 384 0.5694 0.5451 0.5694 0.7546
No log 8.7727 386 0.5706 0.5045 0.5706 0.7554
No log 8.8182 388 0.5693 0.5127 0.5693 0.7545
No log 8.8636 390 0.5680 0.5408 0.5680 0.7536
No log 8.9091 392 0.6025 0.5017 0.6025 0.7762
No log 8.9545 394 0.5974 0.4651 0.5974 0.7729
No log 9.0 396 0.5916 0.5173 0.5916 0.7692
No log 9.0455 398 0.5865 0.5385 0.5865 0.7658
No log 9.0909 400 0.5708 0.5172 0.5708 0.7555
No log 9.1364 402 0.5645 0.5046 0.5645 0.7513
No log 9.1818 404 0.5666 0.5183 0.5666 0.7528
No log 9.2273 406 0.5724 0.4777 0.5724 0.7565
No log 9.2727 408 0.5747 0.5158 0.5747 0.7581
No log 9.3182 410 0.5900 0.5005 0.5900 0.7681
No log 9.3636 412 0.5965 0.5089 0.5965 0.7723
No log 9.4091 414 0.5992 0.5153 0.5992 0.7741
No log 9.4545 416 0.6150 0.5476 0.6150 0.7842
No log 9.5 418 0.6291 0.5481 0.6291 0.7931
No log 9.5455 420 0.6125 0.5481 0.6125 0.7826
No log 9.5909 422 0.6059 0.5656 0.6059 0.7784
No log 9.6364 424 0.5984 0.5323 0.5984 0.7736
No log 9.6818 426 0.6042 0.5421 0.6042 0.7773
No log 9.7273 428 0.6061 0.5506 0.6061 0.7785
No log 9.7727 430 0.6154 0.5918 0.6154 0.7845
No log 9.8182 432 0.6172 0.5787 0.6172 0.7856
No log 9.8636 434 0.6036 0.5915 0.6036 0.7769
No log 9.9091 436 0.6062 0.5855 0.6062 0.7786
No log 9.9545 438 0.6077 0.5486 0.6077 0.7796
No log 10.0 440 0.5865 0.5704 0.5865 0.7658
No log 10.0455 442 0.5846 0.5930 0.5846 0.7646
No log 10.0909 444 0.5907 0.5613 0.5907 0.7686
No log 10.1364 446 0.5961 0.5833 0.5961 0.7720
No log 10.1818 448 0.5985 0.5927 0.5985 0.7736
No log 10.2273 450 0.5925 0.5706 0.5925 0.7697
No log 10.2727 452 0.5816 0.5678 0.5816 0.7626
No log 10.3182 454 0.5766 0.5073 0.5766 0.7593
No log 10.3636 456 0.5694 0.5514 0.5694 0.7546
No log 10.4091 458 0.6255 0.4734 0.6255 0.7909
No log 10.4545 460 0.6831 0.4384 0.6831 0.8265
No log 10.5 462 0.6729 0.4394 0.6729 0.8203
No log 10.5455 464 0.5944 0.4868 0.5944 0.7710
No log 10.5909 466 0.5620 0.5314 0.5620 0.7497
No log 10.6364 468 0.5582 0.5638 0.5582 0.7471
No log 10.6818 470 0.5757 0.5263 0.5757 0.7588
No log 10.7273 472 0.6025 0.4758 0.6025 0.7762
No log 10.7727 474 0.5964 0.4939 0.5964 0.7723
No log 10.8182 476 0.6207 0.5742 0.6207 0.7878
No log 10.8636 478 0.6519 0.5383 0.6519 0.8074
No log 10.9091 480 0.6894 0.5426 0.6894 0.8303
No log 10.9545 482 0.7172 0.5463 0.7172 0.8469
No log 11.0 484 0.6856 0.5680 0.6856 0.8280
No log 11.0455 486 0.6338 0.5665 0.6338 0.7961
No log 11.0909 488 0.6416 0.4742 0.6416 0.8010
No log 11.1364 490 0.6449 0.4620 0.6449 0.8030
No log 11.1818 492 0.6210 0.4518 0.6210 0.7880
No log 11.2273 494 0.5882 0.4857 0.5882 0.7670
No log 11.2727 496 0.5817 0.5665 0.5817 0.7627
No log 11.3182 498 0.5974 0.5476 0.5974 0.7729
0.3486 11.3636 500 0.6440 0.5858 0.6440 0.8025
0.3486 11.4091 502 0.7294 0.5243 0.7294 0.8541
0.3486 11.4545 504 0.7317 0.4901 0.7317 0.8554
0.3486 11.5 506 0.6473 0.5478 0.6473 0.8045
0.3486 11.5455 508 0.5977 0.5444 0.5977 0.7731
0.3486 11.5909 510 0.5889 0.5323 0.5889 0.7674

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task2_organization

Finetuned
(4023)
this model