ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5775
  • Qwk: 0.6470
  • Mse: 0.5775
  • Rmse: 0.7599

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 4.0568 -0.0174 4.0568 2.0142
No log 0.0851 4 2.4043 0.0014 2.4043 1.5506
No log 0.1277 6 2.0332 -0.0111 2.0332 1.4259
No log 0.1702 8 1.1460 0.1426 1.1460 1.0705
No log 0.2128 10 1.1926 0.0293 1.1926 1.0920
No log 0.2553 12 1.6686 -0.0858 1.6686 1.2917
No log 0.2979 14 1.6574 -0.0858 1.6574 1.2874
No log 0.3404 16 1.4189 -0.0858 1.4189 1.1912
No log 0.3830 18 1.2863 -0.0477 1.2863 1.1341
No log 0.4255 20 1.1859 0.1176 1.1859 1.0890
No log 0.4681 22 1.1302 0.2692 1.1302 1.0631
No log 0.5106 24 1.1241 0.2787 1.1241 1.0602
No log 0.5532 26 1.1141 0.2912 1.1141 1.0555
No log 0.5957 28 1.1284 0.1389 1.1284 1.0623
No log 0.6383 30 1.1268 0.1631 1.1268 1.0615
No log 0.6809 32 1.1620 0.0938 1.1620 1.0779
No log 0.7234 34 1.1801 0.0466 1.1801 1.0863
No log 0.7660 36 1.1574 0.1296 1.1574 1.0758
No log 0.8085 38 1.1175 0.1509 1.1175 1.0571
No log 0.8511 40 1.1649 0.1711 1.1649 1.0793
No log 0.8936 42 1.2084 0.1591 1.2084 1.0993
No log 0.9362 44 1.2767 0.0318 1.2767 1.1299
No log 0.9787 46 1.3148 0.0701 1.3148 1.1466
No log 1.0213 48 1.4443 -0.0032 1.4443 1.2018
No log 1.0638 50 1.6010 0.0380 1.6010 1.2653
No log 1.1064 52 1.4612 0.0760 1.4612 1.2088
No log 1.1489 54 1.1901 0.1433 1.1901 1.0909
No log 1.1915 56 1.0081 0.3577 1.0081 1.0041
No log 1.2340 58 0.9894 0.2370 0.9894 0.9947
No log 1.2766 60 1.0339 0.2478 1.0339 1.0168
No log 1.3191 62 0.9449 0.3428 0.9449 0.9721
No log 1.3617 64 1.0186 0.3738 1.0186 1.0092
No log 1.4043 66 1.1030 0.3521 1.1030 1.0503
No log 1.4468 68 0.9725 0.3391 0.9725 0.9862
No log 1.4894 70 0.9160 0.3937 0.9160 0.9571
No log 1.5319 72 0.9540 0.3969 0.9540 0.9767
No log 1.5745 74 0.9525 0.4175 0.9525 0.9760
No log 1.6170 76 1.2637 0.3528 1.2637 1.1242
No log 1.6596 78 1.5836 0.2086 1.5836 1.2584
No log 1.7021 80 1.3278 0.3646 1.3278 1.1523
No log 1.7447 82 1.0764 0.3681 1.0764 1.0375
No log 1.7872 84 1.2287 0.3400 1.2287 1.1085
No log 1.8298 86 1.2764 0.3373 1.2764 1.1298
No log 1.8723 88 1.2593 0.3336 1.2593 1.1222
No log 1.9149 90 0.9158 0.3864 0.9158 0.9570
No log 1.9574 92 0.7083 0.4854 0.7083 0.8416
No log 2.0 94 0.6711 0.4981 0.6711 0.8192
No log 2.0426 96 0.6570 0.5516 0.6570 0.8106
No log 2.0851 98 0.6746 0.5385 0.6746 0.8214
No log 2.1277 100 0.6715 0.5695 0.6715 0.8194
No log 2.1702 102 0.6270 0.5485 0.6270 0.7918
No log 2.2128 104 0.6436 0.5391 0.6436 0.8023
No log 2.2553 106 0.6519 0.5966 0.6519 0.8074
No log 2.2979 108 0.6434 0.5698 0.6434 0.8022
No log 2.3404 110 0.6132 0.5388 0.6132 0.7830
No log 2.3830 112 0.6054 0.6006 0.6054 0.7781
No log 2.4255 114 0.6507 0.5568 0.6507 0.8066
No log 2.4681 116 0.6107 0.6217 0.6107 0.7815
No log 2.5106 118 0.5713 0.6087 0.5713 0.7558
No log 2.5532 120 0.5476 0.5917 0.5476 0.7400
No log 2.5957 122 0.5708 0.6517 0.5708 0.7555
No log 2.6383 124 0.5938 0.6593 0.5938 0.7706
No log 2.6809 126 0.6029 0.6175 0.6029 0.7765
No log 2.7234 128 0.5990 0.5943 0.5990 0.7740
No log 2.7660 130 0.6322 0.5811 0.6322 0.7951
No log 2.8085 132 0.6248 0.5529 0.6248 0.7905
No log 2.8511 134 0.9085 0.5190 0.9085 0.9532
No log 2.8936 136 1.0124 0.4757 1.0124 1.0062
No log 2.9362 138 0.7140 0.5375 0.7140 0.8450
No log 2.9787 140 0.6816 0.4963 0.6816 0.8256
No log 3.0213 142 0.6921 0.4546 0.6921 0.8319
No log 3.0638 144 0.7280 0.4546 0.7280 0.8532
No log 3.1064 146 0.6962 0.4697 0.6962 0.8344
No log 3.1489 148 0.6866 0.4943 0.6866 0.8286
No log 3.1915 150 0.7252 0.4792 0.7252 0.8516
No log 3.2340 152 0.7783 0.5217 0.7783 0.8822
No log 3.2766 154 0.7228 0.5217 0.7228 0.8502
No log 3.3191 156 0.6205 0.5580 0.6205 0.7877
No log 3.3617 158 0.6317 0.4783 0.6317 0.7948
No log 3.4043 160 0.6539 0.4641 0.6539 0.8086
No log 3.4468 162 0.6815 0.4606 0.6815 0.8255
No log 3.4894 164 0.6980 0.4444 0.6980 0.8355
No log 3.5319 166 0.6953 0.4589 0.6953 0.8339
No log 3.5745 168 0.7034 0.4512 0.7034 0.8387
No log 3.6170 170 0.6760 0.4783 0.6760 0.8222
No log 3.6596 172 0.6699 0.4935 0.6699 0.8185
No log 3.7021 174 0.6846 0.5684 0.6846 0.8274
No log 3.7447 176 0.7563 0.6150 0.7563 0.8696
No log 3.7872 178 0.6604 0.5327 0.6604 0.8127
No log 3.8298 180 0.6797 0.5504 0.6797 0.8245
No log 3.8723 182 0.7258 0.5370 0.7258 0.8519
No log 3.9149 184 0.6601 0.5647 0.6601 0.8124
No log 3.9574 186 1.1192 0.4030 1.1192 1.0579
No log 4.0 188 1.9789 0.2171 1.9789 1.4067
No log 4.0426 190 2.1234 0.2072 2.1234 1.4572
No log 4.0851 192 1.6632 0.2259 1.6632 1.2896
No log 4.1277 194 0.9633 0.3243 0.9633 0.9815
No log 4.1702 196 0.7306 0.5056 0.7306 0.8548
No log 4.2128 198 0.9929 0.4641 0.9929 0.9965
No log 4.2553 200 0.9257 0.4420 0.9257 0.9621
No log 4.2979 202 0.7307 0.4949 0.7307 0.8548
No log 4.3404 204 0.6744 0.5153 0.6744 0.8212
No log 4.3830 206 0.6821 0.5183 0.6821 0.8259
No log 4.4255 208 0.6567 0.5033 0.6567 0.8104
No log 4.4681 210 0.7195 0.4952 0.7195 0.8482
No log 4.5106 212 0.7875 0.4818 0.7875 0.8874
No log 4.5532 214 0.9549 0.4987 0.9549 0.9772
No log 4.5957 216 1.0347 0.4277 1.0347 1.0172
No log 4.6383 218 0.9151 0.3453 0.9151 0.9566
No log 4.6809 220 0.7909 0.3960 0.7909 0.8893
No log 4.7234 222 0.7996 0.5441 0.7996 0.8942
No log 4.7660 224 0.8491 0.5495 0.8491 0.9214
No log 4.8085 226 0.7432 0.5706 0.7432 0.8621
No log 4.8511 228 0.6718 0.5346 0.6718 0.8196
No log 4.8936 230 0.7806 0.4921 0.7806 0.8835
No log 4.9362 232 0.7530 0.5137 0.7530 0.8678
No log 4.9787 234 0.6739 0.5206 0.6739 0.8209
No log 5.0213 236 0.6470 0.5495 0.6470 0.8043
No log 5.0638 238 0.6607 0.5703 0.6607 0.8128
No log 5.1064 240 0.6766 0.6047 0.6766 0.8225
No log 5.1489 242 0.7677 0.5708 0.7677 0.8762
No log 5.1915 244 0.7277 0.5928 0.7277 0.8530
No log 5.2340 246 0.7471 0.5898 0.7471 0.8643
No log 5.2766 248 0.7970 0.5379 0.7970 0.8927
No log 5.3191 250 0.7093 0.5568 0.7093 0.8422
No log 5.3617 252 0.7368 0.5199 0.7368 0.8584
No log 5.4043 254 0.9372 0.4729 0.9372 0.9681
No log 5.4468 256 1.0368 0.3944 1.0368 1.0182
No log 5.4894 258 0.9740 0.4961 0.9740 0.9869
No log 5.5319 260 0.8651 0.4021 0.8651 0.9301
No log 5.5745 262 0.8402 0.4738 0.8402 0.9166
No log 5.6170 264 0.8397 0.4738 0.8397 0.9164
No log 5.6596 266 0.7944 0.5098 0.7944 0.8913
No log 5.7021 268 0.8273 0.5098 0.8273 0.9095
No log 5.7447 270 0.9823 0.4638 0.9823 0.9911
No log 5.7872 272 0.8887 0.4945 0.8887 0.9427
No log 5.8298 274 0.6504 0.5795 0.6504 0.8065
No log 5.8723 276 0.6580 0.5304 0.6580 0.8112
No log 5.9149 278 0.7385 0.5636 0.7385 0.8594
No log 5.9574 280 0.6768 0.5076 0.6768 0.8227
No log 6.0 282 0.6467 0.6252 0.6467 0.8042
No log 6.0426 284 0.6450 0.6024 0.6450 0.8031
No log 6.0851 286 0.6741 0.5938 0.6741 0.8210
No log 6.1277 288 0.7150 0.5288 0.7150 0.8456
No log 6.1702 290 0.7526 0.5153 0.7526 0.8675
No log 6.2128 292 0.7679 0.4919 0.7679 0.8763
No log 6.2553 294 0.7543 0.4903 0.7543 0.8685
No log 6.2979 296 0.7623 0.5088 0.7623 0.8731
No log 6.3404 298 0.8085 0.4382 0.8085 0.8992
No log 6.3830 300 0.7912 0.4501 0.7912 0.8895
No log 6.4255 302 0.7258 0.5703 0.7258 0.8520
No log 6.4681 304 0.7086 0.5823 0.7086 0.8418
No log 6.5106 306 0.7390 0.4109 0.7390 0.8597
No log 6.5532 308 0.7498 0.4109 0.7498 0.8659
No log 6.5957 310 0.6673 0.5464 0.6673 0.8169
No log 6.6383 312 0.6837 0.5455 0.6837 0.8269
No log 6.6809 314 0.7121 0.4850 0.7121 0.8439
No log 6.7234 316 0.6954 0.5345 0.6954 0.8339
No log 6.7660 318 0.7367 0.4977 0.7367 0.8583
No log 6.8085 320 0.9491 0.4961 0.9491 0.9742
No log 6.8511 322 0.9181 0.5167 0.9181 0.9582
No log 6.8936 324 0.7107 0.4984 0.7107 0.8430
No log 6.9362 326 0.6631 0.5017 0.6631 0.8143
No log 6.9787 328 0.7239 0.5385 0.7239 0.8508
No log 7.0213 330 0.7848 0.5675 0.7848 0.8859
No log 7.0638 332 0.7171 0.5707 0.7171 0.8468
No log 7.1064 334 0.6339 0.5510 0.6339 0.7962
No log 7.1489 336 0.7130 0.5242 0.7130 0.8444
No log 7.1915 338 0.7325 0.5242 0.7325 0.8558
No log 7.2340 340 0.6837 0.5084 0.6837 0.8269
No log 7.2766 342 0.7480 0.4709 0.7480 0.8648
No log 7.3191 344 0.8868 0.4568 0.8868 0.9417
No log 7.3617 346 0.8091 0.4911 0.8091 0.8995
No log 7.4043 348 0.6718 0.6397 0.6718 0.8197
No log 7.4468 350 0.6912 0.5112 0.6912 0.8314
No log 7.4894 352 0.7492 0.5381 0.7492 0.8656
No log 7.5319 354 0.7389 0.5112 0.7389 0.8596
No log 7.5745 356 0.7881 0.3860 0.7881 0.8877
No log 7.6170 358 0.8684 0.3551 0.8684 0.9319
No log 7.6596 360 0.8395 0.3556 0.8395 0.9163
No log 7.7021 362 0.8046 0.3960 0.8046 0.8970
No log 7.7447 364 0.7960 0.4781 0.7960 0.8922
No log 7.7872 366 0.7676 0.4903 0.7676 0.8761
No log 7.8298 368 0.7555 0.4345 0.7555 0.8692
No log 7.8723 370 0.7322 0.4610 0.7322 0.8557
No log 7.9149 372 0.7315 0.4227 0.7315 0.8553
No log 7.9574 374 0.7184 0.3861 0.7184 0.8476
No log 8.0 376 0.6792 0.5153 0.6792 0.8241
No log 8.0426 378 0.6868 0.5153 0.6868 0.8287
No log 8.0851 380 0.6896 0.5168 0.6896 0.8304
No log 8.1277 382 0.7034 0.4537 0.7034 0.8387
No log 8.1702 384 0.7026 0.4935 0.7026 0.8382
No log 8.2128 386 0.7081 0.5088 0.7081 0.8415
No log 8.2553 388 0.6940 0.5331 0.6940 0.8331
No log 8.2979 390 0.6644 0.5168 0.6644 0.8151
No log 8.3404 392 0.6766 0.4745 0.6766 0.8226
No log 8.3830 394 0.6919 0.4727 0.6919 0.8318
No log 8.4255 396 0.6366 0.4760 0.6366 0.7979
No log 8.4681 398 0.6078 0.5274 0.6078 0.7796
No log 8.5106 400 0.5997 0.5402 0.5997 0.7744
No log 8.5532 402 0.6025 0.4888 0.6025 0.7762
No log 8.5957 404 0.5955 0.5017 0.5955 0.7717
No log 8.6383 406 0.5704 0.5656 0.5704 0.7552
No log 8.6809 408 0.6070 0.6466 0.6070 0.7791
No log 8.7234 410 0.7412 0.6322 0.7412 0.8609
No log 8.7660 412 0.7747 0.6122 0.7747 0.8802
No log 8.8085 414 0.6638 0.6616 0.6638 0.8147
No log 8.8511 416 0.5471 0.6429 0.5471 0.7397
No log 8.8936 418 0.5366 0.6028 0.5366 0.7325
No log 8.9362 420 0.5605 0.6195 0.5605 0.7487
No log 8.9787 422 0.5858 0.6479 0.5858 0.7654
No log 9.0213 424 0.5829 0.6195 0.5829 0.7635
No log 9.0638 426 0.6303 0.6733 0.6303 0.7939
No log 9.1064 428 0.6526 0.6733 0.6526 0.8078
No log 9.1489 430 0.6537 0.6507 0.6537 0.8085
No log 9.1915 432 0.5919 0.5692 0.5919 0.7694
No log 9.2340 434 0.6038 0.5866 0.6038 0.7770
No log 9.2766 436 0.5858 0.5866 0.5858 0.7654
No log 9.3191 438 0.5680 0.6479 0.5680 0.7537
No log 9.3617 440 0.5712 0.6087 0.5712 0.7558
No log 9.4043 442 0.5857 0.5910 0.5857 0.7653
No log 9.4468 444 0.6173 0.6172 0.6173 0.7857
No log 9.4894 446 0.6179 0.5316 0.6179 0.7861
No log 9.5319 448 0.6186 0.5316 0.6186 0.7865
No log 9.5745 450 0.6139 0.5450 0.6139 0.7835
No log 9.6170 452 0.6249 0.6422 0.6249 0.7905
No log 9.6596 454 0.6457 0.6377 0.6457 0.8036
No log 9.7021 456 0.6139 0.6377 0.6139 0.7835
No log 9.7447 458 0.5990 0.6485 0.5990 0.7740
No log 9.7872 460 0.5441 0.6036 0.5441 0.7376
No log 9.8298 462 0.5366 0.6046 0.5366 0.7326
No log 9.8723 464 0.5420 0.5943 0.5420 0.7362
No log 9.9149 466 0.5353 0.5886 0.5353 0.7316
No log 9.9574 468 0.5473 0.5926 0.5473 0.7398
No log 10.0 470 0.5723 0.5302 0.5723 0.7565
No log 10.0426 472 0.5933 0.5165 0.5933 0.7703
No log 10.0851 474 0.5868 0.5165 0.5868 0.7660
No log 10.1277 476 0.5565 0.5570 0.5565 0.7460
No log 10.1702 478 0.5382 0.6451 0.5382 0.7336
No log 10.2128 480 0.5144 0.6598 0.5144 0.7172
No log 10.2553 482 0.5057 0.6566 0.5057 0.7111
No log 10.2979 484 0.5102 0.6441 0.5102 0.7143
No log 10.3404 486 0.5575 0.6328 0.5575 0.7466
No log 10.3830 488 0.5413 0.6473 0.5413 0.7357
No log 10.4255 490 0.4896 0.6911 0.4896 0.6997
No log 10.4681 492 0.5317 0.6733 0.5317 0.7292
No log 10.5106 494 0.5584 0.6701 0.5584 0.7473
No log 10.5532 496 0.5357 0.6602 0.5357 0.7319
No log 10.5957 498 0.5269 0.7082 0.5269 0.7259
0.319 10.6383 500 0.5412 0.6970 0.5412 0.7357
0.319 10.6809 502 0.5513 0.6770 0.5513 0.7425
0.319 10.7234 504 0.5564 0.6932 0.5564 0.7459
0.319 10.7660 506 0.5627 0.6932 0.5627 0.7502
0.319 10.8085 508 0.5664 0.7041 0.5664 0.7526
0.319 10.8511 510 0.6247 0.6768 0.6247 0.7904
0.319 10.8936 512 0.6216 0.6585 0.6216 0.7884
0.319 10.9362 514 0.5819 0.6217 0.5819 0.7629
0.319 10.9787 516 0.5751 0.6320 0.5751 0.7584
0.319 11.0213 518 0.5775 0.6470 0.5775 0.7599

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

Finetuned
(4019)
this model