ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8361
  • Qwk: 0.5411
  • Mse: 0.8361
  • Rmse: 0.9144

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.3390 0.0163 4.3390 2.0830
No log 0.0784 4 3.1711 0.0080 3.1711 1.7808
No log 0.1176 6 1.8291 0.0629 1.8291 1.3525
No log 0.1569 8 1.3150 0.1021 1.3150 1.1467
No log 0.1961 10 1.4439 -0.0697 1.4439 1.2016
No log 0.2353 12 1.1873 0.1585 1.1873 1.0896
No log 0.2745 14 1.2630 0.0974 1.2630 1.1238
No log 0.3137 16 1.2934 0.1354 1.2934 1.1373
No log 0.3529 18 1.3609 -0.0451 1.3609 1.1666
No log 0.3922 20 1.3466 -0.0300 1.3466 1.1604
No log 0.4314 22 1.3356 0.0339 1.3356 1.1557
No log 0.4706 24 1.1176 0.3338 1.1176 1.0572
No log 0.5098 26 1.0756 0.2936 1.0756 1.0371
No log 0.5490 28 0.9859 0.3307 0.9859 0.9929
No log 0.5882 30 0.9643 0.3073 0.9643 0.9820
No log 0.6275 32 0.9007 0.3854 0.9007 0.9490
No log 0.6667 34 0.9622 0.4213 0.9622 0.9809
No log 0.7059 36 0.9708 0.4383 0.9708 0.9853
No log 0.7451 38 0.9878 0.4100 0.9878 0.9939
No log 0.7843 40 1.0558 0.3998 1.0558 1.0275
No log 0.8235 42 0.9242 0.5643 0.9242 0.9614
No log 0.8627 44 0.7955 0.5241 0.7955 0.8919
No log 0.9020 46 0.7980 0.5329 0.7980 0.8933
No log 0.9412 48 0.8442 0.6061 0.8442 0.9188
No log 0.9804 50 1.0407 0.4839 1.0407 1.0202
No log 1.0196 52 1.1216 0.3570 1.1216 1.0591
No log 1.0588 54 1.0024 0.4949 1.0024 1.0012
No log 1.0980 56 0.8697 0.5816 0.8697 0.9326
No log 1.1373 58 0.8600 0.5816 0.8600 0.9274
No log 1.1765 60 0.8561 0.5673 0.8561 0.9253
No log 1.2157 62 0.9019 0.4699 0.9019 0.9497
No log 1.2549 64 0.8546 0.5415 0.8546 0.9245
No log 1.2941 66 0.8366 0.5308 0.8366 0.9147
No log 1.3333 68 0.8373 0.5102 0.8373 0.9150
No log 1.3725 70 0.8966 0.4919 0.8966 0.9469
No log 1.4118 72 1.1326 0.3388 1.1326 1.0642
No log 1.4510 74 1.1207 0.3289 1.1207 1.0586
No log 1.4902 76 0.9146 0.5290 0.9146 0.9563
No log 1.5294 78 0.9239 0.5012 0.9239 0.9612
No log 1.5686 80 0.9205 0.4914 0.9205 0.9594
No log 1.6078 82 0.8815 0.4496 0.8815 0.9389
No log 1.6471 84 0.9873 0.4961 0.9873 0.9936
No log 1.6863 86 0.9612 0.5115 0.9612 0.9804
No log 1.7255 88 0.9322 0.4411 0.9322 0.9655
No log 1.7647 90 0.9161 0.4718 0.9161 0.9571
No log 1.8039 92 0.9150 0.4718 0.9150 0.9566
No log 1.8431 94 0.9314 0.4403 0.9314 0.9651
No log 1.8824 96 0.9626 0.5011 0.9626 0.9811
No log 1.9216 98 0.9502 0.4914 0.9502 0.9748
No log 1.9608 100 0.9553 0.5180 0.9553 0.9774
No log 2.0 102 0.9492 0.5196 0.9492 0.9743
No log 2.0392 104 0.9319 0.5121 0.9319 0.9653
No log 2.0784 106 0.9399 0.4358 0.9399 0.9695
No log 2.1176 108 1.0022 0.4567 1.0022 1.0011
No log 2.1569 110 0.9675 0.3985 0.9675 0.9836
No log 2.1961 112 1.0184 0.4842 1.0184 1.0091
No log 2.2353 114 1.0343 0.5439 1.0343 1.0170
No log 2.2745 116 1.0025 0.5171 1.0025 1.0012
No log 2.3137 118 0.9541 0.4731 0.9541 0.9768
No log 2.3529 120 0.9240 0.5143 0.9240 0.9612
No log 2.3922 122 0.8736 0.4894 0.8736 0.9346
No log 2.4314 124 0.8481 0.4747 0.8481 0.9209
No log 2.4706 126 0.8490 0.4939 0.8490 0.9214
No log 2.5098 128 0.8457 0.5136 0.8457 0.9196
No log 2.5490 130 0.9305 0.5704 0.9305 0.9646
No log 2.5882 132 0.8684 0.5802 0.8684 0.9319
No log 2.6275 134 0.8357 0.5669 0.8357 0.9141
No log 2.6667 136 0.7962 0.5621 0.7962 0.8923
No log 2.7059 138 0.9018 0.5431 0.9018 0.9496
No log 2.7451 140 1.0108 0.4976 1.0108 1.0054
No log 2.7843 142 0.8833 0.4667 0.8833 0.9399
No log 2.8235 144 0.8146 0.3830 0.8146 0.9025
No log 2.8627 146 0.8029 0.4908 0.8029 0.8960
No log 2.9020 148 0.7534 0.5044 0.7534 0.8680
No log 2.9412 150 0.7207 0.5675 0.7207 0.8489
No log 2.9804 152 0.7896 0.5763 0.7896 0.8886
No log 3.0196 154 0.8883 0.5458 0.8883 0.9425
No log 3.0588 156 0.7851 0.5920 0.7851 0.8861
No log 3.0980 158 0.7777 0.6271 0.7777 0.8819
No log 3.1373 160 0.7791 0.6305 0.7791 0.8827
No log 3.1765 162 0.7703 0.5937 0.7703 0.8777
No log 3.2157 164 0.7848 0.5611 0.7848 0.8859
No log 3.2549 166 0.8604 0.5892 0.8604 0.9276
No log 3.2941 168 0.8466 0.5647 0.8466 0.9201
No log 3.3333 170 0.8205 0.5735 0.8205 0.9058
No log 3.3725 172 0.8292 0.5879 0.8292 0.9106
No log 3.4118 174 0.8713 0.5601 0.8713 0.9334
No log 3.4510 176 0.8535 0.5624 0.8535 0.9238
No log 3.4902 178 0.8391 0.5424 0.8391 0.9160
No log 3.5294 180 0.8333 0.4582 0.8333 0.9128
No log 3.5686 182 0.8298 0.4936 0.8298 0.9109
No log 3.6078 184 0.8462 0.4870 0.8462 0.9199
No log 3.6471 186 0.9183 0.5495 0.9183 0.9583
No log 3.6863 188 1.0800 0.5489 1.0800 1.0392
No log 3.7255 190 1.1408 0.5358 1.1408 1.0681
No log 3.7647 192 1.0451 0.5526 1.0451 1.0223
No log 3.8039 194 0.8646 0.5039 0.8646 0.9298
No log 3.8431 196 0.8304 0.4853 0.8304 0.9113
No log 3.8824 198 0.8522 0.5008 0.8522 0.9231
No log 3.9216 200 0.8346 0.4699 0.8346 0.9136
No log 3.9608 202 0.8770 0.5305 0.8770 0.9365
No log 4.0 204 0.8574 0.4926 0.8574 0.9259
No log 4.0392 206 0.8427 0.4618 0.8427 0.9180
No log 4.0784 208 0.8458 0.3913 0.8458 0.9197
No log 4.1176 210 0.8551 0.4304 0.8551 0.9247
No log 4.1569 212 0.8266 0.4728 0.8266 0.9092
No log 4.1961 214 0.8493 0.4632 0.8493 0.9216
No log 4.2353 216 0.8255 0.4993 0.8255 0.9086
No log 4.2745 218 0.8336 0.4993 0.8336 0.9130
No log 4.3137 220 0.8400 0.5295 0.8400 0.9165
No log 4.3529 222 0.8757 0.5114 0.8757 0.9358
No log 4.3922 224 0.8357 0.5275 0.8357 0.9142
No log 4.4314 226 0.8055 0.5462 0.8055 0.8975
No log 4.4706 228 0.8990 0.4829 0.8990 0.9482
No log 4.5098 230 0.9906 0.4444 0.9906 0.9953
No log 4.5490 232 0.9171 0.4849 0.9171 0.9577
No log 4.5882 234 0.8014 0.4815 0.8014 0.8952
No log 4.6275 236 0.8638 0.5177 0.8638 0.9294
No log 4.6667 238 1.0202 0.5224 1.0202 1.0101
No log 4.7059 240 0.9798 0.4883 0.9798 0.9898
No log 4.7451 242 0.8498 0.5647 0.8498 0.9219
No log 4.7843 244 0.8324 0.5279 0.8324 0.9124
No log 4.8235 246 0.8286 0.5279 0.8286 0.9103
No log 4.8627 248 0.8093 0.5411 0.8093 0.8996
No log 4.9020 250 0.7805 0.5204 0.7805 0.8834
No log 4.9412 252 0.7706 0.5556 0.7706 0.8778
No log 4.9804 254 0.8082 0.5812 0.8082 0.8990
No log 5.0196 256 0.8358 0.5269 0.8358 0.9142
No log 5.0588 258 0.7784 0.6288 0.7784 0.8823
No log 5.0980 260 0.7799 0.6241 0.7799 0.8831
No log 5.1373 262 0.8340 0.5728 0.8340 0.9133
No log 5.1765 264 0.8336 0.5458 0.8336 0.9130
No log 5.2157 266 0.8319 0.5345 0.8319 0.9121
No log 5.2549 268 0.8207 0.4794 0.8207 0.9059
No log 5.2941 270 0.8046 0.3991 0.8046 0.8970
No log 5.3333 272 0.7943 0.3987 0.7943 0.8913
No log 5.3725 274 0.7896 0.4218 0.7896 0.8886
No log 5.4118 276 0.7945 0.4534 0.7945 0.8913
No log 5.4510 278 0.7943 0.3852 0.7943 0.8913
No log 5.4902 280 0.8186 0.4603 0.8186 0.9047
No log 5.5294 282 0.8494 0.5542 0.8494 0.9216
No log 5.5686 284 0.9098 0.5600 0.9098 0.9538
No log 5.6078 286 0.8693 0.5600 0.8693 0.9324
No log 5.6471 288 0.7850 0.5073 0.7850 0.8860
No log 5.6863 290 0.8154 0.4465 0.8154 0.9030
No log 5.7255 292 0.8813 0.4741 0.8813 0.9388
No log 5.7647 294 0.8395 0.4331 0.8395 0.9162
No log 5.8039 296 0.8105 0.4980 0.8105 0.9003
No log 5.8431 298 0.9452 0.5614 0.9452 0.9722
No log 5.8824 300 0.9990 0.5471 0.9990 0.9995
No log 5.9216 302 0.8805 0.5515 0.8805 0.9383
No log 5.9608 304 0.7853 0.4160 0.7853 0.8861
No log 6.0 306 0.8060 0.4598 0.8060 0.8978
No log 6.0392 308 0.7794 0.4598 0.7794 0.8828
No log 6.0784 310 0.7702 0.5408 0.7702 0.8776
No log 6.1176 312 0.8461 0.5658 0.8461 0.9198
No log 6.1569 314 0.8483 0.5658 0.8483 0.9211
No log 6.1961 316 0.8126 0.5390 0.8126 0.9014
No log 6.2353 318 0.7923 0.5435 0.7923 0.8901
No log 6.2745 320 0.7870 0.4834 0.7870 0.8871
No log 6.3137 322 0.8131 0.4724 0.8131 0.9017
No log 6.3529 324 0.9059 0.5577 0.9059 0.9518
No log 6.3922 326 0.9568 0.5199 0.9568 0.9782
No log 6.4314 328 1.0070 0.5280 1.0070 1.0035
No log 6.4706 330 0.9527 0.5224 0.9527 0.9760
No log 6.5098 332 0.8292 0.5660 0.8292 0.9106
No log 6.5490 334 0.7479 0.5548 0.7479 0.8648
No log 6.5882 336 0.7430 0.5439 0.7430 0.8620
No log 6.6275 338 0.7477 0.5322 0.7477 0.8647
No log 6.6667 340 0.7680 0.5971 0.7680 0.8764
No log 6.7059 342 0.8607 0.5370 0.8607 0.9277
No log 6.7451 344 0.9665 0.5304 0.9665 0.9831
No log 6.7843 346 0.9626 0.5233 0.9626 0.9811
No log 6.8235 348 0.9042 0.5130 0.9042 0.9509
No log 6.8627 350 0.8280 0.4143 0.8280 0.9100
No log 6.9020 352 0.8019 0.4620 0.8019 0.8955
No log 6.9412 354 0.8078 0.5390 0.8078 0.8988
No log 6.9804 356 0.8204 0.5487 0.8204 0.9058
No log 7.0196 358 0.8385 0.5279 0.8385 0.9157
No log 7.0588 360 0.8447 0.5465 0.8447 0.9191
No log 7.0980 362 0.8569 0.4964 0.8569 0.9257
No log 7.1373 364 0.9630 0.4991 0.9630 0.9813
No log 7.1765 366 1.0235 0.4475 1.0235 1.0117
No log 7.2157 368 0.9669 0.4197 0.9669 0.9833
No log 7.2549 370 0.8880 0.3627 0.8880 0.9423
No log 7.2941 372 0.9081 0.4363 0.9081 0.9529
No log 7.3333 374 0.9592 0.4840 0.9592 0.9794
No log 7.3725 376 0.9552 0.5014 0.9552 0.9773
No log 7.4118 378 0.8842 0.5042 0.8842 0.9403
No log 7.4510 380 0.8214 0.4340 0.8214 0.9063
No log 7.4902 382 0.8120 0.4340 0.8120 0.9011
No log 7.5294 384 0.8256 0.5411 0.8256 0.9086
No log 7.5686 386 0.8395 0.5390 0.8395 0.9162
No log 7.6078 388 0.8130 0.5548 0.8130 0.9017
No log 7.6471 390 0.7916 0.5011 0.7916 0.8897
No log 7.6863 392 0.7922 0.4760 0.7922 0.8901
No log 7.7255 394 0.7928 0.4760 0.7928 0.8904
No log 7.7647 396 0.7967 0.5361 0.7967 0.8926
No log 7.8039 398 0.8061 0.5738 0.8061 0.8978
No log 7.8431 400 0.8119 0.5730 0.8119 0.9011
No log 7.8824 402 0.8206 0.5728 0.8206 0.9059
No log 7.9216 404 0.8140 0.5611 0.8140 0.9022
No log 7.9608 406 0.8045 0.5880 0.8045 0.8969
No log 8.0 408 0.7933 0.6120 0.7933 0.8907
No log 8.0392 410 0.7993 0.5892 0.7993 0.8940
No log 8.0784 412 0.7860 0.5983 0.7860 0.8866
No log 8.1176 414 0.7771 0.5391 0.7771 0.8815
No log 8.1569 416 0.7740 0.5214 0.7740 0.8798
No log 8.1961 418 0.7780 0.5769 0.7780 0.8820
No log 8.2353 420 0.8088 0.5706 0.8088 0.8993
No log 8.2745 422 0.8658 0.5682 0.8658 0.9305
No log 8.3137 424 0.8623 0.5539 0.8623 0.9286
No log 8.3529 426 0.8157 0.5385 0.8157 0.9032
No log 8.3922 428 0.7895 0.4757 0.7895 0.8885
No log 8.4314 430 0.7917 0.4548 0.7917 0.8898
No log 8.4706 432 0.7948 0.4548 0.7948 0.8915
No log 8.5098 434 0.7991 0.5028 0.7991 0.8939
No log 8.5490 436 0.7882 0.5773 0.7882 0.8878
No log 8.5882 438 0.7758 0.5596 0.7758 0.8808
No log 8.6275 440 0.7697 0.5242 0.7697 0.8773
No log 8.6667 442 0.7701 0.5773 0.7701 0.8775
No log 8.7059 444 0.7728 0.5773 0.7728 0.8791
No log 8.7451 446 0.7758 0.5573 0.7758 0.8808
No log 8.7843 448 0.7774 0.5773 0.7774 0.8817
No log 8.8235 450 0.7862 0.5773 0.7862 0.8867
No log 8.8627 452 0.7919 0.5773 0.7919 0.8899
No log 8.9020 454 0.8019 0.5773 0.8019 0.8955
No log 8.9412 456 0.8108 0.5886 0.8108 0.9004
No log 8.9804 458 0.8118 0.5886 0.8118 0.9010
No log 9.0196 460 0.8145 0.5886 0.8145 0.9025
No log 9.0588 462 0.8053 0.5391 0.8053 0.8974
No log 9.0980 464 0.8075 0.5391 0.8075 0.8986
No log 9.1373 466 0.8047 0.5204 0.8047 0.8970
No log 9.1765 468 0.7909 0.5011 0.7909 0.8893
No log 9.2157 470 0.7839 0.5011 0.7839 0.8854
No log 9.2549 472 0.7871 0.5204 0.7871 0.8872
No log 9.2941 474 0.8246 0.5698 0.8246 0.9081
No log 9.3333 476 0.8261 0.5698 0.8261 0.9089
No log 9.3725 478 0.7843 0.5203 0.7843 0.8856
No log 9.4118 480 0.7642 0.4548 0.7642 0.8742
No log 9.4510 482 0.7512 0.4980 0.7512 0.8667
No log 9.4902 484 0.7383 0.5983 0.7383 0.8593
No log 9.5294 486 0.7351 0.5892 0.7351 0.8574
No log 9.5686 488 0.7181 0.5684 0.7181 0.8474
No log 9.6078 490 0.7140 0.5322 0.7140 0.8450
No log 9.6471 492 0.7296 0.5474 0.7296 0.8542
No log 9.6863 494 0.7306 0.5451 0.7306 0.8547
No log 9.7255 496 0.7309 0.6237 0.7309 0.8549
No log 9.7647 498 0.7987 0.5981 0.7987 0.8937
0.3162 9.8039 500 0.8345 0.5781 0.8345 0.9135
0.3162 9.8431 502 0.8285 0.5680 0.8285 0.9102
0.3162 9.8824 504 0.8142 0.5563 0.8142 0.9023
0.3162 9.9216 506 0.8326 0.5365 0.8326 0.9125
0.3162 9.9608 508 0.8258 0.5026 0.8258 0.9087
0.3162 10.0 510 0.8361 0.5411 0.8361 0.9144

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task2_organization

Finetuned
(4019)
this model