ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6887
  • Qwk: 0.4758
  • Mse: 0.6887
  • Rmse: 0.8299

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 2.9818 -0.0677 2.9818 1.7268
No log 0.0444 4 1.2323 0.1508 1.2323 1.1101
No log 0.0667 6 0.9494 0.1882 0.9494 0.9744
No log 0.0889 8 0.6847 0.3273 0.6847 0.8275
No log 0.1111 10 0.8615 0.3170 0.8615 0.9282
No log 0.1333 12 1.1090 -0.1390 1.1090 1.0531
No log 0.1556 14 1.2130 -0.1277 1.2130 1.1014
No log 0.1778 16 1.0788 0.0236 1.0788 1.0386
No log 0.2 18 1.0958 0.1401 1.0958 1.0468
No log 0.2222 20 1.0187 0.2954 1.0187 1.0093
No log 0.2444 22 0.9658 0.3337 0.9658 0.9828
No log 0.2667 24 0.9190 0.3170 0.9190 0.9586
No log 0.2889 26 1.0349 0.3225 1.0349 1.0173
No log 0.3111 28 1.0244 0.3225 1.0244 1.0121
No log 0.3333 30 0.9033 0.2193 0.9033 0.9504
No log 0.3556 32 0.8186 0.1699 0.8186 0.9048
No log 0.3778 34 0.7945 0.1187 0.7945 0.8913
No log 0.4 36 0.7668 0.0393 0.7668 0.8757
No log 0.4222 38 0.7282 -0.0027 0.7282 0.8533
No log 0.4444 40 0.6697 0.0798 0.6697 0.8183
No log 0.4667 42 0.6276 0.1617 0.6276 0.7922
No log 0.4889 44 0.5912 0.2711 0.5912 0.7689
No log 0.5111 46 0.5619 0.3029 0.5619 0.7496
No log 0.5333 48 0.5455 0.5413 0.5455 0.7386
No log 0.5556 50 0.6169 0.4424 0.6169 0.7854
No log 0.5778 52 0.5468 0.5772 0.5468 0.7395
No log 0.6 54 0.5119 0.56 0.5119 0.7155
No log 0.6222 56 0.5168 0.6001 0.5168 0.7189
No log 0.6444 58 0.6218 0.4997 0.6218 0.7885
No log 0.6667 60 0.6438 0.4606 0.6438 0.8023
No log 0.6889 62 0.7727 0.3767 0.7727 0.8790
No log 0.7111 64 0.9674 0.3777 0.9674 0.9836
No log 0.7333 66 0.7568 0.3638 0.7568 0.8699
No log 0.7556 68 0.6581 0.4495 0.6581 0.8112
No log 0.7778 70 0.6390 0.5412 0.6390 0.7994
No log 0.8 72 0.6719 0.5628 0.6719 0.8197
No log 0.8222 74 0.7345 0.5530 0.7345 0.8570
No log 0.8444 76 0.7567 0.5724 0.7567 0.8699
No log 0.8667 78 0.6986 0.5656 0.6986 0.8358
No log 0.8889 80 0.6185 0.5737 0.6185 0.7865
No log 0.9111 82 0.6184 0.6076 0.6184 0.7864
No log 0.9333 84 0.6846 0.4634 0.6846 0.8274
No log 0.9556 86 0.6083 0.5666 0.6083 0.7800
No log 0.9778 88 0.5758 0.4314 0.5758 0.7588
No log 1.0 90 0.6143 0.2943 0.6143 0.7837
No log 1.0222 92 0.6646 0.3155 0.6646 0.8152
No log 1.0444 94 0.6841 0.2718 0.6841 0.8271
No log 1.0667 96 0.6749 0.3050 0.6749 0.8215
No log 1.0889 98 0.6679 0.3050 0.6679 0.8173
No log 1.1111 100 0.6690 0.3081 0.6690 0.8179
No log 1.1333 102 0.6732 0.3015 0.6732 0.8205
No log 1.1556 104 0.6901 0.2893 0.6901 0.8307
No log 1.1778 106 0.7142 0.3196 0.7142 0.8451
No log 1.2 108 0.7389 0.3822 0.7389 0.8596
No log 1.2222 110 0.8840 0.4028 0.8840 0.9402
No log 1.2444 112 1.0750 0.3785 1.0750 1.0368
No log 1.2667 114 0.8985 0.3652 0.8985 0.9479
No log 1.2889 116 0.7475 0.4430 0.7475 0.8646
No log 1.3111 118 0.7282 0.4270 0.7282 0.8534
No log 1.3333 120 0.6883 0.3445 0.6883 0.8296
No log 1.3556 122 0.6782 0.3011 0.6782 0.8235
No log 1.3778 124 0.7025 0.4684 0.7025 0.8382
No log 1.4 126 0.6974 0.4513 0.6974 0.8351
No log 1.4222 128 0.7427 0.4173 0.7427 0.8618
No log 1.4444 130 0.7203 0.4404 0.7203 0.8487
No log 1.4667 132 0.6920 0.4334 0.6920 0.8319
No log 1.4889 134 0.6367 0.4606 0.6367 0.7979
No log 1.5111 136 0.5407 0.5307 0.5407 0.7353
No log 1.5333 138 0.5423 0.5703 0.5423 0.7364
No log 1.5556 140 0.5580 0.4576 0.5580 0.7470
No log 1.5778 142 0.6563 0.4562 0.6563 0.8101
No log 1.6 144 0.7881 0.4437 0.7881 0.8878
No log 1.6222 146 0.9350 0.4362 0.9350 0.9669
No log 1.6444 148 0.7628 0.4018 0.7628 0.8734
No log 1.6667 150 0.5788 0.4855 0.5788 0.7608
No log 1.6889 152 0.5451 0.4729 0.5451 0.7383
No log 1.7111 154 0.5410 0.4803 0.5410 0.7355
No log 1.7333 156 0.5501 0.5448 0.5501 0.7417
No log 1.7556 158 0.6350 0.4067 0.6350 0.7969
No log 1.7778 160 0.6682 0.4067 0.6682 0.8175
No log 1.8 162 0.6840 0.4349 0.6840 0.8270
No log 1.8222 164 0.5909 0.4144 0.5909 0.7687
No log 1.8444 166 0.5122 0.5567 0.5122 0.7157
No log 1.8667 168 0.5275 0.5640 0.5275 0.7263
No log 1.8889 170 0.5627 0.4907 0.5627 0.7502
No log 1.9111 172 0.7074 0.4614 0.7074 0.8411
No log 1.9333 174 0.9148 0.4883 0.9148 0.9565
No log 1.9556 176 1.0473 0.4856 1.0473 1.0234
No log 1.9778 178 1.1668 0.3795 1.1668 1.0802
No log 2.0 180 1.0058 0.3849 1.0058 1.0029
No log 2.0222 182 0.7667 0.3918 0.7667 0.8756
No log 2.0444 184 0.6368 0.3622 0.6368 0.7980
No log 2.0667 186 0.6084 0.3011 0.6084 0.7800
No log 2.0889 188 0.5969 0.4243 0.5969 0.7726
No log 2.1111 190 0.6605 0.4430 0.6605 0.8127
No log 2.1333 192 0.8014 0.3933 0.8014 0.8952
No log 2.1556 194 0.8351 0.3688 0.8351 0.9138
No log 2.1778 196 0.7614 0.4382 0.7614 0.8726
No log 2.2 198 0.6615 0.5586 0.6615 0.8133
No log 2.2222 200 0.6579 0.4745 0.6579 0.8111
No log 2.2444 202 0.6574 0.4582 0.6574 0.8108
No log 2.2667 204 0.6782 0.4444 0.6782 0.8236
No log 2.2889 206 0.7021 0.4295 0.7021 0.8379
No log 2.3111 208 0.6850 0.4723 0.6850 0.8277
No log 2.3333 210 0.6344 0.4684 0.6344 0.7965
No log 2.3556 212 0.6472 0.4408 0.6472 0.8045
No log 2.3778 214 0.6289 0.4484 0.6289 0.7931
No log 2.4 216 0.6292 0.4642 0.6292 0.7933
No log 2.4222 218 0.6638 0.4193 0.6638 0.8147
No log 2.4444 220 0.6203 0.4997 0.6203 0.7876
No log 2.4667 222 0.5683 0.4547 0.5683 0.7539
No log 2.4889 224 0.5498 0.4547 0.5498 0.7415
No log 2.5111 226 0.5498 0.4547 0.5498 0.7415
No log 2.5333 228 0.5932 0.5291 0.5932 0.7702
No log 2.5556 230 0.5868 0.4997 0.5868 0.7660
No log 2.5778 232 0.6175 0.4542 0.6175 0.7858
No log 2.6 234 0.6303 0.5015 0.6303 0.7939
No log 2.6222 236 0.6268 0.4948 0.6268 0.7917
No log 2.6444 238 0.6136 0.4866 0.6136 0.7833
No log 2.6667 240 0.6130 0.4895 0.6130 0.7830
No log 2.6889 242 0.6748 0.4444 0.6748 0.8215
No log 2.7111 244 0.6689 0.4444 0.6689 0.8178
No log 2.7333 246 0.6297 0.4352 0.6297 0.7935
No log 2.7556 248 0.6691 0.4093 0.6691 0.8180
No log 2.7778 250 0.7296 0.4199 0.7296 0.8542
No log 2.8 252 0.6887 0.4114 0.6887 0.8299
No log 2.8222 254 0.6845 0.3958 0.6845 0.8273
No log 2.8444 256 0.7301 0.4002 0.7301 0.8545
No log 2.8667 258 0.7592 0.3934 0.7592 0.8713
No log 2.8889 260 0.7879 0.3688 0.7879 0.8876
No log 2.9111 262 0.7862 0.3319 0.7862 0.8867
No log 2.9333 264 0.7665 0.3195 0.7665 0.8755
No log 2.9556 266 0.8145 0.3105 0.8145 0.9025
No log 2.9778 268 0.8642 0.3274 0.8642 0.9296
No log 3.0 270 0.9573 0.3357 0.9573 0.9784
No log 3.0222 272 0.9260 0.3669 0.9260 0.9623
No log 3.0444 274 0.7996 0.3645 0.7996 0.8942
No log 3.0667 276 0.7185 0.3723 0.7185 0.8476
No log 3.0889 278 0.7182 0.3723 0.7182 0.8474
No log 3.1111 280 0.7704 0.2892 0.7704 0.8777
No log 3.1333 282 0.7969 0.3371 0.7969 0.8927
No log 3.1556 284 0.7122 0.3723 0.7122 0.8439
No log 3.1778 286 0.6665 0.3688 0.6665 0.8164
No log 3.2 288 0.6806 0.4112 0.6806 0.8250
No log 3.2222 290 0.7996 0.3887 0.7996 0.8942
No log 3.2444 292 0.8297 0.4305 0.8297 0.9109
No log 3.2667 294 0.7218 0.4351 0.7218 0.8496
No log 3.2889 296 0.6426 0.4532 0.6426 0.8016
No log 3.3111 298 0.6289 0.4754 0.6289 0.7930
No log 3.3333 300 0.6681 0.4920 0.6681 0.8174
No log 3.3556 302 0.6538 0.4702 0.6538 0.8086
No log 3.3778 304 0.5988 0.5201 0.5988 0.7738
No log 3.4 306 0.5994 0.5404 0.5994 0.7742
No log 3.4222 308 0.6100 0.5201 0.6100 0.7810
No log 3.4444 310 0.6619 0.4464 0.6619 0.8136
No log 3.4667 312 0.7105 0.4404 0.7105 0.8429
No log 3.4889 314 0.6860 0.4476 0.6860 0.8282
No log 3.5111 316 0.6354 0.4542 0.6354 0.7971
No log 3.5333 318 0.6245 0.4597 0.6245 0.7902
No log 3.5556 320 0.6398 0.3985 0.6398 0.7999
No log 3.5778 322 0.7170 0.4093 0.7170 0.8468
No log 3.6 324 0.8081 0.3847 0.8081 0.8989
No log 3.6222 326 0.8307 0.3782 0.8307 0.9114
No log 3.6444 328 0.8170 0.3274 0.8170 0.9039
No log 3.6667 330 0.8481 0.2934 0.8481 0.9209
No log 3.6889 332 0.8110 0.3688 0.8110 0.9005
No log 3.7111 334 0.6974 0.4350 0.6974 0.8351
No log 3.7333 336 0.6234 0.3865 0.6234 0.7896
No log 3.7556 338 0.5900 0.4201 0.5900 0.7681
No log 3.7778 340 0.5996 0.5030 0.5996 0.7744
No log 3.8 342 0.6311 0.5133 0.6311 0.7944
No log 3.8222 344 0.6705 0.4772 0.6705 0.8189
No log 3.8444 346 0.6718 0.4217 0.6718 0.8196
No log 3.8667 348 0.6928 0.4002 0.6928 0.8323
No log 3.8889 350 0.7101 0.4197 0.7101 0.8427
No log 3.9111 352 0.6919 0.4089 0.6919 0.8318
No log 3.9333 354 0.7077 0.4089 0.7077 0.8412
No log 3.9556 356 0.7128 0.3972 0.7128 0.8443
No log 3.9778 358 0.7859 0.3008 0.7859 0.8865
No log 4.0 360 0.8544 0.3194 0.8544 0.9243
No log 4.0222 362 0.7864 0.3169 0.7864 0.8868
No log 4.0444 364 0.7247 0.3972 0.7247 0.8513
No log 4.0667 366 0.7563 0.3564 0.7563 0.8696
No log 4.0889 368 0.7872 0.3425 0.7872 0.8873
No log 4.1111 370 0.7773 0.3819 0.7773 0.8816
No log 4.1333 372 0.7456 0.3940 0.7456 0.8635
No log 4.1556 374 0.6769 0.4329 0.6769 0.8227
No log 4.1778 376 0.6052 0.4827 0.6052 0.7780
No log 4.2 378 0.5679 0.5468 0.5679 0.7536
No log 4.2222 380 0.5806 0.5426 0.5806 0.7620
No log 4.2444 382 0.6445 0.4468 0.6445 0.8028
No log 4.2667 384 0.7305 0.3909 0.7305 0.8547
No log 4.2889 386 0.7068 0.3909 0.7068 0.8407
No log 4.3111 388 0.6370 0.4424 0.6370 0.7981
No log 4.3333 390 0.5923 0.4392 0.5923 0.7696
No log 4.3556 392 0.5918 0.4392 0.5918 0.7693
No log 4.3778 394 0.6812 0.4424 0.6812 0.8254
No log 4.4 396 0.7433 0.3829 0.7433 0.8622
No log 4.4222 398 0.7268 0.4350 0.7268 0.8525
No log 4.4444 400 0.6323 0.4665 0.6323 0.7952
No log 4.4667 402 0.5900 0.5342 0.5900 0.7681
No log 4.4889 404 0.5807 0.5403 0.5807 0.7620
No log 4.5111 406 0.5515 0.5859 0.5515 0.7426
No log 4.5333 408 0.5690 0.5568 0.5690 0.7543
No log 4.5556 410 0.5650 0.5639 0.5650 0.7517
No log 4.5778 412 0.6166 0.5292 0.6166 0.7852
No log 4.6 414 0.7826 0.2964 0.7826 0.8846
No log 4.6222 416 0.9019 0.2872 0.9019 0.9497
No log 4.6444 418 0.8458 0.2779 0.8458 0.9197
No log 4.6667 420 0.6791 0.4562 0.6791 0.8241
No log 4.6889 422 0.5833 0.5755 0.5833 0.7637
No log 4.7111 424 0.5613 0.4782 0.5613 0.7492
No log 4.7333 426 0.6107 0.5510 0.6107 0.7815
No log 4.7556 428 0.6316 0.5670 0.6316 0.7948
No log 4.7778 430 0.6375 0.5471 0.6375 0.7985
No log 4.8 432 0.5672 0.5068 0.5672 0.7532
No log 4.8222 434 0.5430 0.4907 0.5430 0.7369
No log 4.8444 436 0.5447 0.4402 0.5447 0.7380
No log 4.8667 438 0.5607 0.4289 0.5607 0.7488
No log 4.8889 440 0.5911 0.4576 0.5911 0.7688
No log 4.9111 442 0.6767 0.4808 0.6767 0.8226
No log 4.9333 444 0.6967 0.4512 0.6967 0.8347
No log 4.9556 446 0.6453 0.4424 0.6453 0.8033
No log 4.9778 448 0.5886 0.5048 0.5886 0.7672
No log 5.0 450 0.5987 0.5048 0.5987 0.7737
No log 5.0222 452 0.5806 0.4728 0.5806 0.7620
No log 5.0444 454 0.5808 0.4728 0.5808 0.7621
No log 5.0667 456 0.5482 0.5091 0.5482 0.7404
No log 5.0889 458 0.5287 0.5161 0.5287 0.7271
No log 5.1111 460 0.5381 0.5290 0.5381 0.7335
No log 5.1333 462 0.5919 0.4197 0.5919 0.7693
No log 5.1556 464 0.6140 0.4512 0.6140 0.7836
No log 5.1778 466 0.5796 0.4502 0.5796 0.7613
No log 5.2 468 0.5580 0.4576 0.5580 0.7470
No log 5.2222 470 0.5653 0.4997 0.5653 0.7519
No log 5.2444 472 0.5833 0.4582 0.5833 0.7637
No log 5.2667 474 0.6058 0.4582 0.6058 0.7784
No log 5.2889 476 0.6143 0.4582 0.6143 0.7838
No log 5.3111 478 0.6275 0.4808 0.6275 0.7921
No log 5.3333 480 0.6536 0.4568 0.6536 0.8085
No log 5.3556 482 0.6124 0.5042 0.6124 0.7826
No log 5.3778 484 0.5531 0.5701 0.5531 0.7437
No log 5.4 486 0.5245 0.5559 0.5245 0.7242
No log 5.4222 488 0.5443 0.5559 0.5443 0.7378
No log 5.4444 490 0.5890 0.5117 0.5890 0.7674
No log 5.4667 492 0.5687 0.5308 0.5687 0.7541
No log 5.4889 494 0.5501 0.5485 0.5501 0.7417
No log 5.5111 496 0.5177 0.5912 0.5177 0.7195
No log 5.5333 498 0.5261 0.5912 0.5261 0.7253
0.3219 5.5556 500 0.5772 0.5254 0.5772 0.7597
0.3219 5.5778 502 0.6489 0.4444 0.6489 0.8056
0.3219 5.6 504 0.7495 0.3665 0.7495 0.8657
0.3219 5.6222 506 0.8259 0.2964 0.8259 0.9088
0.3219 5.6444 508 0.7582 0.4089 0.7582 0.8707
0.3219 5.6667 510 0.6887 0.4758 0.6887 0.8299

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

Finetuned
(4019)
this model