ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6960
  • Qwk: 0.3618
  • Mse: 0.6960
  • Rmse: 0.8343

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 2.6177 -0.0407 2.6177 1.6179
No log 0.0465 4 1.3528 0.0991 1.3528 1.1631
No log 0.0698 6 1.1761 -0.1680 1.1761 1.0845
No log 0.0930 8 1.0555 0.0156 1.0555 1.0274
No log 0.1163 10 0.9944 0.1461 0.9944 0.9972
No log 0.1395 12 0.9031 0.1777 0.9031 0.9503
No log 0.1628 14 0.7740 0.2204 0.7740 0.8798
No log 0.1860 16 0.7643 0.1901 0.7643 0.8742
No log 0.2093 18 0.8616 0.2193 0.8616 0.9282
No log 0.2326 20 0.8418 0.2518 0.8418 0.9175
No log 0.2558 22 0.7422 0.2890 0.7422 0.8615
No log 0.2791 24 0.7637 0.2784 0.7637 0.8739
No log 0.3023 26 0.8850 0.3076 0.8850 0.9407
No log 0.3256 28 0.8937 0.2836 0.8937 0.9454
No log 0.3488 30 0.7686 0.2819 0.7686 0.8767
No log 0.3721 32 0.7695 0.2981 0.7695 0.8772
No log 0.3953 34 0.8025 0.3737 0.8025 0.8958
No log 0.4186 36 0.8238 0.3590 0.8238 0.9076
No log 0.4419 38 0.6532 0.3050 0.6532 0.8082
No log 0.4651 40 0.6548 0.1181 0.6548 0.8092
No log 0.4884 42 0.6212 0.1846 0.6212 0.7882
No log 0.5116 44 0.6143 0.3894 0.6143 0.7838
No log 0.5349 46 0.5937 0.3416 0.5937 0.7705
No log 0.5581 48 0.6877 0.2528 0.6877 0.8293
No log 0.5814 50 0.8410 0.3151 0.8410 0.9171
No log 0.6047 52 0.7416 0.3739 0.7416 0.8611
No log 0.6279 54 0.7133 0.4003 0.7133 0.8446
No log 0.6512 56 0.7736 0.3160 0.7736 0.8796
No log 0.6744 58 0.8222 0.3160 0.8222 0.9067
No log 0.6977 60 0.6014 0.4435 0.6014 0.7755
No log 0.7209 62 0.7139 0.5562 0.7139 0.8449
No log 0.7442 64 0.6909 0.5123 0.6909 0.8312
No log 0.7674 66 0.6203 0.4196 0.6203 0.7876
No log 0.7907 68 0.6854 0.4210 0.6854 0.8279
No log 0.8140 70 0.6332 0.3984 0.6332 0.7957
No log 0.8372 72 0.6343 0.4244 0.6343 0.7964
No log 0.8605 74 0.6472 0.3138 0.6472 0.8045
No log 0.8837 76 0.6267 0.3933 0.6267 0.7917
No log 0.9070 78 0.6935 0.4542 0.6935 0.8327
No log 0.9302 80 0.6423 0.3966 0.6423 0.8014
No log 0.9535 82 0.6427 0.3692 0.6427 0.8017
No log 0.9767 84 0.8382 0.3439 0.8382 0.9155
No log 1.0 86 1.0602 0.1972 1.0602 1.0296
No log 1.0233 88 0.9753 0.2726 0.9753 0.9876
No log 1.0465 90 0.7444 0.3294 0.7444 0.8628
No log 1.0698 92 0.8465 0.3603 0.8465 0.9201
No log 1.0930 94 0.9212 0.3365 0.9212 0.9598
No log 1.1163 96 0.8037 0.4180 0.8037 0.8965
No log 1.1395 98 0.7888 0.3789 0.7888 0.8882
No log 1.1628 100 0.9859 0.3216 0.9859 0.9929
No log 1.1860 102 0.9788 0.3847 0.9788 0.9894
No log 1.2093 104 0.8237 0.3610 0.8237 0.9076
No log 1.2326 106 0.7674 0.3902 0.7674 0.8760
No log 1.2558 108 0.8879 0.4134 0.8879 0.9423
No log 1.2791 110 0.9397 0.3233 0.9397 0.9694
No log 1.3023 112 0.8774 0.4275 0.8774 0.9367
No log 1.3256 114 0.7892 0.4698 0.7892 0.8884
No log 1.3488 116 0.7788 0.2202 0.7788 0.8825
No log 1.3721 118 0.7754 0.2769 0.7754 0.8806
No log 1.3953 120 0.8074 0.4226 0.8074 0.8986
No log 1.4186 122 0.9130 0.36 0.9130 0.9555
No log 1.4419 124 1.0239 0.3431 1.0239 1.0119
No log 1.4651 126 0.9025 0.3781 0.9025 0.9500
No log 1.4884 128 0.8136 0.4360 0.8136 0.9020
No log 1.5116 130 0.8083 0.3859 0.8083 0.8991
No log 1.5349 132 0.7912 0.3927 0.7912 0.8895
No log 1.5581 134 0.8344 0.4315 0.8344 0.9135
No log 1.5814 136 0.8985 0.4051 0.8985 0.9479
No log 1.6047 138 0.9778 0.3359 0.9778 0.9888
No log 1.6279 140 0.8968 0.3825 0.8968 0.9470
No log 1.6512 142 0.7186 0.4393 0.7186 0.8477
No log 1.6744 144 0.6845 0.3667 0.6845 0.8273
No log 1.6977 146 0.7123 0.4260 0.7123 0.8440
No log 1.7209 148 0.7833 0.4369 0.7833 0.8851
No log 1.7442 150 0.7566 0.4335 0.7566 0.8698
No log 1.7674 152 0.7126 0.4432 0.7126 0.8441
No log 1.7907 154 0.7499 0.3836 0.7499 0.8660
No log 1.8140 156 0.7741 0.4315 0.7741 0.8798
No log 1.8372 158 0.8574 0.3683 0.8574 0.9259
No log 1.8605 160 0.8433 0.3602 0.8433 0.9183
No log 1.8837 162 0.8016 0.4224 0.8016 0.8953
No log 1.9070 164 0.7830 0.3959 0.7830 0.8849
No log 1.9302 166 0.8801 0.3671 0.8801 0.9381
No log 1.9535 168 1.0317 0.3418 1.0317 1.0157
No log 1.9767 170 0.9610 0.3137 0.9610 0.9803
No log 2.0 172 0.8130 0.3623 0.8130 0.9016
No log 2.0233 174 0.7866 0.3822 0.7866 0.8869
No log 2.0465 176 0.8951 0.3786 0.8951 0.9461
No log 2.0698 178 0.9685 0.3593 0.9685 0.9841
No log 2.0930 180 0.8809 0.3611 0.8809 0.9385
No log 2.1163 182 0.8347 0.4200 0.8347 0.9136
No log 2.1395 184 0.7502 0.4529 0.7502 0.8661
No log 2.1628 186 0.7124 0.4889 0.7124 0.8440
No log 2.1860 188 0.7031 0.4695 0.7031 0.8385
No log 2.2093 190 0.7032 0.5047 0.7032 0.8386
No log 2.2326 192 0.6859 0.4933 0.6859 0.8282
No log 2.2558 194 0.7098 0.4124 0.7098 0.8425
No log 2.2791 196 0.7704 0.4286 0.7704 0.8777
No log 2.3023 198 0.7405 0.4562 0.7405 0.8605
No log 2.3256 200 0.6859 0.5559 0.6859 0.8282
No log 2.3488 202 0.6871 0.4762 0.6871 0.8289
No log 2.3721 204 0.6760 0.4762 0.6760 0.8222
No log 2.3953 206 0.6590 0.5308 0.6590 0.8118
No log 2.4186 208 0.8160 0.4007 0.8160 0.9033
No log 2.4419 210 0.9062 0.3577 0.9062 0.9520
No log 2.4651 212 0.8082 0.4085 0.8082 0.8990
No log 2.4884 214 0.6304 0.5184 0.6304 0.7940
No log 2.5116 216 0.6416 0.3683 0.6416 0.8010
No log 2.5349 218 0.6851 0.4044 0.6851 0.8277
No log 2.5581 220 0.6358 0.3683 0.6358 0.7974
No log 2.5814 222 0.6470 0.4513 0.6470 0.8044
No log 2.6047 224 0.7900 0.3981 0.7900 0.8888
No log 2.6279 226 0.9666 0.3247 0.9666 0.9831
No log 2.6512 228 0.9342 0.3455 0.9342 0.9665
No log 2.6744 230 0.7604 0.3981 0.7604 0.8720
No log 2.6977 232 0.6746 0.5357 0.6746 0.8213
No log 2.7209 234 0.6935 0.3425 0.6935 0.8327
No log 2.7442 236 0.6745 0.3501 0.6745 0.8213
No log 2.7674 238 0.6700 0.4134 0.6700 0.8185
No log 2.7907 240 0.8492 0.3846 0.8492 0.9215
No log 2.8140 242 0.9017 0.3709 0.9017 0.9496
No log 2.8372 244 0.7849 0.3991 0.7849 0.8859
No log 2.8605 246 0.6793 0.4134 0.6793 0.8242
No log 2.8837 248 0.6466 0.2535 0.6466 0.8041
No log 2.9070 250 0.6438 0.2958 0.6438 0.8024
No log 2.9302 252 0.6485 0.3738 0.6485 0.8053
No log 2.9535 254 0.8083 0.4726 0.8083 0.8990
No log 2.9767 256 0.9710 0.3102 0.9710 0.9854
No log 3.0 258 0.9542 0.3325 0.9542 0.9768
No log 3.0233 260 0.8376 0.4457 0.8376 0.9152
No log 3.0465 262 0.7289 0.4451 0.7289 0.8538
No log 3.0698 264 0.6988 0.4292 0.6988 0.8359
No log 3.0930 266 0.7216 0.4247 0.7216 0.8495
No log 3.1163 268 0.7307 0.4247 0.7307 0.8548
No log 3.1395 270 0.7525 0.3996 0.7525 0.8675
No log 3.1628 272 0.7928 0.4093 0.7928 0.8904
No log 3.1860 274 0.8365 0.4057 0.8365 0.9146
No log 3.2093 276 0.8402 0.4199 0.8402 0.9166
No log 3.2326 278 0.7258 0.4020 0.7258 0.8519
No log 3.2558 280 0.6611 0.4013 0.6611 0.8131
No log 3.2791 282 0.6379 0.2591 0.6379 0.7987
No log 3.3023 284 0.6592 0.3763 0.6592 0.8119
No log 3.3256 286 0.7055 0.3942 0.7055 0.8400
No log 3.3488 288 0.7231 0.3688 0.7231 0.8504
No log 3.3721 290 0.7339 0.3891 0.7339 0.8567
No log 3.3953 292 0.7210 0.4106 0.7210 0.8491
No log 3.4186 294 0.7002 0.4750 0.7002 0.8368
No log 3.4419 296 0.7080 0.4557 0.7080 0.8415
No log 3.4651 298 0.6870 0.4707 0.6870 0.8289
No log 3.4884 300 0.6720 0.5315 0.6720 0.8198
No log 3.5116 302 0.7110 0.4424 0.7110 0.8432
No log 3.5349 304 0.8293 0.4462 0.8293 0.9106
No log 3.5581 306 0.8785 0.3580 0.8785 0.9373
No log 3.5814 308 0.8035 0.4332 0.8035 0.8964
No log 3.6047 310 0.7707 0.4362 0.7707 0.8779
No log 3.6279 312 0.7207 0.5013 0.7207 0.8489
No log 3.6512 314 0.6810 0.5300 0.6810 0.8252
No log 3.6744 316 0.6762 0.5497 0.6762 0.8223
No log 3.6977 318 0.6616 0.5497 0.6616 0.8134
No log 3.7209 320 0.6267 0.5008 0.6267 0.7916
No log 3.7442 322 0.6236 0.4264 0.6236 0.7897
No log 3.7674 324 0.6437 0.4616 0.6437 0.8023
No log 3.7907 326 0.6519 0.4778 0.6519 0.8074
No log 3.8140 328 0.6278 0.4167 0.6278 0.7924
No log 3.8372 330 0.6218 0.5067 0.6218 0.7885
No log 3.8605 332 0.6443 0.4744 0.6443 0.8027
No log 3.8837 334 0.6449 0.4761 0.6449 0.8031
No log 3.9070 336 0.6186 0.5332 0.6186 0.7865
No log 3.9302 338 0.6155 0.5413 0.6155 0.7845
No log 3.9535 340 0.6262 0.4848 0.6262 0.7913
No log 3.9767 342 0.6403 0.5190 0.6403 0.8002
No log 4.0 344 0.7364 0.4379 0.7364 0.8581
No log 4.0233 346 0.8120 0.4142 0.8120 0.9011
No log 4.0465 348 0.7349 0.4874 0.7349 0.8573
No log 4.0698 350 0.6662 0.4406 0.6662 0.8162
No log 4.0930 352 0.6382 0.4370 0.6382 0.7989
No log 4.1163 354 0.6355 0.4562 0.6355 0.7972
No log 4.1395 356 0.6213 0.4543 0.6213 0.7882
No log 4.1628 358 0.6267 0.4013 0.6267 0.7917
No log 4.1860 360 0.6550 0.4212 0.6550 0.8093
No log 4.2093 362 0.6450 0.3961 0.6450 0.8031
No log 4.2326 364 0.6646 0.3984 0.6646 0.8152
No log 4.2558 366 0.6822 0.3984 0.6822 0.8259
No log 4.2791 368 0.6717 0.4166 0.6717 0.8195
No log 4.3023 370 0.6783 0.4321 0.6783 0.8236
No log 4.3256 372 0.6549 0.3738 0.6549 0.8093
No log 4.3488 374 0.6327 0.3984 0.6327 0.7954
No log 4.3721 376 0.6191 0.3171 0.6191 0.7869
No log 4.3953 378 0.6218 0.3267 0.6218 0.7885
No log 4.4186 380 0.6331 0.3814 0.6331 0.7956
No log 4.4419 382 0.6411 0.4076 0.6411 0.8007
No log 4.4651 384 0.6962 0.4052 0.6962 0.8344
No log 4.4884 386 0.7590 0.4512 0.7590 0.8712
No log 4.5116 388 0.7259 0.4036 0.7259 0.8520
No log 4.5349 390 0.6640 0.3183 0.6640 0.8149
No log 4.5581 392 0.6640 0.2973 0.6640 0.8149
No log 4.5814 394 0.7422 0.3732 0.7422 0.8615
No log 4.6047 396 0.8395 0.5310 0.8395 0.9163
No log 4.6279 398 0.9228 0.4094 0.9228 0.9606
No log 4.6512 400 0.7961 0.52 0.7961 0.8923
No log 4.6744 402 0.7068 0.2722 0.7068 0.8407
No log 4.6977 404 0.6499 0.2973 0.6499 0.8062
No log 4.7209 406 0.6300 0.2652 0.6300 0.7937
No log 4.7442 408 0.6179 0.2652 0.6179 0.7861
No log 4.7674 410 0.6421 0.3425 0.6421 0.8013
No log 4.7907 412 0.7603 0.4788 0.7603 0.8719
No log 4.8140 414 0.8069 0.4419 0.8069 0.8983
No log 4.8372 416 0.7171 0.4664 0.7171 0.8468
No log 4.8605 418 0.6127 0.2591 0.6127 0.7827
No log 4.8837 420 0.6670 0.4037 0.6670 0.8167
No log 4.9070 422 0.6864 0.4250 0.6864 0.8285
No log 4.9302 424 0.6178 0.3883 0.6178 0.7860
No log 4.9535 426 0.6316 0.3525 0.6316 0.7947
No log 4.9767 428 0.7197 0.4430 0.7197 0.8483
No log 5.0 430 0.7174 0.4430 0.7174 0.8470
No log 5.0233 432 0.6448 0.2690 0.6448 0.8030
No log 5.0465 434 0.6316 0.3391 0.6316 0.7948
No log 5.0698 436 0.6314 0.3485 0.6314 0.7946
No log 5.0930 438 0.6258 0.3239 0.6258 0.7911
No log 5.1163 440 0.6337 0.4020 0.6337 0.7961
No log 5.1395 442 0.6486 0.4247 0.6486 0.8054
No log 5.1628 444 0.7130 0.4491 0.7130 0.8444
No log 5.1860 446 0.7271 0.4089 0.7271 0.8527
No log 5.2093 448 0.6797 0.4491 0.6797 0.8245
No log 5.2326 450 0.6396 0.4134 0.6396 0.7997
No log 5.2558 452 0.6102 0.3811 0.6102 0.7812
No log 5.2791 454 0.6118 0.3883 0.6118 0.7822
No log 5.3023 456 0.6060 0.3835 0.6060 0.7784
No log 5.3256 458 0.6178 0.2872 0.6178 0.7860
No log 5.3488 460 0.6593 0.4190 0.6593 0.8120
No log 5.3721 462 0.6823 0.4167 0.6823 0.8260
No log 5.3953 464 0.6490 0.3518 0.6490 0.8056
No log 5.4186 466 0.5907 0.2890 0.5907 0.7685
No log 5.4419 468 0.5913 0.3385 0.5913 0.7689
No log 5.4651 470 0.5979 0.2884 0.5979 0.7732
No log 5.4884 472 0.6075 0.2715 0.6075 0.7794
No log 5.5116 474 0.6239 0.2929 0.6239 0.7899
No log 5.5349 476 0.6470 0.3122 0.6470 0.8044
No log 5.5581 478 0.6663 0.3792 0.6663 0.8162
No log 5.5814 480 0.6685 0.3884 0.6685 0.8176
No log 5.6047 482 0.6628 0.4901 0.6628 0.8141
No log 5.6279 484 0.6657 0.4413 0.6657 0.8159
No log 5.6512 486 0.6680 0.4413 0.6680 0.8173
No log 5.6744 488 0.6809 0.4413 0.6809 0.8251
No log 5.6977 490 0.7129 0.4302 0.7129 0.8444
No log 5.7209 492 0.7139 0.4302 0.7139 0.8449
No log 5.7442 494 0.6928 0.3574 0.6928 0.8323
No log 5.7674 496 0.6828 0.3325 0.6828 0.8263
No log 5.7907 498 0.6791 0.3133 0.6791 0.8241
0.3344 5.8140 500 0.6762 0.3887 0.6762 0.8223
0.3344 5.8372 502 0.7007 0.3425 0.7007 0.8371
0.3344 5.8605 504 0.7248 0.3569 0.7248 0.8514
0.3344 5.8837 506 0.6988 0.4001 0.6988 0.8359
0.3344 5.9070 508 0.7027 0.3138 0.7027 0.8383
0.3344 5.9302 510 0.7178 0.3102 0.7178 0.8472
0.3344 5.9535 512 0.7350 0.3953 0.7350 0.8573
0.3344 5.9767 514 0.7715 0.3971 0.7715 0.8784
0.3344 6.0 516 0.7721 0.3711 0.7721 0.8787
0.3344 6.0233 518 0.7333 0.3701 0.7333 0.8564
0.3344 6.0465 520 0.7055 0.3598 0.7055 0.8400
0.3344 6.0698 522 0.6937 0.2747 0.6937 0.8329
0.3344 6.0930 524 0.6960 0.3618 0.6960 0.8343

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

Finetuned
(4019)
this model