ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5282
  • Qwk: 0.5886
  • Mse: 0.5282
  • Rmse: 0.7268

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 3.9954 -0.0192 3.9954 1.9988
No log 0.0465 4 2.3571 0.0074 2.3571 1.5353
No log 0.0698 6 2.0494 0.0243 2.0494 1.4316
No log 0.0930 8 1.1989 0.2268 1.1989 1.0949
No log 0.1163 10 1.1104 0.1685 1.1104 1.0538
No log 0.1395 12 1.7830 -0.0334 1.7830 1.3353
No log 0.1628 14 1.9000 -0.0046 1.9000 1.3784
No log 0.1860 16 1.4093 -0.0328 1.4093 1.1871
No log 0.2093 18 1.1304 0.1794 1.1304 1.0632
No log 0.2326 20 1.1489 0.0888 1.1489 1.0719
No log 0.2558 22 1.0809 0.1203 1.0809 1.0397
No log 0.2791 24 1.0118 0.2967 1.0118 1.0059
No log 0.3023 26 1.0504 0.2734 1.0504 1.0249
No log 0.3256 28 1.1071 0.2416 1.1071 1.0522
No log 0.3488 30 1.1100 0.1645 1.1100 1.0536
No log 0.3721 32 1.0506 0.2811 1.0506 1.0250
No log 0.3953 34 1.0271 0.1881 1.0271 1.0134
No log 0.4186 36 1.0735 0.1625 1.0735 1.0361
No log 0.4419 38 1.0572 0.2559 1.0572 1.0282
No log 0.4651 40 1.0633 0.2678 1.0633 1.0312
No log 0.4884 42 0.9608 0.3272 0.9608 0.9802
No log 0.5116 44 0.9996 0.2049 0.9996 0.9998
No log 0.5349 46 1.2506 0.0613 1.2506 1.1183
No log 0.5581 48 1.2418 0.0996 1.2418 1.1144
No log 0.5814 50 1.0599 0.2513 1.0599 1.0295
No log 0.6047 52 0.8177 0.3631 0.8177 0.9043
No log 0.6279 54 0.9985 0.2262 0.9985 0.9993
No log 0.6512 56 1.5629 -0.3216 1.5629 1.2502
No log 0.6744 58 1.9715 -0.2661 1.9715 1.4041
No log 0.6977 60 1.8082 -0.0990 1.8082 1.3447
No log 0.7209 62 1.4418 -0.0461 1.4418 1.2007
No log 0.7442 64 1.2564 0.0909 1.2564 1.1209
No log 0.7674 66 1.0557 0.2196 1.0557 1.0275
No log 0.7907 68 0.9233 0.3935 0.9233 0.9609
No log 0.8140 70 0.9498 0.2367 0.9498 0.9746
No log 0.8372 72 0.9389 0.3541 0.9389 0.9690
No log 0.8605 74 0.8065 0.4551 0.8065 0.8981
No log 0.8837 76 0.8142 0.4867 0.8142 0.9024
No log 0.9070 78 0.8960 0.4696 0.8960 0.9466
No log 0.9302 80 0.8834 0.4327 0.8834 0.9399
No log 0.9535 82 0.8670 0.3977 0.8670 0.9311
No log 0.9767 84 0.8810 0.3840 0.8810 0.9386
No log 1.0 86 0.9470 0.2545 0.9470 0.9732
No log 1.0233 88 1.1370 0.3001 1.1370 1.0663
No log 1.0465 90 1.2188 0.2389 1.2188 1.1040
No log 1.0698 92 1.1741 0.2165 1.1741 1.0836
No log 1.0930 94 1.0416 0.2766 1.0416 1.0206
No log 1.1163 96 1.1177 0.1640 1.1177 1.0572
No log 1.1395 98 1.1677 0.1909 1.1677 1.0806
No log 1.1628 100 1.0326 0.2744 1.0326 1.0162
No log 1.1860 102 0.9821 0.3902 0.9821 0.9910
No log 1.2093 104 0.9818 0.4539 0.9818 0.9908
No log 1.2326 106 0.9229 0.4539 0.9229 0.9607
No log 1.2558 108 0.8785 0.5028 0.8785 0.9373
No log 1.2791 110 0.8727 0.4542 0.8727 0.9342
No log 1.3023 112 0.9036 0.4061 0.9036 0.9506
No log 1.3256 114 0.9469 0.3830 0.9469 0.9731
No log 1.3488 116 0.9435 0.3792 0.9435 0.9713
No log 1.3721 118 0.8511 0.4400 0.8511 0.9226
No log 1.3953 120 0.7138 0.4731 0.7138 0.8449
No log 1.4186 122 0.6691 0.4642 0.6691 0.8180
No log 1.4419 124 0.6571 0.5627 0.6571 0.8106
No log 1.4651 126 0.7293 0.5178 0.7293 0.8540
No log 1.4884 128 0.8413 0.5789 0.8413 0.9172
No log 1.5116 130 1.0351 0.3766 1.0351 1.0174
No log 1.5349 132 0.9967 0.4420 0.9967 0.9984
No log 1.5581 134 0.7893 0.5451 0.7893 0.8884
No log 1.5814 136 0.6179 0.5996 0.6179 0.7861
No log 1.6047 138 0.5922 0.6008 0.5922 0.7695
No log 1.6279 140 0.5967 0.5877 0.5967 0.7725
No log 1.6512 142 0.6758 0.5651 0.6758 0.8221
No log 1.6744 144 0.7097 0.6079 0.7097 0.8424
No log 1.6977 146 0.6936 0.5875 0.6936 0.8328
No log 1.7209 148 0.6567 0.5673 0.6567 0.8104
No log 1.7442 150 0.6053 0.6210 0.6053 0.7780
No log 1.7674 152 0.6092 0.6176 0.6092 0.7805
No log 1.7907 154 0.6922 0.5242 0.6922 0.8320
No log 1.8140 156 0.7303 0.5242 0.7303 0.8546
No log 1.8372 158 0.6969 0.5630 0.6969 0.8348
No log 1.8605 160 0.6362 0.6176 0.6362 0.7976
No log 1.8837 162 0.6284 0.5771 0.6284 0.7927
No log 1.9070 164 0.6555 0.5397 0.6555 0.8096
No log 1.9302 166 0.7178 0.5319 0.7178 0.8473
No log 1.9535 168 0.7575 0.4552 0.7575 0.8704
No log 1.9767 170 0.7539 0.4552 0.7539 0.8683
No log 2.0 172 0.7159 0.5065 0.7159 0.8461
No log 2.0233 174 0.6264 0.5650 0.6264 0.7915
No log 2.0465 176 0.6021 0.6044 0.6021 0.7760
No log 2.0698 178 0.5852 0.6622 0.5852 0.7650
No log 2.0930 180 0.6015 0.5823 0.6015 0.7755
No log 2.1163 182 0.7329 0.5253 0.7329 0.8561
No log 2.1395 184 0.8857 0.4542 0.8857 0.9411
No log 2.1628 186 0.8503 0.4169 0.8503 0.9221
No log 2.1860 188 0.7008 0.4965 0.7008 0.8371
No log 2.2093 190 0.6389 0.5677 0.6389 0.7993
No log 2.2326 192 0.6391 0.6040 0.6391 0.7994
No log 2.2558 194 0.6979 0.5799 0.6979 0.8354
No log 2.2791 196 0.7894 0.5020 0.7894 0.8885
No log 2.3023 198 0.7930 0.4681 0.7930 0.8905
No log 2.3256 200 0.7744 0.4563 0.7744 0.8800
No log 2.3488 202 0.6917 0.5572 0.6917 0.8317
No log 2.3721 204 0.6296 0.5775 0.6296 0.7935
No log 2.3953 206 0.6092 0.6021 0.6092 0.7805
No log 2.4186 208 0.5820 0.6295 0.5820 0.7629
No log 2.4419 210 0.5959 0.5675 0.5959 0.7719
No log 2.4651 212 0.5430 0.6500 0.5430 0.7369
No log 2.4884 214 0.5340 0.6623 0.5340 0.7307
No log 2.5116 216 0.5750 0.6395 0.5750 0.7583
No log 2.5349 218 0.7026 0.5593 0.7026 0.8382
No log 2.5581 220 0.8392 0.5908 0.8392 0.9161
No log 2.5814 222 0.8512 0.5465 0.8512 0.9226
No log 2.6047 224 0.7201 0.5458 0.7201 0.8486
No log 2.6279 226 0.6025 0.6377 0.6025 0.7762
No log 2.6512 228 0.5890 0.5898 0.5890 0.7675
No log 2.6744 230 0.6202 0.5331 0.6202 0.7875
No log 2.6977 232 0.6304 0.4781 0.6304 0.7940
No log 2.7209 234 0.6822 0.5601 0.6822 0.8259
No log 2.7442 236 0.8282 0.4879 0.8282 0.9100
No log 2.7674 238 0.9095 0.4435 0.9095 0.9537
No log 2.7907 240 0.8967 0.4115 0.8967 0.9469
No log 2.8140 242 0.7970 0.4912 0.7970 0.8927
No log 2.8372 244 0.7374 0.4478 0.7374 0.8587
No log 2.8605 246 0.7116 0.5088 0.7116 0.8436
No log 2.8837 248 0.7710 0.5020 0.7710 0.8780
No log 2.9070 250 0.9390 0.4454 0.9390 0.9690
No log 2.9302 252 1.0134 0.4084 1.0134 1.0067
No log 2.9535 254 0.9458 0.4181 0.9458 0.9725
No log 2.9767 256 0.8327 0.4417 0.8327 0.9125
No log 3.0 258 0.7695 0.4812 0.7695 0.8772
No log 3.0233 260 0.7428 0.3902 0.7428 0.8618
No log 3.0465 262 0.7734 0.4565 0.7734 0.8795
No log 3.0698 264 0.8461 0.4898 0.8461 0.9198
No log 3.0930 266 0.8402 0.5252 0.8402 0.9166
No log 3.1163 268 0.7369 0.4937 0.7369 0.8584
No log 3.1395 270 0.6794 0.4248 0.6794 0.8243
No log 3.1628 272 0.6703 0.4391 0.6703 0.8187
No log 3.1860 274 0.6573 0.5260 0.6573 0.8107
No log 3.2093 276 0.7550 0.5356 0.7550 0.8689
No log 3.2326 278 0.9249 0.4882 0.9249 0.9617
No log 3.2558 280 0.9108 0.4777 0.9108 0.9544
No log 3.2791 282 0.7695 0.5242 0.7695 0.8772
No log 3.3023 284 0.6759 0.5718 0.6759 0.8221
No log 3.3256 286 0.6578 0.6167 0.6578 0.8110
No log 3.3488 288 0.6631 0.6053 0.6631 0.8143
No log 3.3721 290 0.6443 0.6112 0.6443 0.8027
No log 3.3953 292 0.6632 0.5992 0.6632 0.8144
No log 3.4186 294 0.6653 0.5477 0.6653 0.8156
No log 3.4419 296 0.6623 0.5495 0.6623 0.8138
No log 3.4651 298 0.6727 0.5003 0.6727 0.8202
No log 3.4884 300 0.6975 0.5123 0.6975 0.8352
No log 3.5116 302 0.7237 0.5676 0.7237 0.8507
No log 3.5349 304 0.7313 0.5397 0.7313 0.8551
No log 3.5581 306 0.7561 0.5356 0.7561 0.8695
No log 3.5814 308 0.7857 0.5636 0.7857 0.8864
No log 3.6047 310 0.7606 0.5033 0.7606 0.8721
No log 3.6279 312 0.6991 0.5305 0.6991 0.8361
No log 3.6512 314 0.6789 0.5210 0.6789 0.8240
No log 3.6744 316 0.7028 0.4940 0.7028 0.8384
No log 3.6977 318 0.7519 0.5041 0.7519 0.8671
No log 3.7209 320 0.7627 0.5279 0.7627 0.8733
No log 3.7442 322 0.7323 0.5292 0.7323 0.8558
No log 3.7674 324 0.7035 0.4579 0.7035 0.8387
No log 3.7907 326 0.7188 0.4318 0.7188 0.8478
No log 3.8140 328 0.7517 0.4444 0.7517 0.8670
No log 3.8372 330 0.7746 0.4681 0.7746 0.8801
No log 3.8605 332 0.8114 0.4565 0.8114 0.9008
No log 3.8837 334 0.8366 0.4976 0.8366 0.9146
No log 3.9070 336 0.7866 0.5106 0.7866 0.8869
No log 3.9302 338 0.7231 0.5358 0.7231 0.8504
No log 3.9535 340 0.6987 0.5387 0.6987 0.8359
No log 3.9767 342 0.6714 0.4697 0.6714 0.8194
No log 4.0 344 0.6375 0.5113 0.6375 0.7985
No log 4.0233 346 0.6078 0.5687 0.6078 0.7796
No log 4.0465 348 0.6020 0.6150 0.6020 0.7759
No log 4.0698 350 0.6184 0.5477 0.6184 0.7864
No log 4.0930 352 0.7116 0.5579 0.7116 0.8436
No log 4.1163 354 0.8230 0.5365 0.8230 0.9072
No log 4.1395 356 0.7903 0.5272 0.7903 0.8890
No log 4.1628 358 0.6752 0.5766 0.6752 0.8217
No log 4.1860 360 0.5815 0.6360 0.5815 0.7626
No log 4.2093 362 0.5624 0.5989 0.5624 0.7499
No log 4.2326 364 0.5665 0.6020 0.5665 0.7526
No log 4.2558 366 0.5983 0.5992 0.5983 0.7735
No log 4.2791 368 0.6771 0.5344 0.6771 0.8229
No log 4.3023 370 0.7206 0.4889 0.7206 0.8489
No log 4.3256 372 0.6899 0.5675 0.6899 0.8306
No log 4.3488 374 0.6587 0.5675 0.6587 0.8116
No log 4.3721 376 0.6778 0.5655 0.6778 0.8233
No log 4.3953 378 0.7116 0.5510 0.7116 0.8435
No log 4.4186 380 0.7603 0.5695 0.7603 0.8719
No log 4.4419 382 0.7978 0.5082 0.7978 0.8932
No log 4.4651 384 0.8320 0.5082 0.8320 0.9122
No log 4.4884 386 0.7831 0.4987 0.7831 0.8849
No log 4.5116 388 0.6684 0.5561 0.6684 0.8176
No log 4.5349 390 0.6182 0.5981 0.6182 0.7862
No log 4.5581 392 0.5929 0.6276 0.5929 0.7700
No log 4.5814 394 0.6141 0.6413 0.6141 0.7837
No log 4.6047 396 0.6386 0.5953 0.6386 0.7991
No log 4.6279 398 0.6582 0.5953 0.6582 0.8113
No log 4.6512 400 0.6961 0.5372 0.6961 0.8343
No log 4.6744 402 0.7424 0.4681 0.7424 0.8617
No log 4.6977 404 0.7529 0.4681 0.7529 0.8677
No log 4.7209 406 0.7308 0.3782 0.7308 0.8549
No log 4.7442 408 0.7223 0.4593 0.7223 0.8499
No log 4.7674 410 0.7062 0.4882 0.7062 0.8404
No log 4.7907 412 0.6914 0.5127 0.6914 0.8315
No log 4.8140 414 0.6860 0.6232 0.6860 0.8283
No log 4.8372 416 0.6946 0.5529 0.6946 0.8334
No log 4.8605 418 0.7004 0.5498 0.7004 0.8369
No log 4.8837 420 0.7168 0.5330 0.7168 0.8466
No log 4.9070 422 0.7367 0.5677 0.7367 0.8583
No log 4.9302 424 0.7094 0.5076 0.7094 0.8422
No log 4.9535 426 0.6876 0.5025 0.6876 0.8292
No log 4.9767 428 0.6848 0.4781 0.6848 0.8275
No log 5.0 430 0.6955 0.4642 0.6955 0.8339
No log 5.0233 432 0.7175 0.4622 0.7175 0.8470
No log 5.0465 434 0.7838 0.2873 0.7838 0.8853
No log 5.0698 436 0.8276 0.3021 0.8276 0.9097
No log 5.0930 438 0.8115 0.3169 0.8115 0.9008
No log 5.1163 440 0.7606 0.4023 0.7606 0.8721
No log 5.1395 442 0.7175 0.4301 0.7175 0.8470
No log 5.1628 444 0.6671 0.5244 0.6671 0.8168
No log 5.1860 446 0.6449 0.5011 0.6449 0.8031
No log 5.2093 448 0.6545 0.5244 0.6545 0.8090
No log 5.2326 450 0.7120 0.5419 0.7120 0.8438
No log 5.2558 452 0.7468 0.5278 0.7468 0.8642
No log 5.2791 454 0.7232 0.5253 0.7232 0.8504
No log 5.3023 456 0.6866 0.5292 0.6866 0.8286
No log 5.3256 458 0.6599 0.4949 0.6599 0.8123
No log 5.3488 460 0.6696 0.4815 0.6696 0.8183
No log 5.3721 462 0.6559 0.4949 0.6559 0.8099
No log 5.3953 464 0.6695 0.4697 0.6695 0.8182
No log 5.4186 466 0.6628 0.5229 0.6628 0.8141
No log 5.4419 468 0.6167 0.4480 0.6167 0.7853
No log 5.4651 470 0.5946 0.5412 0.5946 0.7711
No log 5.4884 472 0.5973 0.4871 0.5973 0.7728
No log 5.5116 474 0.6485 0.5686 0.6485 0.8053
No log 5.5349 476 0.7052 0.5124 0.7052 0.8398
No log 5.5581 478 0.6892 0.5124 0.6892 0.8302
No log 5.5814 480 0.6093 0.5937 0.6093 0.7805
No log 5.6047 482 0.5644 0.6232 0.5644 0.7513
No log 5.6279 484 0.5390 0.6445 0.5390 0.7342
No log 5.6512 486 0.5286 0.6648 0.5286 0.7270
No log 5.6744 488 0.5244 0.6566 0.5244 0.7241
No log 5.6977 490 0.5198 0.6736 0.5198 0.7210
No log 5.7209 492 0.5111 0.6745 0.5111 0.7149
No log 5.7442 494 0.5080 0.6681 0.5080 0.7127
No log 5.7674 496 0.5028 0.6796 0.5028 0.7091
No log 5.7907 498 0.5033 0.6978 0.5033 0.7095
0.393 5.8140 500 0.5127 0.6978 0.5127 0.7160
0.393 5.8372 502 0.5514 0.6872 0.5514 0.7425
0.393 5.8605 504 0.5643 0.6973 0.5643 0.7512
0.393 5.8837 506 0.5448 0.6973 0.5448 0.7381
0.393 5.9070 508 0.5330 0.6488 0.5330 0.7301
0.393 5.9302 510 0.5282 0.5886 0.5282 0.7268

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task5_organization

Finetuned
(4019)
this model