ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8862
  • Qwk: 0.2817
  • Mse: 0.8862
  • Rmse: 0.9414

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 2.6222 -0.0262 2.6222 1.6193
No log 0.0851 4 1.3154 0.1819 1.3154 1.1469
No log 0.1277 6 1.0685 -0.1866 1.0685 1.0337
No log 0.1702 8 1.0982 -0.0895 1.0982 1.0479
No log 0.2128 10 1.0233 0.0952 1.0233 1.0116
No log 0.2553 12 0.7663 0.0200 0.7663 0.8754
No log 0.2979 14 0.9527 0.2672 0.9527 0.9761
No log 0.3404 16 1.0356 0.2164 1.0356 1.0176
No log 0.3830 18 0.8514 0.2435 0.8514 0.9227
No log 0.4255 20 0.6919 0.0359 0.6919 0.8318
No log 0.4681 22 0.7646 0.3032 0.7646 0.8744
No log 0.5106 24 0.8280 0.2211 0.8280 0.9099
No log 0.5532 26 0.8271 0.2562 0.8271 0.9095
No log 0.5957 28 0.7801 0.2410 0.7801 0.8832
No log 0.6383 30 0.7039 0.3238 0.7039 0.8390
No log 0.6809 32 0.6543 0.1604 0.6543 0.8089
No log 0.7234 34 0.7966 0.3218 0.7966 0.8925
No log 0.7660 36 0.8572 0.3270 0.8572 0.9258
No log 0.8085 38 0.6704 0.3302 0.6704 0.8188
No log 0.8511 40 0.6536 0.4217 0.6536 0.8084
No log 0.8936 42 0.7815 0.2732 0.7815 0.8840
No log 0.9362 44 0.9513 0.2335 0.9513 0.9754
No log 0.9787 46 1.1418 0.1737 1.1418 1.0686
No log 1.0213 48 1.2924 0.1224 1.2924 1.1368
No log 1.0638 50 1.1384 0.1971 1.1384 1.0670
No log 1.1064 52 0.8336 0.3492 0.8336 0.9130
No log 1.1489 54 0.6983 0.0481 0.6983 0.8356
No log 1.1915 56 0.7184 0.1550 0.7184 0.8476
No log 1.2340 58 0.8314 0.2843 0.8314 0.9118
No log 1.2766 60 0.7542 0.2754 0.7542 0.8684
No log 1.3191 62 0.6848 0.1922 0.6848 0.8275
No log 1.3617 64 0.7760 0.2132 0.7760 0.8809
No log 1.4043 66 0.9102 0.2508 0.9102 0.9540
No log 1.4468 68 0.8952 0.3398 0.8952 0.9461
No log 1.4894 70 0.7623 0.2073 0.7623 0.8731
No log 1.5319 72 0.6984 0.1007 0.6984 0.8357
No log 1.5745 74 0.7121 0.2652 0.7121 0.8438
No log 1.6170 76 0.7044 0.2883 0.7044 0.8393
No log 1.6596 78 0.7279 0.2817 0.7279 0.8532
No log 1.7021 80 0.7677 0.2754 0.7677 0.8762
No log 1.7447 82 0.8226 0.2463 0.8226 0.9070
No log 1.7872 84 0.8590 0.2463 0.8590 0.9268
No log 1.8298 86 0.7444 0.3099 0.7444 0.8628
No log 1.8723 88 0.6599 0.2718 0.6599 0.8124
No log 1.9149 90 0.6606 0.2206 0.6606 0.8128
No log 1.9574 92 0.6628 0.2471 0.6628 0.8141
No log 2.0 94 0.6555 0.3050 0.6555 0.8096
No log 2.0426 96 0.7214 0.3099 0.7214 0.8494
No log 2.0851 98 0.7781 0.3746 0.7781 0.8821
No log 2.1277 100 0.7686 0.3819 0.7686 0.8767
No log 2.1702 102 0.7395 0.4089 0.7395 0.8599
No log 2.2128 104 0.7558 0.4404 0.7558 0.8694
No log 2.2553 106 0.7491 0.4329 0.7491 0.8655
No log 2.2979 108 0.7111 0.4491 0.7111 0.8433
No log 2.3404 110 0.6802 0.4330 0.6802 0.8247
No log 2.3830 112 0.6555 0.3452 0.6555 0.8096
No log 2.4255 114 0.6724 0.3990 0.6724 0.8200
No log 2.4681 116 0.7190 0.4234 0.7190 0.8479
No log 2.5106 118 0.8150 0.3344 0.8150 0.9028
No log 2.5532 120 0.7310 0.3859 0.7310 0.8550
No log 2.5957 122 0.8426 0.3044 0.8426 0.9179
No log 2.6383 124 0.8683 0.3076 0.8683 0.9318
No log 2.6809 126 0.7427 0.3950 0.7427 0.8618
No log 2.7234 128 0.7672 0.3996 0.7672 0.8759
No log 2.7660 130 0.7490 0.3918 0.7490 0.8655
No log 2.8085 132 0.6983 0.2847 0.6983 0.8356
No log 2.8511 134 0.6723 0.3088 0.6723 0.8199
No log 2.8936 136 0.6360 0.4463 0.6360 0.7975
No log 2.9362 138 0.6321 0.5143 0.6321 0.7950
No log 2.9787 140 0.6539 0.3887 0.6539 0.8087
No log 3.0213 142 0.7679 0.4389 0.7679 0.8763
No log 3.0638 144 0.7465 0.4662 0.7465 0.8640
No log 3.1064 146 0.6876 0.4619 0.6876 0.8292
No log 3.1489 148 0.7224 0.4544 0.7224 0.8499
No log 3.1915 150 0.7095 0.4678 0.7095 0.8423
No log 3.2340 152 0.7913 0.4587 0.7913 0.8895
No log 3.2766 154 0.9747 0.4186 0.9747 0.9873
No log 3.3191 156 0.9198 0.4528 0.9198 0.9591
No log 3.3617 158 0.7933 0.4176 0.7933 0.8907
No log 3.4043 160 0.7802 0.4126 0.7802 0.8833
No log 3.4468 162 0.8298 0.4176 0.8298 0.9109
No log 3.4894 164 0.9230 0.4539 0.9230 0.9607
No log 3.5319 166 0.8711 0.4334 0.8711 0.9333
No log 3.5745 168 0.8128 0.3746 0.8128 0.9016
No log 3.6170 170 0.7696 0.4073 0.7696 0.8772
No log 3.6596 172 0.7599 0.3937 0.7599 0.8717
No log 3.7021 174 0.8185 0.3475 0.8185 0.9047
No log 3.7447 176 1.0229 0.2482 1.0229 1.0114
No log 3.7872 178 0.9959 0.2529 0.9959 0.9979
No log 3.8298 180 0.8892 0.4522 0.8892 0.9430
No log 3.8723 182 0.8370 0.4239 0.8370 0.9149
No log 3.9149 184 0.7653 0.4562 0.7653 0.8748
No log 3.9574 186 0.7677 0.4731 0.7677 0.8762
No log 4.0 188 0.7440 0.4488 0.7440 0.8626
No log 4.0426 190 0.7453 0.4470 0.7453 0.8633
No log 4.0851 192 0.7443 0.4397 0.7443 0.8627
No log 4.1277 194 0.7288 0.4397 0.7288 0.8537
No log 4.1702 196 0.7058 0.4397 0.7058 0.8401
No log 4.2128 198 0.6992 0.3239 0.6992 0.8362
No log 4.2553 200 0.7151 0.2800 0.7151 0.8456
No log 4.2979 202 0.6891 0.3679 0.6891 0.8301
No log 4.3404 204 0.6892 0.3937 0.6892 0.8302
No log 4.3830 206 0.8298 0.4369 0.8298 0.9110
No log 4.4255 208 0.8540 0.3826 0.8540 0.9241
No log 4.4681 210 0.7481 0.4434 0.7481 0.8649
No log 4.5106 212 0.6958 0.4016 0.6958 0.8341
No log 4.5532 214 0.6893 0.3791 0.6893 0.8302
No log 4.5957 216 0.6853 0.3791 0.6853 0.8278
No log 4.6383 218 0.6725 0.4065 0.6725 0.8200
No log 4.6809 220 0.6689 0.4505 0.6689 0.8179
No log 4.7234 222 0.6613 0.3906 0.6613 0.8132
No log 4.7660 224 0.6523 0.4065 0.6523 0.8077
No log 4.8085 226 0.7390 0.4059 0.7390 0.8596
No log 4.8511 228 0.8756 0.3287 0.8756 0.9357
No log 4.8936 230 0.9953 0.3395 0.9953 0.9976
No log 4.9362 232 0.9164 0.3051 0.9164 0.9573
No log 4.9787 234 0.7479 0.3287 0.7479 0.8648
No log 5.0213 236 0.6710 0.2712 0.6710 0.8192
No log 5.0638 238 0.6782 0.2085 0.6782 0.8235
No log 5.1064 240 0.6817 0.2072 0.6817 0.8256
No log 5.1489 242 0.7765 0.3287 0.7765 0.8812
No log 5.1915 244 0.9407 0.2964 0.9407 0.9699
No log 5.2340 246 0.9154 0.3731 0.9154 0.9567
No log 5.2766 248 0.7503 0.3355 0.7503 0.8662
No log 5.3191 250 0.6969 0.2398 0.6969 0.8348
No log 5.3617 252 0.7332 0.3291 0.7332 0.8563
No log 5.4043 254 0.7120 0.2944 0.7120 0.8438
No log 5.4468 256 0.6896 0.4051 0.6896 0.8304
No log 5.4894 258 0.6857 0.5107 0.6857 0.8280
No log 5.5319 260 0.6864 0.4757 0.6864 0.8285
No log 5.5745 262 0.7029 0.4149 0.7029 0.8384
No log 5.6170 264 0.7433 0.3959 0.7433 0.8622
No log 5.6596 266 0.7516 0.4140 0.7516 0.8669
No log 5.7021 268 0.7409 0.4140 0.7409 0.8607
No log 5.7447 270 0.6866 0.4422 0.6866 0.8286
No log 5.7872 272 0.6875 0.4986 0.6875 0.8292
No log 5.8298 274 0.7178 0.4830 0.7178 0.8472
No log 5.8723 276 0.6990 0.5158 0.6990 0.8361
No log 5.9149 278 0.7425 0.4140 0.7425 0.8617
No log 5.9574 280 0.7338 0.3994 0.7338 0.8566
No log 6.0 282 0.6944 0.3933 0.6944 0.8333
No log 6.0426 284 0.8407 0.4369 0.8407 0.9169
No log 6.0851 286 1.0092 0.3029 1.0092 1.0046
No log 6.1277 288 0.9282 0.2651 0.9282 0.9634
No log 6.1702 290 0.7517 0.3060 0.7517 0.8670
No log 6.2128 292 0.6759 0.2842 0.6759 0.8222
No log 6.2553 294 0.6896 0.2578 0.6896 0.8304
No log 6.2979 296 0.6781 0.3435 0.6781 0.8235
No log 6.3404 298 0.6776 0.3239 0.6776 0.8232
No log 6.3830 300 0.6814 0.3762 0.6814 0.8255
No log 6.4255 302 0.6844 0.3408 0.6844 0.8273
No log 6.4681 304 0.7028 0.3425 0.7028 0.8383
No log 6.5106 306 0.7041 0.3713 0.7041 0.8391
No log 6.5532 308 0.6473 0.4222 0.6473 0.8046
No log 6.5957 310 0.6324 0.4536 0.6324 0.7952
No log 6.6383 312 0.6302 0.4991 0.6302 0.7938
No log 6.6809 314 0.6182 0.4137 0.6182 0.7863
No log 6.7234 316 0.6185 0.4116 0.6185 0.7865
No log 6.7660 318 0.6419 0.4222 0.6419 0.8012
No log 6.8085 320 0.6264 0.4168 0.6264 0.7914
No log 6.8511 322 0.6202 0.4771 0.6202 0.7875
No log 6.8936 324 0.6594 0.5868 0.6594 0.8120
No log 6.9362 326 0.6586 0.6053 0.6586 0.8115
No log 6.9787 328 0.6499 0.5649 0.6499 0.8061
No log 7.0213 330 0.6368 0.5336 0.6368 0.7980
No log 7.0638 332 0.6708 0.4377 0.6708 0.8190
No log 7.1064 334 0.7095 0.3842 0.7095 0.8423
No log 7.1489 336 0.6760 0.4155 0.6760 0.8222
No log 7.1915 338 0.6251 0.5479 0.6251 0.7906
No log 7.2340 340 0.6482 0.5592 0.6482 0.8051
No log 7.2766 342 0.6963 0.5030 0.6963 0.8344
No log 7.3191 344 0.7262 0.5046 0.7262 0.8522
No log 7.3617 346 0.6846 0.5220 0.6846 0.8274
No log 7.4043 348 0.6167 0.5649 0.6167 0.7853
No log 7.4468 350 0.6537 0.4533 0.6537 0.8085
No log 7.4894 352 0.6843 0.4532 0.6843 0.8272
No log 7.5319 354 0.6439 0.4505 0.6439 0.8025
No log 7.5745 356 0.6188 0.5217 0.6188 0.7867
No log 7.6170 358 0.6249 0.5596 0.6249 0.7905
No log 7.6596 360 0.6180 0.4991 0.6180 0.7861
No log 7.7021 362 0.6447 0.3788 0.6447 0.8029
No log 7.7447 364 0.7070 0.2995 0.7070 0.8408
No log 7.7872 366 0.7063 0.3261 0.7063 0.8404
No log 7.8298 368 0.6579 0.3498 0.6579 0.8111
No log 7.8723 370 0.6362 0.3762 0.6362 0.7976
No log 7.9149 372 0.6334 0.4678 0.6334 0.7959
No log 7.9574 374 0.6383 0.4127 0.6383 0.7989
No log 8.0 376 0.6372 0.3890 0.6372 0.7982
No log 8.0426 378 0.6448 0.3183 0.6448 0.8030
No log 8.0851 380 0.6707 0.3116 0.6707 0.8189
No log 8.1277 382 0.7217 0.3399 0.7217 0.8495
No log 8.1702 384 0.7201 0.2817 0.7201 0.8486
No log 8.2128 386 0.6895 0.2913 0.6895 0.8303
No log 8.2553 388 0.6805 0.2913 0.6805 0.8249
No log 8.2979 390 0.6879 0.3471 0.6879 0.8294
No log 8.3404 392 0.7276 0.4029 0.7276 0.8530
No log 8.3830 394 0.7526 0.3885 0.7526 0.8675
No log 8.4255 396 0.7360 0.4587 0.7360 0.8579
No log 8.4681 398 0.7100 0.4681 0.7100 0.8426
No log 8.5106 400 0.6899 0.4126 0.6899 0.8306
No log 8.5532 402 0.6596 0.4470 0.6596 0.8122
No log 8.5957 404 0.6589 0.5092 0.6589 0.8117
No log 8.6383 406 0.6542 0.5373 0.6542 0.8088
No log 8.6809 408 0.6621 0.4168 0.6621 0.8137
No log 8.7234 410 0.7199 0.4157 0.7199 0.8485
No log 8.7660 412 0.8660 0.3333 0.8660 0.9306
No log 8.8085 414 0.8914 0.3333 0.8914 0.9442
No log 8.8511 416 0.7605 0.3891 0.7605 0.8721
No log 8.8936 418 0.6956 0.4201 0.6956 0.8340
No log 8.9362 420 0.7083 0.3566 0.7083 0.8416
No log 8.9787 422 0.7409 0.3124 0.7409 0.8608
No log 9.0213 424 0.6969 0.3610 0.6969 0.8348
No log 9.0638 426 0.6752 0.4222 0.6752 0.8217
No log 9.1064 428 0.7878 0.3653 0.7878 0.8876
No log 9.1489 430 0.8296 0.3754 0.8296 0.9108
No log 9.1915 432 0.7898 0.3586 0.7898 0.8887
No log 9.2340 434 0.7272 0.3942 0.7272 0.8527
No log 9.2766 436 0.6787 0.4059 0.6787 0.8238
No log 9.3191 438 0.6898 0.3810 0.6898 0.8306
No log 9.3617 440 0.6999 0.3762 0.6999 0.8366
No log 9.4043 442 0.6858 0.3786 0.6858 0.8281
No log 9.4468 444 0.6843 0.4358 0.6843 0.8273
No log 9.4894 446 0.7461 0.3737 0.7461 0.8638
No log 9.5319 448 0.8412 0.3869 0.8412 0.9172
No log 9.5745 450 0.8173 0.3319 0.8173 0.9040
No log 9.6170 452 0.7304 0.3196 0.7304 0.8546
No log 9.6596 454 0.7083 0.3813 0.7083 0.8416
No log 9.7021 456 0.7083 0.3239 0.7083 0.8416
No log 9.7447 458 0.7211 0.3239 0.7211 0.8492
No log 9.7872 460 0.7377 0.2929 0.7377 0.8589
No log 9.8298 462 0.7533 0.2591 0.7533 0.8679
No log 9.8723 464 0.7831 0.2237 0.7831 0.8849
No log 9.9149 466 0.8344 0.3329 0.8344 0.9135
No log 9.9574 468 0.8527 0.3329 0.8527 0.9234
No log 10.0 470 0.8064 0.3355 0.8064 0.8980
No log 10.0426 472 0.7378 0.3485 0.7378 0.8590
No log 10.0851 474 0.7527 0.4294 0.7527 0.8676
No log 10.1277 476 0.7778 0.3590 0.7778 0.8820
No log 10.1702 478 0.7426 0.3831 0.7426 0.8618
No log 10.2128 480 0.7034 0.4217 0.7034 0.8387
No log 10.2553 482 0.7357 0.3196 0.7357 0.8577
No log 10.2979 484 0.8199 0.3060 0.8199 0.9055
No log 10.3404 486 0.7969 0.3060 0.7969 0.8927
No log 10.3830 488 0.7143 0.2913 0.7143 0.8452
No log 10.4255 490 0.6865 0.3022 0.6865 0.8285
No log 10.4681 492 0.7350 0.2552 0.7350 0.8573
No log 10.5106 494 0.7405 0.2552 0.7405 0.8605
No log 10.5532 496 0.7106 0.2744 0.7106 0.8429
No log 10.5957 498 0.7268 0.2685 0.7268 0.8525
0.3903 10.6383 500 0.7723 0.2913 0.7723 0.8788
0.3903 10.6809 502 0.7874 0.2913 0.7874 0.8873
0.3903 10.7234 504 0.7960 0.2847 0.7960 0.8922
0.3903 10.7660 506 0.7874 0.3050 0.7874 0.8874
0.3903 10.8085 508 0.7820 0.1686 0.7820 0.8843
0.3903 10.8511 510 0.7722 0.2121 0.7722 0.8788
0.3903 10.8936 512 0.7768 0.2121 0.7768 0.8814
0.3903 10.9362 514 0.8160 0.1648 0.8160 0.9033
0.3903 10.9787 516 0.8862 0.2817 0.8862 0.9414

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task7_organization

Finetuned
(4019)
this model