ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5329
  • Qwk: 0.3729
  • Mse: 0.5329
  • Rmse: 0.7300

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 2.6516 -0.1213 2.6516 1.6284
No log 0.0769 4 1.5011 -0.0104 1.5011 1.2252
No log 0.1154 6 1.0079 0.0069 1.0079 1.0039
No log 0.1538 8 1.0805 0.0994 1.0805 1.0395
No log 0.1923 10 1.2034 -0.0436 1.2034 1.0970
No log 0.2308 12 0.9608 0.1734 0.9608 0.9802
No log 0.2692 14 0.7688 0.1489 0.7688 0.8768
No log 0.3077 16 0.7703 0.1327 0.7703 0.8777
No log 0.3462 18 0.8288 -0.0500 0.8288 0.9104
No log 0.3846 20 0.8465 -0.0500 0.8465 0.9200
No log 0.4231 22 0.8748 -0.0426 0.8748 0.9353
No log 0.4615 24 0.8245 0.0 0.8245 0.9080
No log 0.5 26 0.7621 0.0 0.7621 0.8730
No log 0.5385 28 0.7125 0.0 0.7125 0.8441
No log 0.5769 30 0.7220 0.0643 0.7220 0.8497
No log 0.6154 32 0.7910 0.0747 0.7910 0.8894
No log 0.6538 34 0.7496 0.1866 0.7496 0.8658
No log 0.6923 36 0.6848 0.1327 0.6848 0.8276
No log 0.7308 38 0.7796 0.2740 0.7796 0.8829
No log 0.7692 40 0.6891 0.2412 0.6891 0.8301
No log 0.8077 42 0.7131 0.3701 0.7131 0.8445
No log 0.8462 44 1.2911 0.2336 1.2911 1.1363
No log 0.8846 46 1.5072 0.0728 1.5072 1.2277
No log 0.9231 48 1.3631 0.1202 1.3631 1.1675
No log 0.9615 50 1.0898 0.2259 1.0898 1.0439
No log 1.0 52 0.8380 0.1866 0.8380 0.9154
No log 1.0385 54 0.7397 0.0327 0.7397 0.8600
No log 1.0769 56 0.7201 0.1282 0.7201 0.8486
No log 1.1154 58 0.7431 0.1744 0.7431 0.8620
No log 1.1538 60 0.7133 0.1702 0.7133 0.8445
No log 1.1923 62 0.6816 0.0840 0.6816 0.8256
No log 1.2308 64 0.7250 0.1972 0.7250 0.8514
No log 1.2692 66 0.8372 0.2244 0.8372 0.9150
No log 1.3077 68 0.8851 0.2012 0.8851 0.9408
No log 1.3462 70 0.8687 0.1416 0.8687 0.9321
No log 1.3846 72 0.7922 0.0851 0.7922 0.8900
No log 1.4231 74 0.7127 0.1863 0.7127 0.8442
No log 1.4615 76 0.7867 0.1358 0.7867 0.8870
No log 1.5 78 0.8201 0.2132 0.8201 0.9056
No log 1.5385 80 0.7420 0.0889 0.7420 0.8614
No log 1.5769 82 0.7806 0.1598 0.7806 0.8835
No log 1.6154 84 0.9581 0.1853 0.9581 0.9788
No log 1.6538 86 1.0082 0.2727 1.0082 1.0041
No log 1.6923 88 0.9366 0.1251 0.9366 0.9678
No log 1.7308 90 0.7920 0.2182 0.7920 0.8899
No log 1.7692 92 0.7208 0.2135 0.7208 0.8490
No log 1.8077 94 0.7314 0.1867 0.7314 0.8552
No log 1.8462 96 0.7410 0.1673 0.7410 0.8608
No log 1.8846 98 0.8344 0.2063 0.8344 0.9135
No log 1.9231 100 1.0015 0.3324 1.0015 1.0008
No log 1.9615 102 0.9506 0.3960 0.9506 0.9750
No log 2.0 104 0.7750 0.1918 0.7750 0.8804
No log 2.0385 106 0.7728 0.2718 0.7728 0.8791
No log 2.0769 108 0.8575 0.2632 0.8575 0.9260
No log 2.1154 110 1.2026 0.1849 1.2026 1.0966
No log 2.1538 112 1.4132 0.1484 1.4132 1.1888
No log 2.1923 114 1.3104 0.1581 1.3104 1.1447
No log 2.2308 116 1.1598 0.2635 1.1598 1.0770
No log 2.2692 118 0.9477 0.3606 0.9477 0.9735
No log 2.3077 120 0.8359 0.3302 0.8359 0.9143
No log 2.3462 122 0.8273 0.3302 0.8273 0.9096
No log 2.3846 124 0.9023 0.3234 0.9023 0.9499
No log 2.4231 126 0.8906 0.3562 0.8906 0.9437
No log 2.4615 128 0.8379 0.3256 0.8379 0.9154
No log 2.5 130 0.7589 0.3329 0.7589 0.8711
No log 2.5385 132 0.7963 0.3526 0.7963 0.8924
No log 2.5769 134 0.9020 0.3239 0.9020 0.9498
No log 2.6154 136 1.3051 0.1875 1.3051 1.1424
No log 2.6538 138 1.4635 0.2029 1.4635 1.2097
No log 2.6923 140 1.2992 0.2366 1.2992 1.1398
No log 2.7308 142 0.9766 0.3310 0.9766 0.9882
No log 2.7692 144 0.7380 0.1598 0.7380 0.8591
No log 2.8077 146 0.7374 0.1432 0.7374 0.8587
No log 2.8462 148 0.7649 0.1803 0.7649 0.8746
No log 2.8846 150 0.7321 0.1867 0.7321 0.8557
No log 2.9231 152 0.7202 0.2294 0.7202 0.8487
No log 2.9615 154 0.7338 0.1953 0.7338 0.8566
No log 3.0 156 0.7612 0.2662 0.7612 0.8724
No log 3.0385 158 0.7434 0.2988 0.7434 0.8622
No log 3.0769 160 0.7024 0.2591 0.7024 0.8381
No log 3.1154 162 0.7147 0.3517 0.7147 0.8454
No log 3.1538 164 0.7032 0.2857 0.7032 0.8386
No log 3.1923 166 0.7117 0.2872 0.7117 0.8436
No log 3.2308 168 0.7244 0.2590 0.7244 0.8511
No log 3.2692 170 0.7416 0.2530 0.7416 0.8611
No log 3.3077 172 0.7256 0.2590 0.7256 0.8518
No log 3.3462 174 0.7670 0.2847 0.7670 0.8758
No log 3.3846 176 0.7818 0.2633 0.7818 0.8842
No log 3.4231 178 0.7317 0.2204 0.7317 0.8554
No log 3.4615 180 0.7063 0.2447 0.7063 0.8404
No log 3.5 182 0.7018 0.3123 0.7018 0.8377
No log 3.5385 184 0.7045 0.2685 0.7045 0.8394
No log 3.5769 186 0.7123 0.2621 0.7123 0.8440
No log 3.6154 188 0.7429 0.3127 0.7429 0.8619
No log 3.6538 190 0.7202 0.2784 0.7202 0.8487
No log 3.6923 192 0.7832 0.4275 0.7832 0.8850
No log 3.7308 194 0.8857 0.4296 0.8857 0.9411
No log 3.7692 196 0.8749 0.4296 0.8749 0.9354
No log 3.8077 198 0.8769 0.4133 0.8769 0.9364
No log 3.8462 200 0.8122 0.3319 0.8122 0.9012
No log 3.8846 202 0.8072 0.3069 0.8072 0.8984
No log 3.9231 204 0.7207 0.2171 0.7207 0.8490
No log 3.9615 206 0.6979 0.1901 0.6979 0.8354
No log 4.0 208 0.7352 0.2171 0.7352 0.8574
No log 4.0385 210 0.8432 0.3494 0.8432 0.9182
No log 4.0769 212 0.8644 0.3746 0.8644 0.9297
No log 4.1154 214 0.7891 0.3494 0.7891 0.8883
No log 4.1538 216 0.6721 0.2981 0.6721 0.8198
No log 4.1923 218 0.6392 0.3123 0.6392 0.7995
No log 4.2308 220 0.6266 0.3524 0.6266 0.7916
No log 4.2692 222 0.6142 0.3524 0.6142 0.7837
No log 4.3077 224 0.6572 0.4134 0.6572 0.8107
No log 4.3462 226 0.7701 0.4917 0.7701 0.8776
No log 4.3846 228 0.8131 0.4917 0.8131 0.9017
No log 4.4231 230 0.7769 0.4917 0.7769 0.8814
No log 4.4615 232 0.7980 0.4917 0.7980 0.8933
No log 4.5 234 0.7840 0.4255 0.7840 0.8854
No log 4.5385 236 0.6612 0.4167 0.6612 0.8131
No log 4.5769 238 0.6130 0.3341 0.6130 0.7829
No log 4.6154 240 0.6208 0.3649 0.6208 0.7879
No log 4.6538 242 0.6277 0.3267 0.6277 0.7923
No log 4.6923 244 0.6286 0.3267 0.6286 0.7928
No log 4.7308 246 0.6736 0.4644 0.6736 0.8207
No log 4.7692 248 0.7190 0.4648 0.7190 0.8480
No log 4.8077 250 0.6850 0.5310 0.6850 0.8277
No log 4.8462 252 0.6079 0.4147 0.6079 0.7797
No log 4.8846 254 0.5847 0.4516 0.5847 0.7646
No log 4.9231 256 0.5740 0.4678 0.5740 0.7576
No log 4.9615 258 0.5678 0.3347 0.5678 0.7535
No log 5.0 260 0.5682 0.4299 0.5682 0.7538
No log 5.0385 262 0.5741 0.4299 0.5741 0.7577
No log 5.0769 264 0.6097 0.3551 0.6097 0.7808
No log 5.1154 266 0.6430 0.3640 0.6430 0.8019
No log 5.1538 268 0.6118 0.3239 0.6118 0.7822
No log 5.1923 270 0.5900 0.4182 0.5900 0.7681
No log 5.2308 272 0.6377 0.3614 0.6377 0.7986
No log 5.2692 274 0.6216 0.2998 0.6216 0.7884
No log 5.3077 276 0.5976 0.4182 0.5976 0.7730
No log 5.3462 278 0.6972 0.3329 0.6972 0.8350
No log 5.3846 280 0.7462 0.4275 0.7462 0.8638
No log 5.4231 282 0.7448 0.3981 0.7448 0.8630
No log 5.4615 284 0.6420 0.4157 0.6420 0.8013
No log 5.5 286 0.6223 0.3883 0.6223 0.7889
No log 5.5385 288 0.7162 0.3367 0.7162 0.8463
No log 5.5769 290 0.7168 0.3367 0.7168 0.8466
No log 5.6154 292 0.6376 0.3443 0.6376 0.7985
No log 5.6538 294 0.6532 0.4060 0.6532 0.8082
No log 5.6923 296 0.7621 0.3630 0.7621 0.8730
No log 5.7308 298 0.8207 0.4328 0.8207 0.9059
No log 5.7692 300 0.7540 0.3746 0.7540 0.8684
No log 5.8077 302 0.6792 0.4076 0.6792 0.8241
No log 5.8462 304 0.6579 0.3416 0.6579 0.8111
No log 5.8846 306 0.6501 0.3081 0.6501 0.8063
No log 5.9231 308 0.6583 0.3622 0.6583 0.8114
No log 5.9615 310 0.7252 0.2967 0.7252 0.8516
No log 6.0 312 0.8488 0.4030 0.8488 0.9213
No log 6.0385 314 0.8671 0.4255 0.8671 0.9312
No log 6.0769 316 0.8008 0.3384 0.8008 0.8949
No log 6.1154 318 0.7128 0.3261 0.7128 0.8443
No log 6.1538 320 0.6565 0.3498 0.6565 0.8102
No log 6.1923 322 0.6494 0.4091 0.6494 0.8058
No log 6.2308 324 0.6500 0.4222 0.6500 0.8062
No log 6.2692 326 0.6465 0.4222 0.6465 0.8041
No log 6.3077 328 0.6633 0.3525 0.6633 0.8144
No log 6.3462 330 0.7453 0.4648 0.7453 0.8633
No log 6.3846 332 0.7816 0.4574 0.7816 0.8841
No log 6.4231 334 0.7118 0.4562 0.7118 0.8437
No log 6.4615 336 0.6348 0.3662 0.6348 0.7968
No log 6.5 338 0.6083 0.3308 0.6083 0.7800
No log 6.5385 340 0.6225 0.2072 0.6225 0.7890
No log 6.5769 342 0.6220 0.3111 0.6220 0.7887
No log 6.6154 344 0.6347 0.4001 0.6347 0.7967
No log 6.6538 346 0.6414 0.4001 0.6414 0.8009
No log 6.6923 348 0.6618 0.3737 0.6618 0.8135
No log 6.7308 350 0.6936 0.3996 0.6936 0.8328
No log 6.7692 352 0.7737 0.3819 0.7737 0.8796
No log 6.8077 354 0.7901 0.3746 0.7901 0.8889
No log 6.8462 356 0.7696 0.3819 0.7696 0.8773
No log 6.8846 358 0.7934 0.3425 0.7934 0.8907
No log 6.9231 360 0.8727 0.3346 0.8727 0.9342
No log 6.9615 362 0.8780 0.3727 0.8780 0.9370
No log 7.0 364 0.7752 0.4243 0.7752 0.8805
No log 7.0385 366 0.6894 0.3339 0.6894 0.8303
No log 7.0769 368 0.6675 0.2857 0.6675 0.8170
No log 7.1154 370 0.6651 0.3078 0.6651 0.8156
No log 7.1538 372 0.6628 0.3155 0.6628 0.8142
No log 7.1923 374 0.6741 0.3996 0.6741 0.8210
No log 7.2308 376 0.6885 0.4247 0.6885 0.8298
No log 7.2692 378 0.6648 0.3127 0.6648 0.8153
No log 7.3077 380 0.6402 0.2441 0.6402 0.8001
No log 7.3462 382 0.6461 0.2170 0.6461 0.8038
No log 7.3846 384 0.6418 0.2193 0.6418 0.8011
No log 7.4231 386 0.6394 0.3123 0.6394 0.7997
No log 7.4615 388 0.6553 0.3267 0.6553 0.8095
No log 7.5 390 0.7450 0.4482 0.7450 0.8631
No log 7.5385 392 0.8936 0.3131 0.8936 0.9453
No log 7.5769 394 0.9083 0.3337 0.9083 0.9530
No log 7.6154 396 0.8426 0.3324 0.8426 0.9180
No log 7.6538 398 0.7993 0.3892 0.7993 0.8940
No log 7.6923 400 0.7410 0.4387 0.7410 0.8608
No log 7.7308 402 0.7275 0.4387 0.7275 0.8529
No log 7.7692 404 0.7655 0.4154 0.7655 0.8749
No log 7.8077 406 0.8466 0.3761 0.8466 0.9201
No log 7.8462 408 0.8834 0.3183 0.8834 0.9399
No log 7.8846 410 0.8064 0.4438 0.8064 0.8980
No log 7.9231 412 0.7136 0.4190 0.7136 0.8447
No log 7.9615 414 0.6838 0.3814 0.6838 0.8269
No log 8.0 416 0.6590 0.4044 0.6590 0.8118
No log 8.0385 418 0.6645 0.4100 0.6645 0.8152
No log 8.0769 420 0.6612 0.4100 0.6612 0.8131
No log 8.1154 422 0.6689 0.3814 0.6689 0.8179
No log 8.1538 424 0.7127 0.4307 0.7127 0.8442
No log 8.1923 426 0.6825 0.4076 0.6825 0.8261
No log 8.2308 428 0.6217 0.3524 0.6217 0.7885
No log 8.2692 430 0.6181 0.3788 0.6181 0.7862
No log 8.3077 432 0.6062 0.4375 0.6062 0.7786
No log 8.3462 434 0.5786 0.3970 0.5786 0.7607
No log 8.3846 436 0.5874 0.4291 0.5874 0.7664
No log 8.4231 438 0.5965 0.4124 0.5965 0.7724
No log 8.4615 440 0.5766 0.4291 0.5766 0.7593
No log 8.5 442 0.5697 0.4837 0.5697 0.7548
No log 8.5385 444 0.5660 0.4526 0.5660 0.7523
No log 8.5769 446 0.5674 0.4742 0.5674 0.7532
No log 8.6154 448 0.5765 0.4386 0.5765 0.7593
No log 8.6538 450 0.5625 0.4484 0.5625 0.7500
No log 8.6923 452 0.5570 0.5344 0.5570 0.7463
No log 8.7308 454 0.5572 0.4837 0.5572 0.7464
No log 8.7692 456 0.5633 0.4816 0.5633 0.7505
No log 8.8077 458 0.5542 0.4569 0.5542 0.7444
No log 8.8462 460 0.5545 0.4742 0.5545 0.7447
No log 8.8846 462 0.5676 0.4849 0.5676 0.7534
No log 8.9231 464 0.5467 0.4722 0.5467 0.7394
No log 8.9615 466 0.5847 0.4437 0.5847 0.7647
No log 9.0 468 0.6067 0.4437 0.6067 0.7789
No log 9.0385 470 0.5640 0.4867 0.5640 0.7510
No log 9.0769 472 0.5363 0.4291 0.5363 0.7323
No log 9.1154 474 0.5346 0.3728 0.5346 0.7312
No log 9.1538 476 0.5329 0.3809 0.5329 0.7300
No log 9.1923 478 0.5227 0.4338 0.5227 0.7230
No log 9.2308 480 0.5194 0.3970 0.5194 0.7207
No log 9.2692 482 0.5504 0.4867 0.5504 0.7419
No log 9.3077 484 0.5723 0.4352 0.5723 0.7565
No log 9.3462 486 0.5329 0.4816 0.5329 0.7300
No log 9.3846 488 0.5205 0.4463 0.5205 0.7215
No log 9.4231 490 0.5248 0.4444 0.5248 0.7245
No log 9.4615 492 0.5379 0.3958 0.5379 0.7334
No log 9.5 494 0.5351 0.3958 0.5351 0.7315
No log 9.5385 496 0.5306 0.3679 0.5306 0.7284
No log 9.5769 498 0.5315 0.4526 0.5315 0.7290
0.38 9.6154 500 0.5416 0.4837 0.5416 0.7359
0.38 9.6538 502 0.5430 0.4837 0.5430 0.7369
0.38 9.6923 504 0.5278 0.4878 0.5278 0.7265
0.38 9.7308 506 0.5283 0.4991 0.5283 0.7269
0.38 9.7692 508 0.5251 0.4722 0.5251 0.7246
0.38 9.8077 510 0.5381 0.4526 0.5381 0.7335
0.38 9.8462 512 0.5575 0.4291 0.5575 0.7467
0.38 9.8846 514 0.5416 0.4526 0.5416 0.7360
0.38 9.9231 516 0.5329 0.3729 0.5329 0.7300

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4019)
this model