ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6296
  • Qwk: 0.4747
  • Mse: 0.6296
  • Rmse: 0.7935

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 2.5993 -0.1089 2.5993 1.6122
No log 0.0702 4 1.3925 0.0704 1.3925 1.1800
No log 0.1053 6 0.9779 -0.0658 0.9779 0.9889
No log 0.1404 8 0.8384 0.0787 0.8384 0.9156
No log 0.1754 10 0.8048 0.1706 0.8048 0.8971
No log 0.2105 12 1.0519 0.2183 1.0519 1.0256
No log 0.2456 14 1.0611 0.2183 1.0611 1.0301
No log 0.2807 16 0.7720 0.1790 0.7720 0.8786
No log 0.3158 18 0.8311 0.3331 0.8311 0.9116
No log 0.3509 20 0.7914 0.3477 0.7914 0.8896
No log 0.3860 22 0.7727 0.3802 0.7727 0.8790
No log 0.4211 24 0.7075 0.2819 0.7075 0.8411
No log 0.4561 26 0.8642 0.2518 0.8642 0.9296
No log 0.4912 28 1.1893 0.1497 1.1893 1.0906
No log 0.5263 30 1.1137 0.2059 1.1137 1.0553
No log 0.5614 32 0.8921 0.3231 0.8921 0.9445
No log 0.5965 34 0.6775 0.2336 0.6775 0.8231
No log 0.6316 36 0.6633 0.2002 0.6633 0.8144
No log 0.6667 38 0.7332 0.3426 0.7332 0.8563
No log 0.7018 40 0.7069 0.3426 0.7069 0.8407
No log 0.7368 42 0.6880 0.2543 0.6880 0.8295
No log 0.7719 44 0.6940 0.3675 0.6940 0.8331
No log 0.8070 46 0.8101 0.2808 0.8101 0.9001
No log 0.8421 48 0.7906 0.2498 0.7906 0.8891
No log 0.8772 50 0.8138 0.2995 0.8138 0.9021
No log 0.9123 52 0.7323 0.2319 0.7323 0.8557
No log 0.9474 54 0.7122 0.2319 0.7122 0.8439
No log 0.9825 56 0.7817 0.2662 0.7817 0.8841
No log 1.0175 58 0.9987 0.2910 0.9987 0.9994
No log 1.0526 60 1.0215 0.3052 1.0215 1.0107
No log 1.0877 62 0.8413 0.3699 0.8413 0.9172
No log 1.1228 64 0.6826 0.3224 0.6826 0.8262
No log 1.1579 66 0.7721 0.3189 0.7721 0.8787
No log 1.1930 68 0.9948 0.2474 0.9948 0.9974
No log 1.2281 70 1.0649 0.1737 1.0649 1.0320
No log 1.2632 72 0.8546 0.2505 0.8546 0.9244
No log 1.2982 74 0.6777 0.1903 0.6777 0.8232
No log 1.3333 76 0.7186 0.2589 0.7186 0.8477
No log 1.3684 78 0.8088 0.3699 0.8088 0.8993
No log 1.4035 80 0.7776 0.2932 0.7776 0.8818
No log 1.4386 82 0.7249 0.2383 0.7249 0.8514
No log 1.4737 84 0.7271 0.2751 0.7271 0.8527
No log 1.5088 86 0.7405 0.2383 0.7405 0.8605
No log 1.5439 88 0.7591 0.2383 0.7591 0.8713
No log 1.5789 90 0.8920 0.3239 0.8920 0.9444
No log 1.6140 92 0.9941 0.3411 0.9941 0.9970
No log 1.6491 94 0.9277 0.3346 0.9277 0.9632
No log 1.6842 96 0.9016 0.3290 0.9016 0.9495
No log 1.7193 98 0.8597 0.3160 0.8597 0.9272
No log 1.7544 100 0.9324 0.2872 0.9324 0.9656
No log 1.7895 102 0.9731 0.2524 0.9731 0.9865
No log 1.8246 104 0.7540 0.3817 0.7540 0.8684
No log 1.8596 106 0.7439 0.3899 0.7439 0.8625
No log 1.8947 108 0.8255 0.3849 0.8255 0.9086
No log 1.9298 110 0.6831 0.3671 0.6831 0.8265
No log 1.9649 112 0.6665 0.3762 0.6665 0.8164
No log 2.0 114 0.7088 0.3425 0.7088 0.8419
No log 2.0351 116 0.7042 0.3471 0.7042 0.8392
No log 2.0702 118 0.6582 0.3782 0.6582 0.8113
No log 2.1053 120 0.6710 0.2685 0.6710 0.8191
No log 2.1404 122 0.7429 0.3399 0.7429 0.8619
No log 2.1754 124 0.7860 0.3794 0.7860 0.8866
No log 2.2105 126 0.9488 0.3596 0.9488 0.9741
No log 2.2456 128 0.9464 0.3965 0.9464 0.9728
No log 2.2807 130 0.8791 0.3955 0.8791 0.9376
No log 2.3158 132 1.1338 0.3096 1.1338 1.0648
No log 2.3509 134 1.4677 0.2655 1.4677 1.2115
No log 2.3860 136 1.2222 0.3160 1.2222 1.1055
No log 2.4211 138 0.8203 0.3261 0.8203 0.9057
No log 2.4561 140 0.6480 0.3296 0.6480 0.8050
No log 2.4912 142 0.6552 0.3863 0.6552 0.8094
No log 2.5263 144 0.6397 0.3341 0.6397 0.7998
No log 2.5614 146 0.8023 0.3564 0.8023 0.8957
No log 2.5965 148 0.8978 0.2939 0.8978 0.9475
No log 2.6316 150 0.7543 0.3564 0.7543 0.8685
No log 2.6667 152 0.6690 0.3865 0.6690 0.8179
No log 2.7018 154 0.6735 0.3837 0.6735 0.8207
No log 2.7368 156 0.6885 0.3665 0.6885 0.8298
No log 2.7719 158 0.9490 0.3627 0.9490 0.9742
No log 2.8070 160 1.1529 0.2444 1.1529 1.0737
No log 2.8421 162 0.9888 0.3627 0.9888 0.9944
No log 2.8772 164 0.7059 0.4081 0.7059 0.8402
No log 2.9123 166 0.6624 0.3862 0.6624 0.8139
No log 2.9474 168 0.7452 0.3099 0.7452 0.8633
No log 2.9825 170 0.6978 0.3167 0.6978 0.8354
No log 3.0175 172 0.6349 0.4019 0.6349 0.7968
No log 3.0526 174 0.6770 0.3737 0.6770 0.8228
No log 3.0877 176 0.6894 0.3444 0.6894 0.8303
No log 3.1228 178 0.6421 0.3782 0.6421 0.8013
No log 3.1579 180 0.6573 0.2389 0.6573 0.8107
No log 3.1930 182 0.6547 0.2389 0.6547 0.8092
No log 3.2281 184 0.6329 0.2540 0.6329 0.7956
No log 3.2632 186 0.6565 0.4592 0.6565 0.8102
No log 3.2982 188 0.6727 0.4158 0.6727 0.8202
No log 3.3333 190 0.6843 0.3662 0.6843 0.8272
No log 3.3684 192 0.6202 0.4547 0.6202 0.7875
No log 3.4035 194 0.6138 0.4397 0.6138 0.7835
No log 3.4386 196 0.6052 0.4253 0.6052 0.7779
No log 3.4737 198 0.6009 0.5042 0.6009 0.7752
No log 3.5088 200 0.5949 0.4972 0.5949 0.7713
No log 3.5439 202 0.5969 0.4722 0.5969 0.7726
No log 3.5789 204 0.6040 0.4527 0.6040 0.7772
No log 3.6140 206 0.6204 0.4486 0.6204 0.7876
No log 3.6491 208 0.6062 0.4423 0.6062 0.7786
No log 3.6842 210 0.6380 0.4576 0.6380 0.7988
No log 3.7193 212 0.7512 0.4223 0.7512 0.8667
No log 3.7544 214 0.6932 0.4451 0.6932 0.8326
No log 3.7895 216 0.6411 0.4875 0.6411 0.8007
No log 3.8246 218 0.6032 0.4068 0.6032 0.7767
No log 3.8596 220 0.7526 0.3640 0.7526 0.8675
No log 3.8947 222 0.9156 0.2066 0.9156 0.9569
No log 3.9298 224 0.8713 0.3274 0.8713 0.9334
No log 3.9649 226 0.7097 0.4395 0.7097 0.8424
No log 4.0 228 0.6101 0.2884 0.6101 0.7811
No log 4.0351 230 0.6734 0.3737 0.6734 0.8206
No log 4.0702 232 0.7719 0.2843 0.7719 0.8786
No log 4.1053 234 0.7465 0.3519 0.7465 0.8640
No log 4.1404 236 0.6325 0.3296 0.6325 0.7953
No log 4.1754 238 0.6115 0.3702 0.6115 0.7820
No log 4.2105 240 0.6232 0.4044 0.6232 0.7894
No log 4.2456 242 0.7091 0.4089 0.7091 0.8421
No log 4.2807 244 0.7336 0.4089 0.7336 0.8565
No log 4.3158 246 0.6449 0.4212 0.6449 0.8031
No log 4.3509 248 0.5970 0.4147 0.5970 0.7727
No log 4.3860 250 0.5875 0.3426 0.5875 0.7665
No log 4.4211 252 0.5876 0.3502 0.5876 0.7665
No log 4.4561 254 0.6318 0.3840 0.6318 0.7949
No log 4.4912 256 0.7017 0.3590 0.7017 0.8377
No log 4.5263 258 0.7069 0.3609 0.7069 0.8408
No log 4.5614 260 0.6473 0.3840 0.6473 0.8045
No log 4.5965 262 0.6130 0.3840 0.6130 0.7829
No log 4.6316 264 0.5868 0.4229 0.5868 0.7661
No log 4.6667 266 0.5742 0.4505 0.5742 0.7578
No log 4.7018 268 0.5739 0.4742 0.5739 0.7576
No log 4.7368 270 0.5805 0.4300 0.5805 0.7619
No log 4.7719 272 0.5915 0.4576 0.5915 0.7691
No log 4.8070 274 0.5709 0.4701 0.5709 0.7556
No log 4.8421 276 0.5685 0.4527 0.5685 0.7540
No log 4.8772 278 0.5732 0.4527 0.5732 0.7571
No log 4.9123 280 0.5566 0.4240 0.5566 0.7461
No log 4.9474 282 0.5606 0.4659 0.5606 0.7487
No log 4.9825 284 0.6286 0.3894 0.6286 0.7929
No log 5.0175 286 0.6844 0.3918 0.6844 0.8273
No log 5.0526 288 0.6368 0.4076 0.6368 0.7980
No log 5.0877 290 0.6214 0.3976 0.6214 0.7883
No log 5.1228 292 0.6600 0.4076 0.6600 0.8124
No log 5.1579 294 0.6494 0.3942 0.6494 0.8059
No log 5.1930 296 0.6124 0.3763 0.6124 0.7825
No log 5.2281 298 0.5921 0.4101 0.5921 0.7695
No log 5.2632 300 0.6097 0.4463 0.6097 0.7808
No log 5.2982 302 0.6814 0.3936 0.6814 0.8255
No log 5.3333 304 0.9566 0.4056 0.9566 0.9781
No log 5.3684 306 1.1125 0.3241 1.1125 1.0548
No log 5.4035 308 0.9951 0.3174 0.9951 0.9975
No log 5.4386 310 0.7285 0.4272 0.7285 0.8535
No log 5.4737 312 0.6028 0.4044 0.6028 0.7764
No log 5.5088 314 0.6284 0.4367 0.6284 0.7927
No log 5.5439 316 0.6106 0.3988 0.6106 0.7814
No log 5.5789 318 0.5896 0.3808 0.5896 0.7678
No log 5.6140 320 0.6162 0.3524 0.6162 0.7850
No log 5.6491 322 0.6538 0.4270 0.6538 0.8086
No log 5.6842 324 0.7078 0.3723 0.7078 0.8413
No log 5.7193 326 0.6786 0.3794 0.6786 0.8238
No log 5.7544 328 0.5978 0.4747 0.5978 0.7732
No log 5.7895 330 0.5827 0.4384 0.5827 0.7634
No log 5.8246 332 0.5884 0.4407 0.5884 0.7671
No log 5.8596 334 0.5780 0.4878 0.5780 0.7602
No log 5.8947 336 0.5732 0.5361 0.5732 0.7571
No log 5.9298 338 0.5661 0.5344 0.5661 0.7524
No log 5.9649 340 0.5581 0.5584 0.5581 0.7471
No log 6.0 342 0.5533 0.5042 0.5533 0.7438
No log 6.0351 344 0.5526 0.4809 0.5526 0.7434
No log 6.0702 346 0.5540 0.5095 0.5540 0.7443
No log 6.1053 348 0.5699 0.4984 0.5699 0.7549
No log 6.1404 350 0.5665 0.5404 0.5665 0.7527
No log 6.1754 352 0.5620 0.4918 0.5620 0.7497
No log 6.2105 354 0.5810 0.4569 0.5810 0.7622
No log 6.2456 356 0.5591 0.4829 0.5591 0.7477
No log 6.2807 358 0.5593 0.4768 0.5593 0.7479
No log 6.3158 360 0.5484 0.4299 0.5484 0.7405
No log 6.3509 362 0.5519 0.4384 0.5519 0.7429
No log 6.3860 364 0.5485 0.3808 0.5485 0.7406
No log 6.4211 366 0.5458 0.4211 0.5458 0.7388
No log 6.4561 368 0.5382 0.4299 0.5382 0.7336
No log 6.4912 370 0.5343 0.5361 0.5343 0.7310
No log 6.5263 372 0.5327 0.5061 0.5327 0.7299
No log 6.5614 374 0.5695 0.4448 0.5695 0.7546
No log 6.5965 376 0.6322 0.5101 0.6322 0.7951
No log 6.6316 378 0.6120 0.4713 0.6120 0.7823
No log 6.6667 380 0.5725 0.4444 0.5725 0.7566
No log 6.7018 382 0.6153 0.5632 0.6153 0.7844
No log 6.7368 384 0.6885 0.4845 0.6885 0.8297
No log 6.7719 386 0.6741 0.4622 0.6741 0.8211
No log 6.8070 388 0.6107 0.6214 0.6107 0.7815
No log 6.8421 390 0.5970 0.4742 0.5970 0.7726
No log 6.8772 392 0.6030 0.4444 0.6030 0.7765
No log 6.9123 394 0.6080 0.4505 0.6080 0.7798
No log 6.9474 396 0.6051 0.4742 0.6051 0.7779
No log 6.9825 398 0.6112 0.4678 0.6112 0.7818
No log 7.0175 400 0.6205 0.3982 0.6205 0.7877
No log 7.0526 402 0.6351 0.3357 0.6351 0.7969
No log 7.0877 404 0.6405 0.3635 0.6405 0.8003
No log 7.1228 406 0.6073 0.3906 0.6073 0.7793
No log 7.1579 408 0.5926 0.4742 0.5926 0.7698
No log 7.1930 410 0.5865 0.4742 0.5865 0.7659
No log 7.2281 412 0.5916 0.4555 0.5916 0.7691
No log 7.2632 414 0.6054 0.4828 0.6054 0.7781
No log 7.2982 416 0.6250 0.5184 0.6250 0.7906
No log 7.3333 418 0.6292 0.5254 0.6292 0.7932
No log 7.3684 420 0.6870 0.4745 0.6870 0.8289
No log 7.4035 422 0.7781 0.3960 0.7781 0.8821
No log 7.4386 424 0.7928 0.3731 0.7928 0.8904
No log 7.4737 426 0.7055 0.4014 0.7055 0.8399
No log 7.5088 428 0.6378 0.4769 0.6378 0.7986
No log 7.5439 430 0.5485 0.4722 0.5485 0.7406
No log 7.5789 432 0.5395 0.4991 0.5395 0.7345
No log 7.6140 434 0.5397 0.4991 0.5397 0.7347
No log 7.6491 436 0.5440 0.4463 0.5440 0.7376
No log 7.6842 438 0.5944 0.4705 0.5944 0.7710
No log 7.7193 440 0.6408 0.4769 0.6408 0.8005
No log 7.7544 442 0.6015 0.4705 0.6015 0.7756
No log 7.7895 444 0.5597 0.4194 0.5597 0.7481
No log 7.8246 446 0.5659 0.3860 0.5659 0.7523
No log 7.8596 448 0.5736 0.3860 0.5736 0.7574
No log 7.8947 450 0.5727 0.4126 0.5727 0.7568
No log 7.9298 452 0.5716 0.4299 0.5716 0.7560
No log 7.9649 454 0.5725 0.4463 0.5725 0.7566
No log 8.0 456 0.5882 0.4681 0.5882 0.7669
No log 8.0351 458 0.6656 0.4315 0.6656 0.8158
No log 8.0702 460 0.7340 0.4829 0.7340 0.8567
No log 8.1053 462 0.6724 0.4644 0.6724 0.8200
No log 8.1404 464 0.5916 0.4934 0.5916 0.7691
No log 8.1754 466 0.5652 0.4972 0.5652 0.7518
No log 8.2105 468 0.5725 0.4876 0.5725 0.7566
No log 8.2456 470 0.5632 0.4876 0.5632 0.7504
No log 8.2807 472 0.5516 0.4722 0.5516 0.7427
No log 8.3158 474 0.5790 0.4743 0.5790 0.7609
No log 8.3509 476 0.6039 0.5184 0.6039 0.7771
No log 8.3860 478 0.6233 0.5184 0.6233 0.7895
No log 8.4211 480 0.5910 0.4985 0.5910 0.7687
No log 8.4561 482 0.5752 0.4904 0.5752 0.7584
No log 8.4912 484 0.5924 0.5092 0.5924 0.7697
No log 8.5263 486 0.5958 0.5092 0.5958 0.7719
No log 8.5614 488 0.6048 0.5286 0.6048 0.7777
No log 8.5965 490 0.6157 0.5549 0.6157 0.7847
No log 8.6316 492 0.6338 0.4985 0.6338 0.7961
No log 8.6667 494 0.6408 0.4661 0.6408 0.8005
No log 8.7018 496 0.6576 0.4642 0.6576 0.8109
No log 8.7368 498 0.6342 0.4724 0.6342 0.7964
0.3449 8.7719 500 0.6118 0.5141 0.6118 0.7822
0.3449 8.8070 502 0.5997 0.4904 0.5997 0.7744
0.3449 8.8421 504 0.6062 0.4386 0.6062 0.7786
0.3449 8.8772 506 0.6059 0.4953 0.6059 0.7784
0.3449 8.9123 508 0.6171 0.4441 0.6171 0.7855
0.3449 8.9474 510 0.6296 0.4747 0.6296 0.7935

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

Finetuned
(4019)
this model