ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6723
  • Qwk: 0.3769
  • Mse: 0.6723
  • Rmse: 0.8200

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1111 2 2.7098 -0.0449 2.7098 1.6461
No log 0.2222 4 1.4777 0.0782 1.4777 1.2156
No log 0.3333 6 0.7834 0.1786 0.7834 0.8851
No log 0.4444 8 0.7792 0.1407 0.7792 0.8827
No log 0.5556 10 0.9759 0.1628 0.9759 0.9879
No log 0.6667 12 1.0939 -0.0340 1.0939 1.0459
No log 0.7778 14 1.0270 0.0168 1.0270 1.0134
No log 0.8889 16 1.0775 0.0692 1.0775 1.0380
No log 1.0 18 1.0521 0.1897 1.0521 1.0257
No log 1.1111 20 0.8148 0.1298 0.8148 0.9027
No log 1.2222 22 0.7892 0.2445 0.7892 0.8883
No log 1.3333 24 0.8604 0.2839 0.8604 0.9276
No log 1.4444 26 0.7986 0.2995 0.7986 0.8936
No log 1.5556 28 0.7157 0.0827 0.7157 0.8460
No log 1.6667 30 0.7478 0.0919 0.7478 0.8647
No log 1.7778 32 0.7188 0.0097 0.7188 0.8478
No log 1.8889 34 0.6982 0.2308 0.6982 0.8356
No log 2.0 36 0.8307 0.2967 0.8307 0.9114
No log 2.1111 38 0.9094 0.2000 0.9094 0.9536
No log 2.2222 40 0.7899 0.2817 0.7899 0.8888
No log 2.3333 42 0.7940 0.0441 0.7940 0.8911
No log 2.4444 44 0.8845 0.0919 0.8845 0.9405
No log 2.5556 46 0.8238 0.0514 0.8238 0.9076
No log 2.6667 48 0.7309 0.1136 0.7309 0.8549
No log 2.7778 50 0.8708 0.3494 0.8708 0.9331
No log 2.8889 52 1.1666 0.2100 1.1666 1.0801
No log 3.0 54 1.1634 0.2264 1.1634 1.0786
No log 3.1111 56 0.9192 0.3194 0.9192 0.9588
No log 3.2222 58 0.7340 0.4137 0.7340 0.8568
No log 3.3333 60 0.9285 0.1793 0.9285 0.9636
No log 3.4444 62 0.9868 0.2020 0.9868 0.9934
No log 3.5556 64 0.8057 0.2624 0.8057 0.8976
No log 3.6667 66 0.7775 0.3344 0.7775 0.8818
No log 3.7778 68 0.9361 0.3287 0.9361 0.9675
No log 3.8889 70 0.9012 0.3287 0.9012 0.9493
No log 4.0 72 0.8072 0.3169 0.8072 0.8984
No log 4.1111 74 0.6927 0.3867 0.6927 0.8323
No log 4.2222 76 0.6750 0.3840 0.6750 0.8216
No log 4.3333 78 0.7333 0.3169 0.7333 0.8563
No log 4.4444 80 0.7332 0.3302 0.7332 0.8563
No log 4.5556 82 0.7798 0.2724 0.7798 0.8830
No log 4.6667 84 0.9752 0.2533 0.9752 0.9875
No log 4.7778 86 1.0180 0.2488 1.0180 1.0090
No log 4.8889 88 0.9101 0.3732 0.9101 0.9540
No log 5.0 90 1.0413 0.2156 1.0413 1.0204
No log 5.1111 92 0.9588 0.2686 0.9588 0.9792
No log 5.2222 94 0.8594 0.3586 0.8594 0.9270
No log 5.3333 96 1.2277 0.3384 1.2277 1.1080
No log 5.4444 98 1.4006 0.3101 1.4006 1.1835
No log 5.5556 100 1.0712 0.4272 1.0712 1.0350
No log 5.6667 102 0.6182 0.3662 0.6182 0.7862
No log 5.7778 104 0.5299 0.6544 0.5299 0.7280
No log 5.8889 106 0.5455 0.6500 0.5455 0.7386
No log 6.0 108 0.5179 0.6257 0.5179 0.7196
No log 6.1111 110 0.5279 0.6010 0.5279 0.7265
No log 6.2222 112 0.5691 0.4464 0.5691 0.7544
No log 6.3333 114 0.5350 0.5307 0.5350 0.7314
No log 6.4444 116 0.5267 0.4983 0.5267 0.7258
No log 6.5556 118 0.5488 0.4122 0.5488 0.7408
No log 6.6667 120 0.6290 0.3712 0.6290 0.7931
No log 6.7778 122 0.7284 0.3194 0.7284 0.8535
No log 6.8889 124 0.7267 0.3562 0.7267 0.8525
No log 7.0 126 0.7312 0.3754 0.7312 0.8551
No log 7.1111 128 0.7664 0.3806 0.7664 0.8754
No log 7.2222 130 0.7156 0.3483 0.7156 0.8459
No log 7.3333 132 0.7446 0.3973 0.7446 0.8629
No log 7.4444 134 0.8691 0.3887 0.8691 0.9322
No log 7.5556 136 0.9019 0.4051 0.9019 0.9497
No log 7.6667 138 0.8369 0.3623 0.8369 0.9148
No log 7.7778 140 0.6901 0.3518 0.6901 0.8307
No log 7.8889 142 0.6119 0.4768 0.6119 0.7822
No log 8.0 144 0.6071 0.4265 0.6071 0.7792
No log 8.1111 146 0.6928 0.3648 0.6928 0.8324
No log 8.2222 148 0.8571 0.4287 0.8571 0.9258
No log 8.3333 150 0.7649 0.4732 0.7649 0.8746
No log 8.4444 152 0.6951 0.4789 0.6951 0.8337
No log 8.5556 154 0.6892 0.4281 0.6892 0.8302
No log 8.6667 156 0.7123 0.4421 0.7123 0.8440
No log 8.7778 158 0.7965 0.4114 0.7965 0.8925
No log 8.8889 160 0.8314 0.3538 0.8314 0.9118
No log 9.0 162 0.7374 0.3494 0.7374 0.8587
No log 9.1111 164 0.6828 0.3372 0.6828 0.8263
No log 9.2222 166 0.6484 0.3518 0.6484 0.8052
No log 9.3333 168 0.6817 0.2692 0.6817 0.8257
No log 9.4444 170 0.9137 0.3577 0.9137 0.9559
No log 9.5556 172 1.0300 0.3759 1.0300 1.0149
No log 9.6667 174 0.8104 0.4528 0.8104 0.9002
No log 9.7778 176 0.6698 0.4302 0.6698 0.8184
No log 9.8889 178 0.6122 0.4438 0.6122 0.7824
No log 10.0 180 0.6072 0.4419 0.6072 0.7792
No log 10.1111 182 0.6816 0.3843 0.6816 0.8256
No log 10.2222 184 0.6902 0.3843 0.6902 0.8308
No log 10.3333 186 0.6156 0.3942 0.6156 0.7846
No log 10.4444 188 0.6053 0.4257 0.6053 0.7780
No log 10.5556 190 0.6108 0.4470 0.6108 0.7816
No log 10.6667 192 0.6393 0.4397 0.6393 0.7996
No log 10.7778 194 0.7008 0.4408 0.7008 0.8371
No log 10.8889 196 0.6832 0.4227 0.6832 0.8265
No log 11.0 198 0.6195 0.5003 0.6195 0.7871
No log 11.1111 200 0.6175 0.4953 0.6175 0.7858
No log 11.2222 202 0.6515 0.3769 0.6515 0.8071
No log 11.3333 204 0.6763 0.3221 0.6763 0.8224
No log 11.4444 206 0.7074 0.3590 0.7074 0.8410
No log 11.5556 208 0.6205 0.4413 0.6205 0.7877
No log 11.6667 210 0.6035 0.4105 0.6035 0.7768
No log 11.7778 212 0.6114 0.4505 0.6114 0.7819
No log 11.8889 214 0.6267 0.4067 0.6267 0.7916
No log 12.0 216 0.6770 0.4036 0.6770 0.8228
No log 12.1111 218 0.6535 0.3891 0.6535 0.8084
No log 12.2222 220 0.6547 0.3441 0.6547 0.8092
No log 12.3333 222 0.6838 0.3842 0.6838 0.8269
No log 12.4444 224 0.7114 0.3936 0.7114 0.8435
No log 12.5556 226 0.6372 0.4358 0.6372 0.7982
No log 12.6667 228 0.6183 0.3964 0.6183 0.7863
No log 12.7778 230 0.6104 0.4768 0.6104 0.7813
No log 12.8889 232 0.6507 0.3287 0.6507 0.8067
No log 13.0 234 0.8082 0.3822 0.8082 0.8990
No log 13.1111 236 0.8165 0.3822 0.8165 0.9036
No log 13.2222 238 0.6693 0.3287 0.6693 0.8181
No log 13.3333 240 0.6292 0.4895 0.6292 0.7932
No log 13.4444 242 0.6215 0.3729 0.6215 0.7883
No log 13.5556 244 0.6260 0.3754 0.6260 0.7912
No log 13.6667 246 0.6185 0.4752 0.6185 0.7865
No log 13.7778 248 0.7082 0.3843 0.7082 0.8415
No log 13.8889 250 0.7596 0.4014 0.7596 0.8716
No log 14.0 252 0.7005 0.3677 0.7005 0.8369
No log 14.1111 254 0.6942 0.3700 0.6942 0.8332
No log 14.2222 256 0.6775 0.3640 0.6775 0.8231
No log 14.3333 258 0.6850 0.3640 0.6850 0.8277
No log 14.4444 260 0.7300 0.3843 0.7300 0.8544
No log 14.5556 262 0.7808 0.3770 0.7808 0.8836
No log 14.6667 264 0.7598 0.3770 0.7598 0.8717
No log 14.7778 266 0.7308 0.4036 0.7308 0.8548
No log 14.8889 268 0.7113 0.3590 0.7113 0.8434
No log 15.0 270 0.6728 0.4001 0.6728 0.8202
No log 15.1111 272 0.6569 0.4753 0.6569 0.8105
No log 15.2222 274 0.6452 0.4418 0.6452 0.8033
No log 15.3333 276 0.6295 0.4555 0.6295 0.7934
No log 15.4444 278 0.6298 0.4576 0.6298 0.7936
No log 15.5556 280 0.6352 0.4352 0.6352 0.7970
No log 15.6667 282 0.6679 0.4352 0.6679 0.8172
No log 15.7778 284 0.6677 0.3425 0.6677 0.8171
No log 15.8889 286 0.6444 0.4114 0.6444 0.8028
No log 16.0 288 0.6533 0.4285 0.6533 0.8083
No log 16.1111 290 0.6384 0.4285 0.6384 0.7990
No log 16.2222 292 0.6044 0.4222 0.6044 0.7775
No log 16.3333 294 0.7067 0.3732 0.7067 0.8407
No log 16.4444 296 0.7693 0.4038 0.7693 0.8771
No log 16.5556 298 0.6960 0.3195 0.6960 0.8343
No log 16.6667 300 0.5901 0.4610 0.5901 0.7682
No log 16.7778 302 0.5841 0.3889 0.5841 0.7643
No log 16.8889 304 0.5816 0.3808 0.5816 0.7626
No log 17.0 306 0.6157 0.4337 0.6157 0.7847
No log 17.1111 308 0.6658 0.3656 0.6658 0.8160
No log 17.2222 310 0.6569 0.3723 0.6569 0.8105
No log 17.3333 312 0.6295 0.4701 0.6295 0.7934
No log 17.4444 314 0.6422 0.4278 0.6422 0.8014
No log 17.5556 316 0.6885 0.3329 0.6885 0.8298
No log 17.6667 318 0.7451 0.3822 0.7451 0.8632
No log 17.7778 320 0.7703 0.3822 0.7703 0.8777
No log 17.8889 322 0.6907 0.3393 0.6907 0.8311
No log 18.0 324 0.6392 0.3603 0.6392 0.7995
No log 18.1111 326 0.6497 0.4068 0.6497 0.8061
No log 18.2222 328 0.6515 0.3691 0.6515 0.8072
No log 18.3333 330 0.6720 0.3401 0.6720 0.8197
No log 18.4444 332 0.7046 0.3590 0.7046 0.8394
No log 18.5556 334 0.6900 0.3482 0.6900 0.8307
No log 18.6667 336 0.6614 0.3554 0.6614 0.8132
No log 18.7778 338 0.6726 0.3554 0.6726 0.8201
No log 18.8889 340 0.6889 0.3489 0.6889 0.8300
No log 19.0 342 0.7312 0.3470 0.7312 0.8551
No log 19.1111 344 0.7399 0.3106 0.7399 0.8602
No log 19.2222 346 0.6791 0.3679 0.6791 0.8241
No log 19.3333 348 0.6630 0.3613 0.6630 0.8143
No log 19.4444 350 0.6351 0.4051 0.6351 0.7969
No log 19.5556 352 0.6185 0.4051 0.6185 0.7865
No log 19.6667 354 0.5994 0.4377 0.5994 0.7742
No log 19.7778 356 0.6163 0.3931 0.6163 0.7851
No log 19.8889 358 0.6339 0.4176 0.6339 0.7962
No log 20.0 360 0.6413 0.3971 0.6413 0.8008
No log 20.1111 362 0.6060 0.4176 0.6060 0.7785
No log 20.2222 364 0.6352 0.3971 0.6352 0.7970
No log 20.3333 366 0.7129 0.3669 0.7129 0.8443
No log 20.4444 368 0.7090 0.3463 0.7090 0.8420
No log 20.5556 370 0.6589 0.3732 0.6589 0.8117
No log 20.6667 372 0.6684 0.3248 0.6684 0.8175
No log 20.7778 374 0.6913 0.3615 0.6913 0.8315
No log 20.8889 376 0.7122 0.3032 0.7122 0.8439
No log 21.0 378 0.7435 0.3344 0.7435 0.8622
No log 21.1111 380 0.7156 0.3866 0.7156 0.8459
No log 21.2222 382 0.7177 0.3526 0.7177 0.8472
No log 21.3333 384 0.7230 0.3648 0.7230 0.8503
No log 21.4444 386 0.6521 0.4374 0.6521 0.8075
No log 21.5556 388 0.6092 0.4402 0.6092 0.7805
No log 21.6667 390 0.6060 0.4299 0.6060 0.7785
No log 21.7778 392 0.6071 0.4828 0.6071 0.7792
No log 21.8889 394 0.6769 0.3746 0.6769 0.8227
No log 22.0 396 0.8222 0.3688 0.8222 0.9068
No log 22.1111 398 0.8852 0.3403 0.8852 0.9409
No log 22.2222 400 0.7850 0.3520 0.7850 0.8860
No log 22.3333 402 0.6722 0.3121 0.6722 0.8199
No log 22.4444 404 0.6313 0.4378 0.6313 0.7945
No log 22.5556 406 0.6368 0.4659 0.6368 0.7980
No log 22.6667 408 0.6651 0.3458 0.6651 0.8156
No log 22.7778 410 0.8287 0.3679 0.8287 0.9103
No log 22.8889 412 0.9932 0.3936 0.9932 0.9966
No log 23.0 414 0.9294 0.4217 0.9294 0.9641
No log 23.1111 416 0.7734 0.4273 0.7734 0.8794
No log 23.2222 418 0.7099 0.3395 0.7099 0.8426
No log 23.3333 420 0.7055 0.3433 0.7055 0.8399
No log 23.4444 422 0.6746 0.3284 0.6746 0.8214
No log 23.5556 424 0.6675 0.3723 0.6675 0.8170
No log 23.6667 426 0.6870 0.3183 0.6870 0.8289
No log 23.7778 428 0.6729 0.3544 0.6729 0.8203
No log 23.8889 430 0.6605 0.3399 0.6605 0.8127
No log 24.0 432 0.6570 0.3545 0.6570 0.8105
No log 24.1111 434 0.6628 0.4212 0.6628 0.8141
No log 24.2222 436 0.6928 0.4212 0.6928 0.8323
No log 24.3333 438 0.8041 0.2892 0.8041 0.8967
No log 24.4444 440 0.9869 0.3284 0.9869 0.9934
No log 24.5556 442 0.9882 0.3233 0.9882 0.9941
No log 24.6667 444 0.9123 0.3650 0.9123 0.9551
No log 24.7778 446 0.8404 0.2934 0.8404 0.9167
No log 24.8889 448 0.7442 0.3256 0.7442 0.8627
No log 25.0 450 0.7011 0.3699 0.7011 0.8373
No log 25.1111 452 0.6625 0.3032 0.6625 0.8139
No log 25.2222 454 0.6148 0.4267 0.6148 0.7841
No log 25.3333 456 0.5909 0.4136 0.5909 0.7687
No log 25.4444 458 0.5871 0.4402 0.5871 0.7663
No log 25.5556 460 0.6003 0.4267 0.6003 0.7748
No log 25.6667 462 0.6486 0.3662 0.6486 0.8053
No log 25.7778 464 0.6899 0.3630 0.6899 0.8306
No log 25.8889 466 0.7124 0.3256 0.7124 0.8440
No log 26.0 468 0.7657 0.3134 0.7657 0.8750
No log 26.1111 470 0.7737 0.3456 0.7737 0.8796
No log 26.2222 472 0.7455 0.3609 0.7455 0.8634
No log 26.3333 474 0.7323 0.3409 0.7323 0.8558
No log 26.4444 476 0.7502 0.3344 0.7502 0.8662
No log 26.5556 478 0.7781 0.3194 0.7781 0.8821
No log 26.6667 480 0.7507 0.3256 0.7507 0.8664
No log 26.7778 482 0.7245 0.3169 0.7245 0.8512
No log 26.8889 484 0.7153 0.2967 0.7153 0.8458
No log 27.0 486 0.7006 0.3518 0.7006 0.8370
No log 27.1111 488 0.6734 0.3755 0.6734 0.8206
No log 27.2222 490 0.6435 0.3197 0.6435 0.8022
No log 27.3333 492 0.6308 0.3604 0.6308 0.7942
No log 27.4444 494 0.6292 0.3863 0.6292 0.7932
No log 27.5556 496 0.6318 0.4613 0.6318 0.7949
No log 27.6667 498 0.6398 0.3574 0.6398 0.7999
0.2746 27.7778 500 0.6658 0.4103 0.6658 0.8160
0.2746 27.8889 502 0.7796 0.3068 0.7796 0.8829
0.2746 28.0 504 0.8717 0.3403 0.8717 0.9337
0.2746 28.1111 506 0.8134 0.3274 0.8134 0.9019
0.2746 28.2222 508 0.7296 0.3001 0.7296 0.8542
0.2746 28.3333 510 0.6723 0.3769 0.6723 0.8200

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k5_task7_organization

Finetuned
(4019)
this model