ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9194
  • Qwk: 0.3678
  • Mse: 0.9194
  • Rmse: 0.9589

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 4.1555 0.0178 4.1555 2.0385
No log 0.3333 4 2.2358 0.0868 2.2358 1.4953
No log 0.5 6 1.5810 0.0185 1.5810 1.2574
No log 0.6667 8 1.4663 0.1033 1.4663 1.2109
No log 0.8333 10 1.0526 0.1476 1.0526 1.0260
No log 1.0 12 1.2922 -0.0274 1.2922 1.1367
No log 1.1667 14 1.5059 0.0214 1.5059 1.2272
No log 1.3333 16 1.2251 0.1837 1.2251 1.1069
No log 1.5 18 1.1602 0.1727 1.1602 1.0771
No log 1.6667 20 1.1171 0.1699 1.1171 1.0569
No log 1.8333 22 1.0786 0.1881 1.0786 1.0386
No log 2.0 24 1.0729 0.1671 1.0729 1.0358
No log 2.1667 26 1.0922 0.2318 1.0922 1.0451
No log 2.3333 28 1.2344 0.0849 1.2344 1.1111
No log 2.5 30 1.2532 0.0849 1.2532 1.1195
No log 2.6667 32 1.1240 0.1979 1.1240 1.0602
No log 2.8333 34 1.1732 0.1086 1.1732 1.0831
No log 3.0 36 1.3042 0.0380 1.3042 1.1420
No log 3.1667 38 1.3270 0.0760 1.3270 1.1519
No log 3.3333 40 1.0754 0.1979 1.0754 1.0370
No log 3.5 42 0.9548 0.2954 0.9548 0.9771
No log 3.6667 44 0.9522 0.3356 0.9522 0.9758
No log 3.8333 46 0.9728 0.3508 0.9728 0.9863
No log 4.0 48 0.9820 0.3078 0.9820 0.9909
No log 4.1667 50 0.9984 0.2591 0.9984 0.9992
No log 4.3333 52 1.0297 0.2591 1.0297 1.0147
No log 4.5 54 1.1043 0.1848 1.1043 1.0509
No log 4.6667 56 1.0506 0.2265 1.0506 1.0250
No log 4.8333 58 1.0847 0.2857 1.0847 1.0415
No log 5.0 60 1.2446 0.3041 1.2446 1.1156
No log 5.1667 62 1.1084 0.3281 1.1084 1.0528
No log 5.3333 64 1.0501 0.2192 1.0501 1.0247
No log 5.5 66 1.4640 0.1012 1.4640 1.2100
No log 5.6667 68 1.6466 0.0816 1.6466 1.2832
No log 5.8333 70 1.3225 0.2283 1.3225 1.1500
No log 6.0 72 1.0032 0.2521 1.0032 1.0016
No log 6.1667 74 0.9689 0.3280 0.9689 0.9843
No log 6.3333 76 0.9716 0.2504 0.9716 0.9857
No log 6.5 78 0.9611 0.3052 0.9611 0.9804
No log 6.6667 80 0.9998 0.3676 0.9998 0.9999
No log 6.8333 82 1.0413 0.3616 1.0413 1.0205
No log 7.0 84 1.0192 0.3407 1.0192 1.0096
No log 7.1667 86 1.0342 0.3143 1.0342 1.0170
No log 7.3333 88 1.0148 0.3101 1.0148 1.0074
No log 7.5 90 1.0338 0.3454 1.0338 1.0167
No log 7.6667 92 1.1898 0.3390 1.1898 1.0908
No log 7.8333 94 1.1169 0.3844 1.1169 1.0569
No log 8.0 96 1.0332 0.3781 1.0332 1.0165
No log 8.1667 98 1.0426 0.3933 1.0426 1.0211
No log 8.3333 100 1.0503 0.3661 1.0503 1.0248
No log 8.5 102 1.0314 0.3992 1.0314 1.0156
No log 8.6667 104 1.1296 0.1645 1.1296 1.0628
No log 8.8333 106 1.0912 0.0983 1.0912 1.0446
No log 9.0 108 0.9957 0.2366 0.9957 0.9979
No log 9.1667 110 0.9674 0.3202 0.9674 0.9836
No log 9.3333 112 0.8918 0.4192 0.8918 0.9443
No log 9.5 114 0.8273 0.5391 0.8273 0.9095
No log 9.6667 116 0.8207 0.4776 0.8207 0.9059
No log 9.8333 118 0.8335 0.5017 0.8335 0.9130
No log 10.0 120 0.8554 0.4524 0.8554 0.9249
No log 10.1667 122 0.9117 0.4161 0.9117 0.9548
No log 10.3333 124 0.9519 0.3695 0.9519 0.9757
No log 10.5 126 0.9090 0.4259 0.9090 0.9534
No log 10.6667 128 0.9512 0.4264 0.9512 0.9753
No log 10.8333 130 0.9769 0.4244 0.9769 0.9884
No log 11.0 132 0.9513 0.3762 0.9513 0.9753
No log 11.1667 134 0.9445 0.3740 0.9445 0.9718
No log 11.3333 136 0.9545 0.3314 0.9545 0.9770
No log 11.5 138 0.9738 0.3603 0.9738 0.9868
No log 11.6667 140 0.9704 0.4128 0.9704 0.9851
No log 11.8333 142 1.0342 0.3861 1.0342 1.0170
No log 12.0 144 1.0075 0.4359 1.0075 1.0037
No log 12.1667 146 0.9279 0.3314 0.9279 0.9633
No log 12.3333 148 0.9177 0.3446 0.9177 0.9580
No log 12.5 150 0.8975 0.3335 0.8975 0.9473
No log 12.6667 152 0.8982 0.3446 0.8982 0.9477
No log 12.8333 154 0.9000 0.3915 0.9000 0.9487
No log 13.0 156 0.8994 0.3398 0.8994 0.9484
No log 13.1667 158 0.9015 0.3367 0.9015 0.9495
No log 13.3333 160 1.0427 0.3723 1.0427 1.0211
No log 13.5 162 1.1591 0.2570 1.1591 1.0766
No log 13.6667 164 0.9791 0.3603 0.9791 0.9895
No log 13.8333 166 0.9215 0.4158 0.9215 0.9600
No log 14.0 168 0.9676 0.4023 0.9676 0.9837
No log 14.1667 170 0.9217 0.4169 0.9217 0.9601
No log 14.3333 172 0.9028 0.3896 0.9028 0.9502
No log 14.5 174 0.9728 0.3988 0.9728 0.9863
No log 14.6667 176 0.9589 0.3853 0.9589 0.9793
No log 14.8333 178 0.8869 0.3476 0.8869 0.9418
No log 15.0 180 0.9007 0.3112 0.9007 0.9491
No log 15.1667 182 0.9637 0.4208 0.9637 0.9817
No log 15.3333 184 0.9597 0.4174 0.9597 0.9796
No log 15.5 186 0.9290 0.2921 0.9290 0.9638
No log 15.6667 188 0.9254 0.2667 0.9254 0.9620
No log 15.8333 190 0.9244 0.2794 0.9244 0.9615
No log 16.0 192 0.9783 0.1699 0.9783 0.9891
No log 16.1667 194 1.0174 0.2864 1.0174 1.0087
No log 16.3333 196 0.9447 0.2499 0.9447 0.9720
No log 16.5 198 0.8975 0.3074 0.8975 0.9474
No log 16.6667 200 0.9105 0.3896 0.9105 0.9542
No log 16.8333 202 0.9009 0.3896 0.9009 0.9491
No log 17.0 204 0.9004 0.3194 0.9004 0.9489
No log 17.1667 206 1.0175 0.2886 1.0175 1.0087
No log 17.3333 208 1.0331 0.2886 1.0331 1.0164
No log 17.5 210 0.9366 0.2842 0.9366 0.9678
No log 17.6667 212 0.9233 0.4087 0.9233 0.9609
No log 17.8333 214 1.0634 0.4023 1.0634 1.0312
No log 18.0 216 1.0324 0.3993 1.0324 1.0161
No log 18.1667 218 0.8574 0.4312 0.8574 0.9259
No log 18.3333 220 0.8647 0.3563 0.8647 0.9299
No log 18.5 222 0.8958 0.4168 0.8958 0.9465
No log 18.6667 224 0.8839 0.3506 0.8839 0.9401
No log 18.8333 226 0.8470 0.3011 0.8470 0.9203
No log 19.0 228 0.8346 0.3178 0.8346 0.9136
No log 19.1667 230 0.8504 0.4019 0.8504 0.9222
No log 19.3333 232 0.8360 0.3540 0.8360 0.9143
No log 19.5 234 0.8688 0.4209 0.8688 0.9321
No log 19.6667 236 0.9295 0.4334 0.9295 0.9641
No log 19.8333 238 0.9566 0.3845 0.9566 0.9781
No log 20.0 240 0.9229 0.4310 0.9229 0.9607
No log 20.1667 242 0.8890 0.4310 0.8890 0.9429
No log 20.3333 244 0.8241 0.4048 0.8241 0.9078
No log 20.5 246 0.8121 0.3979 0.8121 0.9012
No log 20.6667 248 0.8193 0.4613 0.8193 0.9051
No log 20.8333 250 0.9412 0.4696 0.9412 0.9701
No log 21.0 252 1.0578 0.3738 1.0578 1.0285
No log 21.1667 254 1.0654 0.3902 1.0654 1.0322
No log 21.3333 256 0.9703 0.3474 0.9703 0.9851
No log 21.5 258 0.8535 0.3678 0.8535 0.9238
No log 21.6667 260 0.8383 0.4295 0.8383 0.9156
No log 21.8333 262 0.8399 0.4192 0.8399 0.9164
No log 22.0 264 0.8195 0.3548 0.8195 0.9053
No log 22.1667 266 0.8735 0.4327 0.8735 0.9346
No log 22.3333 268 0.9489 0.4326 0.9489 0.9741
No log 22.5 270 0.9410 0.3804 0.9410 0.9700
No log 22.6667 272 0.9418 0.3804 0.9418 0.9705
No log 22.8333 274 0.8955 0.4579 0.8955 0.9463
No log 23.0 276 0.8535 0.3414 0.8535 0.9239
No log 23.1667 278 0.8541 0.4198 0.8541 0.9242
No log 23.3333 280 0.8511 0.4713 0.8511 0.9226
No log 23.5 282 0.8280 0.3859 0.8280 0.9100
No log 23.6667 284 0.8261 0.3596 0.8261 0.9089
No log 23.8333 286 0.8313 0.3697 0.8313 0.9117
No log 24.0 288 0.8263 0.3301 0.8263 0.9090
No log 24.1667 290 0.8337 0.3178 0.8337 0.9131
No log 24.3333 292 0.8300 0.3178 0.8300 0.9111
No log 24.5 294 0.8345 0.3280 0.8345 0.9135
No log 24.6667 296 0.8790 0.4078 0.8790 0.9375
No log 24.8333 298 0.9357 0.3474 0.9357 0.9673
No log 25.0 300 0.9026 0.3617 0.9026 0.9500
No log 25.1667 302 0.8449 0.3172 0.8449 0.9192
No log 25.3333 304 0.8556 0.3356 0.8556 0.9250
No log 25.5 306 0.8995 0.4472 0.8995 0.9484
No log 25.6667 308 0.8922 0.3700 0.8922 0.9446
No log 25.8333 310 0.8619 0.3356 0.8619 0.9284
No log 26.0 312 0.8548 0.3172 0.8548 0.9246
No log 26.1667 314 0.8737 0.3506 0.8737 0.9347
No log 26.3333 316 0.9011 0.3687 0.9011 0.9492
No log 26.5 318 0.8911 0.3678 0.8911 0.9440
No log 26.6667 320 0.8780 0.2865 0.8780 0.9370
No log 26.8333 322 0.8829 0.3074 0.8829 0.9396
No log 27.0 324 0.8900 0.3172 0.8900 0.9434
No log 27.1667 326 0.9074 0.3583 0.9074 0.9526
No log 27.3333 328 0.8980 0.3293 0.8980 0.9476
No log 27.5 330 0.8720 0.2988 0.8720 0.9338
No log 27.6667 332 0.8598 0.2742 0.8598 0.9272
No log 27.8333 334 0.8520 0.2742 0.8520 0.9230
No log 28.0 336 0.8697 0.3678 0.8697 0.9326
No log 28.1667 338 0.9069 0.3658 0.9069 0.9523
No log 28.3333 340 0.9215 0.3760 0.9215 0.9599
No log 28.5 342 0.9222 0.3802 0.9222 0.9603
No log 28.6667 344 0.8991 0.3799 0.8991 0.9482
No log 28.8333 346 0.8772 0.3293 0.8772 0.9366
No log 29.0 348 0.8647 0.3837 0.8647 0.9299
No log 29.1667 350 0.8688 0.3981 0.8688 0.9321
No log 29.3333 352 0.8695 0.3981 0.8695 0.9324
No log 29.5 354 0.8865 0.3474 0.8865 0.9416
No log 29.6667 356 0.8927 0.3474 0.8927 0.9448
No log 29.8333 358 0.9206 0.3992 0.9206 0.9595
No log 30.0 360 0.9219 0.3474 0.9219 0.9602
No log 30.1667 362 0.9524 0.2746 0.9524 0.9759
No log 30.3333 364 0.9386 0.2510 0.9386 0.9688
No log 30.5 366 0.9281 0.3728 0.9281 0.9634
No log 30.6667 368 0.9131 0.3992 0.9131 0.9556
No log 30.8333 370 0.9241 0.3992 0.9241 0.9613
No log 31.0 372 0.9356 0.3424 0.9356 0.9672
No log 31.1667 374 0.9581 0.3523 0.9581 0.9788
No log 31.3333 376 0.9928 0.3424 0.9928 0.9964
No log 31.5 378 0.9943 0.3695 0.9943 0.9971
No log 31.6667 380 0.9524 0.3503 0.9524 0.9759
No log 31.8333 382 0.8848 0.3819 0.8848 0.9406
No log 32.0 384 0.8553 0.3569 0.8553 0.9248
No log 32.1667 386 0.8602 0.3713 0.8602 0.9275
No log 32.3333 388 0.8563 0.3713 0.8563 0.9253
No log 32.5 390 0.8536 0.4097 0.8536 0.9239
No log 32.6667 392 0.8569 0.3860 0.8569 0.9257
No log 32.8333 394 0.8638 0.3474 0.8638 0.9294
No log 33.0 396 0.8655 0.3474 0.8655 0.9303
No log 33.1667 398 0.8660 0.3414 0.8660 0.9306
No log 33.3333 400 0.8631 0.3358 0.8631 0.9291
No log 33.5 402 0.8649 0.4048 0.8649 0.9300
No log 33.6667 404 0.9312 0.3270 0.9312 0.9650
No log 33.8333 406 1.0539 0.3269 1.0539 1.0266
No log 34.0 408 1.0784 0.3269 1.0784 1.0385
No log 34.1667 410 0.9663 0.3872 0.9663 0.9830
No log 34.3333 412 0.8915 0.3532 0.8915 0.9442
No log 34.5 414 0.8991 0.3786 0.8991 0.9482
No log 34.6667 416 0.9348 0.4476 0.9348 0.9668
No log 34.8333 418 0.9238 0.3861 0.9238 0.9611
No log 35.0 420 0.8877 0.3799 0.8877 0.9422
No log 35.1667 422 0.8739 0.3697 0.8739 0.9348
No log 35.3333 424 0.8755 0.3697 0.8755 0.9357
No log 35.5 426 0.8792 0.3697 0.8792 0.9376
No log 35.6667 428 0.9140 0.4220 0.9140 0.9560
No log 35.8333 430 0.9565 0.4489 0.9565 0.9780
No log 36.0 432 0.9664 0.4244 0.9664 0.9830
No log 36.1667 434 0.9289 0.4198 0.9289 0.9638
No log 36.3333 436 0.8896 0.3414 0.8896 0.9432
No log 36.5 438 0.8973 0.3569 0.8973 0.9473
No log 36.6667 440 0.9183 0.3323 0.9183 0.9583
No log 36.8333 442 0.9083 0.3446 0.9083 0.9530
No log 37.0 444 0.8967 0.3403 0.8967 0.9470
No log 37.1667 446 0.8987 0.2988 0.8987 0.9480
No log 37.3333 448 0.9010 0.2988 0.9010 0.9492
No log 37.5 450 0.9093 0.3151 0.9093 0.9536
No log 37.6667 452 0.9276 0.3214 0.9276 0.9631
No log 37.8333 454 0.9249 0.3089 0.9249 0.9617
No log 38.0 456 0.9169 0.2842 0.9169 0.9575
No log 38.1667 458 0.9185 0.3536 0.9185 0.9584
No log 38.3333 460 0.9151 0.3536 0.9151 0.9566
No log 38.5 462 0.9065 0.3536 0.9065 0.9521
No log 38.6667 464 0.8999 0.3293 0.8999 0.9486
No log 38.8333 466 0.8917 0.3697 0.8917 0.9443
No log 39.0 468 0.9068 0.2690 0.9068 0.9522
No log 39.1667 470 0.9355 0.3080 0.9355 0.9672
No log 39.3333 472 0.9285 0.2761 0.9285 0.9636
No log 39.5 474 0.9356 0.2857 0.9356 0.9673
No log 39.6667 476 0.9167 0.2857 0.9167 0.9574
No log 39.8333 478 0.8983 0.3033 0.8983 0.9478
No log 40.0 480 0.9216 0.3236 0.9216 0.9600
No log 40.1667 482 0.9372 0.3799 0.9372 0.9681
No log 40.3333 484 0.9346 0.4186 0.9346 0.9668
No log 40.5 486 0.9210 0.3678 0.9210 0.9597
No log 40.6667 488 0.8939 0.3293 0.8939 0.9454
No log 40.8333 490 0.8911 0.2888 0.8911 0.9440
No log 41.0 492 0.8967 0.3301 0.8967 0.9469
No log 41.1667 494 0.8993 0.3713 0.8993 0.9483
No log 41.3333 496 0.8905 0.3301 0.8905 0.9437
No log 41.5 498 0.8875 0.3052 0.8875 0.9421
0.2391 41.6667 500 0.9184 0.4227 0.9184 0.9583
0.2391 41.8333 502 0.9273 0.4227 0.9273 0.9629
0.2391 42.0 504 0.9223 0.4227 0.9223 0.9604
0.2391 42.1667 506 0.9239 0.4093 0.9239 0.9612
0.2391 42.3333 508 0.9168 0.4216 0.9168 0.9575
0.2391 42.5 510 0.9194 0.3678 0.9194 0.9589

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

Finetuned
(4019)
this model