ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0523
  • Qwk: 0.4790
  • Mse: 1.0523
  • Rmse: 1.0258

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 3.8827 -0.0151 3.8827 1.9705
No log 0.0690 4 2.3503 -0.0189 2.3503 1.5331
No log 0.1034 6 3.3976 -0.0343 3.3976 1.8433
No log 0.1379 8 4.3908 0.0171 4.3908 2.0954
No log 0.1724 10 2.9131 -0.0295 2.9131 1.7068
No log 0.2069 12 1.3737 0.0894 1.3737 1.1720
No log 0.2414 14 1.0962 0.2416 1.0962 1.0470
No log 0.2759 16 1.0484 0.3449 1.0484 1.0239
No log 0.3103 18 1.1433 0.3927 1.1433 1.0692
No log 0.3448 20 1.5454 0.1429 1.5454 1.2431
No log 0.3793 22 1.9269 0.1296 1.9269 1.3881
No log 0.4138 24 2.1738 0.1465 2.1738 1.4744
No log 0.4483 26 2.2513 0.1221 2.2513 1.5004
No log 0.4828 28 1.6165 0.2351 1.6165 1.2714
No log 0.5172 30 1.1390 0.2560 1.1390 1.0672
No log 0.5517 32 0.9792 0.2526 0.9792 0.9895
No log 0.5862 34 1.0101 0.3066 1.0101 1.0050
No log 0.6207 36 1.1235 0.4012 1.1235 1.0600
No log 0.6552 38 1.1648 0.3542 1.1648 1.0793
No log 0.6897 40 1.1022 0.3243 1.1022 1.0499
No log 0.7241 42 1.1218 0.3243 1.1218 1.0591
No log 0.7586 44 1.0679 0.2977 1.0679 1.0334
No log 0.7931 46 1.0611 0.2567 1.0611 1.0301
No log 0.8276 48 0.9578 0.3691 0.9578 0.9787
No log 0.8621 50 0.8676 0.2770 0.8676 0.9315
No log 0.8966 52 0.8749 0.2770 0.8749 0.9354
No log 0.9310 54 0.8883 0.3498 0.8883 0.9425
No log 0.9655 56 0.9563 0.3250 0.9563 0.9779
No log 1.0 58 1.1407 0.3283 1.1407 1.0681
No log 1.0345 60 1.0584 0.2711 1.0584 1.0288
No log 1.0690 62 0.9629 0.3129 0.9629 0.9813
No log 1.1034 64 1.0630 0.3902 1.0630 1.0310
No log 1.1379 66 1.1496 0.2471 1.1496 1.0722
No log 1.1724 68 1.1000 0.3024 1.1000 1.0488
No log 1.2069 70 0.9757 0.3326 0.9757 0.9878
No log 1.2414 72 0.9358 0.3198 0.9358 0.9674
No log 1.2759 74 0.9934 0.1794 0.9934 0.9967
No log 1.3103 76 1.0084 0.2188 1.0084 1.0042
No log 1.3448 78 1.0067 0.3713 1.0067 1.0033
No log 1.3793 80 0.9656 0.3455 0.9656 0.9827
No log 1.4138 82 0.9832 0.3198 0.9832 0.9915
No log 1.4483 84 1.0364 0.2795 1.0364 1.0180
No log 1.4828 86 1.0609 0.3063 1.0609 1.0300
No log 1.5172 88 1.0794 0.2582 1.0794 1.0389
No log 1.5517 90 1.0631 0.3860 1.0631 1.0311
No log 1.5862 92 1.0703 0.3915 1.0703 1.0346
No log 1.6207 94 1.2684 0.3070 1.2684 1.1262
No log 1.6552 96 1.4565 0.3333 1.4565 1.2068
No log 1.6897 98 1.3804 0.3318 1.3804 1.1749
No log 1.7241 100 1.1446 0.2837 1.1446 1.0698
No log 1.7586 102 1.0397 0.3133 1.0397 1.0197
No log 1.7931 104 1.0443 0.4373 1.0443 1.0219
No log 1.8276 106 1.0965 0.4392 1.0965 1.0471
No log 1.8621 108 1.0463 0.4206 1.0463 1.0229
No log 1.8966 110 1.0460 0.3482 1.0460 1.0228
No log 1.9310 112 1.0853 0.3711 1.0853 1.0418
No log 1.9655 114 1.1043 0.3758 1.1043 1.0509
No log 2.0 116 1.1332 0.3718 1.1332 1.0645
No log 2.0345 118 1.1377 0.3917 1.1377 1.0666
No log 2.0690 120 1.1435 0.3648 1.1435 1.0693
No log 2.1034 122 1.1013 0.4232 1.1013 1.0494
No log 2.1379 124 0.9759 0.3607 0.9759 0.9879
No log 2.1724 126 0.8873 0.3820 0.8873 0.9420
No log 2.2069 128 0.8663 0.3625 0.8663 0.9307
No log 2.2414 130 0.8625 0.4089 0.8625 0.9287
No log 2.2759 132 0.8833 0.4056 0.8833 0.9398
No log 2.3103 134 0.9124 0.4313 0.9124 0.9552
No log 2.3448 136 0.9654 0.3286 0.9654 0.9826
No log 2.3793 138 1.1335 0.3687 1.1335 1.0647
No log 2.4138 140 1.0551 0.3918 1.0551 1.0272
No log 2.4483 142 1.0378 0.4737 1.0378 1.0187
No log 2.4828 144 1.1433 0.4264 1.1433 1.0693
No log 2.5172 146 1.0562 0.4503 1.0562 1.0277
No log 2.5517 148 1.0218 0.2694 1.0218 1.0109
No log 2.5862 150 0.9906 0.3514 0.9906 0.9953
No log 2.6207 152 0.9382 0.3224 0.9382 0.9686
No log 2.6552 154 0.9202 0.2941 0.9202 0.9592
No log 2.6897 156 0.9203 0.3357 0.9203 0.9593
No log 2.7241 158 0.9347 0.4272 0.9347 0.9668
No log 2.7586 160 0.9881 0.4273 0.9881 0.9940
No log 2.7931 162 1.0588 0.4309 1.0588 1.0290
No log 2.8276 164 1.1410 0.4973 1.1410 1.0682
No log 2.8621 166 1.3562 0.2730 1.3562 1.1646
No log 2.8966 168 1.4988 0.2789 1.4988 1.2242
No log 2.9310 170 1.3422 0.2230 1.3422 1.1585
No log 2.9655 172 1.1816 0.4020 1.1816 1.0870
No log 3.0 174 1.1237 0.3924 1.1237 1.0601
No log 3.0345 176 1.1163 0.3843 1.1163 1.0566
No log 3.0690 178 1.2763 0.4032 1.2763 1.1297
No log 3.1034 180 1.3986 0.3243 1.3986 1.1826
No log 3.1379 182 1.2453 0.3345 1.2453 1.1159
No log 3.1724 184 0.9703 0.4161 0.9703 0.9850
No log 3.2069 186 0.9294 0.3804 0.9294 0.9641
No log 3.2414 188 0.9516 0.4845 0.9516 0.9755
No log 3.2759 190 0.9303 0.4223 0.9303 0.9645
No log 3.3103 192 1.0026 0.4196 1.0026 1.0013
No log 3.3448 194 1.1127 0.4976 1.1127 1.0548
No log 3.3793 196 1.0237 0.4568 1.0237 1.0118
No log 3.4138 198 0.9068 0.3454 0.9068 0.9523
No log 3.4483 200 0.8817 0.4118 0.8817 0.9390
No log 3.4828 202 0.9082 0.4729 0.9082 0.9530
No log 3.5172 204 0.9099 0.4581 0.9099 0.9539
No log 3.5517 206 0.8902 0.4350 0.8902 0.9435
No log 3.5862 208 0.9101 0.4215 0.9101 0.9540
No log 3.6207 210 0.9443 0.4807 0.9443 0.9717
No log 3.6552 212 0.9715 0.4369 0.9715 0.9856
No log 3.6897 214 0.9362 0.3792 0.9362 0.9676
No log 3.7241 216 0.9121 0.3742 0.9121 0.9550
No log 3.7586 218 0.9766 0.4585 0.9766 0.9882
No log 3.7931 220 1.1081 0.4787 1.1081 1.0527
No log 3.8276 222 1.1726 0.4301 1.1726 1.0829
No log 3.8621 224 1.0732 0.4585 1.0732 1.0360
No log 3.8966 226 0.9207 0.4230 0.9207 0.9595
No log 3.9310 228 0.8838 0.4114 0.8838 0.9401
No log 3.9655 230 0.9041 0.4230 0.9041 0.9508
No log 4.0 232 0.9400 0.4745 0.9400 0.9696
No log 4.0345 234 0.9597 0.4930 0.9597 0.9796
No log 4.0690 236 0.9359 0.4306 0.9359 0.9674
No log 4.1034 238 0.8728 0.4069 0.8728 0.9342
No log 4.1379 240 0.8607 0.3981 0.8607 0.9277
No log 4.1724 242 0.8876 0.4405 0.8876 0.9421
No log 4.2069 244 0.8922 0.4160 0.8922 0.9445
No log 4.2414 246 0.9119 0.5073 0.9119 0.9549
No log 4.2759 248 0.9036 0.4364 0.9036 0.9506
No log 4.3103 250 0.9250 0.4760 0.9250 0.9618
No log 4.3448 252 0.9678 0.4076 0.9678 0.9838
No log 4.3793 254 0.9431 0.4063 0.9431 0.9711
No log 4.4138 256 0.9360 0.3705 0.9360 0.9675
No log 4.4483 258 0.9044 0.4186 0.9044 0.9510
No log 4.4828 260 0.9198 0.3705 0.9198 0.9590
No log 4.5172 262 1.0117 0.4058 1.0117 1.0058
No log 4.5517 264 1.0365 0.4286 1.0365 1.0181
No log 4.5862 266 0.9587 0.4033 0.9587 0.9791
No log 4.6207 268 0.9625 0.3842 0.9625 0.9811
No log 4.6552 270 1.0778 0.4405 1.0778 1.0382
No log 4.6897 272 1.2736 0.3972 1.2736 1.1286
No log 4.7241 274 1.2229 0.4874 1.2229 1.1059
No log 4.7586 276 1.0118 0.4510 1.0118 1.0059
No log 4.7931 278 0.8990 0.3640 0.8990 0.9482
No log 4.8276 280 0.8903 0.3640 0.8903 0.9435
No log 4.8621 282 0.9633 0.4171 0.9633 0.9815
No log 4.8966 284 1.1899 0.4491 1.1899 1.0908
No log 4.9310 286 1.1471 0.4487 1.1471 1.0710
No log 4.9655 288 0.9659 0.3512 0.9659 0.9828
No log 5.0 290 0.9598 0.3607 0.9598 0.9797
No log 5.0345 292 1.0185 0.3629 1.0185 1.0092
No log 5.0690 294 1.0927 0.3766 1.0927 1.0453
No log 5.1034 296 1.2132 0.4186 1.2132 1.1015
No log 5.1379 298 1.1753 0.3972 1.1753 1.0841
No log 5.1724 300 1.0267 0.3436 1.0267 1.0132
No log 5.2069 302 0.9602 0.3144 0.9602 0.9799
No log 5.2414 304 0.9394 0.3308 0.9394 0.9692
No log 5.2759 306 0.9721 0.3463 0.9721 0.9859
No log 5.3103 308 1.0627 0.3972 1.0627 1.0309
No log 5.3448 310 1.0346 0.4589 1.0346 1.0172
No log 5.3793 312 0.8910 0.3809 0.8910 0.9439
No log 5.4138 314 0.8535 0.3437 0.8535 0.9238
No log 5.4483 316 0.8935 0.3447 0.8935 0.9453
No log 5.4828 318 0.9043 0.3842 0.9043 0.9510
No log 5.5172 320 1.0219 0.4613 1.0219 1.0109
No log 5.5517 322 1.1213 0.4693 1.1213 1.0589
No log 5.5862 324 1.0448 0.4615 1.0448 1.0222
No log 5.6207 326 0.9554 0.4376 0.9554 0.9775
No log 5.6552 328 0.9215 0.4273 0.9215 0.9600
No log 5.6897 330 0.9059 0.3678 0.9059 0.9518
No log 5.7241 332 0.9500 0.3728 0.9500 0.9747
No log 5.7586 334 0.9382 0.3688 0.9382 0.9686
No log 5.7931 336 0.8925 0.2939 0.8925 0.9447
No log 5.8276 338 0.8878 0.4229 0.8878 0.9422
No log 5.8621 340 0.8877 0.4013 0.8877 0.9422
No log 5.8966 342 0.9140 0.3923 0.9140 0.9560
No log 5.9310 344 0.9673 0.4822 0.9673 0.9835
No log 5.9655 346 1.0367 0.4589 1.0367 1.0182
No log 6.0 348 0.9746 0.4792 0.9746 0.9872
No log 6.0345 350 0.8739 0.3535 0.8739 0.9348
No log 6.0690 352 0.8633 0.3572 0.8633 0.9291
No log 6.1034 354 0.8985 0.4677 0.8985 0.9479
No log 6.1379 356 0.9457 0.4449 0.9457 0.9725
No log 6.1724 358 1.0018 0.3975 1.0018 1.0009
No log 6.2069 360 1.0198 0.3728 1.0198 1.0099
No log 6.2414 362 0.9638 0.3624 0.9638 0.9817
No log 6.2759 364 0.9300 0.3772 0.9300 0.9644
No log 6.3103 366 0.9246 0.3860 0.9246 0.9616
No log 6.3448 368 0.9481 0.3877 0.9481 0.9737
No log 6.3793 370 1.0751 0.4497 1.0751 1.0369
No log 6.4138 372 1.0104 0.4400 1.0104 1.0052
No log 6.4483 374 0.8589 0.4003 0.8589 0.9267
No log 6.4828 376 0.8292 0.5179 0.8292 0.9106
No log 6.5172 378 0.8432 0.5071 0.8432 0.9183
No log 6.5517 380 0.8499 0.3601 0.8499 0.9219
No log 6.5862 382 1.0001 0.4694 1.0001 1.0001
No log 6.6207 384 1.1535 0.4681 1.1535 1.0740
No log 6.6552 386 1.1194 0.4592 1.1194 1.0580
No log 6.6897 388 0.9916 0.4228 0.9916 0.9958
No log 6.7241 390 0.9721 0.3065 0.9721 0.9859
No log 6.7586 392 0.9257 0.3159 0.9257 0.9621
No log 6.7931 394 0.8857 0.2416 0.8857 0.9411
No log 6.8276 396 0.8897 0.2951 0.8897 0.9432
No log 6.8621 398 0.9185 0.3029 0.9185 0.9584
No log 6.8966 400 0.9333 0.3134 0.9333 0.9661
No log 6.9310 402 0.9419 0.3207 0.9419 0.9705
No log 6.9655 404 1.0028 0.3917 1.0028 1.0014
No log 7.0 406 1.0369 0.3946 1.0369 1.0183
No log 7.0345 408 1.0417 0.4116 1.0417 1.0206
No log 7.0690 410 1.0446 0.3348 1.0446 1.0221
No log 7.1034 412 1.0095 0.2742 1.0095 1.0047
No log 7.1379 414 1.0162 0.3430 1.0162 1.0081
No log 7.1724 416 0.9666 0.3404 0.9666 0.9832
No log 7.2069 418 0.9279 0.2786 0.9279 0.9633
No log 7.2414 420 0.9377 0.3457 0.9377 0.9683
No log 7.2759 422 0.9122 0.3457 0.9122 0.9551
No log 7.3103 424 0.8928 0.2899 0.8928 0.9449
No log 7.3448 426 0.9117 0.3601 0.9117 0.9549
No log 7.3793 428 0.8795 0.3289 0.8795 0.9378
No log 7.4138 430 0.8721 0.3357 0.8721 0.9339
No log 7.4483 432 0.8763 0.3224 0.8763 0.9361
No log 7.4828 434 0.9084 0.5222 0.9084 0.9531
No log 7.5172 436 0.8770 0.4268 0.8770 0.9365
No log 7.5517 438 0.8508 0.3980 0.8508 0.9224
No log 7.5862 440 0.9223 0.4205 0.9223 0.9604
No log 7.6207 442 1.0067 0.4400 1.0067 1.0034
No log 7.6552 444 0.9343 0.4822 0.9343 0.9666
No log 7.6897 446 0.8419 0.4247 0.8419 0.9175
No log 7.7241 448 0.8061 0.4463 0.8061 0.8979
No log 7.7586 450 0.8104 0.4221 0.8104 0.9002
No log 7.7931 452 0.8290 0.4327 0.8290 0.9105
No log 7.8276 454 0.8036 0.4088 0.8036 0.8964
No log 7.8621 456 0.7561 0.5431 0.7561 0.8695
No log 7.8966 458 0.7412 0.5057 0.7412 0.8609
No log 7.9310 460 0.7442 0.4706 0.7442 0.8627
No log 7.9655 462 0.7870 0.4421 0.7870 0.8871
No log 8.0 464 0.8435 0.4103 0.8435 0.9184
No log 8.0345 466 0.8581 0.4306 0.8581 0.9263
No log 8.0690 468 0.8054 0.4425 0.8054 0.8974
No log 8.1034 470 0.7978 0.4444 0.7978 0.8932
No log 8.1379 472 0.7923 0.3985 0.7923 0.8901
No log 8.1724 474 0.7926 0.4235 0.7926 0.8903
No log 8.2069 476 0.7973 0.4297 0.7973 0.8929
No log 8.2414 478 0.7842 0.3967 0.7842 0.8856
No log 8.2759 480 0.7854 0.3970 0.7854 0.8862
No log 8.3103 482 0.8013 0.4622 0.8013 0.8952
No log 8.3448 484 0.8078 0.4622 0.8078 0.8988
No log 8.3793 486 0.7642 0.5328 0.7642 0.8742
No log 8.4138 488 0.8224 0.4719 0.8224 0.9069
No log 8.4483 490 1.0762 0.5161 1.0762 1.0374
No log 8.4828 492 1.1014 0.5145 1.1014 1.0495
No log 8.5172 494 0.9148 0.4810 0.9148 0.9565
No log 8.5517 496 0.7722 0.4577 0.7722 0.8787
No log 8.5862 498 0.8145 0.5435 0.8145 0.9025
0.312 8.6207 500 0.8249 0.4903 0.8249 0.9083
0.312 8.6552 502 0.8091 0.4072 0.8091 0.8995
0.312 8.6897 504 0.8141 0.3455 0.8141 0.9023
0.312 8.7241 506 0.8404 0.3563 0.8404 0.9167
0.312 8.7586 508 0.9525 0.4156 0.9525 0.9759
0.312 8.7931 510 1.0523 0.4790 1.0523 1.0258

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task5_organization

Finetuned
(4019)
this model