ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9607
  • Qwk: 0.3693
  • Mse: 0.9607
  • Rmse: 0.9801

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0238 2 3.9453 -0.0254 3.9453 1.9863
No log 0.0476 4 2.3001 0.0203 2.3001 1.5166
No log 0.0714 6 1.5631 -0.0180 1.5631 1.2502
No log 0.0952 8 1.2223 0.2543 1.2223 1.1056
No log 0.1190 10 1.0466 0.2061 1.0466 1.0230
No log 0.1429 12 0.9997 0.2416 0.9997 0.9999
No log 0.1667 14 1.0407 0.1901 1.0407 1.0201
No log 0.1905 16 1.2339 0.1628 1.2339 1.1108
No log 0.2143 18 1.5974 0.1004 1.5974 1.2639
No log 0.2381 20 1.6884 0.0416 1.6884 1.2994
No log 0.2619 22 1.4663 0.0883 1.4663 1.2109
No log 0.2857 24 1.2079 0.2236 1.2079 1.0990
No log 0.3095 26 1.0128 0.2490 1.0128 1.0064
No log 0.3333 28 1.0455 0.1532 1.0455 1.0225
No log 0.3571 30 1.1363 0.2038 1.1363 1.0660
No log 0.3810 32 1.4882 0.1428 1.4882 1.2199
No log 0.4048 34 2.0126 0.0494 2.0126 1.4187
No log 0.4286 36 1.9009 0.0422 1.9009 1.3787
No log 0.4524 38 1.5664 0.1294 1.5664 1.2516
No log 0.4762 40 1.5625 0.1379 1.5625 1.2500
No log 0.5 42 1.7348 0.1174 1.7348 1.3171
No log 0.5238 44 1.5068 0.1667 1.5068 1.2275
No log 0.5476 46 1.2689 0.2574 1.2689 1.1264
No log 0.5714 48 1.2214 0.1995 1.2214 1.1052
No log 0.5952 50 1.3520 0.2238 1.3520 1.1628
No log 0.6190 52 1.3102 0.2484 1.3102 1.1446
No log 0.6429 54 1.1242 0.2352 1.1242 1.0603
No log 0.6667 56 1.0316 0.24 1.0316 1.0157
No log 0.6905 58 1.0260 0.2254 1.0260 1.0129
No log 0.7143 60 1.0972 0.2208 1.0972 1.0475
No log 0.7381 62 1.3092 0.2239 1.3092 1.1442
No log 0.7619 64 1.4478 0.0964 1.4478 1.2032
No log 0.7857 66 1.4465 0.1064 1.4465 1.2027
No log 0.8095 68 1.4278 0.1064 1.4278 1.1949
No log 0.8333 70 1.1616 0.2913 1.1616 1.0778
No log 0.8571 72 1.1008 0.3062 1.1008 1.0492
No log 0.8810 74 1.1520 0.2755 1.1520 1.0733
No log 0.9048 76 1.1037 0.3102 1.1037 1.0506
No log 0.9286 78 1.0001 0.3951 1.0001 1.0001
No log 0.9524 80 0.9763 0.3951 0.9763 0.9881
No log 0.9762 82 0.9624 0.3933 0.9624 0.9810
No log 1.0 84 1.0184 0.3402 1.0184 1.0091
No log 1.0238 86 1.0104 0.3590 1.0104 1.0052
No log 1.0476 88 1.0267 0.4631 1.0267 1.0133
No log 1.0714 90 1.0810 0.3480 1.0810 1.0397
No log 1.0952 92 1.0145 0.4631 1.0145 1.0072
No log 1.1190 94 1.0482 0.3117 1.0482 1.0238
No log 1.1429 96 1.1914 0.2465 1.1914 1.0915
No log 1.1667 98 1.1021 0.2465 1.1021 1.0498
No log 1.1905 100 0.9548 0.4175 0.9548 0.9771
No log 1.2143 102 1.0441 0.4257 1.0441 1.0218
No log 1.2381 104 1.1006 0.3621 1.1006 1.0491
No log 1.2619 106 1.0425 0.4025 1.0425 1.0210
No log 1.2857 108 1.0469 0.2693 1.0469 1.0232
No log 1.3095 110 1.0340 0.2714 1.0340 1.0169
No log 1.3333 112 0.9856 0.2921 0.9856 0.9928
No log 1.3571 114 0.9903 0.3765 0.9903 0.9951
No log 1.3810 116 0.9762 0.4273 0.9762 0.9880
No log 1.4048 118 0.9506 0.3620 0.9506 0.9750
No log 1.4286 120 0.9503 0.3966 0.9503 0.9748
No log 1.4524 122 0.9380 0.3620 0.9380 0.9685
No log 1.4762 124 0.9513 0.3821 0.9513 0.9754
No log 1.5 126 0.9503 0.3966 0.9503 0.9748
No log 1.5238 128 0.9414 0.4444 0.9414 0.9703
No log 1.5476 130 1.0242 0.4499 1.0242 1.0120
No log 1.5714 132 1.1261 0.3361 1.1261 1.0612
No log 1.5952 134 1.0596 0.3654 1.0596 1.0294
No log 1.6190 136 0.9891 0.3380 0.9891 0.9945
No log 1.6429 138 1.0070 0.3414 1.0070 1.0035
No log 1.6667 140 1.0259 0.2928 1.0259 1.0129
No log 1.6905 142 1.0133 0.3083 1.0133 1.0066
No log 1.7143 144 0.9511 0.3821 0.9511 0.9753
No log 1.7381 146 0.9150 0.3729 0.9150 0.9566
No log 1.7619 148 0.9089 0.4271 0.9089 0.9533
No log 1.7857 150 0.9129 0.3765 0.9129 0.9555
No log 1.8095 152 0.9290 0.3974 0.9290 0.9638
No log 1.8333 154 0.8677 0.4401 0.8677 0.9315
No log 1.8571 156 0.8682 0.4720 0.8682 0.9318
No log 1.8810 158 0.9001 0.4734 0.9001 0.9487
No log 1.9048 160 0.8802 0.4181 0.8802 0.9382
No log 1.9286 162 0.8879 0.4568 0.8879 0.9423
No log 1.9524 164 1.0723 0.3696 1.0723 1.0355
No log 1.9762 166 1.1257 0.3983 1.1257 1.0610
No log 2.0 168 0.9038 0.3879 0.9038 0.9507
No log 2.0238 170 0.8696 0.4198 0.8696 0.9325
No log 2.0476 172 0.8705 0.4321 0.8705 0.9330
No log 2.0714 174 0.9222 0.3804 0.9222 0.9603
No log 2.0952 176 1.0103 0.4020 1.0103 1.0051
No log 2.1190 178 1.0052 0.3446 1.0052 1.0026
No log 2.1429 180 1.0023 0.3820 1.0023 1.0011
No log 2.1667 182 0.9776 0.4219 0.9776 0.9888
No log 2.1905 184 0.9584 0.4208 0.9584 0.9790
No log 2.2143 186 0.9354 0.3933 0.9354 0.9671
No log 2.2381 188 0.9238 0.3933 0.9238 0.9612
No log 2.2619 190 0.9492 0.4048 0.9492 0.9743
No log 2.2857 192 0.9822 0.3559 0.9822 0.9911
No log 2.3095 194 0.9961 0.3661 0.9961 0.9981
No log 2.3333 196 1.0772 0.2507 1.0772 1.0379
No log 2.3571 198 1.1919 0.1807 1.1919 1.0918
No log 2.3810 200 1.2118 0.1291 1.2118 1.1008
No log 2.4048 202 1.0836 0.2579 1.0836 1.0410
No log 2.4286 204 1.0951 0.2878 1.0951 1.0464
No log 2.4524 206 1.0537 0.3389 1.0537 1.0265
No log 2.4762 208 1.0301 0.2995 1.0301 1.0149
No log 2.5 210 0.9766 0.3210 0.9766 0.9882
No log 2.5238 212 0.9395 0.3838 0.9395 0.9693
No log 2.5476 214 0.9454 0.3513 0.9454 0.9723
No log 2.5714 216 0.9408 0.3879 0.9408 0.9700
No log 2.5952 218 0.9214 0.3879 0.9214 0.9599
No log 2.6190 220 0.9958 0.3842 0.9958 0.9979
No log 2.6429 222 1.0806 0.3972 1.0806 1.0395
No log 2.6667 224 1.0436 0.3455 1.0436 1.0216
No log 2.6905 226 1.0357 0.2921 1.0357 1.0177
No log 2.7143 228 1.0165 0.3418 1.0165 1.0082
No log 2.7381 230 1.0025 0.3149 1.0025 1.0012
No log 2.7619 232 1.0171 0.3005 1.0171 1.0085
No log 2.7857 234 1.0112 0.3124 1.0112 1.0056
No log 2.8095 236 0.9420 0.3590 0.9420 0.9706
No log 2.8333 238 0.8979 0.4423 0.8979 0.9476
No log 2.8571 240 0.9116 0.4244 0.9116 0.9548
No log 2.8810 242 0.9020 0.4244 0.9020 0.9497
No log 2.9048 244 0.9516 0.3404 0.9516 0.9755
No log 2.9286 246 1.0026 0.3957 1.0026 1.0013
No log 2.9524 248 1.0555 0.3826 1.0555 1.0274
No log 2.9762 250 1.1093 0.3938 1.1093 1.0532
No log 3.0 252 1.0870 0.2655 1.0870 1.0426
No log 3.0238 254 1.0317 0.3032 1.0317 1.0157
No log 3.0476 256 0.9934 0.3932 0.9934 0.9967
No log 3.0714 258 0.9968 0.3577 0.9968 0.9984
No log 3.0952 260 1.0140 0.2993 1.0140 1.0070
No log 3.1190 262 1.0046 0.3725 1.0046 1.0023
No log 3.1429 264 1.0020 0.3728 1.0020 1.0010
No log 3.1667 266 0.9659 0.3590 0.9659 0.9828
No log 3.1905 268 0.9681 0.4119 0.9681 0.9839
No log 3.2143 270 0.9232 0.3914 0.9232 0.9608
No log 3.2381 272 0.9225 0.4027 0.9225 0.9605
No log 3.2619 274 1.0247 0.3845 1.0247 1.0123
No log 3.2857 276 1.0425 0.3488 1.0425 1.0210
No log 3.3095 278 0.9653 0.3269 0.9653 0.9825
No log 3.3333 280 0.9430 0.4010 0.9430 0.9711
No log 3.3571 282 0.9755 0.4010 0.9755 0.9877
No log 3.3810 284 1.0750 0.3846 1.0750 1.0368
No log 3.4048 286 1.2045 0.3326 1.2045 1.0975
No log 3.4286 288 1.2001 0.3642 1.2001 1.0955
No log 3.4524 290 1.1175 0.3945 1.1175 1.0571
No log 3.4762 292 1.0582 0.3483 1.0582 1.0287
No log 3.5 294 1.0747 0.3139 1.0747 1.0367
No log 3.5238 296 1.0717 0.3510 1.0717 1.0352
No log 3.5476 298 1.0107 0.3615 1.0107 1.0053
No log 3.5714 300 1.0278 0.3217 1.0278 1.0138
No log 3.5952 302 1.0867 0.3250 1.0867 1.0424
No log 3.6190 304 1.1375 0.3654 1.1375 1.0665
No log 3.6429 306 1.0610 0.3753 1.0610 1.0300
No log 3.6667 308 0.9427 0.4271 0.9427 0.9709
No log 3.6905 310 0.9297 0.4439 0.9297 0.9642
No log 3.7143 312 0.9459 0.3802 0.9459 0.9726
No log 3.7381 314 1.0154 0.3217 1.0154 1.0076
No log 3.7619 316 1.0296 0.3217 1.0296 1.0147
No log 3.7857 318 1.0213 0.3103 1.0213 1.0106
No log 3.8095 320 0.9539 0.3725 0.9539 0.9767
No log 3.8333 322 0.9242 0.3284 0.9242 0.9614
No log 3.8571 324 0.9309 0.3378 0.9309 0.9648
No log 3.8810 326 0.9460 0.3634 0.9460 0.9726
No log 3.9048 328 0.9365 0.3800 0.9365 0.9677
No log 3.9286 330 0.9452 0.3836 0.9452 0.9722
No log 3.9524 332 0.9487 0.3797 0.9487 0.9740
No log 3.9762 334 0.9823 0.2767 0.9823 0.9911
No log 4.0 336 1.0560 0.3584 1.0560 1.0276
No log 4.0238 338 1.0315 0.3361 1.0315 1.0156
No log 4.0476 340 0.9593 0.2790 0.9593 0.9795
No log 4.0714 342 0.9379 0.3356 0.9379 0.9684
No log 4.0952 344 0.9067 0.3737 0.9067 0.9522
No log 4.1190 346 0.8949 0.3915 0.8949 0.9460
No log 4.1429 348 0.9066 0.4661 0.9066 0.9522
No log 4.1667 350 0.9366 0.4289 0.9366 0.9678
No log 4.1905 352 0.9274 0.4661 0.9274 0.9630
No log 4.2143 354 0.9461 0.3784 0.9461 0.9727
No log 4.2381 356 0.9994 0.3502 0.9994 0.9997
No log 4.2619 358 1.0705 0.3196 1.0705 1.0346
No log 4.2857 360 1.0499 0.2655 1.0499 1.0246
No log 4.3095 362 0.9835 0.3243 0.9835 0.9917
No log 4.3333 364 0.9618 0.3263 0.9618 0.9807
No log 4.3571 366 0.9620 0.3540 0.9620 0.9808
No log 4.3810 368 0.9846 0.4153 0.9846 0.9923
No log 4.4048 370 1.0865 0.3626 1.0865 1.0424
No log 4.4286 372 1.1860 0.3333 1.1860 1.0890
No log 4.4524 374 1.1436 0.3756 1.1436 1.0694
No log 4.4762 376 1.0334 0.3765 1.0334 1.0166
No log 4.5 378 1.0157 0.3210 1.0157 1.0078
No log 4.5238 380 1.0324 0.3004 1.0324 1.0161
No log 4.5476 382 0.9986 0.3343 0.9986 0.9993
No log 4.5714 384 0.9774 0.2995 0.9774 0.9886
No log 4.5952 386 0.9925 0.3842 0.9925 0.9962
No log 4.6190 388 0.9489 0.2910 0.9489 0.9741
No log 4.6429 390 0.8974 0.3095 0.8974 0.9473
No log 4.6667 392 0.9096 0.3229 0.9096 0.9537
No log 4.6905 394 0.9573 0.3091 0.9573 0.9784
No log 4.7143 396 1.0460 0.2974 1.0460 1.0227
No log 4.7381 398 1.0796 0.3427 1.0796 1.0391
No log 4.7619 400 1.0950 0.4492 1.0950 1.0464
No log 4.7857 402 1.0573 0.4119 1.0573 1.0282
No log 4.8095 404 1.0208 0.3747 1.0208 1.0103
No log 4.8333 406 1.0239 0.3861 1.0239 1.0119
No log 4.8571 408 1.0376 0.4119 1.0376 1.0186
No log 4.8810 410 0.9971 0.4008 0.9971 0.9986
No log 4.9048 412 0.9997 0.4373 0.9997 0.9998
No log 4.9286 414 1.0218 0.4379 1.0218 1.0108
No log 4.9524 416 0.9744 0.4373 0.9744 0.9871
No log 4.9762 418 0.9294 0.4153 0.9294 0.9641
No log 5.0 420 0.9382 0.3765 0.9382 0.9686
No log 5.0238 422 0.9386 0.3540 0.9386 0.9688
No log 5.0476 424 0.9804 0.3338 0.9804 0.9901
No log 5.0714 426 0.9917 0.3634 0.9917 0.9958
No log 5.0952 428 0.9766 0.3652 0.9766 0.9883
No log 5.1190 430 0.9812 0.3652 0.9812 0.9905
No log 5.1429 432 0.9576 0.3915 0.9576 0.9786
No log 5.1667 434 0.9589 0.3797 0.9589 0.9792
No log 5.1905 436 0.9664 0.3059 0.9664 0.9831
No log 5.2143 438 0.9619 0.3802 0.9619 0.9808
No log 5.2381 440 1.0598 0.3862 1.0598 1.0295
No log 5.2619 442 1.1043 0.3953 1.1043 1.0509
No log 5.2857 444 1.0164 0.3939 1.0164 1.0081
No log 5.3095 446 0.9081 0.3609 0.9081 0.9529
No log 5.3333 448 0.8922 0.3622 0.8922 0.9446
No log 5.3571 450 0.9033 0.3820 0.9033 0.9504
No log 5.3810 452 0.9367 0.4503 0.9367 0.9678
No log 5.4048 454 1.0630 0.4379 1.0630 1.0310
No log 5.4286 456 1.1063 0.4002 1.1063 1.0518
No log 5.4524 458 1.0431 0.3771 1.0431 1.0213
No log 5.4762 460 0.9669 0.3615 0.9669 0.9833
No log 5.5 462 0.9095 0.3744 0.9095 0.9537
No log 5.5238 464 0.9132 0.3762 0.9132 0.9556
No log 5.5476 466 0.9480 0.3048 0.9480 0.9737
No log 5.5714 468 1.0176 0.3463 1.0176 1.0088
No log 5.5952 470 1.0958 0.2947 1.0958 1.0468
No log 5.6190 472 1.0975 0.3149 1.0975 1.0476
No log 5.6429 474 1.0874 0.3453 1.0874 1.0428
No log 5.6667 476 1.1128 0.3595 1.1128 1.0549
No log 5.6905 478 1.1094 0.3573 1.1094 1.0533
No log 5.7143 480 1.1533 0.2749 1.1533 1.0739
No log 5.7381 482 1.2202 0.2870 1.2202 1.1046
No log 5.7619 484 1.1861 0.2832 1.1861 1.0891
No log 5.7857 486 1.0657 0.2907 1.0657 1.0323
No log 5.8095 488 1.0063 0.3383 1.0063 1.0031
No log 5.8333 490 0.9842 0.3725 0.9842 0.9921
No log 5.8571 492 1.0041 0.3565 1.0041 1.0020
No log 5.8810 494 1.0170 0.4181 1.0170 1.0085
No log 5.9048 496 1.0366 0.4191 1.0366 1.0182
No log 5.9286 498 1.0184 0.4375 1.0184 1.0092
0.2992 5.9524 500 0.9742 0.4630 0.9742 0.9870
0.2992 5.9762 502 0.9394 0.4271 0.9394 0.9692
0.2992 6.0 504 0.9537 0.4503 0.9537 0.9766
0.2992 6.0238 506 0.9620 0.4503 0.9620 0.9808
0.2992 6.0476 508 0.9492 0.4271 0.9492 0.9743
0.2992 6.0714 510 0.9607 0.3693 0.9607 0.9801

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k18_task5_organization

Finetuned
(4019)
this model