ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k9_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6042
  • Qwk: 0.375
  • Mse: 0.6042
  • Rmse: 0.7773

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.2319 -0.0160 3.2319 1.7978
No log 0.0930 4 1.7152 0.0255 1.7152 1.3097
No log 0.1395 6 1.2155 0.0255 1.2155 1.1025
No log 0.1860 8 0.9023 0.0118 0.9023 0.9499
No log 0.2326 10 0.6263 0.1467 0.6263 0.7914
No log 0.2791 12 0.6229 0.0569 0.6229 0.7892
No log 0.3256 14 0.6500 0.1008 0.6500 0.8062
No log 0.3721 16 0.9170 0.1402 0.9170 0.9576
No log 0.4186 18 0.6752 0.0388 0.6752 0.8217
No log 0.4651 20 0.6870 0.0 0.6870 0.8288
No log 0.5116 22 0.6346 0.0 0.6346 0.7966
No log 0.5581 24 0.5999 -0.0081 0.5999 0.7745
No log 0.6047 26 0.6353 0.0327 0.6353 0.7971
No log 0.6512 28 0.6596 0.0952 0.6596 0.8121
No log 0.6977 30 0.7022 0.0805 0.7022 0.8380
No log 0.7442 32 0.6506 0.0692 0.6506 0.8066
No log 0.7907 34 0.6850 0.0476 0.6850 0.8276
No log 0.8372 36 0.8403 0.1484 0.8403 0.9167
No log 0.8837 38 0.7983 0.1323 0.7983 0.8935
No log 0.9302 40 0.8224 0.1064 0.8224 0.9068
No log 0.9767 42 0.9035 0.1179 0.9035 0.9505
No log 1.0233 44 0.7064 0.2593 0.7064 0.8405
No log 1.0698 46 0.8482 0.0385 0.8482 0.9210
No log 1.1163 48 0.7098 0.2405 0.7098 0.8425
No log 1.1628 50 0.8081 0.1619 0.8081 0.8990
No log 1.2093 52 0.9779 0.1055 0.9779 0.9889
No log 1.2558 54 0.7628 0.2077 0.7628 0.8734
No log 1.3023 56 0.5638 0.1407 0.5638 0.7509
No log 1.3488 58 0.6617 0.0080 0.6617 0.8135
No log 1.3953 60 0.7394 0.0080 0.7394 0.8599
No log 1.4419 62 0.6013 0.0 0.6013 0.7754
No log 1.4884 64 0.6346 0.2201 0.6346 0.7966
No log 1.5349 66 0.7522 0.2077 0.7522 0.8673
No log 1.5814 68 0.8094 0.1228 0.8094 0.8997
No log 1.6279 70 0.5963 0.3684 0.5963 0.7722
No log 1.6744 72 0.6517 0.0604 0.6517 0.8073
No log 1.7209 74 0.7799 0.0380 0.7799 0.8831
No log 1.7674 76 0.6828 0.1195 0.6828 0.8263
No log 1.8140 78 0.5573 0.2179 0.5573 0.7465
No log 1.8605 80 0.6641 0.2637 0.6641 0.8149
No log 1.9070 82 0.5475 0.2471 0.5475 0.7399
No log 1.9535 84 0.5472 0.2832 0.5472 0.7397
No log 2.0 86 0.5965 0.3089 0.5965 0.7723
No log 2.0465 88 0.7196 0.1786 0.7196 0.8483
No log 2.0930 90 0.6216 0.3617 0.6216 0.7884
No log 2.1395 92 0.7520 0.1429 0.7520 0.8672
No log 2.1860 94 0.6626 0.3200 0.6626 0.8140
No log 2.2326 96 0.5550 0.2179 0.5550 0.7450
No log 2.2791 98 0.5617 0.2083 0.5617 0.7495
No log 2.3256 100 0.6442 0.3786 0.6442 0.8026
No log 2.3721 102 0.8466 0.2191 0.8466 0.9201
No log 2.4186 104 0.9294 0.1587 0.9294 0.9640
No log 2.4651 106 0.6929 0.3786 0.6929 0.8324
No log 2.5116 108 0.6121 0.3524 0.6121 0.7824
No log 2.5581 110 0.7215 0.2676 0.7215 0.8494
No log 2.6047 112 1.0525 0.2171 1.0525 1.0259
No log 2.6512 114 1.2549 0.0102 1.2549 1.1202
No log 2.6977 116 0.8882 0.2184 0.8882 0.9425
No log 2.7442 118 0.5858 0.2780 0.5858 0.7654
No log 2.7907 120 0.6727 0.3498 0.6727 0.8202
No log 2.8372 122 1.0466 0.1095 1.0466 1.0231
No log 2.8837 124 1.1144 0.0933 1.1144 1.0556
No log 2.9302 126 0.7676 0.2542 0.7676 0.8761
No log 2.9767 128 0.5370 0.2842 0.5370 0.7328
No log 3.0233 130 0.5705 0.4227 0.5705 0.7553
No log 3.0698 132 0.7552 0.2900 0.7552 0.8690
No log 3.1163 134 0.7387 0.2900 0.7387 0.8595
No log 3.1628 136 0.7200 0.2920 0.7200 0.8485
No log 3.2093 138 0.8429 0.2184 0.8429 0.9181
No log 3.2558 140 0.7714 0.3195 0.7714 0.8783
No log 3.3023 142 0.5788 0.3089 0.5788 0.7608
No log 3.3488 144 0.6245 0.4051 0.6245 0.7903
No log 3.3953 146 0.8190 0.2131 0.8190 0.9050
No log 3.4419 148 0.7552 0.3388 0.7552 0.8690
No log 3.4884 150 0.5827 0.24 0.5827 0.7633
No log 3.5349 152 0.5820 0.3180 0.5820 0.7629
No log 3.5814 154 0.7462 0.3702 0.7462 0.8638
No log 3.6279 156 0.8575 0.25 0.8575 0.9260
No log 3.6744 158 0.7534 0.3103 0.7534 0.8680
No log 3.7209 160 0.7710 0.2775 0.7710 0.8781
No log 3.7674 162 1.1379 0.1141 1.1379 1.0667
No log 3.8140 164 1.1172 0.0621 1.1172 1.0570
No log 3.8605 166 0.6464 0.3237 0.6464 0.8040
No log 3.9070 168 0.5666 0.2350 0.5666 0.7527
No log 3.9535 170 0.5603 0.3407 0.5603 0.7485
No log 4.0 172 0.8331 0.2542 0.8331 0.9127
No log 4.0465 174 1.0857 0.1439 1.0857 1.0420
No log 4.0930 176 0.9228 0.3030 0.9228 0.9606
No log 4.1395 178 0.5654 0.3299 0.5654 0.7519
No log 4.1860 180 0.5851 0.4286 0.5851 0.7649
No log 4.2326 182 0.5493 0.3684 0.5493 0.7412
No log 4.2791 184 0.7955 0.3414 0.7955 0.8919
No log 4.3256 186 1.2087 0.0345 1.2087 1.0994
No log 4.3721 188 1.1352 0.1886 1.1352 1.0654
No log 4.4186 190 0.6819 0.3585 0.6819 0.8258
No log 4.4651 192 0.5190 0.4536 0.5190 0.7204
No log 4.5116 194 0.6314 0.3242 0.6314 0.7946
No log 4.5581 196 0.5524 0.4167 0.5524 0.7433
No log 4.6047 198 0.5305 0.3118 0.5305 0.7284
No log 4.6512 200 0.6293 0.3706 0.6293 0.7933
No log 4.6977 202 0.5763 0.3797 0.5763 0.7591
No log 4.7442 204 0.5295 0.3878 0.5295 0.7277
No log 4.7907 206 0.5582 0.3299 0.5582 0.7471
No log 4.8372 208 0.5655 0.4227 0.5655 0.7520
No log 4.8837 210 0.9712 0.2115 0.9712 0.9855
No log 4.9302 212 1.2212 0.1429 1.2212 1.1051
No log 4.9767 214 0.9911 0.1642 0.9911 0.9956
No log 5.0233 216 0.6064 0.4510 0.6064 0.7787
No log 5.0698 218 0.5753 0.3704 0.5753 0.7585
No log 5.1163 220 0.5535 0.3905 0.5535 0.7440
No log 5.1628 222 0.5635 0.3878 0.5635 0.7507
No log 5.2093 224 0.5759 0.3641 0.5759 0.7589
No log 5.2558 226 0.6389 0.2965 0.6389 0.7993
No log 5.3023 228 0.5548 0.3263 0.5548 0.7448
No log 5.3488 230 0.4924 0.4225 0.4924 0.7017
No log 5.3953 232 0.5166 0.3263 0.5166 0.7188
No log 5.4419 234 0.6234 0.3725 0.6234 0.7896
No log 5.4884 236 0.6310 0.3725 0.6310 0.7943
No log 5.5349 238 0.6226 0.4059 0.6226 0.7890
No log 5.5814 240 0.6704 0.4010 0.6704 0.8188
No log 5.6279 242 0.5760 0.3398 0.5760 0.7589
No log 5.6744 244 0.5467 0.3398 0.5467 0.7394
No log 5.7209 246 0.5735 0.3398 0.5735 0.7573
No log 5.7674 248 0.5823 0.3892 0.5823 0.7631
No log 5.8140 250 0.6838 0.3623 0.6838 0.8269
No log 5.8605 252 0.7376 0.3514 0.7376 0.8588
No log 5.9070 254 0.6440 0.3892 0.6440 0.8025
No log 5.9535 256 0.5729 0.3237 0.5729 0.7569
No log 6.0 258 0.6166 0.3831 0.6166 0.7852
No log 6.0465 260 0.8510 0.2698 0.8510 0.9225
No log 6.0930 262 0.9259 0.2797 0.9259 0.9623
No log 6.1395 264 0.7171 0.3091 0.7171 0.8468
No log 6.1860 266 0.5636 0.3645 0.5636 0.7507
No log 6.2326 268 0.5692 0.3433 0.5692 0.7545
No log 6.2791 270 0.5523 0.3402 0.5523 0.7431
No log 6.3256 272 0.7389 0.3271 0.7389 0.8596
No log 6.3721 274 0.9861 0.1698 0.9861 0.9930
No log 6.4186 276 0.9280 0.1692 0.9280 0.9633
No log 6.4651 278 0.7160 0.3684 0.7160 0.8461
No log 6.5116 280 0.5457 0.3623 0.5457 0.7387
No log 6.5581 282 0.5655 0.3548 0.5655 0.7520
No log 6.6047 284 0.5698 0.3548 0.5698 0.7548
No log 6.6512 286 0.5315 0.3508 0.5315 0.7290
No log 6.6977 288 0.5608 0.3433 0.5608 0.7489
No log 6.7442 290 0.6712 0.3725 0.6712 0.8193
No log 6.7907 292 0.6719 0.3725 0.6719 0.8197
No log 6.8372 294 0.5937 0.4400 0.5937 0.7705
No log 6.8837 296 0.5283 0.4171 0.5283 0.7268
No log 6.9302 298 0.5282 0.36 0.5282 0.7268
No log 6.9767 300 0.5742 0.5025 0.5742 0.7578
No log 7.0233 302 0.6452 0.4400 0.6452 0.8032
No log 7.0698 304 0.6850 0.4340 0.6850 0.8277
No log 7.1163 306 0.6787 0.4783 0.6787 0.8238
No log 7.1628 308 0.6677 0.4455 0.6677 0.8172
No log 7.2093 310 0.6787 0.4450 0.6787 0.8239
No log 7.2558 312 0.6228 0.4112 0.6228 0.7892
No log 7.3023 314 0.5599 0.3469 0.5599 0.7483
No log 7.3488 316 0.5607 0.3990 0.5607 0.7488
No log 7.3953 318 0.6279 0.3814 0.6279 0.7924
No log 7.4419 320 0.6624 0.3367 0.6624 0.8139
No log 7.4884 322 0.6532 0.3367 0.6532 0.8082
No log 7.5349 324 0.6712 0.3367 0.6712 0.8193
No log 7.5814 326 0.6131 0.3439 0.6131 0.7830
No log 7.6279 328 0.5574 0.3161 0.5574 0.7466
No log 7.6744 330 0.5587 0.3469 0.5587 0.7475
No log 7.7209 332 0.6108 0.3814 0.6108 0.7815
No log 7.7674 334 0.7439 0.3301 0.7439 0.8625
No log 7.8140 336 0.8061 0.2881 0.8061 0.8978
No log 7.8605 338 0.7651 0.3973 0.7651 0.8747
No log 7.9070 340 0.6580 0.3814 0.6580 0.8111
No log 7.9535 342 0.5903 0.3535 0.5903 0.7683
No log 8.0 344 0.5864 0.3535 0.5864 0.7657
No log 8.0465 346 0.5949 0.3535 0.5949 0.7713
No log 8.0930 348 0.5765 0.3333 0.5765 0.7593
No log 8.1395 350 0.5708 0.3333 0.5708 0.7555
No log 8.1860 352 0.5678 0.3333 0.5678 0.7535
No log 8.2326 354 0.5711 0.3333 0.5711 0.7557
No log 8.2791 356 0.6076 0.375 0.6076 0.7795
No log 8.3256 358 0.6953 0.3663 0.6953 0.8339
No log 8.3721 360 0.7490 0.3607 0.7490 0.8654
No log 8.4186 362 0.7106 0.3548 0.7106 0.8430
No log 8.4651 364 0.6274 0.3299 0.6274 0.7921
No log 8.5116 366 0.5579 0.3469 0.5579 0.7469
No log 8.5581 368 0.5463 0.3508 0.5463 0.7391
No log 8.6047 370 0.5508 0.3469 0.5508 0.7421
No log 8.6512 372 0.5532 0.3469 0.5532 0.7438
No log 8.6977 374 0.5550 0.3469 0.5550 0.7450
No log 8.7442 376 0.5607 0.3469 0.5607 0.7488
No log 8.7907 378 0.5848 0.4051 0.5848 0.7647
No log 8.8372 380 0.6424 0.4112 0.6424 0.8015
No log 8.8837 382 0.6926 0.2941 0.6926 0.8322
No log 8.9302 384 0.6994 0.2919 0.6994 0.8363
No log 8.9767 386 0.6802 0.3367 0.6802 0.8247
No log 9.0233 388 0.6745 0.3367 0.6745 0.8213
No log 9.0698 390 0.6388 0.3814 0.6388 0.7992
No log 9.1163 392 0.6271 0.4400 0.6271 0.7919
No log 9.1628 394 0.6093 0.4051 0.6093 0.7806
No log 9.2093 396 0.5995 0.3535 0.5995 0.7743
No log 9.2558 398 0.5865 0.3161 0.5865 0.7658
No log 9.3023 400 0.5910 0.3161 0.5910 0.7687
No log 9.3488 402 0.6030 0.4400 0.6030 0.7766
No log 9.3953 404 0.6176 0.4112 0.6176 0.7858
No log 9.4419 406 0.6239 0.4112 0.6239 0.7899
No log 9.4884 408 0.6251 0.4112 0.6251 0.7906
No log 9.5349 410 0.6280 0.4112 0.6280 0.7925
No log 9.5814 412 0.6350 0.4112 0.6350 0.7969
No log 9.6279 414 0.6339 0.4112 0.6339 0.7962
No log 9.6744 416 0.6281 0.4112 0.6281 0.7925
No log 9.7209 418 0.6163 0.4112 0.6163 0.7850
No log 9.7674 420 0.6094 0.4112 0.6094 0.7807
No log 9.8140 422 0.6064 0.375 0.6064 0.7787
No log 9.8605 424 0.6052 0.375 0.6052 0.7779
No log 9.9070 426 0.6062 0.375 0.6062 0.7786
No log 9.9535 428 0.6052 0.375 0.6052 0.7779
No log 10.0 430 0.6042 0.375 0.6042 0.7773

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k9_task3_organization

Finetuned
(4023)
this model