ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k9_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5442
  • Qwk: 0.3892
  • Mse: 0.5442
  • Rmse: 0.7377

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.4572 -0.0066 3.4572 1.8594
No log 0.0930 4 1.7549 -0.0101 1.7549 1.3247
No log 0.1395 6 1.1272 0.0376 1.1272 1.0617
No log 0.1860 8 1.7482 0.0617 1.7482 1.3222
No log 0.2326 10 0.9512 0.0847 0.9512 0.9753
No log 0.2791 12 0.8519 0.0957 0.8519 0.9230
No log 0.3256 14 0.6201 0.0388 0.6201 0.7874
No log 0.3721 16 0.6267 0.0476 0.6267 0.7917
No log 0.4186 18 0.9489 0.0244 0.9489 0.9741
No log 0.4651 20 0.9988 0.0522 0.9988 0.9994
No log 0.5116 22 0.7931 0.0980 0.7931 0.8906
No log 0.5581 24 0.6205 -0.0303 0.6205 0.7877
No log 0.6047 26 0.6066 0.0569 0.6066 0.7789
No log 0.6512 28 0.6063 -0.0081 0.6063 0.7787
No log 0.6977 30 0.6731 0.0877 0.6731 0.8204
No log 0.7442 32 0.8507 0.1050 0.8507 0.9223
No log 0.7907 34 0.7794 0.1238 0.7794 0.8828
No log 0.8372 36 0.6049 0.2308 0.6049 0.7778
No log 0.8837 38 0.7732 0.1475 0.7732 0.8793
No log 0.9302 40 0.8512 0.1930 0.8512 0.9226
No log 0.9767 42 0.6090 0.2000 0.6090 0.7804
No log 1.0233 44 1.2304 0.0657 1.2304 1.1092
No log 1.0698 46 1.6672 0.0924 1.6672 1.2912
No log 1.1163 48 0.9711 0.1417 0.9711 0.9855
No log 1.1628 50 0.5861 0.2308 0.5861 0.7656
No log 1.2093 52 0.6929 0.1915 0.6929 0.8324
No log 1.2558 54 0.5746 0.1781 0.5746 0.7580
No log 1.3023 56 0.6024 0.3103 0.6024 0.7761
No log 1.3488 58 0.7535 0.1930 0.7535 0.8680
No log 1.3953 60 0.7090 0.2381 0.7090 0.8420
No log 1.4419 62 0.5292 0.2298 0.5292 0.7274
No log 1.4884 64 0.8551 0.1597 0.8551 0.9247
No log 1.5349 66 0.9407 0.1648 0.9407 0.9699
No log 1.5814 68 0.6603 0.3171 0.6603 0.8126
No log 1.6279 70 0.6081 0.2281 0.6081 0.7798
No log 1.6744 72 0.6855 0.3016 0.6855 0.8280
No log 1.7209 74 0.5659 0.2405 0.5659 0.7523
No log 1.7674 76 0.6248 0.2516 0.6248 0.7904
No log 1.8140 78 0.7758 0.1818 0.7758 0.8808
No log 1.8605 80 0.5869 0.1773 0.5869 0.7661
No log 1.9070 82 0.5312 0.3067 0.5312 0.7288
No log 1.9535 84 0.5689 0.3333 0.5689 0.7542
No log 2.0 86 0.6154 0.3118 0.6154 0.7845
No log 2.0465 88 0.5536 0.3086 0.5536 0.7440
No log 2.0930 90 0.5018 0.2632 0.5018 0.7084
No log 2.1395 92 0.5387 0.3171 0.5387 0.7340
No log 2.1860 94 0.6165 0.3016 0.6165 0.7852
No log 2.2326 96 0.7576 0.3138 0.7576 0.8704
No log 2.2791 98 0.6015 0.3905 0.6015 0.7756
No log 2.3256 100 0.6472 0.4067 0.6472 0.8045
No log 2.3721 102 1.0972 0.1837 1.0972 1.0475
No log 2.4186 104 1.4312 0.0886 1.4312 1.1963
No log 2.4651 106 1.1889 0.1020 1.1889 1.0903
No log 2.5116 108 0.7436 0.2863 0.7436 0.8623
No log 2.5581 110 0.8284 0.2863 0.8284 0.9102
No log 2.6047 112 1.2385 0.1096 1.2385 1.1129
No log 2.6512 114 1.0140 0.1161 1.0140 1.0070
No log 2.6977 116 0.7627 0.3202 0.7627 0.8733
No log 2.7442 118 0.8940 0.2063 0.8940 0.9455
No log 2.7907 120 0.8767 0.1741 0.8767 0.9363
No log 2.8372 122 0.7615 0.2068 0.7615 0.8726
No log 2.8837 124 0.7768 0.2068 0.7768 0.8814
No log 2.9302 126 0.5842 0.4043 0.5842 0.7644
No log 2.9767 128 0.7597 0.2414 0.7597 0.8716
No log 3.0233 130 0.9051 0.2191 0.9051 0.9514
No log 3.0698 132 0.9013 0.1875 0.9013 0.9494
No log 3.1163 134 0.5636 0.4526 0.5636 0.7507
No log 3.1628 136 0.5721 0.4526 0.5721 0.7564
No log 3.2093 138 0.5700 0.4526 0.5700 0.7550
No log 3.2558 140 0.7402 0.3242 0.7402 0.8604
No log 3.3023 142 0.6772 0.4182 0.6772 0.8229
No log 3.3488 144 0.7289 0.4286 0.7289 0.8538
No log 3.3953 146 0.9002 0.2714 0.9002 0.9488
No log 3.4419 148 0.6229 0.4483 0.6229 0.7892
No log 3.4884 150 0.6325 0.4138 0.6325 0.7953
No log 3.5349 152 0.8803 0.2124 0.8803 0.9382
No log 3.5814 154 0.6588 0.3909 0.6588 0.8117
No log 3.6279 156 0.5171 0.5282 0.5171 0.7191
No log 3.6744 158 0.5385 0.4924 0.5385 0.7339
No log 3.7209 160 0.6468 0.2711 0.6468 0.8042
No log 3.7674 162 0.5719 0.4396 0.5719 0.7563
No log 3.8140 164 0.4983 0.4743 0.4983 0.7059
No log 3.8605 166 0.5911 0.2780 0.5911 0.7688
No log 3.9070 168 0.9206 0.1667 0.9206 0.9595
No log 3.9535 170 0.8014 0.25 0.8014 0.8952
No log 4.0 172 0.4876 0.4802 0.4876 0.6983
No log 4.0465 174 0.4838 0.4545 0.4838 0.6956
No log 4.0930 176 0.5311 0.4924 0.5311 0.7288
No log 4.1395 178 0.6555 0.3580 0.6555 0.8097
No log 4.1860 180 0.6088 0.3628 0.6088 0.7802
No log 4.2326 182 0.5029 0.4839 0.5029 0.7091
No log 4.2791 184 0.5180 0.4882 0.5180 0.7197
No log 4.3256 186 0.6425 0.3473 0.6425 0.8016
No log 4.3721 188 1.1111 0.2102 1.1111 1.0541
No log 4.4186 190 1.1200 0.2102 1.1200 1.0583
No log 4.4651 192 0.7042 0.4023 0.7042 0.8391
No log 4.5116 194 0.5156 0.4737 0.5156 0.7180
No log 4.5581 196 0.5310 0.4286 0.5310 0.7287
No log 4.6047 198 0.4959 0.4819 0.4959 0.7042
No log 4.6512 200 0.7238 0.3882 0.7238 0.8508
No log 4.6977 202 0.6920 0.4109 0.6920 0.8319
No log 4.7442 204 0.5190 0.4709 0.5190 0.7204
No log 4.7907 206 0.5366 0.4709 0.5366 0.7325
No log 4.8372 208 0.6254 0.3504 0.6254 0.7908
No log 4.8837 210 0.5764 0.4393 0.5764 0.7592
No log 4.9302 212 0.6255 0.3665 0.6255 0.7909
No log 4.9767 214 0.5750 0.4286 0.5750 0.7583
No log 5.0233 216 0.5914 0.4123 0.5914 0.7690
No log 5.0698 218 0.5500 0.4227 0.5500 0.7416
No log 5.1163 220 0.5670 0.4286 0.5670 0.7530
No log 5.1628 222 0.5867 0.4074 0.5867 0.7660
No log 5.2093 224 0.5234 0.4346 0.5234 0.7235
No log 5.2558 226 0.5288 0.4346 0.5288 0.7272
No log 5.3023 228 0.5668 0.4051 0.5668 0.7528
No log 5.3488 230 0.5949 0.3548 0.5949 0.7713
No log 5.3953 232 0.5476 0.4343 0.5476 0.7400
No log 5.4419 234 0.5210 0.4343 0.5210 0.7218
No log 5.4884 236 0.5784 0.3962 0.5784 0.7605
No log 5.5349 238 0.6693 0.3755 0.6693 0.8181
No log 5.5814 240 0.6264 0.3480 0.6264 0.7915
No log 5.6279 242 0.5506 0.4341 0.5506 0.7420
No log 5.6744 244 0.6512 0.3418 0.6512 0.8070
No log 5.7209 246 0.8376 0.2409 0.8376 0.9152
No log 5.7674 248 0.7237 0.3651 0.7237 0.8507
No log 5.8140 250 0.4950 0.4404 0.4950 0.7035
No log 5.8605 252 0.4495 0.5111 0.4495 0.6704
No log 5.9070 254 0.4576 0.4839 0.4576 0.6764
No log 5.9535 256 0.5138 0.4286 0.5138 0.7168
No log 6.0 258 0.6456 0.3793 0.6456 0.8035
No log 6.0465 260 0.7077 0.4050 0.7077 0.8413
No log 6.0930 262 0.6292 0.3391 0.6292 0.7932
No log 6.1395 264 0.5511 0.3990 0.5511 0.7424
No log 6.1860 266 0.5411 0.4286 0.5411 0.7356
No log 6.2326 268 0.5928 0.3892 0.5928 0.7700
No log 6.2791 270 0.5621 0.3990 0.5621 0.7497
No log 6.3256 272 0.5552 0.3990 0.5552 0.7451
No log 6.3721 274 0.5565 0.3990 0.5565 0.7460
No log 6.4186 276 0.6098 0.3524 0.6098 0.7809
No log 6.4651 278 0.6448 0.3153 0.6448 0.8030
No log 6.5116 280 0.5861 0.36 0.5861 0.7655
No log 6.5581 282 0.5055 0.4667 0.5055 0.7110
No log 6.6047 284 0.4965 0.4917 0.4965 0.7046
No log 6.6512 286 0.5052 0.4483 0.5052 0.7108
No log 6.6977 288 0.5675 0.3407 0.5675 0.7533
No log 6.7442 290 0.6761 0.3103 0.6761 0.8223
No log 6.7907 292 0.6821 0.2727 0.6821 0.8259
No log 6.8372 294 0.6402 0.2775 0.6402 0.8001
No log 6.8837 296 0.6491 0.2743 0.6491 0.8057
No log 6.9302 298 0.5770 0.3846 0.5770 0.7596
No log 6.9767 300 0.5374 0.4175 0.5374 0.7331
No log 7.0233 302 0.5449 0.4123 0.5449 0.7381
No log 7.0698 304 0.6131 0.3305 0.6131 0.7830
No log 7.1163 306 0.6554 0.3834 0.6554 0.8095
No log 7.1628 308 0.6691 0.3834 0.6691 0.8180
No log 7.2093 310 0.6681 0.3548 0.6681 0.8174
No log 7.2558 312 0.6550 0.3613 0.6550 0.8093
No log 7.3023 314 0.7104 0.3333 0.7104 0.8428
No log 7.3488 316 0.6733 0.2759 0.6733 0.8205
No log 7.3953 318 0.5694 0.3892 0.5694 0.7546
No log 7.4419 320 0.4980 0.4348 0.4980 0.7057
No log 7.4884 322 0.4877 0.5056 0.4877 0.6984
No log 7.5349 324 0.4894 0.4917 0.4894 0.6996
No log 7.5814 326 0.4998 0.4350 0.4998 0.7070
No log 7.6279 328 0.5249 0.4639 0.5249 0.7245
No log 7.6744 330 0.5663 0.3990 0.5663 0.7525
No log 7.7209 332 0.5927 0.3892 0.5927 0.7699
No log 7.7674 334 0.5688 0.3990 0.5688 0.7542
No log 7.8140 336 0.5243 0.4462 0.5243 0.7241
No log 7.8605 338 0.5207 0.4694 0.5207 0.7216
No log 7.9070 340 0.5330 0.4400 0.5330 0.7301
No log 7.9535 342 0.5851 0.4286 0.5851 0.7649
No log 8.0 344 0.6667 0.3333 0.6667 0.8165
No log 8.0465 346 0.6894 0.3306 0.6894 0.8303
No log 8.0930 348 0.6400 0.3684 0.6400 0.8000
No log 8.1395 350 0.5899 0.4175 0.5899 0.7680
No log 8.1860 352 0.5420 0.4639 0.5420 0.7362
No log 8.2326 354 0.5332 0.4639 0.5332 0.7302
No log 8.2791 356 0.5549 0.4286 0.5549 0.7449
No log 8.3256 358 0.5924 0.3394 0.5924 0.7697
No log 8.3721 360 0.6195 0.3648 0.6195 0.7871
No log 8.4186 362 0.6400 0.3648 0.6400 0.8000
No log 8.4651 364 0.6037 0.3722 0.6037 0.7770
No log 8.5116 366 0.5524 0.4229 0.5524 0.7432
No log 8.5581 368 0.5460 0.4286 0.5460 0.7389
No log 8.6047 370 0.5306 0.4171 0.5306 0.7284
No log 8.6512 372 0.5262 0.4455 0.5262 0.7254
No log 8.6977 374 0.5450 0.4171 0.5450 0.7382
No log 8.7442 376 0.5866 0.3665 0.5866 0.7659
No log 8.7907 378 0.6064 0.3684 0.6064 0.7787
No log 8.8372 380 0.6136 0.3648 0.6136 0.7833
No log 8.8837 382 0.6090 0.3394 0.6090 0.7804
No log 8.9302 384 0.6129 0.3363 0.6129 0.7829
No log 8.9767 386 0.6120 0.3333 0.6120 0.7823
No log 9.0233 388 0.6206 0.3333 0.6206 0.7878
No log 9.0698 390 0.5993 0.3394 0.5993 0.7741
No log 9.1163 392 0.5751 0.3665 0.5751 0.7583
No log 9.1628 394 0.5633 0.3744 0.5633 0.7505
No log 9.2093 396 0.5652 0.3744 0.5652 0.7518
No log 9.2558 398 0.5828 0.3394 0.5828 0.7634
No log 9.3023 400 0.5963 0.3394 0.5963 0.7722
No log 9.3488 402 0.6002 0.3394 0.6002 0.7747
No log 9.3953 404 0.5990 0.3394 0.5990 0.7740
No log 9.4419 406 0.5946 0.3394 0.5946 0.7711
No log 9.4884 408 0.5738 0.3427 0.5738 0.7575
No log 9.5349 410 0.5586 0.3892 0.5586 0.7474
No log 9.5814 412 0.5531 0.3892 0.5531 0.7437
No log 9.6279 414 0.5453 0.3892 0.5453 0.7384
No log 9.6744 416 0.5448 0.3892 0.5448 0.7381
No log 9.7209 418 0.5435 0.3892 0.5435 0.7372
No log 9.7674 420 0.5398 0.4175 0.5398 0.7347
No log 9.8140 422 0.5405 0.4175 0.5405 0.7352
No log 9.8605 424 0.5404 0.4175 0.5404 0.7351
No log 9.9070 426 0.5417 0.4175 0.5417 0.7360
No log 9.9535 428 0.5437 0.3892 0.5437 0.7373
No log 10.0 430 0.5442 0.3892 0.5442 0.7377

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k9_task3_organization

Finetuned
(4023)
this model