ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k7_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4984
  • Qwk: 0.4595
  • Mse: 0.4984
  • Rmse: 0.7060

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.1771 -0.0014 3.1771 1.7824
No log 0.0930 4 1.8248 -0.0130 1.8248 1.3509
No log 0.1395 6 1.6001 0.0390 1.6001 1.2650
No log 0.1860 8 1.4545 0.0130 1.4545 1.2060
No log 0.2326 10 0.9374 0.1464 0.9374 0.9682
No log 0.2791 12 0.5887 0.0286 0.5887 0.7673
No log 0.3256 14 0.6512 -0.0159 0.6512 0.8070
No log 0.3721 16 0.8097 -0.2414 0.8097 0.8998
No log 0.4186 18 1.0902 -0.0268 1.0902 1.0441
No log 0.4651 20 1.2290 0.0 1.2290 1.1086
No log 0.5116 22 1.3512 0.0 1.3512 1.1624
No log 0.5581 24 1.1242 0.0 1.1242 1.0603
No log 0.6047 26 0.8776 0.0388 0.8776 0.9368
No log 0.6512 28 0.7513 0.0899 0.7513 0.8668
No log 0.6977 30 0.7005 0.1111 0.7005 0.8370
No log 0.7442 32 0.7277 0.2258 0.7277 0.8531
No log 0.7907 34 0.9080 0.0794 0.9080 0.9529
No log 0.8372 36 0.8416 0.0894 0.8416 0.9174
No log 0.8837 38 0.7594 0.1443 0.7594 0.8714
No log 0.9302 40 0.6652 0.2201 0.6652 0.8156
No log 0.9767 42 0.6503 0.2099 0.6503 0.8064
No log 1.0233 44 0.5946 0.2340 0.5946 0.7711
No log 1.0698 46 0.7504 0.2000 0.7504 0.8663
No log 1.1163 48 0.8933 0.1712 0.8933 0.9452
No log 1.1628 50 0.6611 0.1913 0.6611 0.8131
No log 1.2093 52 0.5686 0.0476 0.5686 0.7540
No log 1.2558 54 0.5706 0.0476 0.5706 0.7554
No log 1.3023 56 0.5968 0.2821 0.5968 0.7725
No log 1.3488 58 0.9351 0.1736 0.9351 0.9670
No log 1.3953 60 0.9219 0.1736 0.9219 0.9602
No log 1.4419 62 0.6536 0.2000 0.6536 0.8085
No log 1.4884 64 0.5968 -0.0233 0.5968 0.7725
No log 1.5349 66 0.5761 -0.0081 0.5761 0.7590
No log 1.5814 68 0.5552 0.1884 0.5552 0.7451
No log 1.6279 70 0.7048 0.1919 0.7048 0.8395
No log 1.6744 72 0.9368 0.1276 0.9368 0.9679
No log 1.7209 74 0.8543 0.2000 0.8543 0.9243
No log 1.7674 76 0.5443 0.2593 0.5443 0.7377
No log 1.8140 78 0.5387 0.1111 0.5387 0.7339
No log 1.8605 80 0.5094 0.0933 0.5094 0.7137
No log 1.9070 82 0.7147 0.2850 0.7147 0.8454
No log 1.9535 84 1.1452 0.0968 1.1452 1.0701
No log 2.0 86 1.1445 0.0916 1.1445 1.0698
No log 2.0465 88 0.5698 0.3103 0.5698 0.7548
No log 2.0930 90 0.6227 0.1407 0.6227 0.7891
No log 2.1395 92 0.6674 0.1429 0.6674 0.8169
No log 2.1860 94 0.5015 0.1220 0.5015 0.7082
No log 2.2326 96 0.6673 0.2549 0.6673 0.8169
No log 2.2791 98 0.8627 0.1644 0.8627 0.9288
No log 2.3256 100 0.6575 0.2464 0.6575 0.8108
No log 2.3721 102 0.5660 0.2381 0.5660 0.7523
No log 2.4186 104 0.4965 0.2121 0.4965 0.7046
No log 2.4651 106 0.5124 0.0476 0.5124 0.7158
No log 2.5116 108 0.5160 0.1628 0.5160 0.7184
No log 2.5581 110 0.5415 0.2704 0.5415 0.7359
No log 2.6047 112 0.5908 0.2184 0.5908 0.7686
No log 2.6512 114 0.5228 0.3253 0.5228 0.7230
No log 2.6977 116 0.5428 0.3103 0.5428 0.7367
No log 2.7442 118 0.6237 0.2527 0.6237 0.7897
No log 2.7907 120 0.8688 0.1673 0.8688 0.9321
No log 2.8372 122 0.9005 0.1385 0.9005 0.9489
No log 2.8837 124 0.9952 0.0222 0.9952 0.9976
No log 2.9302 126 0.8650 0.1347 0.8650 0.9301
No log 2.9767 128 0.6642 0.2593 0.6642 0.8150
No log 3.0233 130 0.7428 0.2239 0.7428 0.8618
No log 3.0698 132 0.5916 0.3617 0.5916 0.7692
No log 3.1163 134 0.7108 0.3469 0.7108 0.8431
No log 3.1628 136 0.8032 0.1636 0.8032 0.8962
No log 3.2093 138 0.5610 0.3927 0.5610 0.7490
No log 3.2558 140 0.5085 0.3469 0.5085 0.7131
No log 3.3023 142 0.5185 0.3469 0.5185 0.7200
No log 3.3488 144 0.5529 0.3171 0.5529 0.7436
No log 3.3953 146 0.5163 0.4620 0.5163 0.7186
No log 3.4419 148 0.6084 0.2746 0.6084 0.7800
No log 3.4884 150 0.5784 0.3548 0.5784 0.7605
No log 3.5349 152 0.5467 0.3224 0.5467 0.7394
No log 3.5814 154 0.7033 0.2074 0.7033 0.8387
No log 3.6279 156 0.5848 0.2821 0.5848 0.7647
No log 3.6744 158 0.5641 0.2644 0.5641 0.7510
No log 3.7209 160 0.6521 0.2766 0.6521 0.8075
No log 3.7674 162 0.6580 0.2593 0.6580 0.8112
No log 3.8140 164 0.6110 0.1902 0.6110 0.7817
No log 3.8605 166 0.6338 0.2485 0.6338 0.7961
No log 3.9070 168 0.6168 0.0064 0.6168 0.7854
No log 3.9535 170 0.6306 0.0180 0.6306 0.7941
No log 4.0 172 0.6352 0.1461 0.6352 0.7970
No log 4.0465 174 0.6400 0.1913 0.6400 0.8000
No log 4.0930 176 0.8677 0.2000 0.8677 0.9315
No log 4.1395 178 0.8135 0.2000 0.8135 0.9020
No log 4.1860 180 0.6378 0.1919 0.6378 0.7986
No log 4.2326 182 1.0495 0.1289 1.0495 1.0244
No log 4.2791 184 1.0627 0.1289 1.0627 1.0309
No log 4.3256 186 0.6704 0.1683 0.6704 0.8188
No log 4.3721 188 0.8044 0.2356 0.8044 0.8969
No log 4.4186 190 0.9694 0.136 0.9694 0.9846
No log 4.4651 192 0.7398 0.3052 0.7398 0.8601
No log 4.5116 194 0.5650 0.2707 0.5650 0.7517
No log 4.5581 196 0.7043 0.2607 0.7043 0.8393
No log 4.6047 198 0.8065 0.2838 0.8065 0.8981
No log 4.6512 200 0.6121 0.2475 0.6121 0.7824
No log 4.6977 202 0.5536 0.2558 0.5536 0.7441
No log 4.7442 204 0.6333 0.3103 0.6333 0.7958
No log 4.7907 206 0.5367 0.2444 0.5367 0.7326
No log 4.8372 208 0.5233 0.4286 0.5233 0.7234
No log 4.8837 210 0.5420 0.4627 0.5420 0.7362
No log 4.9302 212 0.5335 0.4286 0.5335 0.7304
No log 4.9767 214 0.5385 0.2707 0.5385 0.7338
No log 5.0233 216 0.5590 0.3478 0.5590 0.7477
No log 5.0698 218 0.6312 0.2323 0.6312 0.7945
No log 5.1163 220 0.6869 0.2000 0.6869 0.8288
No log 5.1628 222 0.5827 0.3446 0.5827 0.7634
No log 5.2093 224 0.6901 0.3208 0.6901 0.8307
No log 5.2558 226 0.8304 0.2000 0.8304 0.9112
No log 5.3023 228 0.7189 0.2593 0.7189 0.8479
No log 5.3488 230 0.5802 0.2787 0.5802 0.7617
No log 5.3953 232 0.5824 0.2609 0.5824 0.7631
No log 5.4419 234 0.5722 0.2766 0.5722 0.7564
No log 5.4884 236 0.6866 0.3143 0.6866 0.8286
No log 5.5349 238 0.7380 0.2593 0.7380 0.8591
No log 5.5814 240 0.6102 0.3846 0.6102 0.7812
No log 5.6279 242 0.5612 0.2766 0.5612 0.7492
No log 5.6744 244 0.5715 0.1908 0.5715 0.7560
No log 5.7209 246 0.5616 0.2432 0.5616 0.7494
No log 5.7674 248 0.5942 0.3052 0.5942 0.7709
No log 5.8140 250 0.6062 0.3333 0.6062 0.7786
No log 5.8605 252 0.5641 0.3488 0.5641 0.7511
No log 5.9070 254 0.5433 0.2941 0.5433 0.7371
No log 5.9535 256 0.5898 0.3365 0.5898 0.7680
No log 6.0 258 0.6076 0.3628 0.6076 0.7795
No log 6.0465 260 0.5760 0.3878 0.5760 0.7590
No log 6.0930 262 0.5054 0.3978 0.5054 0.7109
No log 6.1395 264 0.4923 0.4225 0.4923 0.7016
No log 6.1860 266 0.4888 0.4112 0.4888 0.6992
No log 6.2326 268 0.5056 0.3575 0.5056 0.7111
No log 6.2791 270 0.5671 0.2621 0.5671 0.7531
No log 6.3256 272 0.5532 0.2709 0.5532 0.7438
No log 6.3721 274 0.5196 0.3617 0.5196 0.7208
No log 6.4186 276 0.5251 0.3617 0.5251 0.7246
No log 6.4651 278 0.5599 0.2941 0.5599 0.7483
No log 6.5116 280 0.5481 0.4286 0.5481 0.7404
No log 6.5581 282 0.5255 0.3978 0.5255 0.7249
No log 6.6047 284 0.5319 0.3769 0.5319 0.7293
No log 6.6512 286 0.5621 0.2709 0.5621 0.7498
No log 6.6977 288 0.5420 0.3402 0.5420 0.7362
No log 6.7442 290 0.5177 0.3575 0.5177 0.7195
No log 6.7907 292 0.5085 0.3369 0.5085 0.7131
No log 6.8372 294 0.5151 0.3224 0.5151 0.7177
No log 6.8837 296 0.5040 0.2967 0.5040 0.7100
No log 6.9302 298 0.5027 0.3149 0.5027 0.7090
No log 6.9767 300 0.5027 0.3797 0.5027 0.7090
No log 7.0233 302 0.5194 0.3663 0.5194 0.7207
No log 7.0698 304 0.5110 0.3367 0.5110 0.7148
No log 7.1163 306 0.4992 0.3730 0.4992 0.7066
No log 7.1628 308 0.5058 0.5 0.5058 0.7112
No log 7.2093 310 0.5004 0.4105 0.5004 0.7074
No log 7.2558 312 0.5002 0.3927 0.5002 0.7073
No log 7.3023 314 0.5209 0.3561 0.5209 0.7217
No log 7.3488 316 0.5260 0.3561 0.5260 0.7252
No log 7.3953 318 0.5098 0.3769 0.5098 0.7140
No log 7.4419 320 0.5094 0.3990 0.5094 0.7137
No log 7.4884 322 0.5133 0.3769 0.5133 0.7165
No log 7.5349 324 0.5142 0.3878 0.5142 0.7171
No log 7.5814 326 0.5145 0.3878 0.5145 0.7173
No log 7.6279 328 0.5127 0.3878 0.5127 0.7160
No log 7.6744 330 0.5220 0.3043 0.5220 0.7225
No log 7.7209 332 0.5482 0.2871 0.5482 0.7404
No log 7.7674 334 0.5447 0.2871 0.5447 0.7381
No log 7.8140 336 0.5201 0.3043 0.5201 0.7212
No log 7.8605 338 0.5139 0.3797 0.5139 0.7169
No log 7.9070 340 0.5142 0.3407 0.5142 0.7171
No log 7.9535 342 0.5173 0.3043 0.5173 0.7192
No log 8.0 344 0.5211 0.2941 0.5211 0.7219
No log 8.0465 346 0.5244 0.3263 0.5244 0.7242
No log 8.0930 348 0.5113 0.3402 0.5113 0.7150
No log 8.1395 350 0.5100 0.3591 0.5100 0.7142
No log 8.1860 352 0.5180 0.3898 0.5180 0.7197
No log 8.2326 354 0.5239 0.3898 0.5239 0.7238
No log 8.2791 356 0.5117 0.3898 0.5117 0.7153
No log 8.3256 358 0.5027 0.3829 0.5027 0.7090
No log 8.3721 360 0.4917 0.3708 0.4917 0.7012
No log 8.4186 362 0.4882 0.3661 0.4882 0.6987
No log 8.4651 364 0.4874 0.3258 0.4874 0.6982
No log 8.5116 366 0.4932 0.3478 0.4932 0.7023
No log 8.5581 368 0.4974 0.3369 0.4974 0.7053
No log 8.6047 370 0.4936 0.3478 0.4936 0.7025
No log 8.6512 372 0.4919 0.3862 0.4919 0.7014
No log 8.6977 374 0.4927 0.3862 0.4927 0.7019
No log 8.7442 376 0.4917 0.3548 0.4917 0.7012
No log 8.7907 378 0.4907 0.4098 0.4907 0.7005
No log 8.8372 380 0.4922 0.4468 0.4922 0.7016
No log 8.8837 382 0.4964 0.4583 0.4964 0.7046
No log 8.9302 384 0.5062 0.3661 0.5062 0.7114
No log 8.9767 386 0.5090 0.3661 0.5090 0.7135
No log 9.0233 388 0.5133 0.3778 0.5133 0.7164
No log 9.0698 390 0.5101 0.3661 0.5101 0.7142
No log 9.1163 392 0.5015 0.4583 0.5015 0.7082
No log 9.1628 394 0.4973 0.4468 0.4973 0.7052
No log 9.2093 396 0.4980 0.4098 0.4980 0.7057
No log 9.2558 398 0.4985 0.4098 0.4985 0.7060
No log 9.3023 400 0.4987 0.4098 0.4987 0.7062
No log 9.3488 402 0.4994 0.3548 0.4994 0.7067
No log 9.3953 404 0.5024 0.3369 0.5024 0.7088
No log 9.4419 406 0.5034 0.3369 0.5034 0.7095
No log 9.4884 408 0.5053 0.3263 0.5053 0.7109
No log 9.5349 410 0.5035 0.3369 0.5035 0.7096
No log 9.5814 412 0.5005 0.3591 0.5005 0.7075
No log 9.6279 414 0.4995 0.3591 0.4995 0.7067
No log 9.6744 416 0.4988 0.3591 0.4988 0.7063
No log 9.7209 418 0.4981 0.3591 0.4981 0.7058
No log 9.7674 420 0.4979 0.3978 0.4979 0.7057
No log 9.8140 422 0.4979 0.3978 0.4979 0.7056
No log 9.8605 424 0.4981 0.4098 0.4981 0.7058
No log 9.9070 426 0.4982 0.4098 0.4982 0.7059
No log 9.9535 428 0.4984 0.4468 0.4984 0.7059
No log 10.0 430 0.4984 0.4595 0.4984 0.7060

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run2_AugV5_k7_task3_organization

Finetuned
(4023)
this model