ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8817
  • Qwk: 0.4359
  • Mse: 0.8817
  • Rmse: 0.9390

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 4.5215 -0.0103 4.5215 2.1264
No log 0.1143 4 2.6020 -0.0285 2.6020 1.6131
No log 0.1714 6 1.7886 0.0062 1.7886 1.3374
No log 0.2286 8 1.4062 0.0393 1.4062 1.1859
No log 0.2857 10 1.3003 0.0623 1.3003 1.1403
No log 0.3429 12 1.2280 0.1593 1.2280 1.1081
No log 0.4 14 1.2470 0.1507 1.2470 1.1167
No log 0.4571 16 1.3825 -0.0436 1.3825 1.1758
No log 0.5143 18 1.5021 0.0488 1.5021 1.2256
No log 0.5714 20 1.2939 0.0449 1.2939 1.1375
No log 0.6286 22 1.1634 0.3041 1.1634 1.0786
No log 0.6857 24 1.1345 0.3307 1.1345 1.0651
No log 0.7429 26 1.1323 0.3195 1.1323 1.0641
No log 0.8 28 1.1059 0.3045 1.1059 1.0516
No log 0.8571 30 1.0681 0.3474 1.0681 1.0335
No log 0.9143 32 1.0732 0.3045 1.0732 1.0360
No log 0.9714 34 1.1604 0.1848 1.1604 1.0772
No log 1.0286 36 1.3546 0.0254 1.3546 1.1639
No log 1.0857 38 1.4846 0.0403 1.4846 1.2184
No log 1.1429 40 1.4045 0.0403 1.4045 1.1851
No log 1.2 42 1.1847 0.1735 1.1847 1.0884
No log 1.2571 44 1.0782 0.4051 1.0782 1.0383
No log 1.3143 46 1.3152 0.0414 1.3152 1.1468
No log 1.3714 48 1.5326 -0.0697 1.5326 1.2380
No log 1.4286 50 1.0530 0.2408 1.0530 1.0261
No log 1.4857 52 0.8911 0.4330 0.8911 0.9440
No log 1.5429 54 1.2632 0.2058 1.2632 1.1239
No log 1.6 56 1.2420 0.2058 1.2420 1.1145
No log 1.6571 58 1.0656 0.3237 1.0656 1.0323
No log 1.7143 60 1.0395 0.2440 1.0395 1.0195
No log 1.7714 62 1.0751 0.2674 1.0751 1.0369
No log 1.8286 64 1.0308 0.2605 1.0308 1.0153
No log 1.8857 66 0.9947 0.4321 0.9947 0.9973
No log 1.9429 68 1.0780 0.2149 1.0780 1.0382
No log 2.0 70 1.1716 0.1920 1.1716 1.0824
No log 2.0571 72 1.2025 0.2014 1.2025 1.0966
No log 2.1143 74 1.2239 0.2014 1.2239 1.1063
No log 2.1714 76 1.0547 0.3577 1.0547 1.0270
No log 2.2286 78 0.9713 0.4371 0.9713 0.9855
No log 2.2857 80 0.9489 0.4075 0.9489 0.9741
No log 2.3429 82 0.9388 0.3117 0.9388 0.9689
No log 2.4 84 0.9797 0.3093 0.9797 0.9898
No log 2.4571 86 1.0101 0.4202 1.0101 1.0050
No log 2.5143 88 1.0293 0.3839 1.0293 1.0146
No log 2.5714 90 1.0943 0.3551 1.0943 1.0461
No log 2.6286 92 1.0701 0.3760 1.0701 1.0345
No log 2.6857 94 1.0152 0.4020 1.0152 1.0076
No log 2.7429 96 0.9587 0.4628 0.9587 0.9791
No log 2.8 98 0.9784 0.3869 0.9784 0.9891
No log 2.8571 100 0.9538 0.4060 0.9538 0.9766
No log 2.9143 102 0.9745 0.36 0.9745 0.9872
No log 2.9714 104 0.9286 0.3892 0.9286 0.9637
No log 3.0286 106 0.9566 0.4110 0.9566 0.9781
No log 3.0857 108 1.1196 0.3556 1.1196 1.0581
No log 3.1429 110 1.0999 0.3225 1.0999 1.0487
No log 3.2 112 1.1103 0.3705 1.1103 1.0537
No log 3.2571 114 1.1493 0.3600 1.1493 1.0720
No log 3.3143 116 1.0250 0.2976 1.0250 1.0124
No log 3.3714 118 0.9906 0.2033 0.9906 0.9953
No log 3.4286 120 1.1105 0.1443 1.1105 1.0538
No log 3.4857 122 0.9268 0.3363 0.9268 0.9627
No log 3.5429 124 0.8256 0.5503 0.8256 0.9086
No log 3.6 126 1.0520 0.4275 1.0520 1.0257
No log 3.6571 128 1.1514 0.4138 1.1514 1.0730
No log 3.7143 130 0.9096 0.5792 0.9096 0.9537
No log 3.7714 132 0.9407 0.4788 0.9407 0.9699
No log 3.8286 134 1.0967 0.4966 1.0967 1.0472
No log 3.8857 136 1.0119 0.5677 1.0119 1.0059
No log 3.9429 138 0.9105 0.5997 0.9105 0.9542
No log 4.0 140 1.0696 0.4674 1.0696 1.0342
No log 4.0571 142 1.2302 0.3525 1.2302 1.1091
No log 4.1143 144 1.1159 0.4010 1.1159 1.0563
No log 4.1714 146 0.9277 0.4420 0.9277 0.9632
No log 4.2286 148 0.8986 0.5058 0.8986 0.9480
No log 4.2857 150 0.9006 0.5076 0.9006 0.9490
No log 4.3429 152 0.9068 0.4945 0.9068 0.9523
No log 4.4 154 0.9187 0.4960 0.9187 0.9585
No log 4.4571 156 0.9417 0.4534 0.9417 0.9704
No log 4.5143 158 0.9458 0.4258 0.9458 0.9725
No log 4.5714 160 0.9938 0.3548 0.9938 0.9969
No log 4.6286 162 1.1449 0.4219 1.1449 1.0700
No log 4.6857 164 1.0527 0.3310 1.0527 1.0260
No log 4.7429 166 0.9983 0.3327 0.9983 0.9992
No log 4.8 168 0.9751 0.3387 0.9751 0.9875
No log 4.8571 170 1.0612 0.3281 1.0612 1.0302
No log 4.9143 172 1.0658 0.3902 1.0658 1.0324
No log 4.9714 174 0.9442 0.2993 0.9442 0.9717
No log 5.0286 176 0.9444 0.2679 0.9444 0.9718
No log 5.0857 178 0.9481 0.3015 0.9481 0.9737
No log 5.1429 180 1.0336 0.3707 1.0336 1.0166
No log 5.2 182 1.0477 0.3326 1.0477 1.0236
No log 5.2571 184 0.9829 0.3115 0.9829 0.9914
No log 5.3143 186 0.9367 0.2850 0.9367 0.9679
No log 5.3714 188 0.9316 0.2971 0.9316 0.9652
No log 5.4286 190 0.9541 0.2608 0.9541 0.9768
No log 5.4857 192 0.9056 0.3619 0.9056 0.9516
No log 5.5429 194 0.8974 0.4331 0.8974 0.9473
No log 5.6 196 0.9075 0.5446 0.9075 0.9526
No log 5.6571 198 0.8600 0.4420 0.8600 0.9273
No log 5.7143 200 0.9475 0.3656 0.9475 0.9734
No log 5.7714 202 0.9689 0.3656 0.9689 0.9843
No log 5.8286 204 0.9392 0.3687 0.9392 0.9691
No log 5.8857 206 0.9604 0.3434 0.9604 0.9800
No log 5.9429 208 0.9446 0.3666 0.9446 0.9719
No log 6.0 210 0.9227 0.3666 0.9227 0.9606
No log 6.0571 212 0.9910 0.3897 0.9910 0.9955
No log 6.1143 214 1.2039 0.3943 1.2039 1.0972
No log 6.1714 216 1.3930 0.2895 1.3930 1.1803
No log 6.2286 218 1.2968 0.3324 1.2968 1.1388
No log 6.2857 220 1.0017 0.4010 1.0017 1.0009
No log 6.3429 222 0.8375 0.4377 0.8375 0.9152
No log 6.4 224 0.8153 0.4691 0.8153 0.9029
No log 6.4571 226 0.8143 0.4691 0.8143 0.9024
No log 6.5143 228 0.8226 0.3641 0.8226 0.9070
No log 6.5714 230 0.9118 0.3908 0.9118 0.9549
No log 6.6286 232 0.9774 0.4098 0.9774 0.9886
No log 6.6857 234 0.8713 0.3908 0.8713 0.9334
No log 6.7429 236 0.8731 0.3913 0.8731 0.9344
No log 6.8 238 0.9311 0.4013 0.9311 0.9649
No log 6.8571 240 0.8891 0.3928 0.8891 0.9429
No log 6.9143 242 0.9173 0.4418 0.9173 0.9578
No log 6.9714 244 1.0413 0.4890 1.0413 1.0205
No log 7.0286 246 0.9726 0.5015 0.9726 0.9862
No log 7.0857 248 0.7743 0.5057 0.7743 0.8800
No log 7.1429 250 0.7883 0.5462 0.7883 0.8879
No log 7.2 252 0.8593 0.5122 0.8593 0.9270
No log 7.2571 254 0.8045 0.5451 0.8045 0.8970
No log 7.3143 256 0.7749 0.4691 0.7749 0.8803
No log 7.3714 258 0.9022 0.4252 0.9022 0.9499
No log 7.4286 260 1.0440 0.4580 1.0440 1.0218
No log 7.4857 262 0.9752 0.5015 0.9752 0.9875
No log 7.5429 264 0.8046 0.5131 0.8046 0.8970
No log 7.6 266 0.8196 0.4959 0.8196 0.9053
No log 7.6571 268 0.9029 0.4630 0.9029 0.9502
No log 7.7143 270 0.8645 0.4632 0.8645 0.9298
No log 7.7714 272 0.8092 0.5410 0.8092 0.8996
No log 7.8286 274 0.9046 0.4706 0.9046 0.9511
No log 7.8857 276 1.0831 0.4586 1.0831 1.0407
No log 7.9429 278 1.0717 0.4856 1.0717 1.0352
No log 8.0 280 0.9074 0.5080 0.9074 0.9526
No log 8.0571 282 0.8281 0.5194 0.8281 0.9100
No log 8.1143 284 0.8054 0.4789 0.8054 0.8975
No log 8.1714 286 0.8002 0.4893 0.8002 0.8945
No log 8.2286 288 0.7987 0.4534 0.7987 0.8937
No log 8.2857 290 0.7929 0.4534 0.7929 0.8904
No log 8.3429 292 0.8073 0.4218 0.8073 0.8985
No log 8.4 294 0.8651 0.4420 0.8651 0.9301
No log 8.4571 296 1.0795 0.4362 1.0795 1.0390
No log 8.5143 298 1.2327 0.4601 1.2327 1.1103
No log 8.5714 300 1.1401 0.4362 1.1401 1.0677
No log 8.6286 302 0.9519 0.4337 0.9519 0.9757
No log 8.6857 304 0.8298 0.4356 0.8298 0.9109
No log 8.7429 306 0.8036 0.4898 0.8036 0.8964
No log 8.8 308 0.8338 0.3941 0.8338 0.9131
No log 8.8571 310 0.9353 0.4449 0.9353 0.9671
No log 8.9143 312 1.0209 0.4596 1.0209 1.0104
No log 8.9714 314 0.9804 0.4444 0.9804 0.9901
No log 9.0286 316 0.9579 0.4251 0.9579 0.9787
No log 9.0857 318 0.8446 0.4920 0.8446 0.9190
No log 9.1429 320 0.7671 0.5606 0.7671 0.8758
No log 9.2 322 0.7682 0.5434 0.7682 0.8764
No log 9.2571 324 0.8059 0.5528 0.8059 0.8977
No log 9.3143 326 0.9409 0.4048 0.9409 0.9700
No log 9.3714 328 1.1684 0.4654 1.1684 1.0809
No log 9.4286 330 1.2234 0.4261 1.2234 1.1061
No log 9.4857 332 1.1001 0.4217 1.1001 1.0489
No log 9.5429 334 0.9048 0.3993 0.9048 0.9512
No log 9.6 336 0.8204 0.4724 0.8204 0.9057
No log 9.6571 338 0.7960 0.5351 0.7960 0.8922
No log 9.7143 340 0.8141 0.4724 0.8141 0.9023
No log 9.7714 342 0.8739 0.4072 0.8739 0.9348
No log 9.8286 344 0.8524 0.4328 0.8524 0.9232
No log 9.8857 346 0.8055 0.4656 0.8055 0.8975
No log 9.9429 348 0.7839 0.5399 0.7839 0.8854
No log 10.0 350 0.7806 0.4942 0.7806 0.8835
No log 10.0571 352 0.8027 0.5028 0.8027 0.8959
No log 10.1143 354 0.8322 0.4859 0.8322 0.9123
No log 10.1714 356 0.8618 0.4059 0.8618 0.9283
No log 10.2286 358 0.8568 0.4343 0.8568 0.9257
No log 10.2857 360 0.7788 0.4757 0.7788 0.8825
No log 10.3429 362 0.7316 0.5195 0.7316 0.8553
No log 10.4 364 0.7154 0.5279 0.7154 0.8458
No log 10.4571 366 0.7064 0.5957 0.7064 0.8405
No log 10.5143 368 0.7437 0.5270 0.7437 0.8624
No log 10.5714 370 0.8188 0.5083 0.8188 0.9049
No log 10.6286 372 0.8124 0.5130 0.8124 0.9013
No log 10.6857 374 0.7822 0.5250 0.7822 0.8844
No log 10.7429 376 0.7261 0.5431 0.7261 0.8521
No log 10.8 378 0.7012 0.5633 0.7012 0.8374
No log 10.8571 380 0.7095 0.6333 0.7095 0.8423
No log 10.9143 382 0.6996 0.6220 0.6996 0.8364
No log 10.9714 384 0.7087 0.5556 0.7087 0.8419
No log 11.0286 386 0.7372 0.5102 0.7372 0.8586
No log 11.0857 388 0.7551 0.4965 0.7551 0.8689
No log 11.1429 390 0.7377 0.4257 0.7377 0.8589
No log 11.2 392 0.7406 0.4813 0.7406 0.8606
No log 11.2571 394 0.7357 0.4813 0.7357 0.8577
No log 11.3143 396 0.7686 0.5338 0.7686 0.8767
No log 11.3714 398 0.7876 0.5498 0.7876 0.8874
No log 11.4286 400 0.7356 0.5432 0.7356 0.8576
No log 11.4857 402 0.7207 0.5411 0.7207 0.8489
No log 11.5429 404 0.7372 0.5411 0.7372 0.8586
No log 11.6 406 0.7332 0.5586 0.7332 0.8563
No log 11.6571 408 0.7869 0.5763 0.7869 0.8871
No log 11.7143 410 0.8820 0.5342 0.8820 0.9392
No log 11.7714 412 0.9658 0.5318 0.9658 0.9827
No log 11.8286 414 0.9657 0.5318 0.9657 0.9827
No log 11.8857 416 0.9285 0.5165 0.9285 0.9636
No log 11.9429 418 0.8763 0.4761 0.8763 0.9361
No log 12.0 420 0.8699 0.4565 0.8699 0.9327
No log 12.0571 422 0.8693 0.4558 0.8693 0.9324
No log 12.1143 424 0.8481 0.5114 0.8481 0.9209
No log 12.1714 426 0.8725 0.5448 0.8725 0.9341
No log 12.2286 428 0.9850 0.5504 0.9850 0.9925
No log 12.2857 430 1.0169 0.5318 1.0169 1.0084
No log 12.3429 432 0.9253 0.5468 0.9253 0.9619
No log 12.4 434 0.8046 0.5012 0.8046 0.8970
No log 12.4571 436 0.7686 0.4865 0.7686 0.8767
No log 12.5143 438 0.7600 0.4575 0.7600 0.8718
No log 12.5714 440 0.7977 0.4828 0.7977 0.8931
No log 12.6286 442 0.8495 0.5733 0.8495 0.9217
No log 12.6857 444 0.8538 0.5458 0.8538 0.9240
No log 12.7429 446 0.8301 0.5479 0.8301 0.9111
No log 12.8 448 0.7755 0.5089 0.7755 0.8807
No log 12.8571 450 0.7717 0.5491 0.7717 0.8784
No log 12.9143 452 0.7718 0.5107 0.7718 0.8785
No log 12.9714 454 0.7795 0.4023 0.7795 0.8829
No log 13.0286 456 0.7907 0.4059 0.7907 0.8892
No log 13.0857 458 0.7961 0.4220 0.7961 0.8923
No log 13.1429 460 0.8210 0.4045 0.8210 0.9061
No log 13.2 462 0.9145 0.5029 0.9145 0.9563
No log 13.2571 464 0.9934 0.4845 0.9934 0.9967
No log 13.3143 466 0.9741 0.4732 0.9741 0.9870
No log 13.3714 468 0.8847 0.4663 0.8847 0.9406
No log 13.4286 470 0.8132 0.4409 0.8132 0.9018
No log 13.4857 472 0.8092 0.4181 0.8092 0.8996
No log 13.5429 474 0.8330 0.4277 0.8330 0.9127
No log 13.6 476 0.8488 0.4202 0.8488 0.9213
No log 13.6571 478 0.8994 0.4598 0.8994 0.9484
No log 13.7143 480 0.9310 0.4655 0.9310 0.9649
No log 13.7714 482 0.8918 0.4570 0.8918 0.9443
No log 13.8286 484 0.8453 0.4712 0.8453 0.9194
No log 13.8857 486 0.8677 0.4485 0.8677 0.9315
No log 13.9429 488 0.9107 0.4958 0.9107 0.9543
No log 14.0 490 0.9260 0.5190 0.9260 0.9623
No log 14.0571 492 0.8693 0.4598 0.8693 0.9323
No log 14.1143 494 0.8654 0.4519 0.8654 0.9303
No log 14.1714 496 0.8987 0.4712 0.8987 0.9480
No log 14.2286 498 0.9189 0.4600 0.9189 0.9586
0.3683 14.2857 500 1.0119 0.5015 1.0119 1.0059
0.3683 14.3429 502 1.0283 0.4887 1.0283 1.0141
0.3683 14.4 504 0.9458 0.5094 0.9458 0.9725
0.3683 14.4571 506 0.8443 0.4704 0.8443 0.9189
0.3683 14.5143 508 0.8245 0.4219 0.8245 0.9080
0.3683 14.5714 510 0.8283 0.3838 0.8283 0.9101
0.3683 14.6286 512 0.8729 0.4359 0.8729 0.9343
0.3683 14.6857 514 0.8817 0.4359 0.8817 0.9390

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k11_task2_organization

Finetuned
(4023)
this model