ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9106
  • Qwk: 0.4772
  • Mse: 0.9106
  • Rmse: 0.9543

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0526 2 4.5878 0.0010 4.5878 2.1419
No log 0.1053 4 2.9244 0.0026 2.9244 1.7101
No log 0.1579 6 1.7552 0.0062 1.7552 1.3249
No log 0.2105 8 1.6877 0.0277 1.6877 1.2991
No log 0.2632 10 1.3991 0.0623 1.3991 1.1828
No log 0.3158 12 1.3153 0.1616 1.3153 1.1469
No log 0.3684 14 1.3340 0.1076 1.3340 1.1550
No log 0.4211 16 1.3979 0.0537 1.3979 1.1823
No log 0.4737 18 1.2967 0.0704 1.2967 1.1387
No log 0.5263 20 1.2333 -0.0333 1.2333 1.1105
No log 0.5789 22 1.2261 0.0427 1.2261 1.1073
No log 0.6316 24 1.2382 0.0527 1.2382 1.1127
No log 0.6842 26 1.2427 0.0527 1.2427 1.1148
No log 0.7368 28 1.2204 0.0499 1.2204 1.1047
No log 0.7895 30 1.2318 0.1852 1.2318 1.1099
No log 0.8421 32 1.3799 0.0297 1.3799 1.1747
No log 0.8947 34 1.8728 0.0452 1.8728 1.3685
No log 0.9474 36 1.8777 0.1149 1.8777 1.3703
No log 1.0 38 1.4641 0.0124 1.4641 1.2100
No log 1.0526 40 1.1974 0.2491 1.1974 1.0943
No log 1.1053 42 1.1122 0.1711 1.1122 1.0546
No log 1.1579 44 1.2506 0.0789 1.2506 1.1183
No log 1.2105 46 1.2374 0.0353 1.2374 1.1124
No log 1.2632 48 1.1497 0.0841 1.1497 1.0722
No log 1.3158 50 1.1578 0.2579 1.1578 1.0760
No log 1.3684 52 1.2377 0.1379 1.2377 1.1125
No log 1.4211 54 1.3480 0.0926 1.3480 1.1610
No log 1.4737 56 1.3560 0.0926 1.3560 1.1645
No log 1.5263 58 1.2500 0.1438 1.2500 1.1180
No log 1.5789 60 1.1262 0.3431 1.1262 1.0612
No log 1.6316 62 1.0453 0.4139 1.0453 1.0224
No log 1.6842 64 1.0286 0.4428 1.0286 1.0142
No log 1.7368 66 1.0315 0.3704 1.0315 1.0156
No log 1.7895 68 1.0651 0.3083 1.0651 1.0321
No log 1.8421 70 1.0795 0.3131 1.0795 1.0390
No log 1.8947 72 1.1432 0.2351 1.1432 1.0692
No log 1.9474 74 1.0998 0.3032 1.0998 1.0487
No log 2.0 76 1.0327 0.3330 1.0327 1.0162
No log 2.0526 78 0.9986 0.3679 0.9986 0.9993
No log 2.1053 80 1.0108 0.3693 1.0108 1.0054
No log 2.1579 82 1.0111 0.3119 1.0111 1.0056
No log 2.2105 84 1.0622 0.1871 1.0622 1.0306
No log 2.2632 86 1.0952 0.2206 1.0952 1.0465
No log 2.3158 88 1.0828 0.1793 1.0828 1.0406
No log 2.3684 90 1.0629 0.2988 1.0629 1.0310
No log 2.4211 92 0.9262 0.4549 0.9262 0.9624
No log 2.4737 94 0.8497 0.4428 0.8497 0.9218
No log 2.5263 96 0.8687 0.2995 0.8687 0.9320
No log 2.5789 98 0.8743 0.2991 0.8743 0.9350
No log 2.6316 100 0.8577 0.5185 0.8577 0.9261
No log 2.6842 102 0.8855 0.5325 0.8855 0.9410
No log 2.7368 104 0.8483 0.5185 0.8483 0.9210
No log 2.7895 106 0.8177 0.4563 0.8177 0.9042
No log 2.8421 108 0.9777 0.3897 0.9777 0.9888
No log 2.8947 110 1.2101 0.3752 1.2101 1.1000
No log 2.9474 112 1.2261 0.3752 1.2261 1.1073
No log 3.0 114 1.0085 0.3887 1.0085 1.0042
No log 3.0526 116 0.9354 0.3902 0.9354 0.9672
No log 3.1053 118 0.8950 0.3970 0.8950 0.9461
No log 3.1579 120 0.8730 0.4942 0.8730 0.9344
No log 3.2105 122 0.8500 0.5319 0.8500 0.9219
No log 3.2632 124 0.8285 0.4995 0.8285 0.9102
No log 3.3158 126 0.8418 0.4575 0.8418 0.9175
No log 3.3684 128 0.9731 0.3256 0.9731 0.9864
No log 3.4211 130 1.0696 0.3367 1.0696 1.0342
No log 3.4737 132 1.0538 0.3934 1.0538 1.0265
No log 3.5263 134 1.1286 0.4449 1.1286 1.0624
No log 3.5789 136 0.9998 0.3117 0.9998 0.9999
No log 3.6316 138 0.9193 0.3502 0.9193 0.9588
No log 3.6842 140 0.8961 0.4258 0.8961 0.9466
No log 3.7368 142 0.9533 0.2636 0.9533 0.9764
No log 3.7895 144 0.9968 0.2110 0.9968 0.9984
No log 3.8421 146 0.9186 0.3045 0.9186 0.9584
No log 3.8947 148 0.8953 0.3374 0.8953 0.9462
No log 3.9474 150 0.9891 0.3452 0.9891 0.9945
No log 4.0 152 1.1239 0.4112 1.1239 1.0601
No log 4.0526 154 1.0139 0.4631 1.0139 1.0069
No log 4.1053 156 0.7932 0.5673 0.7932 0.8906
No log 4.1579 158 0.8075 0.5787 0.8075 0.8986
No log 4.2105 160 0.9845 0.4088 0.9845 0.9922
No log 4.2632 162 1.3193 0.4158 1.3193 1.1486
No log 4.3158 164 1.2599 0.4332 1.2599 1.1225
No log 4.3684 166 0.9668 0.4365 0.9668 0.9833
No log 4.4211 168 0.7938 0.5131 0.7938 0.8909
No log 4.4737 170 0.7883 0.5296 0.7883 0.8879
No log 4.5263 172 0.8716 0.4916 0.8716 0.9336
No log 4.5789 174 0.9770 0.3237 0.9770 0.9885
No log 4.6316 176 0.9445 0.3677 0.9445 0.9719
No log 4.6842 178 0.9539 0.3625 0.9539 0.9767
No log 4.7368 180 0.9225 0.2231 0.9225 0.9605
No log 4.7895 182 0.8879 0.3616 0.8879 0.9423
No log 4.8421 184 0.9229 0.2231 0.9229 0.9607
No log 4.8947 186 1.0564 0.3117 1.0564 1.0278
No log 4.9474 188 1.1633 0.3416 1.1633 1.0786
No log 5.0 190 1.1232 0.3046 1.1232 1.0598
No log 5.0526 192 1.1317 0.3742 1.1317 1.0638
No log 5.1053 194 0.9887 0.2930 0.9887 0.9943
No log 5.1579 196 0.9502 0.3165 0.9502 0.9748
No log 5.2105 198 0.9212 0.3144 0.9212 0.9598
No log 5.2632 200 1.0286 0.3838 1.0286 1.0142
No log 5.3158 202 1.1653 0.3959 1.1653 1.0795
No log 5.3684 204 1.4364 0.3489 1.4364 1.1985
No log 5.4211 206 1.4283 0.3337 1.4283 1.1951
No log 5.4737 208 1.0940 0.4073 1.0940 1.0459
No log 5.5263 210 0.8490 0.4505 0.8490 0.9214
No log 5.5789 212 0.8588 0.4819 0.8588 0.9267
No log 5.6316 214 0.8575 0.3902 0.8575 0.9260
No log 5.6842 216 1.0407 0.3607 1.0407 1.0202
No log 5.7368 218 1.1648 0.3025 1.1648 1.0793
No log 5.7895 220 1.1062 0.3289 1.1062 1.0518
No log 5.8421 222 0.9631 0.2175 0.9631 0.9814
No log 5.8947 224 0.9122 0.3343 0.9122 0.9551
No log 5.9474 226 0.9607 0.3571 0.9607 0.9801
No log 6.0 228 1.1121 0.3833 1.1121 1.0546
No log 6.0526 230 1.2212 0.3576 1.2212 1.1051
No log 6.1053 232 1.2834 0.3576 1.2834 1.1329
No log 6.1579 234 1.2572 0.3511 1.2572 1.1213
No log 6.2105 236 1.0179 0.3424 1.0179 1.0089
No log 6.2632 238 0.8742 0.4280 0.8742 0.9350
No log 6.3158 240 0.8987 0.3615 0.8987 0.9480
No log 6.3684 242 0.9230 0.3467 0.9230 0.9607
No log 6.4211 244 0.9557 0.3250 0.9557 0.9776
No log 6.4737 246 0.9746 0.3045 0.9746 0.9872
No log 6.5263 248 0.9839 0.3510 0.9839 0.9919
No log 6.5789 250 0.9539 0.3557 0.9539 0.9767
No log 6.6316 252 0.9011 0.3510 0.9011 0.9493
No log 6.6842 254 0.9136 0.3920 0.9136 0.9558
No log 6.7368 256 1.0379 0.3446 1.0379 1.0188
No log 6.7895 258 1.0189 0.3607 1.0189 1.0094
No log 6.8421 260 0.9268 0.4261 0.9268 0.9627
No log 6.8947 262 0.8516 0.4979 0.8516 0.9228
No log 6.9474 264 0.8461 0.4780 0.8461 0.9198
No log 7.0 266 0.8487 0.5678 0.8487 0.9212
No log 7.0526 268 0.9117 0.3866 0.9117 0.9548
No log 7.1053 270 0.9377 0.3954 0.9377 0.9683
No log 7.1579 272 0.8854 0.4902 0.8854 0.9410
No log 7.2105 274 0.8922 0.4690 0.8922 0.9446
No log 7.2632 276 0.9005 0.4931 0.9005 0.9490
No log 7.3158 278 0.9150 0.4069 0.9150 0.9566
No log 7.3684 280 0.8967 0.4568 0.8967 0.9469
No log 7.4211 282 0.8694 0.4385 0.8694 0.9324
No log 7.4737 284 0.8905 0.4434 0.8905 0.9436
No log 7.5263 286 0.9655 0.3996 0.9655 0.9826
No log 7.5789 288 0.9863 0.4093 0.9863 0.9931
No log 7.6316 290 0.9051 0.4264 0.9051 0.9513
No log 7.6842 292 0.8623 0.3957 0.8623 0.9286
No log 7.7368 294 0.8475 0.4203 0.8475 0.9206
No log 7.7895 296 0.8783 0.4164 0.8783 0.9372
No log 7.8421 298 0.9227 0.4318 0.9227 0.9606
No log 7.8947 300 1.0025 0.4497 1.0025 1.0013
No log 7.9474 302 0.9866 0.4383 0.9866 0.9933
No log 8.0 304 1.0421 0.4075 1.0421 1.0208
No log 8.0526 306 0.9874 0.4513 0.9874 0.9937
No log 8.1053 308 0.9972 0.3978 0.9972 0.9986
No log 8.1579 310 1.0459 0.3874 1.0459 1.0227
No log 8.2105 312 1.0179 0.3868 1.0179 1.0089
No log 8.2632 314 0.9309 0.4165 0.9309 0.9648
No log 8.3158 316 0.8985 0.3729 0.8985 0.9479
No log 8.3684 318 0.9124 0.3738 0.9124 0.9552
No log 8.4211 320 1.0301 0.3685 1.0301 1.0149
No log 8.4737 322 1.0373 0.3694 1.0373 1.0185
No log 8.5263 324 1.0169 0.4016 1.0169 1.0084
No log 8.5789 326 0.9492 0.4386 0.9492 0.9743
No log 8.6316 328 0.9064 0.3139 0.9064 0.9520
No log 8.6842 330 0.9623 0.4037 0.9623 0.9810
No log 8.7368 332 1.1122 0.3411 1.1122 1.0546
No log 8.7895 334 1.1037 0.3411 1.1037 1.0506
No log 8.8421 336 0.9668 0.4473 0.9668 0.9833
No log 8.8947 338 0.9147 0.4328 0.9147 0.9564
No log 8.9474 340 0.9238 0.4623 0.9238 0.9612
No log 9.0 342 1.0063 0.4135 1.0063 1.0031
No log 9.0526 344 0.9609 0.4490 0.9609 0.9803
No log 9.1053 346 0.9609 0.4167 0.9609 0.9802
No log 9.1579 348 0.9170 0.4540 0.9170 0.9576
No log 9.2105 350 0.9119 0.4546 0.9119 0.9550
No log 9.2632 352 0.9644 0.4043 0.9644 0.9821
No log 9.3158 354 0.9082 0.4293 0.9082 0.9530
No log 9.3684 356 0.8583 0.4180 0.8583 0.9264
No log 9.4211 358 0.8819 0.4420 0.8819 0.9391
No log 9.4737 360 0.9388 0.4468 0.9388 0.9689
No log 9.5263 362 0.9816 0.4133 0.9816 0.9907
No log 9.5789 364 0.9661 0.4133 0.9661 0.9829
No log 9.6316 366 0.9456 0.4131 0.9456 0.9724
No log 9.6842 368 0.8927 0.4587 0.8927 0.9448
No log 9.7368 370 0.9011 0.4587 0.9011 0.9493
No log 9.7895 372 0.9031 0.4587 0.9031 0.9503
No log 9.8421 374 0.8788 0.4142 0.8788 0.9375
No log 9.8947 376 0.8696 0.4519 0.8696 0.9325
No log 9.9474 378 0.9073 0.3854 0.9073 0.9525
No log 10.0 380 0.9226 0.3431 0.9226 0.9605
No log 10.0526 382 0.9456 0.3383 0.9456 0.9724
No log 10.1053 384 1.0126 0.3024 1.0126 1.0063
No log 10.1579 386 1.0347 0.3367 1.0347 1.0172
No log 10.2105 388 0.9963 0.3667 0.9963 0.9982
No log 10.2632 390 0.9344 0.3093 0.9344 0.9666
No log 10.3158 392 0.8870 0.4608 0.8870 0.9418
No log 10.3684 394 0.8858 0.4280 0.8858 0.9411
No log 10.4211 396 0.9888 0.3785 0.9888 0.9944
No log 10.4737 398 1.2503 0.3791 1.2503 1.1182
No log 10.5263 400 1.2872 0.3977 1.2872 1.1346
No log 10.5789 402 1.1008 0.4071 1.1008 1.0492
No log 10.6316 404 0.9358 0.3802 0.9358 0.9674
No log 10.6842 406 0.8915 0.3317 0.8915 0.9442
No log 10.7368 408 0.9149 0.3317 0.9149 0.9565
No log 10.7895 410 0.9771 0.3985 0.9771 0.9885
No log 10.8421 412 1.0067 0.4186 1.0067 1.0033
No log 10.8947 414 1.0001 0.3602 1.0001 1.0001
No log 10.9474 416 0.9492 0.4125 0.9492 0.9743
No log 11.0 418 0.9481 0.4357 0.9480 0.9737
No log 11.0526 420 0.9870 0.3778 0.9870 0.9935
No log 11.1053 422 1.0012 0.4003 1.0012 1.0006
No log 11.1579 424 0.9367 0.4025 0.9367 0.9678
No log 11.2105 426 0.8945 0.3753 0.8945 0.9458
No log 11.2632 428 0.8790 0.3921 0.8790 0.9376
No log 11.3158 430 0.8687 0.3927 0.8687 0.9320
No log 11.3684 432 0.8688 0.4476 0.8688 0.9321
No log 11.4211 434 0.8887 0.4898 0.8887 0.9427
No log 11.4737 436 0.8859 0.4898 0.8859 0.9412
No log 11.5263 438 0.8534 0.4676 0.8534 0.9238
No log 11.5789 440 0.8484 0.4676 0.8484 0.9211
No log 11.6316 442 0.8418 0.4575 0.8418 0.9175
No log 11.6842 444 0.8521 0.4676 0.8521 0.9231
No log 11.7368 446 0.8810 0.4865 0.8810 0.9386
No log 11.7895 448 0.9673 0.3528 0.9673 0.9835
No log 11.8421 450 1.0643 0.4069 1.0643 1.0316
No log 11.8947 452 1.0084 0.3982 1.0084 1.0042
No log 11.9474 454 0.9175 0.4345 0.9175 0.9579
No log 12.0 456 0.8564 0.4982 0.8564 0.9254
No log 12.0526 458 0.8823 0.4588 0.8823 0.9393
No log 12.1053 460 0.8384 0.4912 0.8384 0.9156
No log 12.1579 462 0.8513 0.4539 0.8513 0.9227
No log 12.2105 464 0.8727 0.4640 0.8727 0.9342
No log 12.2632 466 0.9368 0.4087 0.9368 0.9679
No log 12.3158 468 0.9202 0.4087 0.9202 0.9593
No log 12.3684 470 0.8742 0.4765 0.8742 0.9350
No log 12.4211 472 0.8536 0.4811 0.8536 0.9239
No log 12.4737 474 0.8312 0.4620 0.8312 0.9117
No log 12.5263 476 0.8486 0.4444 0.8486 0.9212
No log 12.5789 478 0.9200 0.3851 0.9200 0.9592
No log 12.6316 480 0.9685 0.4186 0.9685 0.9841
No log 12.6842 482 1.0097 0.3607 1.0097 1.0048
No log 12.7368 484 0.9756 0.4186 0.9756 0.9877
No log 12.7895 486 0.9182 0.3949 0.9182 0.9582
No log 12.8421 488 0.9162 0.3949 0.9162 0.9572
No log 12.8947 490 0.9375 0.3949 0.9375 0.9682
No log 12.9474 492 0.9704 0.4186 0.9704 0.9851
No log 13.0 494 0.9987 0.4186 0.9987 0.9993
No log 13.0526 496 0.8959 0.4783 0.8959 0.9465
No log 13.1053 498 0.7745 0.5142 0.7745 0.8801
0.3699 13.1579 500 0.7500 0.5028 0.7500 0.8660
0.3699 13.2105 502 0.7656 0.4757 0.7656 0.8750
0.3699 13.2632 504 0.8292 0.4163 0.8292 0.9106
0.3699 13.3158 506 0.9123 0.3474 0.9123 0.9551
0.3699 13.3684 508 0.9526 0.3739 0.9526 0.9760
0.3699 13.4211 510 0.9106 0.4772 0.9106 0.9543

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task2_organization

Finetuned
(4023)
this model