ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7401
  • Qwk: 0.5278
  • Mse: 0.7401
  • Rmse: 0.8603

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 4.0781 -0.0169 4.0781 2.0194
No log 0.0727 4 2.2393 0.0264 2.2393 1.4964
No log 0.1091 6 1.5143 -0.0256 1.5143 1.2306
No log 0.1455 8 2.0767 -0.1484 2.0767 1.4411
No log 0.1818 10 1.6326 -0.1127 1.6326 1.2777
No log 0.2182 12 1.0668 -0.0849 1.0668 1.0329
No log 0.2545 14 0.7871 0.2241 0.7871 0.8872
No log 0.2909 16 0.7235 0.2285 0.7235 0.8506
No log 0.3273 18 0.7081 0.2286 0.7081 0.8415
No log 0.3636 20 0.8129 0.1112 0.8129 0.9016
No log 0.4 22 0.8208 0.1590 0.8208 0.9060
No log 0.4364 24 0.8570 0.1924 0.8570 0.9258
No log 0.4727 26 0.8893 0.2797 0.8893 0.9430
No log 0.5091 28 0.8845 0.3444 0.8845 0.9405
No log 0.5455 30 1.5500 0.1808 1.5500 1.2450
No log 0.5818 32 1.6884 0.1652 1.6884 1.2994
No log 0.6182 34 1.1657 0.2280 1.1657 1.0797
No log 0.6545 36 0.7238 0.3289 0.7238 0.8508
No log 0.6909 38 0.5789 0.4843 0.5789 0.7608
No log 0.7273 40 0.5500 0.4706 0.5500 0.7416
No log 0.7636 42 0.5426 0.4482 0.5426 0.7366
No log 0.8 44 0.5580 0.4236 0.5580 0.7470
No log 0.8364 46 0.5094 0.4698 0.5094 0.7137
No log 0.8727 48 0.5105 0.4824 0.5105 0.7145
No log 0.9091 50 0.5565 0.4581 0.5565 0.7460
No log 0.9455 52 0.6569 0.4021 0.6569 0.8105
No log 0.9818 54 0.6134 0.4575 0.6134 0.7832
No log 1.0182 56 0.5723 0.4992 0.5723 0.7565
No log 1.0545 58 0.5594 0.4655 0.5594 0.7479
No log 1.0909 60 0.5915 0.4508 0.5915 0.7691
No log 1.1273 62 0.6183 0.4742 0.6183 0.7863
No log 1.1636 64 0.6238 0.4233 0.6238 0.7898
No log 1.2 66 0.7125 0.3377 0.7125 0.8441
No log 1.2364 68 0.7739 0.3443 0.7739 0.8797
No log 1.2727 70 0.7201 0.4909 0.7201 0.8486
No log 1.3091 72 0.6923 0.4465 0.6923 0.8320
No log 1.3455 74 0.8469 0.4227 0.8469 0.9203
No log 1.3818 76 0.6963 0.4635 0.6963 0.8345
No log 1.4182 78 0.6787 0.5530 0.6787 0.8238
No log 1.4545 80 0.7533 0.4966 0.7533 0.8679
No log 1.4909 82 0.6354 0.5611 0.6354 0.7971
No log 1.5273 84 0.6071 0.4532 0.6071 0.7791
No log 1.5636 86 0.7696 0.3753 0.7696 0.8773
No log 1.6 88 0.7174 0.3365 0.7174 0.8470
No log 1.6364 90 0.5862 0.4336 0.5862 0.7656
No log 1.6727 92 0.6143 0.5368 0.6143 0.7838
No log 1.7091 94 0.7141 0.4603 0.7141 0.8450
No log 1.7455 96 0.6741 0.4934 0.6741 0.8211
No log 1.7818 98 0.6982 0.4608 0.6982 0.8356
No log 1.8182 100 0.8773 0.4114 0.8773 0.9367
No log 1.8545 102 0.9173 0.4060 0.9173 0.9577
No log 1.8909 104 0.7547 0.4758 0.7547 0.8687
No log 1.9273 106 0.6864 0.5027 0.6864 0.8285
No log 1.9636 108 0.6726 0.5223 0.6726 0.8201
No log 2.0 110 0.6774 0.4957 0.6774 0.8231
No log 2.0364 112 0.7056 0.4661 0.7056 0.8400
No log 2.0727 114 0.6948 0.5299 0.6948 0.8335
No log 2.1091 116 0.6981 0.5506 0.6981 0.8355
No log 2.1455 118 0.7213 0.5269 0.7213 0.8493
No log 2.1818 120 0.7629 0.4684 0.7629 0.8734
No log 2.2182 122 0.7998 0.4701 0.7998 0.8943
No log 2.2545 124 0.7719 0.4809 0.7719 0.8786
No log 2.2909 126 0.8044 0.5168 0.8044 0.8969
No log 2.3273 128 0.8166 0.5091 0.8166 0.9036
No log 2.3636 130 0.8134 0.5005 0.8134 0.9019
No log 2.4 132 0.7844 0.4843 0.7844 0.8856
No log 2.4364 134 0.7498 0.4865 0.7498 0.8659
No log 2.4727 136 0.7373 0.494 0.7373 0.8587
No log 2.5091 138 0.7447 0.4782 0.7447 0.8630
No log 2.5455 140 0.7803 0.4767 0.7803 0.8834
No log 2.5818 142 0.7813 0.4530 0.7813 0.8839
No log 2.6182 144 0.8184 0.4834 0.8184 0.9047
No log 2.6545 146 0.8586 0.4875 0.8586 0.9266
No log 2.6909 148 0.8794 0.4896 0.8794 0.9378
No log 2.7273 150 0.8759 0.4958 0.8759 0.9359
No log 2.7636 152 0.8749 0.4799 0.8749 0.9354
No log 2.8 154 0.9104 0.4453 0.9104 0.9541
No log 2.8364 156 1.0094 0.4091 1.0094 1.0047
No log 2.8727 158 1.0968 0.3664 1.0968 1.0473
No log 2.9091 160 0.9717 0.4578 0.9717 0.9858
No log 2.9455 162 0.9553 0.4390 0.9553 0.9774
No log 2.9818 164 0.9845 0.4273 0.9845 0.9922
No log 3.0182 166 1.0257 0.4488 1.0257 1.0128
No log 3.0545 168 0.9025 0.4582 0.9025 0.9500
No log 3.0909 170 0.8472 0.4745 0.8472 0.9204
No log 3.1273 172 0.7790 0.4858 0.7790 0.8826
No log 3.1636 174 0.7289 0.4972 0.7289 0.8538
No log 3.2 176 0.7215 0.4802 0.7215 0.8494
No log 3.2364 178 0.7292 0.4876 0.7292 0.8539
No log 3.2727 180 0.8234 0.4802 0.8234 0.9074
No log 3.3091 182 0.8590 0.4654 0.8590 0.9268
No log 3.3455 184 0.8684 0.4420 0.8684 0.9319
No log 3.3818 186 0.7613 0.4941 0.7613 0.8725
No log 3.4182 188 0.7345 0.4782 0.7345 0.8570
No log 3.4545 190 0.7143 0.5085 0.7143 0.8451
No log 3.4909 192 0.7426 0.4268 0.7426 0.8618
No log 3.5273 194 0.7851 0.3997 0.7851 0.8861
No log 3.5636 196 0.7484 0.4531 0.7484 0.8651
No log 3.6 198 0.7602 0.5066 0.7602 0.8719
No log 3.6364 200 0.8007 0.4855 0.8007 0.8948
No log 3.6727 202 0.8407 0.4664 0.8407 0.9169
No log 3.7091 204 0.8678 0.4927 0.8678 0.9316
No log 3.7455 206 0.8706 0.4693 0.8706 0.9331
No log 3.7818 208 0.8410 0.4556 0.8410 0.9170
No log 3.8182 210 0.7655 0.4650 0.7655 0.8749
No log 3.8545 212 0.7966 0.4860 0.7966 0.8925
No log 3.8909 214 0.8129 0.4781 0.8129 0.9016
No log 3.9273 216 0.7865 0.4550 0.7865 0.8869
No log 3.9636 218 0.7829 0.5037 0.7829 0.8848
No log 4.0 220 0.8070 0.4695 0.8070 0.8983
No log 4.0364 222 0.7884 0.4997 0.7884 0.8879
No log 4.0727 224 0.7428 0.4642 0.7428 0.8618
No log 4.1091 226 0.7343 0.5222 0.7343 0.8569
No log 4.1455 228 0.7223 0.5195 0.7223 0.8499
No log 4.1818 230 0.6859 0.4899 0.6859 0.8282
No log 4.2182 232 0.7027 0.5149 0.7027 0.8383
No log 4.2545 234 0.7368 0.5187 0.7368 0.8584
No log 4.2909 236 0.7645 0.4994 0.7645 0.8743
No log 4.3273 238 0.8755 0.4825 0.8755 0.9357
No log 4.3636 240 1.0294 0.4312 1.0294 1.0146
No log 4.4 242 0.9872 0.4264 0.9872 0.9936
No log 4.4364 244 0.8352 0.4947 0.8352 0.9139
No log 4.4727 246 0.7567 0.4830 0.7567 0.8699
No log 4.5091 248 0.8110 0.4906 0.8110 0.9005
No log 4.5455 250 0.7985 0.4848 0.7985 0.8936
No log 4.5818 252 0.7401 0.4837 0.7401 0.8603
No log 4.6182 254 0.7205 0.5486 0.7205 0.8488
No log 4.6545 256 0.6859 0.4931 0.6859 0.8282
No log 4.6909 258 0.6658 0.5151 0.6658 0.8160
No log 4.7273 260 0.6701 0.5090 0.6701 0.8186
No log 4.7636 262 0.6944 0.5246 0.6944 0.8333
No log 4.8 264 0.7731 0.5050 0.7731 0.8793
No log 4.8364 266 0.8300 0.4950 0.8300 0.9111
No log 4.8727 268 0.8002 0.5221 0.8002 0.8945
No log 4.9091 270 0.7770 0.5169 0.7770 0.8815
No log 4.9455 272 0.8148 0.5085 0.8148 0.9027
No log 4.9818 274 0.7937 0.5094 0.7937 0.8909
No log 5.0182 276 0.7469 0.5106 0.7469 0.8643
No log 5.0545 278 0.7307 0.5206 0.7307 0.8548
No log 5.0909 280 0.7326 0.5133 0.7326 0.8559
No log 5.1273 282 0.7457 0.5149 0.7457 0.8635
No log 5.1636 284 0.7752 0.5198 0.7752 0.8804
No log 5.2 286 0.7719 0.5198 0.7719 0.8786
No log 5.2364 288 0.7535 0.5113 0.7535 0.8680
No log 5.2727 290 0.7344 0.5 0.7344 0.8570
No log 5.3091 292 0.7039 0.5225 0.7039 0.8390
No log 5.3455 294 0.6762 0.5265 0.6762 0.8223
No log 5.3818 296 0.6807 0.4713 0.6807 0.8251
No log 5.4182 298 0.6756 0.5021 0.6756 0.8220
No log 5.4545 300 0.6890 0.5029 0.6890 0.8301
No log 5.4909 302 0.7075 0.5324 0.7075 0.8411
No log 5.5273 304 0.7465 0.4958 0.7465 0.8640
No log 5.5636 306 0.7725 0.5179 0.7725 0.8789
No log 5.6 308 0.7832 0.5044 0.7832 0.8850
No log 5.6364 310 0.7744 0.5282 0.7744 0.8800
No log 5.6727 312 0.7562 0.5465 0.7562 0.8696
No log 5.7091 314 0.7645 0.5430 0.7645 0.8743
No log 5.7455 316 0.7625 0.5321 0.7625 0.8732
No log 5.7818 318 0.7467 0.5258 0.7467 0.8641
No log 5.8182 320 0.7302 0.5463 0.7302 0.8545
No log 5.8545 322 0.7157 0.5444 0.7157 0.8460
No log 5.8909 324 0.7058 0.5233 0.7058 0.8401
No log 5.9273 326 0.7313 0.5184 0.7313 0.8552
No log 5.9636 328 0.7473 0.5360 0.7473 0.8645
No log 6.0 330 0.7464 0.5085 0.7464 0.8639
No log 6.0364 332 0.7767 0.4895 0.7767 0.8813
No log 6.0727 334 0.7957 0.4859 0.7957 0.8920
No log 6.1091 336 0.7982 0.5058 0.7982 0.8934
No log 6.1455 338 0.7963 0.5061 0.7963 0.8924
No log 6.1818 340 0.8162 0.5265 0.8162 0.9034
No log 6.2182 342 0.8028 0.5221 0.8028 0.8960
No log 6.2545 344 0.7592 0.494 0.7592 0.8713
No log 6.2909 346 0.7456 0.4977 0.7456 0.8635
No log 6.3273 348 0.7275 0.4841 0.7275 0.8530
No log 6.3636 350 0.7128 0.4895 0.7128 0.8443
No log 6.4 352 0.7160 0.4948 0.7160 0.8462
No log 6.4364 354 0.7219 0.4890 0.7219 0.8496
No log 6.4727 356 0.7210 0.4913 0.7210 0.8491
No log 6.5091 358 0.7312 0.4731 0.7312 0.8551
No log 6.5455 360 0.7535 0.4884 0.7535 0.8680
No log 6.5818 362 0.7695 0.4884 0.7695 0.8772
No log 6.6182 364 0.7692 0.5085 0.7692 0.8770
No log 6.6545 366 0.7679 0.5230 0.7679 0.8763
No log 6.6909 368 0.7567 0.5095 0.7567 0.8699
No log 6.7273 370 0.7355 0.5020 0.7355 0.8576
No log 6.7636 372 0.7229 0.5248 0.7229 0.8502
No log 6.8 374 0.7099 0.5129 0.7099 0.8425
No log 6.8364 376 0.6992 0.5105 0.6992 0.8362
No log 6.8727 378 0.7064 0.5122 0.7064 0.8405
No log 6.9091 380 0.7113 0.5122 0.7113 0.8434
No log 6.9455 382 0.7184 0.5064 0.7184 0.8476
No log 6.9818 384 0.7236 0.5208 0.7236 0.8506
No log 7.0182 386 0.7227 0.5265 0.7227 0.8501
No log 7.0545 388 0.7399 0.5265 0.7399 0.8602
No log 7.0909 390 0.7591 0.5258 0.7591 0.8712
No log 7.1273 392 0.7796 0.5263 0.7796 0.8830
No log 7.1636 394 0.7962 0.5306 0.7962 0.8923
No log 7.2 396 0.8033 0.5198 0.8033 0.8963
No log 7.2364 398 0.8176 0.5255 0.8176 0.9042
No log 7.2727 400 0.8117 0.5375 0.8117 0.9009
No log 7.3091 402 0.7920 0.5363 0.7920 0.8899
No log 7.3455 404 0.7824 0.5428 0.7824 0.8846
No log 7.3818 406 0.7837 0.5372 0.7837 0.8852
No log 7.4182 408 0.7692 0.5428 0.7692 0.8771
No log 7.4545 410 0.7531 0.4932 0.7531 0.8678
No log 7.4909 412 0.7514 0.5274 0.7514 0.8668
No log 7.5273 414 0.7632 0.4803 0.7632 0.8736
No log 7.5636 416 0.7700 0.4796 0.7700 0.8775
No log 7.6 418 0.7642 0.4856 0.7642 0.8742
No log 7.6364 420 0.7550 0.4830 0.7550 0.8689
No log 7.6727 422 0.7461 0.4830 0.7461 0.8637
No log 7.7091 424 0.7461 0.4884 0.7461 0.8638
No log 7.7455 426 0.7587 0.4797 0.7587 0.8710
No log 7.7818 428 0.7619 0.4764 0.7619 0.8729
No log 7.8182 430 0.7602 0.4838 0.7602 0.8719
No log 7.8545 432 0.7587 0.4845 0.7587 0.8710
No log 7.8909 434 0.7500 0.4845 0.7500 0.8661
No log 7.9273 436 0.7450 0.4985 0.7450 0.8632
No log 7.9636 438 0.7450 0.4927 0.7450 0.8631
No log 8.0 440 0.7452 0.4812 0.7452 0.8632
No log 8.0364 442 0.7535 0.4636 0.7535 0.8681
No log 8.0727 444 0.7629 0.4636 0.7629 0.8734
No log 8.1091 446 0.7707 0.4692 0.7707 0.8779
No log 8.1455 448 0.7787 0.4937 0.7787 0.8825
No log 8.1818 450 0.7909 0.4569 0.7909 0.8893
No log 8.2182 452 0.8014 0.4620 0.8014 0.8952
No log 8.2545 454 0.8052 0.4717 0.8052 0.8973
No log 8.2909 456 0.8014 0.4835 0.8014 0.8952
No log 8.3273 458 0.7974 0.4909 0.7974 0.8930
No log 8.3636 460 0.7806 0.4558 0.7806 0.8835
No log 8.4 462 0.7641 0.4502 0.7641 0.8741
No log 8.4364 464 0.7550 0.4941 0.7550 0.8689
No log 8.4727 466 0.7431 0.5087 0.7431 0.8621
No log 8.5091 468 0.7400 0.4899 0.7400 0.8603
No log 8.5455 470 0.7351 0.4910 0.7351 0.8574
No log 8.5818 472 0.7319 0.4870 0.7319 0.8555
No log 8.6182 474 0.7335 0.4972 0.7335 0.8565
No log 8.6545 476 0.7347 0.4972 0.7347 0.8571
No log 8.6909 478 0.7380 0.4972 0.7380 0.8591
No log 8.7273 480 0.7418 0.5217 0.7418 0.8613
No log 8.7636 482 0.7447 0.5217 0.7447 0.8630
No log 8.8 484 0.7471 0.5217 0.7471 0.8644
No log 8.8364 486 0.7479 0.5217 0.7479 0.8648
No log 8.8727 488 0.7426 0.5217 0.7426 0.8617
No log 8.9091 490 0.7384 0.5217 0.7384 0.8593
No log 8.9455 492 0.7356 0.5217 0.7356 0.8577
No log 8.9818 494 0.7315 0.5217 0.7315 0.8553
No log 9.0182 496 0.7288 0.5078 0.7288 0.8537
No log 9.0545 498 0.7278 0.5078 0.7278 0.8531
0.3824 9.0909 500 0.7310 0.4985 0.7310 0.8550
0.3824 9.1273 502 0.7326 0.4985 0.7326 0.8559
0.3824 9.1636 504 0.7372 0.4927 0.7372 0.8586
0.3824 9.2 506 0.7444 0.4845 0.7444 0.8628
0.3824 9.2364 508 0.7495 0.4845 0.7495 0.8657
0.3824 9.2727 510 0.7545 0.4922 0.7545 0.8686
0.3824 9.3091 512 0.7577 0.5067 0.7577 0.8704
0.3824 9.3455 514 0.7616 0.5067 0.7616 0.8727
0.3824 9.3818 516 0.7630 0.5354 0.7630 0.8735
0.3824 9.4182 518 0.7610 0.5354 0.7610 0.8723
0.3824 9.4545 520 0.7607 0.5298 0.7607 0.8722
0.3824 9.4909 522 0.7598 0.5298 0.7598 0.8717
0.3824 9.5273 524 0.7591 0.5354 0.7591 0.8712
0.3824 9.5636 526 0.7572 0.5354 0.7572 0.8702
0.3824 9.6 528 0.7547 0.5410 0.7547 0.8687
0.3824 9.6364 530 0.7538 0.5206 0.7538 0.8682
0.3824 9.6727 532 0.7523 0.5206 0.7523 0.8674
0.3824 9.7091 534 0.7507 0.5206 0.7507 0.8664
0.3824 9.7455 536 0.7494 0.5206 0.7494 0.8657
0.3824 9.7818 538 0.7478 0.5278 0.7478 0.8648
0.3824 9.8182 540 0.7461 0.5278 0.7461 0.8638
0.3824 9.8545 542 0.7445 0.5278 0.7445 0.8628
0.3824 9.8909 544 0.7427 0.5278 0.7427 0.8618
0.3824 9.9273 546 0.7412 0.5278 0.7412 0.8609
0.3824 9.9636 548 0.7404 0.5278 0.7404 0.8604
0.3824 10.0 550 0.7401 0.5278 0.7401 0.8603

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k11_task2_organization

Finetuned
(4023)
this model