ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9149
  • Qwk: 0.3270
  • Mse: 0.9149
  • Rmse: 0.9565

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 4.7824 0.0010 4.7824 2.1869
No log 0.0727 4 2.7949 -0.0452 2.7949 1.6718
No log 0.1091 6 2.1438 0.0094 2.1438 1.4642
No log 0.1455 8 2.7067 -0.0681 2.7067 1.6452
No log 0.1818 10 2.0233 -0.0233 2.0233 1.4224
No log 0.2182 12 1.5715 0.0018 1.5715 1.2536
No log 0.2545 14 1.4950 -0.0522 1.4950 1.2227
No log 0.2909 16 1.5672 -0.0534 1.5672 1.2519
No log 0.3273 18 1.5126 -0.0603 1.5126 1.2299
No log 0.3636 20 1.4470 -0.0603 1.4470 1.2029
No log 0.4 22 1.5509 0.0124 1.5509 1.2454
No log 0.4364 24 1.5838 0.0121 1.5838 1.2585
No log 0.4727 26 1.3843 0.1472 1.3843 1.1766
No log 0.5091 28 1.2289 0.0741 1.2289 1.1085
No log 0.5455 30 1.2237 0.0700 1.2237 1.1062
No log 0.5818 32 1.2593 0.0527 1.2593 1.1222
No log 0.6182 34 1.2912 0.0353 1.2912 1.1363
No log 0.6545 36 1.3308 0.0353 1.3308 1.1536
No log 0.6909 38 1.2902 -0.0043 1.2902 1.1359
No log 0.7273 40 1.2761 0.1011 1.2761 1.1297
No log 0.7636 42 1.2610 0.1579 1.2610 1.1229
No log 0.8 44 1.2323 0.1530 1.2323 1.1101
No log 0.8364 46 1.1637 0.1904 1.1637 1.0788
No log 0.8727 48 1.1364 0.2785 1.1364 1.0660
No log 0.9091 50 1.1317 0.2038 1.1317 1.0638
No log 0.9455 52 1.1248 0.1546 1.1248 1.0606
No log 0.9818 54 1.1400 0.1706 1.1400 1.0677
No log 1.0182 56 1.1534 0.1911 1.1534 1.0740
No log 1.0545 58 1.1542 0.1860 1.1542 1.0743
No log 1.0909 60 1.0527 0.2658 1.0527 1.0260
No log 1.1273 62 1.1391 0.3433 1.1391 1.0673
No log 1.1636 64 1.5475 0.1520 1.5475 1.2440
No log 1.2 66 1.7204 0.1618 1.7204 1.3117
No log 1.2364 68 1.6726 0.1140 1.6726 1.2933
No log 1.2727 70 1.4407 0.1743 1.4407 1.2003
No log 1.3091 72 1.2061 0.2191 1.2061 1.0982
No log 1.3455 74 1.0773 0.3908 1.0773 1.0379
No log 1.3818 76 1.0337 0.3612 1.0337 1.0167
No log 1.4182 78 1.0452 0.3793 1.0452 1.0224
No log 1.4545 80 1.0376 0.3567 1.0376 1.0186
No log 1.4909 82 1.0425 0.3090 1.0425 1.0210
No log 1.5273 84 1.0665 0.2590 1.0665 1.0327
No log 1.5636 86 1.0196 0.3021 1.0196 1.0097
No log 1.6 88 1.0450 0.3615 1.0450 1.0222
No log 1.6364 90 1.0477 0.3960 1.0477 1.0236
No log 1.6727 92 1.0849 0.3476 1.0849 1.0416
No log 1.7091 94 1.0528 0.4137 1.0528 1.0260
No log 1.7455 96 0.9493 0.4059 0.9493 0.9743
No log 1.7818 98 0.8759 0.4039 0.8759 0.9359
No log 1.8182 100 0.8620 0.4394 0.8620 0.9284
No log 1.8545 102 0.8678 0.4202 0.8678 0.9316
No log 1.8909 104 0.8925 0.3169 0.8925 0.9447
No log 1.9273 106 0.9197 0.3168 0.9197 0.9590
No log 1.9636 108 0.9251 0.3627 0.9251 0.9618
No log 2.0 110 0.8916 0.3237 0.8916 0.9442
No log 2.0364 112 0.9214 0.5 0.9214 0.9599
No log 2.0727 114 1.2592 0.4371 1.2592 1.1222
No log 2.1091 116 1.3732 0.2662 1.3732 1.1719
No log 2.1455 118 1.1125 0.4803 1.1125 1.0547
No log 2.1818 120 0.9339 0.4736 0.9339 0.9664
No log 2.2182 122 1.0177 0.4454 1.0177 1.0088
No log 2.2545 124 0.9630 0.4626 0.9630 0.9813
No log 2.2909 126 0.9312 0.3412 0.9312 0.9650
No log 2.3273 128 1.0027 0.3686 1.0027 1.0013
No log 2.3636 130 1.0006 0.2999 1.0006 1.0003
No log 2.4 132 1.0824 0.3398 1.0824 1.0404
No log 2.4364 134 1.0805 0.3068 1.0805 1.0395
No log 2.4727 136 0.9812 0.3412 0.9812 0.9905
No log 2.5091 138 0.9718 0.3309 0.9718 0.9858
No log 2.5455 140 0.9882 0.3654 0.9882 0.9941
No log 2.5818 142 1.0588 0.4167 1.0588 1.0290
No log 2.6182 144 1.0454 0.4220 1.0454 1.0224
No log 2.6545 146 0.9737 0.4132 0.9737 0.9868
No log 2.6909 148 0.8723 0.4002 0.8723 0.9339
No log 2.7273 150 0.8535 0.3920 0.8535 0.9239
No log 2.7636 152 0.8600 0.4159 0.8600 0.9274
No log 2.8 154 0.8722 0.4316 0.8722 0.9339
No log 2.8364 156 0.9142 0.4019 0.9142 0.9562
No log 2.8727 158 0.8898 0.3694 0.8898 0.9433
No log 2.9091 160 0.8890 0.3914 0.8890 0.9428
No log 2.9455 162 0.9726 0.4136 0.9726 0.9862
No log 2.9818 164 0.9785 0.4136 0.9785 0.9892
No log 3.0182 166 0.9553 0.4408 0.9553 0.9774
No log 3.0545 168 0.8823 0.4609 0.8823 0.9393
No log 3.0909 170 0.9341 0.3986 0.9341 0.9665
No log 3.1273 172 0.9255 0.3747 0.9255 0.9620
No log 3.1636 174 0.8928 0.4009 0.8928 0.9449
No log 3.2 176 0.8475 0.3873 0.8475 0.9206
No log 3.2364 178 0.8674 0.5060 0.8674 0.9313
No log 3.2727 180 0.9141 0.4717 0.9141 0.9561
No log 3.3091 182 0.8991 0.4541 0.8991 0.9482
No log 3.3455 184 0.8841 0.2464 0.8841 0.9403
No log 3.3818 186 0.9046 0.3289 0.9046 0.9511
No log 3.4182 188 0.9082 0.3211 0.9082 0.9530
No log 3.4545 190 0.9116 0.4252 0.9116 0.9548
No log 3.4909 192 1.0372 0.4620 1.0372 1.0184
No log 3.5273 194 1.1275 0.4106 1.1275 1.0618
No log 3.5636 196 1.0208 0.4991 1.0208 1.0103
No log 3.6 198 0.9621 0.3956 0.9621 0.9809
No log 3.6364 200 0.9563 0.3261 0.9563 0.9779
No log 3.6727 202 0.9642 0.3478 0.9642 0.9819
No log 3.7091 204 0.9339 0.2692 0.9339 0.9664
No log 3.7455 206 0.9314 0.3299 0.9314 0.9651
No log 3.7818 208 0.9487 0.4444 0.9487 0.9740
No log 3.8182 210 0.9312 0.3250 0.9312 0.9650
No log 3.8545 212 0.9397 0.2843 0.9397 0.9694
No log 3.8909 214 0.9377 0.2843 0.9377 0.9683
No log 3.9273 216 0.9430 0.3753 0.9430 0.9711
No log 3.9636 218 0.9625 0.2692 0.9625 0.9811
No log 4.0 220 0.9935 0.1927 0.9935 0.9968
No log 4.0364 222 1.0045 0.1717 1.0045 1.0022
No log 4.0727 224 0.9831 0.1444 0.9831 0.9915
No log 4.1091 226 0.9914 0.1343 0.9914 0.9957
No log 4.1455 228 0.9805 0.1875 0.9805 0.9902
No log 4.1818 230 0.9858 0.1546 0.9858 0.9929
No log 4.2182 232 0.9715 0.2569 0.9715 0.9857
No log 4.2545 234 0.9670 0.1711 0.9670 0.9833
No log 4.2909 236 0.9446 0.2729 0.9446 0.9719
No log 4.3273 238 0.9305 0.3307 0.9305 0.9646
No log 4.3636 240 0.9226 0.3307 0.9226 0.9605
No log 4.4 242 0.9158 0.3551 0.9158 0.9570
No log 4.4364 244 0.9113 0.2621 0.9113 0.9546
No log 4.4727 246 0.9202 0.3779 0.9202 0.9593
No log 4.5091 248 1.0050 0.3992 1.0050 1.0025
No log 4.5455 250 1.0490 0.4417 1.0490 1.0242
No log 4.5818 252 0.9896 0.3660 0.9896 0.9948
No log 4.6182 254 0.9508 0.3070 0.9508 0.9751
No log 4.6545 256 0.9598 0.2164 0.9598 0.9797
No log 4.6909 258 0.9402 0.2972 0.9402 0.9697
No log 4.7273 260 0.9043 0.3174 0.9043 0.9509
No log 4.7636 262 0.9133 0.3866 0.9133 0.9557
No log 4.8 264 0.9161 0.3909 0.9161 0.9571
No log 4.8364 266 0.8921 0.4261 0.8921 0.9445
No log 4.8727 268 0.8811 0.3866 0.8811 0.9386
No log 4.9091 270 0.8821 0.2729 0.8821 0.9392
No log 4.9455 272 0.8752 0.3045 0.8752 0.9355
No log 4.9818 274 0.8830 0.3278 0.8830 0.9397
No log 5.0182 276 0.8884 0.2834 0.8884 0.9426
No log 5.0545 278 0.8789 0.2624 0.8789 0.9375
No log 5.0909 280 0.8720 0.2887 0.8720 0.9338
No log 5.1273 282 0.8593 0.3830 0.8593 0.9270
No log 5.1636 284 0.8581 0.3804 0.8581 0.9263
No log 5.2 286 0.8912 0.3506 0.8912 0.9440
No log 5.2364 288 0.8917 0.3256 0.8917 0.9443
No log 5.2727 290 0.8836 0.3671 0.8836 0.9400
No log 5.3091 292 0.8930 0.3558 0.8930 0.9450
No log 5.3455 294 0.8871 0.3558 0.8871 0.9419
No log 5.3818 296 0.9049 0.4483 0.9049 0.9512
No log 5.4182 298 0.8914 0.4094 0.8914 0.9442
No log 5.4545 300 0.8471 0.3145 0.8471 0.9204
No log 5.4909 302 0.8428 0.3478 0.8428 0.9181
No log 5.5273 304 0.8069 0.3914 0.8069 0.8983
No log 5.5636 306 0.7940 0.5446 0.7940 0.8911
No log 5.6 308 0.8014 0.5549 0.8014 0.8952
No log 5.6364 310 0.8746 0.5416 0.8746 0.9352
No log 5.6727 312 0.8218 0.5052 0.8218 0.9065
No log 5.7091 314 0.7587 0.4626 0.7587 0.8711
No log 5.7455 316 0.7691 0.4435 0.7691 0.8770
No log 5.7818 318 0.7793 0.4527 0.7793 0.8828
No log 5.8182 320 0.7720 0.4794 0.7720 0.8787
No log 5.8545 322 0.7692 0.4334 0.7692 0.8771
No log 5.8909 324 0.7691 0.4706 0.7691 0.8770
No log 5.9273 326 0.7929 0.4242 0.7929 0.8904
No log 5.9636 328 0.8882 0.4351 0.8882 0.9424
No log 6.0 330 0.9055 0.4565 0.9055 0.9516
No log 6.0364 332 0.7990 0.4369 0.7990 0.8938
No log 6.0727 334 0.7775 0.5219 0.7775 0.8818
No log 6.1091 336 0.8401 0.5602 0.8401 0.9166
No log 6.1455 338 0.7960 0.5219 0.7960 0.8922
No log 6.1818 340 0.8099 0.4142 0.8099 0.8999
No log 6.2182 342 0.8887 0.4025 0.8887 0.9427
No log 6.2545 344 0.8553 0.3769 0.8553 0.9248
No log 6.2909 346 0.8254 0.3392 0.8254 0.9085
No log 6.3273 348 0.8022 0.4434 0.8022 0.8957
No log 6.3636 350 0.8122 0.4471 0.8122 0.9012
No log 6.4 352 0.7933 0.4434 0.7933 0.8907
No log 6.4364 354 0.8114 0.4548 0.8114 0.9008
No log 6.4727 356 0.9552 0.4128 0.9552 0.9774
No log 6.5091 358 1.0667 0.3978 1.0667 1.0328
No log 6.5455 360 0.9916 0.3141 0.9916 0.9958
No log 6.5818 362 0.8666 0.2834 0.8666 0.9309
No log 6.6182 364 0.8241 0.3392 0.8241 0.9078
No log 6.6545 366 0.8174 0.3695 0.8174 0.9041
No log 6.6909 368 0.8722 0.3966 0.8722 0.9339
No log 6.7273 370 0.8569 0.4648 0.8569 0.9257
No log 6.7636 372 0.7782 0.5410 0.7782 0.8822
No log 6.8 374 0.7867 0.4993 0.7867 0.8869
No log 6.8364 376 0.8260 0.4100 0.8260 0.9088
No log 6.8727 378 0.8760 0.4264 0.8760 0.9359
No log 6.9091 380 0.9938 0.3978 0.9938 0.9969
No log 6.9455 382 1.0102 0.4186 1.0102 1.0051
No log 6.9818 384 0.9724 0.3978 0.9724 0.9861
No log 7.0182 386 0.9157 0.4127 0.9157 0.9569
No log 7.0545 388 0.8498 0.4140 0.8498 0.9219
No log 7.0909 390 0.8366 0.4284 0.8366 0.9147
No log 7.1273 392 0.8310 0.4180 0.8310 0.9116
No log 7.1636 394 0.8770 0.4429 0.8770 0.9365
No log 7.2 396 1.0333 0.4497 1.0333 1.0165
No log 7.2364 398 1.0377 0.4497 1.0377 1.0187
No log 7.2727 400 0.9574 0.3918 0.9574 0.9785
No log 7.3091 402 0.8914 0.4302 0.8914 0.9441
No log 7.3455 404 0.8921 0.4302 0.8921 0.9445
No log 7.3818 406 0.8801 0.4482 0.8801 0.9381
No log 7.4182 408 0.8741 0.4243 0.8741 0.9350
No log 7.4545 410 0.8432 0.4385 0.8432 0.9182
No log 7.4909 412 0.8209 0.3660 0.8209 0.9060
No log 7.5273 414 0.8275 0.4444 0.8275 0.9097
No log 7.5636 416 0.8445 0.4483 0.8445 0.9190
No log 7.6 418 0.9798 0.3823 0.9798 0.9899
No log 7.6364 420 0.9966 0.4005 0.9966 0.9983
No log 7.6727 422 0.8330 0.4741 0.8330 0.9127
No log 7.7091 424 0.7621 0.4826 0.7621 0.8730
No log 7.7455 426 0.7594 0.5093 0.7594 0.8715
No log 7.7818 428 0.8154 0.4752 0.8154 0.9030
No log 7.8182 430 0.8371 0.4752 0.8371 0.9149
No log 7.8545 432 0.7472 0.4439 0.7472 0.8644
No log 7.8909 434 0.7212 0.5079 0.7212 0.8492
No log 7.9273 436 0.7183 0.5215 0.7183 0.8475
No log 7.9636 438 0.7381 0.5089 0.7381 0.8592
No log 8.0 440 0.7402 0.5524 0.7402 0.8603
No log 8.0364 442 0.7170 0.5821 0.7170 0.8468
No log 8.0727 444 0.7441 0.5528 0.7441 0.8626
No log 8.1091 446 0.7608 0.4583 0.7608 0.8723
No log 8.1455 448 0.7811 0.4491 0.7811 0.8838
No log 8.1818 450 0.7984 0.3974 0.7984 0.8935
No log 8.2182 452 0.7952 0.3974 0.7952 0.8918
No log 8.2545 454 0.8140 0.4098 0.8140 0.9022
No log 8.2909 456 0.8068 0.4098 0.8068 0.8982
No log 8.3273 458 0.7649 0.4346 0.7649 0.8746
No log 8.3636 460 0.7320 0.5257 0.7320 0.8556
No log 8.4 462 0.7223 0.5942 0.7223 0.8499
No log 8.4364 464 0.7528 0.5127 0.7528 0.8676
No log 8.4727 466 0.7852 0.5691 0.7852 0.8861
No log 8.5091 468 0.7615 0.5427 0.7615 0.8726
No log 8.5455 470 0.7358 0.5591 0.7358 0.8578
No log 8.5818 472 0.7440 0.5714 0.7440 0.8625
No log 8.6182 474 0.7299 0.5688 0.7299 0.8543
No log 8.6545 476 0.7139 0.5563 0.7139 0.8449
No log 8.6909 478 0.7109 0.5486 0.7109 0.8431
No log 8.7273 480 0.7308 0.5149 0.7308 0.8549
No log 8.7636 482 0.7476 0.5102 0.7476 0.8646
No log 8.8 484 0.7669 0.5102 0.7669 0.8757
No log 8.8364 486 0.8332 0.5124 0.8332 0.9128
No log 8.8727 488 0.9700 0.4796 0.9700 0.9849
No log 8.9091 490 0.9677 0.4806 0.9677 0.9837
No log 8.9455 492 0.8465 0.4700 0.8465 0.9200
No log 8.9818 494 0.7926 0.4714 0.7926 0.8903
No log 9.0182 496 0.7630 0.5435 0.7630 0.8735
No log 9.0545 498 0.7820 0.4934 0.7820 0.8843
0.4104 9.0909 500 0.8958 0.5090 0.8958 0.9465
0.4104 9.1273 502 1.1102 0.4349 1.1102 1.0537
0.4104 9.1636 504 1.1563 0.3715 1.1563 1.0753
0.4104 9.2 506 1.0618 0.3929 1.0618 1.0304
0.4104 9.2364 508 0.9739 0.3333 0.9739 0.9869
0.4104 9.2727 510 0.9149 0.3270 0.9149 0.9565

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task2_organization

Finetuned
(4023)
this model