ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0606
  • Qwk: 0.5970
  • Mse: 1.0606
  • Rmse: 1.0299

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 6.8542 0.0308 6.8542 2.6181
No log 0.3333 4 4.2472 0.0803 4.2472 2.0609
No log 0.5 6 2.7860 0.0988 2.7860 1.6691
No log 0.6667 8 1.9509 0.2689 1.9509 1.3968
No log 0.8333 10 1.6758 0.1905 1.6758 1.2945
No log 1.0 12 1.7979 0.1538 1.7979 1.3409
No log 1.1667 14 1.9694 0.0741 1.9694 1.4033
No log 1.3333 16 2.0149 0.0367 2.0149 1.4195
No log 1.5 18 1.8017 0.2075 1.8017 1.3423
No log 1.6667 20 1.4168 0.3158 1.4168 1.1903
No log 1.8333 22 1.6558 0.3622 1.6558 1.2868
No log 2.0 24 1.3291 0.3036 1.3291 1.1529
No log 2.1667 26 1.2674 0.375 1.2674 1.1258
No log 2.3333 28 1.2168 0.4071 1.2168 1.1031
No log 2.5 30 1.2105 0.3393 1.2105 1.1002
No log 2.6667 32 1.3981 0.4480 1.3981 1.1824
No log 2.8333 34 1.3582 0.4516 1.3582 1.1654
No log 3.0 36 1.2741 0.3717 1.2741 1.1287
No log 3.1667 38 1.5056 0.1852 1.5056 1.2270
No log 3.3333 40 1.5781 0.2407 1.5781 1.2562
No log 3.5 42 1.5285 0.4228 1.5285 1.2363
No log 3.6667 44 1.3022 0.5077 1.3022 1.1411
No log 3.8333 46 1.1540 0.5191 1.1540 1.0743
No log 4.0 48 1.4213 0.5517 1.4213 1.1922
No log 4.1667 50 1.1553 0.6014 1.1553 1.0749
No log 4.3333 52 0.8960 0.6525 0.8960 0.9466
No log 4.5 54 1.1723 0.5397 1.1723 1.0827
No log 4.6667 56 1.5095 0.35 1.5095 1.2286
No log 4.8333 58 1.6659 0.1982 1.6659 1.2907
No log 5.0 60 1.7011 0.1930 1.7011 1.3043
No log 5.1667 62 1.3987 0.3478 1.3987 1.1826
No log 5.3333 64 1.0910 0.5669 1.0910 1.0445
No log 5.5 66 0.9351 0.6667 0.9351 0.9670
No log 5.6667 68 0.9336 0.5970 0.9336 0.9662
No log 5.8333 70 0.9880 0.512 0.9880 0.9940
No log 6.0 72 1.2065 0.5344 1.2065 1.0984
No log 6.1667 74 1.3508 0.4848 1.3508 1.1622
No log 6.3333 76 1.3902 0.4928 1.3902 1.1791
No log 6.5 78 1.2167 0.5693 1.2167 1.1030
No log 6.6667 80 1.0365 0.6015 1.0365 1.0181
No log 6.8333 82 0.9272 0.6074 0.9272 0.9629
No log 7.0 84 0.8435 0.6475 0.8435 0.9184
No log 7.1667 86 0.8278 0.6370 0.8278 0.9098
No log 7.3333 88 0.8854 0.6222 0.8854 0.9410
No log 7.5 90 1.0308 0.5735 1.0308 1.0153
No log 7.6667 92 1.3648 0.5103 1.3648 1.1682
No log 7.8333 94 1.6154 0.3380 1.6154 1.2710
No log 8.0 96 1.6361 0.4533 1.6361 1.2791
No log 8.1667 98 1.4039 0.5541 1.4039 1.1849
No log 8.3333 100 1.0597 0.6143 1.0597 1.0294
No log 8.5 102 0.9001 0.5802 0.9001 0.9488
No log 8.6667 104 0.8869 0.5802 0.8869 0.9418
No log 8.8333 106 0.8498 0.5649 0.8498 0.9219
No log 9.0 108 0.8742 0.6176 0.8742 0.9350
No log 9.1667 110 0.9839 0.6259 0.9839 0.9919
No log 9.3333 112 1.0117 0.64 1.0117 1.0058
No log 9.5 114 1.0781 0.6056 1.0781 1.0383
No log 9.6667 116 1.0311 0.5985 1.0311 1.0154
No log 9.8333 118 0.9423 0.6232 0.9423 0.9707
No log 10.0 120 0.8945 0.6522 0.8945 0.9458
No log 10.1667 122 1.0135 0.6370 1.0135 1.0067
No log 10.3333 124 1.2454 0.5333 1.2454 1.1160
No log 10.5 126 1.1780 0.5714 1.1780 1.0854
No log 10.6667 128 1.0108 0.5625 1.0108 1.0054
No log 10.8333 130 0.9044 0.6316 0.9044 0.9510
No log 11.0 132 0.9342 0.5909 0.9342 0.9665
No log 11.1667 134 1.1291 0.5714 1.1291 1.0626
No log 11.3333 136 1.1491 0.5547 1.1491 1.0720
No log 11.5 138 0.9331 0.5816 0.9331 0.9660
No log 11.6667 140 0.7977 0.6812 0.7977 0.8931
No log 11.8333 142 0.8422 0.6418 0.8422 0.9177
No log 12.0 144 0.9021 0.6377 0.9021 0.9498
No log 12.1667 146 0.9188 0.6377 0.9188 0.9585
No log 12.3333 148 0.9265 0.6569 0.9265 0.9625
No log 12.5 150 1.0803 0.5630 1.0803 1.0394
No log 12.6667 152 1.1806 0.5753 1.1806 1.0865
No log 12.8333 154 1.0856 0.6111 1.0856 1.0419
No log 13.0 156 0.9173 0.6277 0.9173 0.9578
No log 13.1667 158 0.7949 0.6763 0.7949 0.8916
No log 13.3333 160 0.7601 0.6861 0.7601 0.8718
No log 13.5 162 0.8229 0.6107 0.8229 0.9071
No log 13.6667 164 0.9294 0.6207 0.9294 0.9641
No log 13.8333 166 0.9686 0.6216 0.9686 0.9842
No log 14.0 168 0.9057 0.6216 0.9057 0.9517
No log 14.1667 170 0.9277 0.625 0.9277 0.9632
No log 14.3333 172 1.0040 0.6259 1.0040 1.0020
No log 14.5 174 1.0149 0.6323 1.0149 1.0074
No log 14.6667 176 0.8408 0.6800 0.8408 0.9170
No log 14.8333 178 0.6825 0.8 0.6825 0.8261
No log 15.0 180 0.6943 0.7261 0.6943 0.8332
No log 15.1667 182 0.6963 0.7417 0.6963 0.8345
No log 15.3333 184 0.7734 0.6861 0.7734 0.8795
No log 15.5 186 1.0053 0.6383 1.0053 1.0026
No log 15.6667 188 1.1754 0.5811 1.1754 1.0842
No log 15.8333 190 1.2510 0.5417 1.2510 1.1185
No log 16.0 192 1.2115 0.5612 1.2115 1.1007
No log 16.1667 194 1.1682 0.5957 1.1682 1.0808
No log 16.3333 196 1.0048 0.6212 1.0048 1.0024
No log 16.5 198 1.0127 0.6212 1.0127 1.0063
No log 16.6667 200 1.1671 0.5286 1.1671 1.0803
No log 16.8333 202 1.2086 0.5075 1.2086 1.0994
No log 17.0 204 1.1473 0.5231 1.1473 1.0711
No log 17.1667 206 1.1002 0.5581 1.1002 1.0489
No log 17.3333 208 1.0472 0.6015 1.0472 1.0233
No log 17.5 210 1.0552 0.6176 1.0552 1.0272
No log 17.6667 212 0.9540 0.6324 0.9540 0.9767
No log 17.8333 214 0.8882 0.6519 0.8882 0.9424
No log 18.0 216 0.9077 0.6187 0.9077 0.9527
No log 18.1667 218 1.0977 0.5946 1.0977 1.0477
No log 18.3333 220 1.3036 0.5625 1.3036 1.1418
No log 18.5 222 1.2699 0.5170 1.2699 1.1269
No log 18.6667 224 1.0396 0.5797 1.0396 1.0196
No log 18.8333 226 0.8751 0.6567 0.8751 0.9355
No log 19.0 228 0.8593 0.6515 0.8593 0.9270
No log 19.1667 230 0.8994 0.6716 0.8994 0.9483
No log 19.3333 232 1.0301 0.5970 1.0301 1.0149
No log 19.5 234 1.3205 0.5229 1.3205 1.1491
No log 19.6667 236 1.4679 0.4875 1.4679 1.2116
No log 19.8333 238 1.3591 0.48 1.3591 1.1658
No log 20.0 240 1.1363 0.5839 1.1363 1.0660
No log 20.1667 242 1.0399 0.5455 1.0399 1.0198
No log 20.3333 244 1.0390 0.5649 1.0390 1.0193
No log 20.5 246 1.0383 0.5839 1.0383 1.0190
No log 20.6667 248 1.0184 0.6351 1.0184 1.0092
No log 20.8333 250 1.0551 0.6420 1.0551 1.0272
No log 21.0 252 0.9668 0.6839 0.9668 0.9833
No log 21.1667 254 0.7892 0.7153 0.7892 0.8884
No log 21.3333 256 0.7206 0.75 0.7206 0.8489
No log 21.5 258 0.7157 0.7153 0.7157 0.8460
No log 21.6667 260 0.7336 0.7299 0.7336 0.8565
No log 21.8333 262 0.8832 0.6957 0.8832 0.9398
No log 22.0 264 1.0133 0.6824 1.0133 1.0066
No log 22.1667 266 0.9621 0.6667 0.9621 0.9809
No log 22.3333 268 0.8595 0.6716 0.8595 0.9271
No log 22.5 270 0.7975 0.6912 0.7975 0.8930
No log 22.6667 272 0.7895 0.6567 0.7895 0.8885
No log 22.8333 274 0.8673 0.6466 0.8673 0.9313
No log 23.0 276 1.0392 0.5874 1.0392 1.0194
No log 23.1667 278 1.1190 0.5897 1.1190 1.0578
No log 23.3333 280 1.0628 0.6053 1.0628 1.0309
No log 23.5 282 0.9097 0.6475 0.9097 0.9538
No log 23.6667 284 0.8604 0.6567 0.8604 0.9276
No log 23.8333 286 0.9133 0.6716 0.9133 0.9557
No log 24.0 288 0.9740 0.6277 0.9740 0.9869
No log 24.1667 290 1.0293 0.6087 1.0293 1.0146
No log 24.3333 292 1.1678 0.5676 1.1678 1.0807
No log 24.5 294 1.2206 0.5696 1.2206 1.1048
No log 24.6667 296 1.1185 0.5890 1.1185 1.0576
No log 24.8333 298 0.9952 0.5865 0.9952 0.9976
No log 25.0 300 0.9022 0.6047 0.9022 0.9498
No log 25.1667 302 0.8786 0.6308 0.8786 0.9373
No log 25.3333 304 0.8906 0.6466 0.8906 0.9437
No log 25.5 306 0.9679 0.5926 0.9679 0.9838
No log 25.6667 308 0.9853 0.6187 0.9853 0.9926
No log 25.8333 310 0.9594 0.5970 0.9594 0.9795
No log 26.0 312 0.9667 0.5970 0.9667 0.9832
No log 26.1667 314 0.9573 0.6338 0.9573 0.9784
No log 26.3333 316 0.8843 0.6471 0.8843 0.9404
No log 26.5 318 0.8834 0.6515 0.8834 0.9399
No log 26.6667 320 0.9355 0.5827 0.9355 0.9672
No log 26.8333 322 1.0128 0.6260 1.0128 1.0064
No log 27.0 324 1.0745 0.6029 1.0745 1.0366
No log 27.1667 326 1.0578 0.5985 1.0578 1.0285
No log 27.3333 328 0.9979 0.6176 0.9979 0.9990
No log 27.5 330 0.9936 0.6286 0.9936 0.9968
No log 27.6667 332 0.9149 0.6569 0.9149 0.9565
No log 27.8333 334 0.9318 0.6074 0.9318 0.9653
No log 28.0 336 0.9874 0.5970 0.9874 0.9937
No log 28.1667 338 0.9635 0.6074 0.9635 0.9816
No log 28.3333 340 0.9476 0.6074 0.9476 0.9734
No log 28.5 342 0.9611 0.6176 0.9611 0.9803
No log 28.6667 344 0.9106 0.6324 0.9106 0.9543
No log 28.8333 346 0.8430 0.6107 0.8430 0.9181
No log 29.0 348 0.8351 0.6107 0.8351 0.9138
No log 29.1667 350 0.8514 0.5970 0.8514 0.9227
No log 29.3333 352 0.9177 0.6176 0.9177 0.9580
No log 29.5 354 0.9338 0.6241 0.9338 0.9663
No log 29.6667 356 0.8884 0.6176 0.8884 0.9426
No log 29.8333 358 0.9039 0.6074 0.9039 0.9507
No log 30.0 360 0.9086 0.5926 0.9086 0.9532
No log 30.1667 362 0.9251 0.6074 0.9251 0.9618
No log 30.3333 364 0.9831 0.6074 0.9831 0.9915
No log 30.5 366 0.9896 0.6074 0.9896 0.9948
No log 30.6667 368 0.9647 0.6176 0.9647 0.9822
No log 30.8333 370 0.9807 0.6338 0.9807 0.9903
No log 31.0 372 0.9084 0.6324 0.9084 0.9531
No log 31.1667 374 0.8432 0.6466 0.8432 0.9183
No log 31.3333 376 0.8578 0.6324 0.8578 0.9262
No log 31.5 378 0.8740 0.6324 0.8740 0.9349
No log 31.6667 380 0.9618 0.6573 0.9618 0.9807
No log 31.8333 382 0.9807 0.6338 0.9807 0.9903
No log 32.0 384 0.9424 0.6232 0.9424 0.9708
No log 32.1667 386 0.9703 0.6241 0.9703 0.9850
No log 32.3333 388 0.9494 0.6131 0.9494 0.9744
No log 32.5 390 0.9059 0.6212 0.9059 0.9518
No log 32.6667 392 0.9284 0.5954 0.9284 0.9635
No log 32.8333 394 0.9134 0.6212 0.9134 0.9557
No log 33.0 396 0.8965 0.6466 0.8965 0.9469
No log 33.1667 398 0.8989 0.6466 0.8989 0.9481
No log 33.3333 400 0.8824 0.6331 0.8824 0.9394
No log 33.5 402 0.9086 0.6479 0.9086 0.9532
No log 33.6667 404 0.9697 0.6577 0.9697 0.9848
No log 33.8333 406 0.9465 0.6575 0.9465 0.9729
No log 34.0 408 0.9148 0.6479 0.9148 0.9564
No log 34.1667 410 0.9163 0.6466 0.9163 0.9572
No log 34.3333 412 0.8810 0.6466 0.8810 0.9386
No log 34.5 414 0.8595 0.6466 0.8595 0.9271
No log 34.6667 416 0.8202 0.6202 0.8202 0.9056
No log 34.8333 418 0.8386 0.6466 0.8386 0.9158
No log 35.0 420 0.9464 0.6338 0.9464 0.9728
No log 35.1667 422 1.1184 0.5987 1.1184 1.0576
No log 35.3333 424 1.2534 0.5799 1.2534 1.1195
No log 35.5 426 1.2576 0.5799 1.2576 1.1214
No log 35.6667 428 1.1211 0.6303 1.1211 1.0588
No log 35.8333 430 0.9402 0.6345 0.9402 0.9697
No log 36.0 432 0.7925 0.6715 0.7925 0.8902
No log 36.1667 434 0.7623 0.6716 0.7623 0.8731
No log 36.3333 436 0.7868 0.6202 0.7868 0.8870
No log 36.5 438 0.8479 0.6466 0.8479 0.9208
No log 36.6667 440 0.9833 0.6015 0.9833 0.9916
No log 36.8333 442 1.1693 0.5390 1.1693 1.0814
No log 37.0 444 1.2201 0.5696 1.2201 1.1046
No log 37.1667 446 1.1471 0.5571 1.1471 1.0710
No log 37.3333 448 1.0076 0.5954 1.0076 1.0038
No log 37.5 450 0.9028 0.5736 0.9028 0.9501
No log 37.6667 452 0.8617 0.6107 0.8617 0.9283
No log 37.8333 454 0.8541 0.5954 0.8541 0.9242
No log 38.0 456 0.9176 0.6107 0.9176 0.9579
No log 38.1667 458 1.0548 0.6069 1.0548 1.0270
No log 38.3333 460 1.0936 0.6040 1.0936 1.0458
No log 38.5 462 1.0499 0.6131 1.0499 1.0247
No log 38.6667 464 0.9557 0.6364 0.9557 0.9776
No log 38.8333 466 0.8893 0.6466 0.8893 0.9430
No log 39.0 468 0.8626 0.6466 0.8626 0.9288
No log 39.1667 470 0.8794 0.6466 0.8794 0.9378
No log 39.3333 472 0.9118 0.6466 0.9118 0.9549
No log 39.5 474 0.9480 0.6571 0.9480 0.9737
No log 39.6667 476 0.9496 0.6974 0.9496 0.9745
No log 39.8333 478 0.9137 0.6757 0.9137 0.9559
No log 40.0 480 0.8781 0.6620 0.8781 0.9371
No log 40.1667 482 0.8404 0.6569 0.8404 0.9167
No log 40.3333 484 0.8342 0.6569 0.8342 0.9133
No log 40.5 486 0.8288 0.6569 0.8288 0.9104
No log 40.6667 488 0.8554 0.6569 0.8554 0.9249
No log 40.8333 490 0.9136 0.6324 0.9136 0.9558
No log 41.0 492 0.9985 0.6790 0.9985 0.9993
No log 41.1667 494 1.0040 0.6748 1.0040 1.0020
No log 41.3333 496 0.9728 0.6709 0.9728 0.9863
No log 41.5 498 0.9948 0.6301 0.9948 0.9974
0.3411 41.6667 500 1.0167 0.6222 1.0167 1.0083
0.3411 41.8333 502 0.9912 0.6107 0.9912 0.9956
0.3411 42.0 504 1.0097 0.5938 1.0097 1.0048
0.3411 42.1667 506 1.0568 0.5938 1.0568 1.0280
0.3411 42.3333 508 1.0554 0.5938 1.0554 1.0273
0.3411 42.5 510 1.0606 0.5970 1.0606 1.0299

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k2_task1_organization

Finetuned
(4019)
this model