ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0061
  • Qwk: 0.3826
  • Mse: 1.0061
  • Rmse: 1.0031

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0870 2 4.6297 0.0010 4.6297 2.1517
No log 0.1739 4 2.6809 0.0620 2.6809 1.6373
No log 0.2609 6 1.7704 0.0198 1.7704 1.3306
No log 0.3478 8 2.0792 0.0022 2.0792 1.4419
No log 0.4348 10 1.7294 0.0201 1.7294 1.3151
No log 0.5217 12 1.3715 -0.0089 1.3715 1.1711
No log 0.6087 14 1.2247 0.1370 1.2247 1.1067
No log 0.6957 16 1.1787 0.0941 1.1787 1.0857
No log 0.7826 18 1.2165 0.0599 1.2165 1.1030
No log 0.8696 20 1.2819 -0.0180 1.2819 1.1322
No log 0.9565 22 1.3227 -0.0031 1.3227 1.1501
No log 1.0435 24 1.2700 0.0529 1.2700 1.1269
No log 1.1304 26 1.2040 0.1543 1.2040 1.0973
No log 1.2174 28 1.1948 0.1343 1.1948 1.0931
No log 1.3043 30 1.2432 0.0157 1.2432 1.1150
No log 1.3913 32 1.1679 0.0613 1.1679 1.0807
No log 1.4783 34 1.1117 0.1541 1.1117 1.0544
No log 1.5652 36 1.1075 0.2417 1.1075 1.0524
No log 1.6522 38 1.0679 0.2188 1.0679 1.0334
No log 1.7391 40 1.0700 0.1786 1.0700 1.0344
No log 1.8261 42 1.0743 0.2835 1.0743 1.0365
No log 1.9130 44 0.9998 0.3603 0.9998 0.9999
No log 2.0 46 0.9299 0.4180 0.9299 0.9643
No log 2.0870 48 0.9554 0.3382 0.9554 0.9775
No log 2.1739 50 0.9101 0.3728 0.9101 0.9540
No log 2.2609 52 0.9067 0.4140 0.9067 0.9522
No log 2.3478 54 0.9686 0.4157 0.9686 0.9842
No log 2.4348 56 1.0901 0.3032 1.0901 1.0441
No log 2.5217 58 1.0585 0.2640 1.0585 1.0288
No log 2.6087 60 1.0836 0.1904 1.0836 1.0410
No log 2.6957 62 1.0347 0.3487 1.0347 1.0172
No log 2.7826 64 0.9668 0.4466 0.9668 0.9833
No log 2.8696 66 1.0242 0.2893 1.0242 1.0120
No log 2.9565 68 1.0937 0.2409 1.0936 1.0458
No log 3.0435 70 1.1025 0.2610 1.1025 1.0500
No log 3.1304 72 1.1092 0.2893 1.1092 1.0532
No log 3.2174 74 1.1283 0.3387 1.1283 1.0622
No log 3.3043 76 1.0946 0.3852 1.0946 1.0462
No log 3.3913 78 1.1130 0.4435 1.1130 1.0550
No log 3.4783 80 1.0770 0.4318 1.0770 1.0378
No log 3.5652 82 1.0189 0.4440 1.0189 1.0094
No log 3.6522 84 1.0145 0.3066 1.0145 1.0072
No log 3.7391 86 1.0176 0.3615 1.0176 1.0088
No log 3.8261 88 0.9430 0.3408 0.9430 0.9711
No log 3.9130 90 1.0891 0.4936 1.0891 1.0436
No log 4.0 92 1.0968 0.4538 1.0968 1.0473
No log 4.0870 94 0.9177 0.3979 0.9177 0.9580
No log 4.1739 96 0.8677 0.4219 0.8677 0.9315
No log 4.2609 98 0.9338 0.4615 0.9338 0.9663
No log 4.3478 100 0.8743 0.4219 0.8743 0.9350
No log 4.4348 102 0.8465 0.4519 0.8465 0.9201
No log 4.5217 104 0.8659 0.4948 0.8659 0.9305
No log 4.6087 106 0.8743 0.4948 0.8743 0.9350
No log 4.6957 108 0.8859 0.4587 0.8859 0.9412
No log 4.7826 110 0.9750 0.4533 0.9750 0.9874
No log 4.8696 112 0.9139 0.4986 0.9139 0.9560
No log 4.9565 114 0.8387 0.4714 0.8387 0.9158
No log 5.0435 116 0.8457 0.4714 0.8457 0.9196
No log 5.1304 118 0.8184 0.4828 0.8184 0.9047
No log 5.2174 120 0.8046 0.5076 0.8046 0.8970
No log 5.3043 122 0.7919 0.4980 0.7919 0.8899
No log 5.3913 124 0.8171 0.4865 0.8171 0.9039
No log 5.4783 126 0.8508 0.5026 0.8508 0.9224
No log 5.5652 128 0.8432 0.4902 0.8432 0.9182
No log 5.6522 130 0.8187 0.4902 0.8187 0.9048
No log 5.7391 132 0.8113 0.4787 0.8113 0.9007
No log 5.8261 134 0.8296 0.5041 0.8296 0.9108
No log 5.9130 136 0.9029 0.5190 0.9029 0.9502
No log 6.0 138 0.8968 0.5218 0.8968 0.9470
No log 6.0870 140 0.8450 0.4808 0.8450 0.9193
No log 6.1739 142 0.8822 0.3781 0.8822 0.9392
No log 6.2609 144 0.8649 0.4813 0.8649 0.9300
No log 6.3478 146 0.9217 0.4833 0.9217 0.9601
No log 6.4348 148 1.0749 0.4332 1.0749 1.0368
No log 6.5217 150 1.0888 0.4332 1.0888 1.0435
No log 6.6087 152 0.9676 0.4408 0.9676 0.9836
No log 6.6957 154 0.9183 0.4631 0.9183 0.9583
No log 6.7826 156 0.9367 0.4898 0.9367 0.9678
No log 6.8696 158 0.9124 0.5012 0.9124 0.9552
No log 6.9565 160 0.8966 0.4823 0.8966 0.9469
No log 7.0435 162 0.9579 0.5 0.9579 0.9787
No log 7.1304 164 1.0976 0.4964 1.0976 1.0477
No log 7.2174 166 1.0391 0.4801 1.0391 1.0194
No log 7.3043 168 0.8745 0.4676 0.8745 0.9352
No log 7.3913 170 0.8821 0.4841 0.8821 0.9392
No log 7.4783 172 0.8965 0.4144 0.8965 0.9468
No log 7.5652 174 1.0400 0.3902 1.0400 1.0198
No log 7.6522 176 1.0431 0.3792 1.0431 1.0213
No log 7.7391 178 0.8986 0.4493 0.8986 0.9479
No log 7.8261 180 1.1038 0.4002 1.1038 1.0506
No log 7.9130 182 1.3537 0.3131 1.3537 1.1635
No log 8.0 184 1.1497 0.3879 1.1497 1.0723
No log 8.0870 186 0.8736 0.4548 0.8736 0.9347
No log 8.1739 188 0.9344 0.5209 0.9344 0.9666
No log 8.2609 190 1.0527 0.5201 1.0527 1.0260
No log 8.3478 192 1.0099 0.5289 1.0099 1.0049
No log 8.4348 194 0.9357 0.4906 0.9357 0.9673
No log 8.5217 196 0.9586 0.5185 0.9586 0.9791
No log 8.6087 198 1.0296 0.4902 1.0296 1.0147
No log 8.6957 200 0.9144 0.4752 0.9144 0.9562
No log 8.7826 202 0.8371 0.4612 0.8371 0.9149
No log 8.8696 204 0.9388 0.5201 0.9388 0.9689
No log 8.9565 206 0.9676 0.5346 0.9676 0.9837
No log 9.0435 208 0.8693 0.4382 0.8693 0.9324
No log 9.1304 210 0.8774 0.5579 0.8774 0.9367
No log 9.2174 212 0.9157 0.3723 0.9157 0.9569
No log 9.3043 214 0.8325 0.5233 0.8325 0.9124
No log 9.3913 216 0.8531 0.4956 0.8531 0.9236
No log 9.4783 218 0.8810 0.5264 0.8810 0.9386
No log 9.5652 220 0.8231 0.5029 0.8231 0.9072
No log 9.6522 222 0.7988 0.4555 0.7988 0.8938
No log 9.7391 224 0.8015 0.4352 0.8015 0.8953
No log 9.8261 226 0.8090 0.5011 0.8090 0.8994
No log 9.9130 228 0.8445 0.4771 0.8445 0.9189
No log 10.0 230 0.9367 0.3941 0.9367 0.9678
No log 10.0870 232 0.9393 0.3941 0.9393 0.9692
No log 10.1739 234 0.8392 0.5194 0.8392 0.9161
No log 10.2609 236 0.9025 0.5068 0.9025 0.9500
No log 10.3478 238 0.9641 0.4945 0.9641 0.9819
No log 10.4348 240 0.8932 0.4689 0.8932 0.9451
No log 10.5217 242 0.8553 0.4884 0.8553 0.9248
No log 10.6087 244 0.8554 0.4527 0.8553 0.9249
No log 10.6957 246 0.9124 0.5090 0.9124 0.9552
No log 10.7826 248 1.0725 0.3332 1.0725 1.0356
No log 10.8696 250 1.2302 0.3902 1.2302 1.1091
No log 10.9565 252 1.1517 0.3716 1.1517 1.0732
No log 11.0435 254 0.9597 0.3556 0.9597 0.9796
No log 11.1304 256 0.8685 0.4845 0.8685 0.9319
No log 11.2174 258 0.8609 0.4979 0.8609 0.9279
No log 11.3043 260 0.8907 0.4164 0.8907 0.9438
No log 11.3913 262 1.0291 0.4218 1.0291 1.0144
No log 11.4783 264 1.0936 0.4077 1.0936 1.0458
No log 11.5652 266 1.0180 0.4418 1.0180 1.0089
No log 11.6522 268 0.8735 0.3926 0.8735 0.9346
No log 11.7391 270 0.8943 0.4587 0.8943 0.9457
No log 11.8261 272 0.9600 0.4175 0.9600 0.9798
No log 11.9130 274 0.8996 0.4587 0.8996 0.9485
No log 12.0 276 0.8484 0.3854 0.8484 0.9211
No log 12.0870 278 0.9284 0.3936 0.9284 0.9636
No log 12.1739 280 0.9754 0.3543 0.9754 0.9876
No log 12.2609 282 0.9184 0.4429 0.9184 0.9583
No log 12.3478 284 0.8644 0.4476 0.8644 0.9297
No log 12.4348 286 0.8580 0.4409 0.8580 0.9263
No log 12.5217 288 0.8534 0.4757 0.8534 0.9238
No log 12.6087 290 0.8486 0.4963 0.8486 0.9212
No log 12.6957 292 0.9043 0.4501 0.9043 0.9510
No log 12.7826 294 0.9535 0.3830 0.9535 0.9765
No log 12.8696 296 0.9223 0.4166 0.9223 0.9604
No log 12.9565 298 0.8486 0.4203 0.8486 0.9212
No log 13.0435 300 0.8627 0.4874 0.8627 0.9288
No log 13.1304 302 0.9023 0.4923 0.9023 0.9499
No log 13.2174 304 0.8819 0.5042 0.8819 0.9391
No log 13.3043 306 0.8632 0.3979 0.8632 0.9291
No log 13.3913 308 0.8867 0.4568 0.8867 0.9416
No log 13.4783 310 0.9469 0.4631 0.9469 0.9731
No log 13.5652 312 0.9591 0.4295 0.9591 0.9793
No log 13.6522 314 0.9686 0.4065 0.9686 0.9842
No log 13.7391 316 1.0282 0.2930 1.0282 1.0140
No log 13.8261 318 1.0232 0.3141 1.0232 1.0115
No log 13.9130 320 0.9772 0.3147 0.9772 0.9886
No log 14.0 322 0.9487 0.3551 0.9487 0.9740
No log 14.0870 324 0.9369 0.3820 0.9369 0.9679
No log 14.1739 326 0.9541 0.4164 0.9541 0.9768
No log 14.2609 328 1.0159 0.3220 1.0159 1.0079
No log 14.3478 330 0.9727 0.3913 0.9727 0.9862
No log 14.4348 332 0.9203 0.3926 0.9203 0.9593
No log 14.5217 334 0.8796 0.4385 0.8796 0.9379
No log 14.6087 336 0.8604 0.4526 0.8604 0.9276
No log 14.6957 338 0.8542 0.4620 0.8542 0.9242
No log 14.7826 340 0.9207 0.4166 0.9207 0.9595
No log 14.8696 342 1.0509 0.3982 1.0509 1.0251
No log 14.9565 344 1.0513 0.3563 1.0513 1.0253
No log 15.0435 346 0.9453 0.4037 0.9453 0.9723
No log 15.1304 348 0.8920 0.4242 0.8920 0.9444
No log 15.2174 350 0.8784 0.4340 0.8784 0.9372
No log 15.3043 352 0.8663 0.4764 0.8663 0.9308
No log 15.3913 354 0.8166 0.4644 0.8166 0.9037
No log 15.4783 356 0.7853 0.4778 0.7853 0.8861
No log 15.5652 358 0.7679 0.5462 0.7679 0.8763
No log 15.6522 360 0.8013 0.4802 0.8013 0.8952
No log 15.7391 362 0.8527 0.5170 0.8527 0.9234
No log 15.8261 364 0.8287 0.5135 0.8287 0.9103
No log 15.9130 366 0.7702 0.4636 0.7702 0.8776
No log 16.0 368 0.7534 0.5672 0.7534 0.8680
No log 16.0870 370 0.7614 0.5864 0.7614 0.8726
No log 16.1739 372 0.7576 0.5633 0.7576 0.8704
No log 16.2609 374 0.7520 0.5241 0.7520 0.8672
No log 16.3478 376 0.7521 0.5466 0.7521 0.8672
No log 16.4348 378 0.7612 0.5098 0.7612 0.8725
No log 16.5217 380 0.7722 0.5136 0.7722 0.8787
No log 16.6087 382 0.7794 0.5365 0.7794 0.8828
No log 16.6957 384 0.8324 0.5261 0.8324 0.9123
No log 16.7826 386 0.9542 0.5080 0.9542 0.9769
No log 16.8696 388 1.0369 0.5069 1.0369 1.0183
No log 16.9565 390 1.0169 0.5069 1.0169 1.0084
No log 17.0435 392 0.8775 0.4927 0.8775 0.9368
No log 17.1304 394 0.8060 0.4920 0.8060 0.8978
No log 17.2174 396 0.7954 0.4920 0.7954 0.8918
No log 17.3043 398 0.7675 0.4846 0.7675 0.8761
No log 17.3913 400 0.7501 0.5178 0.7501 0.8661
No log 17.4783 402 0.7490 0.5299 0.7490 0.8655
No log 17.5652 404 0.7524 0.5300 0.7524 0.8674
No log 17.6522 406 0.7786 0.5013 0.7786 0.8824
No log 17.7391 408 0.7995 0.5094 0.7995 0.8941
No log 17.8261 410 0.8756 0.4545 0.8756 0.9357
No log 17.9130 412 0.9051 0.4933 0.9051 0.9514
No log 18.0 414 0.8457 0.4321 0.8457 0.9196
No log 18.0870 416 0.7951 0.5102 0.7951 0.8917
No log 18.1739 418 0.8017 0.4563 0.8017 0.8953
No log 18.2609 420 0.8032 0.4534 0.8032 0.8962
No log 18.3478 422 0.8224 0.4142 0.8224 0.9069
No log 18.4348 424 0.8765 0.3945 0.8765 0.9362
No log 18.5217 426 0.9374 0.4771 0.9374 0.9682
No log 18.6087 428 0.9020 0.5 0.9020 0.9497
No log 18.6957 430 0.8582 0.4413 0.8582 0.9264
No log 18.7826 432 0.8637 0.5068 0.8637 0.9293
No log 18.8696 434 0.8614 0.4840 0.8614 0.9281
No log 18.9565 436 0.8272 0.4774 0.8272 0.9095
No log 19.0435 438 0.8052 0.5250 0.8052 0.8973
No log 19.1304 440 0.7896 0.5192 0.7896 0.8886
No log 19.2174 442 0.7925 0.5195 0.7925 0.8902
No log 19.3043 444 0.7894 0.5459 0.7894 0.8885
No log 19.3913 446 0.7821 0.5274 0.7821 0.8844
No log 19.4783 448 0.8058 0.4871 0.8058 0.8977
No log 19.5652 450 0.8664 0.4833 0.8664 0.9308
No log 19.6522 452 0.8871 0.5029 0.8871 0.9419
No log 19.7391 454 0.8445 0.4871 0.8445 0.9190
No log 19.8261 456 0.8146 0.4898 0.8146 0.9026
No log 19.9130 458 0.8218 0.4898 0.8218 0.9065
No log 20.0 460 0.8257 0.4637 0.8257 0.9087
No log 20.0870 462 0.8448 0.4470 0.8448 0.9191
No log 20.1739 464 0.8591 0.4065 0.8591 0.9269
No log 20.2609 466 0.9104 0.4264 0.9104 0.9541
No log 20.3478 468 0.9325 0.4262 0.9325 0.9657
No log 20.4348 470 0.9124 0.4710 0.9124 0.9552
No log 20.5217 472 0.9151 0.4700 0.9151 0.9566
No log 20.6087 474 0.9023 0.4700 0.9023 0.9499
No log 20.6957 476 0.8342 0.4789 0.8342 0.9133
No log 20.7826 478 0.8621 0.4590 0.8621 0.9285
No log 20.8696 480 0.8619 0.5180 0.8619 0.9284
No log 20.9565 482 0.8262 0.5462 0.8262 0.9090
No log 21.0435 484 0.8493 0.4805 0.8493 0.9216
No log 21.1304 486 0.8810 0.4968 0.8810 0.9386
No log 21.2174 488 0.8564 0.4876 0.8564 0.9254
No log 21.3043 490 0.8661 0.4465 0.8661 0.9306
No log 21.3913 492 0.9167 0.4390 0.9167 0.9574
No log 21.4783 494 0.9426 0.4034 0.9426 0.9709
No log 21.5652 496 0.9660 0.4129 0.9660 0.9829
No log 21.6522 498 1.0633 0.4104 1.0633 1.0312
0.3238 21.7391 500 1.0771 0.4295 1.0771 1.0378
0.3238 21.8261 502 0.9689 0.5130 0.9689 0.9843
0.3238 21.9130 504 0.8998 0.4777 0.8998 0.9486
0.3238 22.0 506 0.9188 0.4014 0.9188 0.9585
0.3238 22.0870 508 0.9218 0.4141 0.9218 0.9601
0.3238 22.1739 510 0.8810 0.4231 0.8810 0.9386
0.3238 22.2609 512 0.8767 0.4889 0.8767 0.9363
0.3238 22.3478 514 0.9332 0.4898 0.9332 0.9660
0.3238 22.4348 516 0.9063 0.4911 0.9063 0.9520
0.3238 22.5217 518 0.8511 0.4611 0.8511 0.9225
0.3238 22.6087 520 0.8679 0.4401 0.8679 0.9316
0.3238 22.6957 522 0.9306 0.3747 0.9306 0.9647
0.3238 22.7826 524 0.9247 0.3951 0.9247 0.9616
0.3238 22.8696 526 0.8724 0.3970 0.8724 0.9340
0.3238 22.9565 528 0.8834 0.4304 0.8834 0.9399
0.3238 23.0435 530 0.9533 0.3634 0.9533 0.9764
0.3238 23.1304 532 0.9513 0.3634 0.9513 0.9753
0.3238 23.2174 534 0.9036 0.4062 0.9036 0.9506
0.3238 23.3043 536 0.8696 0.4656 0.8696 0.9325
0.3238 23.3913 538 0.8600 0.4656 0.8600 0.9274
0.3238 23.4783 540 0.8538 0.4705 0.8538 0.9240
0.3238 23.5652 542 0.8417 0.4450 0.8417 0.9174
0.3238 23.6522 544 0.8359 0.5085 0.8359 0.9143
0.3238 23.7391 546 0.8223 0.5207 0.8223 0.9068
0.3238 23.8261 548 0.8206 0.5203 0.8206 0.9059
0.3238 23.9130 550 0.8766 0.4552 0.8766 0.9363
0.3238 24.0 552 0.8824 0.4858 0.8824 0.9394
0.3238 24.0870 554 0.8563 0.4439 0.8563 0.9254
0.3238 24.1739 556 0.8458 0.4757 0.8458 0.9197
0.3238 24.2609 558 0.8503 0.4757 0.8503 0.9221
0.3238 24.3478 560 0.8512 0.4757 0.8512 0.9226
0.3238 24.4348 562 0.8806 0.4553 0.8806 0.9384
0.3238 24.5217 564 0.9572 0.4463 0.9572 0.9784
0.3238 24.6087 566 1.0444 0.4138 1.0444 1.0219
0.3238 24.6957 568 1.1195 0.4216 1.1195 1.0580
0.3238 24.7826 570 1.0927 0.4216 1.0927 1.0453
0.3238 24.8696 572 1.0022 0.3945 1.0022 1.0011
0.3238 24.9565 574 0.9478 0.4533 0.9478 0.9736
0.3238 25.0435 576 0.9001 0.4612 0.9001 0.9487
0.3238 25.1304 578 0.8962 0.4808 0.8962 0.9467
0.3238 25.2174 580 0.9295 0.3375 0.9295 0.9641
0.3238 25.3043 582 1.0079 0.3826 1.0079 1.0039
0.3238 25.3913 584 1.0809 0.3525 1.0809 1.0396
0.3238 25.4783 586 1.0803 0.3536 1.0803 1.0394
0.3238 25.5652 588 1.0061 0.3826 1.0061 1.0031

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k7_task2_organization

Finetuned
(4023)
this model